Sunday, July 27, 2008

OpenOffice Modal Dialog "JRE Required"

Was trying to edit a chart in OpenOffice Writer on Ubuntu today when it popped up a modal dialog box saying that it needed Java Runtime Environment, JRE, to complete the task (not in those words).

Now even after using the Ubuntu package manager, Aptitude, to install the required software OpenOffice still hangs on the modal dialog, making it impossible to close Open Office or do anything else with it. Only solution is to kill the Open Office process, which is called soffice. ie: pkill soffice.bin. First annoyance I've had with OpenOffice, which apart from this 'lil quirk is the best Document Publisher/Editor - even better then the Microsoft Office Suite.

It appears this bug is already addressed http://qa.openoffice.org/issues/show_bug.cgi?id=74940.

Friday, July 11, 2008

BlogPluse Trends

BlogPulse offers a trend search very similar to Google Trends but specifically targeted at blogs.

Here is the blog trend for the words Joomla, Drupal and Wordpress over the last 6 months.
This shows the percentage of the mentions of each word, Joomla, Drupal and Wordpress in blogs.

Lively - Googles 3D Virtual World

Just came across Lively.com which developed by Google, creates a virtual 3D world similar to SecondLife.

I haven't tested out Lively yet, since I'm running Ubuntu and it only supports Windows at the moment. In one of their blog posts however we can gather that Lively is well integrated with the Web. They have widgets that allows visitors - with Lively software installed - to jump into a Lively room embedded in a webpage.

By contrast I believe SecondLife offer APIs that offers a REST interface as well as other network level interfacing. Here is a SecondLife Facebook Application.

I'm wondering if Google will offer the same level of integration with their Lively 3D world. Would make fun Mashups.

Friday, July 4, 2008

Chaining Functions in JavaScript

A lil snippet I borrowed from the Google AJAX libraries API:

chain = function(args) {
    return function() {
     for(var i = 0; i < args.length; i++) {
      args[i]();
     }
    }
   };
You're probably familiar with the multiple window onload method written by Simon Willison which creates a closure of functions within functions in order to chain them.
Here is the same functionality using the chain function defined above.
function addLoad(fn) {
    window.onload = typeof(window.onload) != 'function' ? fn : chain([onload, fn]);
};
Chain can also be namespaced to the Function.prototype if thats how you like your JS.
Function.prototype.chain = function(args) {
    args.push(this);
    return function() {
     for(var i = 0; i < args.length; i++) {
      args[i]();
     }
    }
   };
So the multiple window onload function would be:
window.addLoad = function(fn) {
    window.onload = typeof(window.onload) != 'function' 
    ? fn : window.onload.chain([fn]);
   };

Hacking Google Loader and the AJAX Libraries API

Google recently released the AJAX Libraries API which allows you to load the popular JavaScript libraries from Google Servers. The benefits of this outlined in the description of the API.

The AJAX Libraries API is a content distribution network and loading architecture for the most popular open source JavaScript libraries. By using the google.load() method, your application has high speed, globally available access to a growing list of the most popular JavaScript open source libraries.

I was thinking of using it for a current project that would use JS heavily, however, since the project used a CMS (Joomla) the main concern for me was really how many times MooTools would be loaded. Joomla uses a PHP based plugin system (which registers observers of events triggered during Joomla code execution) and the loading of JavaScript by multiple plugins can be redundant as there is no central way of knowing which JavaScript library has already been loaded, nor is there a central repository for JavaScript libraries within Joomla.

MooTools is the preferred library for Joomla and in some cases it is loaded 2 or even 3 times redundantly. I did not want our extension to add to that mess. To solve the problem I would test for the existence of MooTools, if (typeof(MooTools) == 'undefined') and load it from Google only if it wasn't available. Now this would have worked well, however, I would have to add the JavaScript for AJAX Libraries API and it would only be loading 1 script, "MooTools", when I also had about 3-4 other custom libraries that I wanted loaded.

Now I thought, why don't I develop a JavaScript loader just like the Google AJAX Libraries API Loader. Should be just a simple function to append a Script element to the document head. So I started with:

function loadJS(src) {
    var script = document.createElement('script');
    script.src = src;
    script.type = 'text/javascript';
    timer = setInterval(closure(this, function(script) {
     if (document.getElementsByTagName('head')[0]) {
      clearTimeout(timer);
      document.getElementsByTagName('head')[0].appendChild(script);
     }
    }, [script]), 50);
   }
function closure(obj, fn, params) {
    return function() {
     fn.apply(obj, params);
    };
   }
The function loadJS would try to attach a script element to the document head, each 50 milliseconds until it succeeded.

This works but there is no way of knowing when the JavaScript file was fully loaded. Normally, the way to figure out if a JS file has finished loading from the remote server, is to have the JS file invoke a callback function on the Client JavaScript (aka: JavaScript Remoting). This however means you have to build a callback function into each JavaScript file, which is not what I wanted.

So to fix this problem I though I'd add another Interval with setInterval() to detect when the remote JS file had finished loading by testing a condition that exits when the file has completed. eg: for MooTools it would mean that the Object window.MooTools existed.

So I went about writing a JavaScript library for this, with a somewhat elaborate API, with JS libraries registering their "load condition test" and allowing their remote loading, about 1 wasted hour, (well not wasted if you learn something) only to realize that this wouldn't work for the purpose either. The reason is that it broke the window.onload functionality. Some remote files would load before the window.onload event (cached ones) and others after. This made the JavaScript already written to rely on window.onload fail.

Last Resort, how did Google Do it? I had noted earlier that if you load a JavaScript file with Google's API the file would always load before the window.onload method fired. Here is the simple test: (In the debug output, the google callback always fired first).

google.load("prototype", "1");
   google.load("jquery", "1");
   google.load("mootools", "1");
   google.setOnLoadCallback(function() {
    addLoad(function() {
     debug('google.setOnLoadCallback - window.onload');
    });
    debug('google.setOnLoadCallback')
   });
   addLoad(function() {
    debug('window.onload');
   });
   debug('end scripts');
I had to take a look at the source code for Google's AJAX Libraries API which is: http://www.google.com/jsapi to see how they achieved this.

It never occurred to me that you could force the browser to load your JavaScript before the window.onload event so I was a bit baffled. Browsing through their source code I came upon what I was looking for:

function q(a,b,c){if(c){var d;if(a=="script"){d=document.createElement("script");d.type="text/javascript";d.src=b}else if(a=="css"){d=document.createElement("link");d.type="text/css";d.href=b;d.rel="stylesheet"}var e=document.getElementsByTagName("head")[0];if(!e){e=document.body.parentNode.appendChild(document.createElement("head"))}e.appendChild(d)}else{if(a=="script"){document.write('<script src="'+b+'" type="text/javascript"><\/script>')}else if(a=="css"){document.write('<link href="'+b+'" type="text/css" rel="stylesheet"></link>'
)}}}
The code has been minified, so its a bit hard to read. Basically its the same as any javascript remoting code you'd find on the net, the but the part that jumps out is:
var e=document.getElementsByTagName("head")[0];
if(!e){e=document.body.parentNode.appendChild(document.createElement("head"))}
e.appendChild(d)
Notice how it will create a head Node and append it to the parentNode of the document body if the document head head does not exist yet.

Now that forces the browser to load the JavaScript right then, no matter what. Now following that method you can load remote JavaScript files dynamically and just used the regular old window.onload event or "domready" event and the files will be available.

Apparently this won't work on all browsers, since Google's code also has the alternative:

document.write('<script src="'+b+'" type="text/javascript"><\/script>')
with a bit of testing, you could discern which browsers worked with which and use that. I'd imagine that the latest browsers would accept the dom method and older ones would need the document.write

So my JavaScript file loading function became:

function loadJS(src) {
    var script = document.createElement('script');
    script.src = src;
    script.type = 'text/javascript';
    var head = document.getElementsByTagName('head')[0];
    if (!head) {
     head = document.body.parentNode.appendChild(document.createElement('head'));
    }
    head.appendChild(script);
    
   }

Anyways, I finally got my JavaScript library loader working just as I liked, thanks to the good work done by Google with the AJAX Libraries API.

Secure HTTP over SSH proxy with Linux

In an previous post I made I detailed how to create a secure your browser's HTTP communications by tunneling the HTTP session over an SSH proxy using Putty.

Putty is what you would use if you use a Windows desktop. If you're on a Linux Desktop you do not need Putty since you should have OpenSSH with the distribution you use.

Doing a man ssh on your Linux Desktop should give you the manual on how to use your SSH client:

SSH(1)                                                         BSD General Commands Manual                                                         SSH(1)

NAME
     ssh - OpenSSH SSH client (remote login program)

SYNOPSIS
     ssh [-1246AaCfgKkMNnqsTtVvXxY] [-b bind_address] [-c cipher_spec] [-D  [bind_address:]port] [-e escape_char] [-F configfile] [-i identity_file] [-L
         [bind_address:]port:host:hostport] [-l login_name] [-m mac_spec] [-O ctl_cmd] [-o option] [-p port] [-R  [bind_address:]port:host:hostport]
         [-S ctl_path] [-w local_tun[:remote_tun]] [user@]hostname [command]

... etc ...
The synopsis gives you the format of the command and the options that can be used with the ssh command. Of interest is the -D option. This allows you to bind the SSH session to a local address and port. Below is the part of the manual explaining the D option:
     -D [bind_address:]port
             Specifies a local “dynamic” application-level port forwarding.  This works by allocating a socket to listen to port on the local side,
             optionally bound to the specified bind_address.  Whenever a connection is made to this port, the connection is forwarded over the secure
             channel, and the application protocol is then used to determine where to connect to from the remote machine.  Currently the SOCKS4 and
             SOCKS5 protocols are supported, and ssh will act as a SOCKS server.  Only root can forward privileged ports.  Dynamic port forwardings can
             also be specified in the configuration file.

             IPv6 addresses can be specified with an alternative syntax: [bind_address/]port or by enclosing the address in square brackets.  Only the
             superuser can forward privileged ports.  By default, the local port is bound in accordance with the GatewayPorts setting.  However, an
             explicit bind_address may be used to bind the connection to a specific address.  The bind_address of “localhost” indicates that the listen‐
             ing port be bound for local use only, while an empty address or ‘*’ indicates that the port should be available from all interfaces.
Basically it means that you can start an SSH session using the OpenSSH client with a command such as:
ssh -D localhost:8000 user@example.com
and it will create a SOCKS proxy on port 8000 that will tunnel your HTTP connection over SSH to the server at example.com under the username user.

Now you can configure your applications that access the internet to use the secure HTTP tunnel you've created to your remote SSH server. The applications are not limited to web browsers, you can configure your Instant Messenger, Skype, Games etc. to use the socks proxy, as long at the communication protocol is supported.

Configuring Firefox to use the Socks Proxy

  • Tools -> Options -> Advanced -> Network
  • Under Connection click on the Settings button
  • Choose Manual Proxy configuration, and SOCKS v5
  • Fill in localhost for the host, and 8000 (or the port number you used) for the port
  • Click OK and reload the page

Now what you can do is have the the ssh session start up when you start your desktop. Thats if you want to use your secure tunnel every time you use Firefox or whatever program you have configured to use it. On Ubuntu (Debian) you'd add a shell script to your home directory.
Example:

#!/bin/sh
ssh -D localhost:8000 user@example.com
That should start up the ssh connection and create the socks proxy when you log in. The other alternative is to create a launcher and use ssh -D localhost:8000 user@example.com as the command, allowing you to launch the proxy whenever you need.

You can also set up an ssh key for authentication instead of having to log in. This is detailed in other posts: http://pkeck.myweb.uga.edu/ssh/ and http://sial.org/howto/openssh/publickey-auth/. This allows you to use the proxy transparently in the background without having to start it and log in.

For Firefox you can switch between proxy and direct connection using the switchproxy extension.

Disclaimer: Please note that it is your responsibility to use the information in this article within the legal laws of your country. Some countries do not allow encryption of internet traffic, therefore you SHOULD NOT use this resource if you live in such a country. I provide this information without warranty and free of charge and will not be held accountable for any damages lost due to its use.. etc etc.