Google Docs Offline

I was able to try Google Docs Offline for the first time today and it's extremely cool.

The offline function uses the Google Gears web browser plugin which I talked about last June to keep an offline copy of all your documents. If you find yourself outside "the cloud" with no Internet access you simply point your web browser at the docs.google.com domain and your browser reverts to the offline version. You can view documents and spreadhseets and even edit your documents offline and it automatically syncs your changes when your conection is re-established.

It was a tiny bit buggy for me when I first used it, but I think this is a significant step forward for web applications. I really hope that open standards emerge so that this kind of thing becomes commonplace, there are certainly efforts to make this happen.

3D Internet vs. 3D Web

There are a couple of things I've found out about recently which I think are significant developments in the 3D web.

Second Life Architecture Working Group

The Second Life Architecture Working Group are publicly working on a defining a set of protocols which will open up the next generation of the Second Life "grid" to allow others to host Second Life style worlds. Second Life is a popular (the most popular?) online virtual world and currently has an open source client and closed source server. All servers are currently run by Linden Labs, but the company recognises that if Second Life is to become as ubiquitous as the World Wide Web, they have to open up the technology.

What I find interesting about this standardisation effort is the willigness of the group to investigate the use of existing open Internet standards where possible. Examples include HTTP, REST APIs, XMPP, FOAF, XFN and OpenID. I think this is a much more sensible approach than trying to define new protocols for every part of the technology.

The basic approach of the group currently appears to be to take each feature of Second Life and either match it to an existing open standard, or if none applies then define a new one. Meetings happen online, even inside Second Life itself, and the chat logs are available to view on their wiki.

Vivaty

Vivaty were previously known as MediaMachines who created the FluxPlayer for X3D viewing and FluxStudio for X3D authoring. Under their new brand they have recently launched a beta of a new 3D social networking service which uses X3D technology to provide online virtual worlds similar to Second Life, but with a greater emphasis on social networking. This is really a flagship for the X3D standard and it will be interesting to see how well it performs.

This new direction for the company appears to be an evolution from creating developer tools and implementations of Web3D standards, to providing end user web services which use those standards. I think this speaks volumes about the maturity of the technology.

3D Internet vs. 3D Web

What's interesting about contrasting these two developments is that they are both working towards building distributed online virtual worlds, but going about it in different ways. One is creating a 3D Internet and one is creating a 3D Web. Also, one is taking a commercial service and turning it into an open technology, the other is taking an open technology and turning it into a commercial service.

Here's my question. Are virtual worlds and the web different uses of the Internet in the same way that email clients and web browsers are different, or are virtual worlds one possible application of the 3D Web? It could be that both are true, similar to the fact that both email clients and webmail exist. In which case, the 3D Web is a web interface to virtual worlds. This then strays into the much wider debate of desktop vs. web browser as a software platform.

The Second Life client includes not only the real time rendering of interactive 3D vector graphics, but also a huge array of other technologies including authentication, instant messaging, prescense, friends lists and even currency. I would call this a rich Internet client, because it involves much more than just a web browser. It's something which implements many different protocols over the Internet and is designed to be installed on a desktop PC, separate from a web browser.

The pure 3D Web approach I envisage would be slightly different. The client (web user agent) would include only the downloading/uploading (over HTTP) and rendering of 3D scenes (in X3D), with a client-side scripting engine (ECMAScript). HTTP authenticaion might also be included, as it is currently included in web browsers, but application specific protocols like instant messaging, exchange of currency and friends lists would be left to server-side web services. An instant messenger client may well use the Jabber protocol (XMPP), but would not require the user to download an IM client, it would simply be used via a web interface much like Meebo. Currency? 3D PayPal perhaps.

The pure 3D Web option fits very neatly in the context of other web standards – we already have XHTML and the start of SVG implementations in web browsers, X3D could be the next step, with CSS and ECMAScript playing their parts.

XMPP for the 3D Web?

As an aside, one interesting idea which has been mentioned in the Second Life Architecture Working Group discussions is the idea of using XMPP not only as an instant messaging protocol but as a general purpose point-to-point protocol. I've thought about this before – the idea of using XMPP in place of HTTP to overcome the limitations and synchronous nature of HTTP. I know XMPP is extensible, but I don't know enough about it to know whether this would work. The 3D Web is surely going to be a big stretch for the hack that is AJAX and XMPP could hold the answer to truly interactive 3D scenes.

Google Internship, Flat wanted in London

Internship 

After two years of trying and five interviews, I've finally got an internship with Google this summer. I'll be working as a Technical Solutions Engineer in the Partner Solutions Organisation at Google London and the work I do will hopefully ultimately improve the results you get when you carry out a local search on Google Maps. I've been to visit the team I'll be working with and had lunch at Google, with a tour of the building. All I'm going to say is, I can't wait to start!

Flat 

I'm currently looking for short term accommodation in London over the summer. I'm looking for a 3 month let of a studio flat (or potentially a flat share) from mid-June to mid-September. I need somewhere in the range of £100-200 per week which is in easy reach of London Victoria station, somewhere on the District, Circle or Victoria tube lines would probably be ideal. My basic requirements are a double bed, a shower, an Internet connection and basic kitchen facilities.

If anyone has or knows of something suitable, I'd be extremely grateful if you could contact me by email (ben at tola.me.uk). So far I've been looking on rightmove.co.uk and gumtree.com and I'm not having a huge amount of success. There's lots out there, but I keep coming across problems.

Congratulations Alex

This is a slightly belated but massive congratulations to my university friend, Alex Smith.

Towards the end of October Alex had told me that he'd done something that had resulted in a company trying to give him money, but he said he couldn't tell me what it was, yet. Then I heard the university press office was frantically trying to get in touch with him and I started to wonder what it was!

On 24th October my housemate told me I should look at the front page of Birmingham University's web site. There was an article saying that Alex had won a $25,000 prize from Wolfram Research for proving that "Wolfram's 2,3 Turing Machine is Universal"! (no I didn't know what that meant either until I looked it up).

The story soon appeared on wired.com, Scientific American, Slashdot and Ars Technica.

In Stephen Wolfram's blog entry you can read how he didn't know whether it would take a month or a century to prove, Alex managed it in 5 months. In fact, his initial submission was just 47 days after the competiton was announced! You can read Alex's 50 page proof here.

Alex described himself as "an undergraduate studying Electronic and Computer Engineering at the University of Birmingham, UK. He has a background in mathematics and esoteric programming languages". I worked with Alex on our robot project last year and this year we're in the same group for our Masters Group Project (Radio Orienteering). I've told him that now that he's a celebrity, he shouldn't let the fame go to his head 😛 We've still got work to do, after all. However, as with any major discovery Alex's proof is already causing contreversy and people are trying to prove him wrong. Alex has proved during countless interjections during lectures that he loves a good argument, so I expect it to keep him occupied for some time 🙂

I'll watch the debate unravel with interest (albeit a regretful lack of understanding!), having his claims put under scrutiny is all part of the academic process.

Well done, Alex!

Lazy Browser Development is letting Flash Kill the Web

Adobe's Flash technology which is fuelling some of the most innovative developments on the Internet is simultaneously putting the World Wide Web as we know it at risk.

The World Wide Web is ubiquitous

It has far surpassed the expectations of its creator, Sir Tim Berners-Lee, and has come to embody what we think of as the Internet. But what gave the web such an explosive growth and universal appeal was not technology alone, it was the open-ness with which the technology was developed. The World Wide Web Consortium (W3C) was founded to oversee the standardisation of the technology built on top of Internet technology already standardised by the Internet Engineering Task Force (IETF). This meant that many developers could write software which implemented these open standards, giving us the diverse web servers and web user agents we see today.

The web is no longer just text and images

It is a multimedia platform with audio and video and the early stages of 3D virtual worlds. But if we look at how the multimedia web is coming about, we can see that it is being developed in quite a different way. Instead of the open standards and human-readable markup of XHTML, CSS and ECMAScript that underly the original web, the multimedia web is being built on proprietary technology called Flash. Flash is currently the only widespread way of delivery audio and video across the web in a cross-platform way and is behind popular web sites like YouTube and Flickr.

Flash is not like the rest of the web

Flash was created by Macromedia and is now owned by Adobe, one of the big software giants of the world. Flash differs from web standards in that it is a closed, proprietary format owned by one company and when flash appears on a web page it appears as a cryptic "binary blob" that can only be interpreted by Adobe's software.

Web standards like XHTML mean that any web user agent or search engine can trawl the web and interpret the data stored on it. Search engines can index web pages, multiple companies can implement web user agents and web servers and the web has a certain level of accessability for the visually impaired user. Flash has none of these properties, it is a closed proprietary technology owned by one company that doesn't play nicely with the rest of the web.

What we can do

If we care about the World Wide Web we should be supporting the same standards organisations that created it to drive innovation of the next generation. The problem is that often companies can develop innovative proprietary technology a lot faster than standards organisations can standardise technologies. If we're not careful, such standards will never emerge and the web will become hostage to large corporations, holding our data prisoner in proprietary, closed formats.

So what is Flash used for today that should use open web standards and what technology should we be supporting?

Audio/Video

Web sites like YouTube use Flash to embed video in web pages because it is currently the easiest way to deliver video across the web and reach the maximum number of people. This is due to an underlying problem with digital audio and video, every computing platform seems to use a different standard. Giants like Microsoft, Apple and Sony are in a standards war over digital formats. This has happened before with cassette tapes, VHS, CDs and is also now happening with DVDs.

But in the backgroud a few sane people have defined some open standards for these things. Formats like Ogg Vorbis, Ogg Theora and FLAC are as close as we can get to open, patent-free formats for audio and video.

We should be building native support for these open formats into web user agents like Mozilla Firefox so that video can be delivered over the web in an open way.

2D Vector Graphics

Flash is by far the dominant technology for static and animated 2D vector graphics. But the World Wide Web Consortium already has a standard for this, it's called SVG. Not a lot of people realise that SVG has an equivalent feature set to Flash and is an open XML based standard like XHTML and is a W3C recommendation. For some reason there is not yet a widespread full implementation of this standard in any web user agent (to Adobe's credit they have probably come the closest, but have recently announced they are going to stop supporting SVG).

Mozilla are working on implementing SVG in Firefox but progress is slow on implementing such a huge standard and for some reason it doesn't seem to be a high priority for them. We need to give organisations like Mozilla more help in implementing SVG natively in web user agents.

3D Vector Graphics

3D virtual worlds are in their early stages on the Internet and are predominantly made up of applications like Second Life which are very much like the AOL of the web, before it was more open. They have a walled garden of a virtual world controlled by one company.

Just today I've seen an announcement claiming that MetaPlace is going to make virtual worlds work like the web. They're going to "democratise" virtual worlds by allowing anyone to create their own virtual world. At first glance, this appears to be the true 3D web. But actually Areae who make MetaPlace are going to use Flash technology to deliver these 3D worlds to the web browser. This is not how the web is supposed to work.

The Web3D Consortium exists to standardise formats for the 3D web and they are developing the X3D standard, another XML based standard like XHTML. If online virtual worlds are going to truly work like the web, they will surely use an open standard like X3D. We need to work on implementing X3D in web user agents.

Applications

Flash is very popular for applications online because of its programmable properties and its cross-platform nature. An alternative exists in a combination of technologies known as AJAX. Asynchronous JavaScript and XML is a way of using existing web technologies to create highly responsive web applications.

By applying AJAX techniques to SVG and X3D as well as XHTML, we can create some really interesting software.

The W3C is standardising web application formats in the Web Application Formats Working Group. WHATWG are contributing to this effort. 

In conclusion

We've become lazy with our web user agents. The truth is,standardising a computer format, implementing it and promoting its adoption is a lot of hard work. Developers can hardly be blamed for choosing proprietary options which already work today, they are pragmatists and simply use whatver works.

If we care about the open-ness of the web (and by that I mean the freedom of our information) then we need to put a lot more effort into developing web user agents for the next generation of the World Wide Web.

To borrow a phrase from the Mozilla Corporation's advertising campaigns:

Take Back the Web!

Victory for Open Standards – OOXML denied ISO fast track

Microsoft's application for their OOXML office format to be fast tracked to an ISO standard has been turned down. There were lots of rumours flying around all day yesterday about what the result of the latest vote was, but it has finally been officially announced by ISO. Grocklaw has the details.

Standards for digital office documents might be a dull topic to discuss, but they're of a huge political importance. Microsoft currently has a monopoly with their Office suite and their proprietary office formats lock users' data in a single software vendor. This has the potential to cause huge problems for long term storage of data, let alone forcing computer users (including governments, schools and hospitals) into purchasing costly Microsoft software in order to read the documents others create.

An international standard already exists for documents, it's called the Open Document Format and it's an open standard used by many competing office suites including OpenOffice, StarOffice and KOffice. Microsoft is trying to push through a competing standard to ODF called OOXML which will complicate document standards and many believe is so technically complex that it could never practically be implemented by any company other than Microsoft. This would give the impression that Microsoft documents are in an open format, but actually perpetuate the problem of vendor lock-in.

To put it into perspective, the OOXML standard document is 6545 pages long. The existing ODF standard achieves the same goal in a document which is just 867 pages long. The reason for this is that ODF referenes existing ISO standards for things like date formats and mathematical formulae where as OOXML invents new ways of doing everything. In fact, despite the impossible length of the OOXML standard document, huge parts are missing because the document references Microsoft implementations which people implementing standards have no access to.

OOXML is the worst kind of standard. It was created for the wrong reasons, it is in competition with an already approved ISO standard, it is hugely complex to implement and up until now was being rushed through a standards process at an unreasonable speed. It gives me great faith in the international standards process that the ISO voted not to allow OOXML on a fast track to being a standard. Microsoft have been using some very underhand tactics to get the votes they needed and I'm pleased that their financial clout has not affected the result of the vote.

Reading Microsoft's press release you'd be forgiven for thinking that yesterday was a huge success for Microsoft, but it is pure spin. The reality is that they lost both of two votes required for the fast track to be approved and this is a set back for them.

But the fight is not over yet. The OOXML format was turned down for a fast track for now, but this does not prevent it from becoming a standard in the future. Many comments will have been made on the proposed standard which Microsoft can address and try to take the application further.

Microsoft are trying to make a mockery of the standards process, there is no need for a duplicate standard for documents and the proposed OOXML standard is completely self-serving. They should concentrate their efforts on adopting existing international standards rather than trying to force the world into using their own.