Graphical Software Design with UML and XMI

I'm currently studying parallel modules in UML and Java and whilst reading through the notes for UML an idea occurred to me.

If you could store the semantics of a UML diagram in an XML format you could tranform your models into SVG diagrams, XHTML documentation and even generate a framework of code for the implementation of a computer program.

A computer program could be designed graphically using a drag and drop application with SVG and collaboratively using a version control system. This would also, in theory, make it much easier to implement your program in multiple languages if you wished. In combination with reverse engineering of code into UML it would allow people graphical, textual and code views of an application, depending on personal preference or their role in the development process.

As this is quite an obvious use of UML I searched the web for UML and XML to find out who had already done this.

I found:

I'd be interested if anyone has experience of using these types of tools in practice and how useful they are.

User Experience in a Horizontal Market

I read a blog post a while back called Reaction to the iPhone reveals how the electronics industry failed to beat the iPod. He talks about how the electronics industry is so obsessed with features that it forgets about user experience. Since then I've had a Design lecture from someone, who also looks a lot like an Apple fanboy, and talked about the same kind of things.

But the point they bring up about user experience is an important one. Much of the electronics industry can not understand why Apple products are so successful, they're overpriced and are easily matched by cheaper products with the same or even better raw functionality.

Engineers are unusual customers

I don't own an iPod, I do own an Apple laptop, but only because I know I can install other operating systems on it if I choose to. I'm extremely impressed with the iPhone, but I'm very unlikely to buy one. That's because I'm an engineer at heart and the opportunity for hacking on such closed systems is a lot less than for others. Apple is as proprietary as you can get. I'd rather own something I can hack on, even if that means sacrificing form for function. As an engineer I'm one of the worst placed people to judge the usability of a product.

This is a problem with the electronics industry, because the products are made by engineers like me. That's my first point, but there's a more important one.

Vertical Integration

The reason I believe Apple's products are so successful is because of the integrated user experience you get. The reason everything is so well integrated is because of the vertical market Apple have created. They provide the equipment, the peripherals, packaging and sales of their equipment, the operating system, the applications, the bundling and selection of content and there a links with the content creation itself.

When the button on your iPod is made by the same company you buy your music from, user experience is easy. I don't mean to trivialise the work that Apple put in to make this happen because they do an amazing job, I merely mean to point out that in a horizontal market it's going to be even harder.


A paper I read years ago by Milton L. Mueller talks about the economic consequences of digital convergence. He describes the transition from vertical market (like Apple) to a horizontal one where equipment, software, carriage, packaging and content will be distinct sectors of the market, run by different companies.

It's all the more interesting then that the company which just reached milestone in the convergence of computing, telecommunications and media did so with a vertical approach.

A horizontal market like the one described in that paper is my ideal for a converged market. It's good engineering and it's good for the consumer because it encourages a more competitive market. I think it's the market the Open Source community is heading for, which probably explains why they are notoriously bad at user experience.

The answer

So how do we achieve this level of user experience in a horizontal market where all of these functions are run by different companies? People who know me might guess that this is the part where I bring in my “open standards” mantra.

The only way we can hope to achieve this level of integration is by using open standards for the links between each level. Open hardware, open software, open networks and open formats for media.

If you want to converge different markets (which I believe is inevitable), those markets need to establish common languages to communicate with. If we achieve that ideal, everyone wins.

Metaverse Roadmap, Convergence at CES

In a comment on my last blog post I mentioned that:

“I have a vision of something which basically *is* the web, but in 3D. In fact, I think the user should be able to choose how they wish to view a given web resource – in plain text, 2D shapes, 3D shapes, simulated speech etc. This can be done with content negotiation in HTTP. The same resource could be rendered by lots of different devices, from a light switch to a 3D headset.”

I then found the Metaverse Roadmap, a “public ten-year forecast and visioning survey of 3D Web technologies”. They have a wiki where you can input your thoughts. I was going to add my own vision statement about how the 3D web could just be one mode of interaction with a multimodal web (as mentioned above). I found this vision statement which is a similar idea:

“The world will be the metaverse. People often think of Stephenson’s metaverse as an “other” place, and the web as a window onto cyberspace, but as Paul Saffo and Mike Liebhold of Institute for the Future note, the best model for the metaverse of 2016 may be an information-drenched world, where the 3D web is just one particular instantiation. Mixed reality is likely to be the dominant user experience. You will use virtual worlds when they are an appropriate mode of interaction, but they are not your primary mode of communication – you have your chat, your email, your augmented reality, your 2D and 3D browser, etc. While people will continue to use online spaces and media centers for particularly high quality 3D content, the pervasiveness of information access and augmented reality will give world itself new layers of “metaverse-itivity.” The ubiquity of small, portable Sidekick-like and wearable devices will enable immediate access. Voice will be used for many basic queries, but text, even IM text, is private and unobtrusive, so it will not disappear.”

Someone also mentions the need for a new type of browser which will allow us to access “all our 3D access through one piece of software” and mentioned that “Open standards will be particularly important for this”. I've downloaded FreeWRL, the X3D renderer I want to use for Webscope

In other news…

It seems CES is all about convergence again this year with Apple's iPhone being announced alongside the Nokia N800, Apple TV and Windows Home Server. The iPhone was inevitable but it sure is pretty now it's here, very nice design touches like motion sensors and multi-touch screen that I didn't expect to see yet. Note the lack of 3G and the presence of WiFi. This is the kind of hardware we should be thinking about for future web software development.

Second Life Client Open Sourced

In my fourth blog post of the day, Linden Labs has Open Sourced the client for Second Life in a blog post entitled Embracing the Inevitable.

Linden Labs always said that Open Sourcing the code was part of the long term plan, I remember an interview on LUGRadio a while back. It's a shame it's only the client and not the server-side code, but they say they are staying open minded about that. One step at a time.

My dream (as I described in March) would be a distributed system where anyone could set up their own server. It would use web standards and would just be like a collection of 3D web pages in X3D. It might be difficult to attain the same kind of user experience you get with Second Life, but it would be a great extension of the web.

Update: I've started a wiki page posing the question “What would be required to create a 3D web with a similar user experience to that of online virtual worlds like Second Life?”. You can log in with username:iwontspam password:ipromise or start a new account. I'd value input.

Nokia N800

I bought my Nokia 770 Internet Tablet in October and (as I thought might happen), Nokia have just released a new version, the N800.

This was reported on digg yesterday and today I found the official Nokia product page, I expect they will launch it properly at CES this week.

The new model has 128Mb RAM, 256Mb ROM and a VGA webcam and now supports up to two 2GB SD cards instead of single RS-MMC card. It's got a revamped design and has now really gone mainstream, becoming part of the NSeries (like my Nokia N70 mobile phone). This is great news because it shows that the Linux based Maemo operating system is seen to be a success.

Maemo is one of the platforms I would like to port Webscope to. The new hardware will be useful if I start getting X3D rendering and video playback running and will allow video calls as well. There's a long way to go before then though 😛

Open Standards and Free Software are making me OS-agnostic

I use three different operating systems on a daily basis – Windows, Mac and GNU/Linux – yet my data is always the same and I often use the same applications. Here's what I use on a regular basis:

Task Open Standard(s) Free Software Application(s)
Email IMAP Thunderbird
Calendar iCalendar over WebDAV Mozilla Calendar
Contacts LDAP Thunderbird address book
Documents OpenDocument Open Office
Music Ogg & MP3 over HTTP VLC Media player
Pictures SVG, PNG, JPEG The GIMP, Inkscape
Code Subversion Eclipse & Subclipse
News OPML, RSS, Atom Thunderbird news reader
Chat Jabber & IRC Gaim

Sometimes I'll use an OS-specific app if it provides a better experience, but still uses open standards. For example, I use iCal, iChat, iTunes and Address Book on the Mac with iCalendar, Jabber, MP3 and LDAP respectively (I know, MP3 isn't entirely open). I can easily chop and change which application I use, or even use different ones at the same time because my data is stored in such an accessible way.

Although I believe that desktop Linux is very important and Ubuntu is my first choice of OS, what's more important is the open standards it uses for managing data. Free software won't fend off proprietary software by building a better desktop, it will win by making the operating system a user is running almost irrelevant.

What I'm working towards at the moment is hosting all of my data across web servers and having a web application to manage each type of data. That way I can access my data on any device with a web browser – including my phone and Internet tablet. I've had a web server at home for a couple of years now which I store some of my data on.

Web applications I've been using include:

  • Horde IMP
  • GMail
  • PHPiCalendar
  • Google Calendar
  • Google Docs & Spreadsheets
  • Ampache
  • Flickr
  • Gliffy
  • Trac
  • Gregarius
  • Google Reader

Some of these are hosted by companies, some are hosted on my own server, but what's important is that they use the same open standards.

One of the aims of Moya is to create a home server which manages all of these types of data and provides a web interface, making the client operating system irrelevant.

The New Command Line

In the draft specifications for webscope and moya, I mention a “versatile text input box” and “natural language command” respectively. I briefly describe these ideas in the Multimodal Web User Agent design concept.

In light of a Lifehacker article hailing the return of the command line, I thought I would clarify these thoughts further in a new design concept page, Natural Language Command Line.

We are seeing this trend all over the place, a versatile text input element which can be used to find information or carry out actions with a much more loosely defined syntax than the traditional command line. In my implementation, a user agent accepts user input and turns it into ASCII text to be passed as a query string in an HTTP request. The receiving web server then interprets the user's command and attempts to carry out the requested action. Think of it as combining the address bar and search bar in Firefox and adding something new.

Introducing Webscope and Moya

I don't know if anybody noticed but my homepage and Twisted Lemon's homepage have been down over the new year period. We're back up and running now because Moose Computer Services have moved our hosting from the old virtual machine to a new, real server. Hopefully now we've got rid of our noisy neighbours we won't have the problem again 🙂

While has been down I've been busy working on Hippygeek now has subversion repositories and trac projects working (I've been playing with the Subclipse plugin for Eclipse. It's a bit clunky but a very useful feature.)

In particular I've started two new projects, webscope and moya. They are two projects I've been planning for a couple of years, but I've decided to make the thought process a bit more open in the hope that I'll make some progress towards implementing them. There's no code yet.


“Webscope is a unified interface for managing your information with multiple modes of interaction. It is a web resource manager – a hybrid web browser, web server, media player and window manager replacement.”

I'm hoping to write the front end using XUL, running on XULRunner, but the back end will include lots of other bits including an HTTP server.

You can click the link above for more information, or see the draft specification and UI mockups (which I created in Inkscape).

Webscope is an implementation of design concepts from my web site, including:


“Moya is software for a home information appliance, a central computer for the home. Features will include a media centre, social software and home automation with a minimalistic and multimodal web interface.”

Moya will be a combination of lots of existing projects in lots of different programming languages, loosely coupled with APIs. Any new components will probably be written in Python, which I'm learning at the moment.

You can click the link above for more information.

Moya is an implementation of some design concepts on my web site, including: