http://qs321.pair.com?node_id=334749

Here is a question/meditation I threw together at work but I thought I would throw it into the wider audience:

Why is it that when we use a desktop application we accept that we may need a manual, that we may need training and that practise makes perfect?

Yet when we use a web application we expect it to be obvious? In fact, the more complicated the operation the easier we expect it to be.

We frequently accept that we cannot make software idiot proof yet we attempt to do exactly that when we present an interface through HTML.

The underlying application is probably no different yet by using a web browser we automatically assume a differernt mindset.

More often than not we succeed to some degree but the very lack of this training, manuals and user help means that when a user makes a mistake or assumes behaviour about the software it becomes the developers fault?

Is it not the case that for complex intranet applications that it is fair to expect the user to complete training before becoming a qualified end user?

Of course you should test, debug and usability test but even then is there not a balance between testing for usability and accepting that the user must have at least *some* foreknowledge before being able to understand the application you have built?

Just some thoughts I'm having based on work I'm doing :)

Simon