Quantcast
Viewing all articles
Browse latest Browse all 35

Dumb, Thick, Thin – What Next?

Once upon a time, software professionals used to work on dumb terminals. This will probably evoke lot of bad memories, a feeling of lack of control and a character-based user interface. The terminal was called dumb because it did not have any power of its own – no memory, no processor, no storage. Everything was communicated back to the server and there was a heavy dependency on the server. This architecture called for control mechanism at the server – a centralised control. The user sitting at the dumb terminal had no control whatsoever.

Then there was a microcomputer revolution which brought the computer to the common man and resulted in the personal computer. With the improvement in silicon technologies, the hardware became both available and affordable. This ushered in an era of the desktops and the multitude of softwares that would run on them. The desktop opearting systems put the control back in the hands of user. The different software packages, the look and feel and most of all the freedom to work without depending on another computer made the user feel powerful. Lets call these applications thick clients. This coupled with networking, enhanced the entire experience from a single computer to multiple, with multi-user software applications being developed. The advent in the display monitors allowed more complex, usable and rich user interfaces and made it more appealing.

As software technologies evolved, the thick clients became more and more complicated. Along with the power came responsibilities and frustrations for the user. There were multiple versions of the same software package, the installations and upgrades caused sleepless nights. The terms “dell hell” or “rpm hell”, basically caused by incompatible dependencies came up. The onus was on the thick client user to set it up properly and use them. The software vendors could not help in cases where the thick clients were not configured properly or were not being used in the right way. An additional disadvantage of the installed software was that it could not be used by users on the move, a requirement for lot of businesses. The software applications and information were not available from anywhere and locked the usage to that specific computer.

When the Internet came in, it was a revolution that changed not only softwares but eventually the life of the man on the street with a browser! Browser became the most important software application on the computer. Internet allowed 24/7 access-from-anywhere power, freeing the users to move around. Software technologies focused on using Internet for using softwares. This would not require any installation or setup by the user and eliminated the dll/rpm hells from individual machines. Hence these were called thin clients. Only certain computers had to be maintained and controlled – the Web Servers. The web servers encouraged hosted applications, which would be maintained by professionals and guaranteed 99.9% uptime. The user would not require more than a browser to take advantage of the software applications.

Today, almost every desktop functionality is available on the Internet, including the requires-rich-user-interface word processing. The focus of software technologies today is Web 2.0, to make the web applications more usable by developing technologies like AJAX. Users can not only access information from everywhere, but share it with anyone around the world in addition to looking for information, or organize personal information – live life on the Internet.

This trend though, is encouraging the thin client scenario, which had been employed in the mainframe days. The days of central control of configuration, customization and dependency on another computer to work have returned. Not only this, now the personal computers also have to be kept uptodate with new technologies, innumerable plugins along with the web servers to use software applications. Otherwise you might see messages like “Your browser does not support …” and end up not using the software application. This has brought in a series of incompatibilities and innumerable software vendors who keep on trying to one-up the competition. In the end, the user has to spend money in upgrading to the latest technologies, whether they are required or not.

The thin client is not exactly same as the dumb terminal. Users can still customize their browser settings to their liking. But the control is still with the web server, and hence the dependency on it. One can argue that now the software developers and the users have a choice between thick and thin clients or a combination. However, in reality, how many standards or technologies are geared towards thick clients?

Where do we go from here? Should we go back to the mainframe days, where atleast the user did not have to upgrade the dumb terminal? Wasn’t the dumb terminal a thin client with even the operating system hosted on the server? Is there a way to keep a balance and use both the thin client and thick client architectures? Or will there be a new architecture for the future?

Technorati tags: , , , , , ,

Copyright Abhijit Nadgouda.

Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.
Image may be NSFW.
Clik here to view.

Viewing all articles
Browse latest Browse all 35

Trending Articles