Today's Weblog Home  Weblog Index

Active Code and Web Services -  A Speculation

April 30, 2003 - Edited March 31, 2004 - A note regarding the potential for Web Services (.net, etc.) to replace functions and services currently included in tightly linked MS operating system and networking software and thus increasing the security and stability of both networks and "client" computers.
http://www.jimpvonka.com/jimstech/activweb.html


Let us begin with a useful example of a Web Service which relies on Active Code on the user's machine, that is the client machine.  By clicking on the Active Scan icon to the right, you can invoke the scan of your machine.  It is a good facility, and a good way to experience what "Web Services" are all about.

Panda Software's Active Scan is a free web tool for detecting and eliminating viruses from client machines on the web.  However, it relies on the use of Microsoft Active X programming features, and so cannot be used in any web browser other than MS Internet Explorer.   (I do not use Internet Explorer except for special situations like Active Scan and Microsoft's Windows Update facility, and a few web sites which are not compliant with web (W3C) standards and cannot be accessed through standards compliant browsers.)  
Panda ActiveScan - On-line Virus Check

Another example is available at tax time.  In 2004, I used web services to file both state and federal income tax returns, on the web.  That is, I did not install any tax preparation software package on my computer, but instead accessed web sites which permitted me to prepare my returns at the web site.   These web enabled tax return preparation facilities did not rely on Active X components, so I was able to use my standards compliant web browser, Mozilla, to access the sites and prepare my returns.  My returns are not complex, and I found the use of web enabled preparation tools more convenient than purchasing, instaing and updating software on my computer. 


I may have this backwards, completely wrong.  It is relatively uninformed speculation, notes I want to get down so that I can follow up with some research.

One of Microsoft's irredeemable errors in software architecture was to design as if  the local machine and the rest of the world, the network, could safely be parts of a seamless fabric, freely exchanging and having access to information.  This error originated in the history of networking as a controlled environment, under professional administration, with extensive technical and organizational facilities supporting the security and integrity of the network.  In other words, the network was a unified technical system, with security methods built in and managed by technical networking security experts, and it existed in an organizational context where compliance with security measures was a condition of employment and could be managed and enforced by organizational management.   Clearly, the extension of software architectures designed for such an environment to the technically open TCP/IP internet network, completely outside the framework of even national management except in authoritarian nations, was completely inappropriate.

Operating system features tightly binding the OS to networking functions, and making the nature of those bindings nearly invisible to and unmanageable by computer users have made the computer, for individual users of computers linked to the internet, the "enemy in the living room".  These features, appropriate to a tightly managed organizationally controlled network environment, make the home and small business computer outside such an environment vulnerable to attack, invasion, control, and destruction by people who are connected to the internet anywhere in the world.  And those people do not have to be terribly sophisticated to do those things.

In fact, it probably takes more sophistication to defend a home or small business computer from attack than it does to successfully attack one which is undefended.   The same features that affect individual users in this way also make life at work much harder for the network security people responsible for organizational networks.  When these networks are opened to the internet, all the failings in the basic architectural concepts of the linked OS - network environment void the presumptions on which the security of the network is based - tight technical and organizational controls over the network and its users.

To correct the problem is a major undertaking, requiring the redesign of the core architecture of Microsoft's local Operating System and networking facilities linkages.  And such a redesign cannot be successfully undertaken, unless an alternative means of delivering the services now delivered through the tight linkage of these two functions can be designed and built, to replace what must be removed from the tightly linked environment in order to establish an appropriately modular OS - network relationship  supporting secure services.

One part of the problem is the need to expose, and make directly and simply manageable all networking related functions tied to the Operating System (I mean removing them as integral components and making them modules whose interactions with the OS are separately managed and controlled by the user, through simple and well distinguished software interfaces).

But apart from that need, there is the issue of  how to provide the services which the OS currently supports through what are known as "active components".  These are typified in my mind (and here I am out of my depth completely) by so-called "ActiveX" components and by active content in documents and files originated and operated on ("run") by some programs - Visual Basic Scripting in MS Word and other MS Office programs; Outlook, and Outlook Express being common offenders.  These features have been the source of millions of dollars if not billions in lost time and property due to attacks from outside the controlled environment which exploit their inherent features - features which constitute a failure of design on a massive scale in the context of internet connected machines.

The issue presently before me is this:  Can a form of Web Services be designed which can provide the facilities needed for communications between applications across the internet, without relying on inherently insecure methods and facilities active on local machines?   Such an approach to communications of information among applications on local machines would eliminate the need for transmission of active content, potentially dangerous and damaging, among machines.  The functions and services currently provided by active content would be performed under the supervision of the Web Services tools and facilities on the web, not on the local machine.   "Web Applications" would run on tightly managed servers under the control of organizations providing services (such as when the State of Kansas permits me to prepare my state income tax return on the Web using facilities on its servers, not on my computer - which serves only as a communications and data entry terminal for this purpose).

In this frame, the objective is the elimination of active code on the client side, no intrusion of active code from web sites onto client machines, and the processing of data and applications on servers - NOT on the client or local machine.   I wonder if such a model is technically feasible, and if so if anyone is working on building it.

I fear that there is currently a trend in software and network design to further break down the wall between code on the client machine and that on networks, including the internet.  And that this trend is turning the computer and other 'web enabled' devices in our space into unwanted and unwelcome intruders, and spies.  Let us be clear that such intrusion is NOT necesary to gain the benefits of use of and interaction with the rest of the world through the interned.  It is an artifact of the current model of local, client machine interaction with networks.  And other models, I believe, exist and are vastly preferable.