|
![]() |
Homepage / Publications & Opinion / Archive / Articles, Lectures, Preprints & Reprints![]() Synopsis Much of the established thinking on future information networks and working is constrained by, or at least incremental on, today's experience and practice. For example, 'the copper mind set' would have us believe that bandwidth, or more accurately bit transport capability, is a scarce and expensive commodity. However, optical fibre transmission has freed us from that constraint. We might also ve led to the conviction that protocols such as X25 are necessary, right and proper. Not so. They were designed for the noisy and unreliable copper cable networks of the 1970s. They are now out dated and outmoded - with frame relay their dying ember. Network and service management systems currently adopt a centralised approach which brings with it the baggage of large unwieldy software systems - also a vestige of an inflexible hardwired past. Advances in open distributed computing and artificial intelligence are providing superior alternatives for modern flexible information networking. In this paper we explore these new freedoms and give our personal views on the developments which might have an impact on our IT lives in the next decade. Introduction Our exploration of the future begins by reviewing the underlying trends in the support technologies focusing on the rapid change in capability. We then consider the problems of scale and complexity that are now becoming severe in established networks. The need to tame these two problems is urgent and is both the second driver and a primary limiting factor. Interaction between business practice, home life and entertainment in the evolving information systems world is also seen as a further driver of change. After a critical look at today's software we indicate how various Artificial Intelligence (AI) and distributed system techniques will make a dramatic impact on future systems. The final sections give a prognosis for a global information system with a seamless integration of currently separate technologies providing intelligent support to people. Technological Trends Computing Since their general introduction in the early 1980s Personal Computers (PCs) have now completely populated business life in the western world. Many technical professionals now own two machines: a desk top and a lap top. This has arisen through the utility of being able to 'carry' the office to meeting places and performance/price incentives. To some extent PCs have replaced the traditional role of clerical assistants, drawing boards and filing cabinets. In future they may well simulate much of our laboratory, test and modelling space - and replace the human secretary. Today they provide a diary function, store documents, undertake word processing and interconnect their user with the global messaging systems. Impressive as these capabilities are they still demand a high degree of capability on the part of the user. This need for users of modern IT to learn the complexities of their systems in order to unlock the potential of technology is an impediment to the mature, but less of a problem for the young. Even though we can be generally critical of today's technological packaging and user unfriendliness, it must be said that the performance is truly impressive. PCs currently deliver a hardware line up which has echoes of yesteryears' supercomputers: 25 - 160 MHz clock; 64 bit processor; 32 Mbyte RAM; 1 Gbyte Disk; CD, video and hi-fi sound. All this at affordable and reducing prices. At the supercomputer end of the spectrum the computer power is truly staggering. Specialised numerical processing machines operate at 10's of GFLOPS and general purpose machines have arrays of 100's of processors. The forces that have shaped the computer market are a combination of competition, application and technology. Chances are that the performance/price ratio will continue its upward trend for at least the next 10, and probably 20, years. Such extrapolation tells us that in 2005 PCs will be delivering c1000 MIPs and supercomputers exceeding 10 TFLOPSs. Given this remarkable advance in hardware ability, it is interesting to contemplate the relatively static situation in software. If only it was simpler, more reliable, understandable and low cost. Why not? We have yet to realise the analogous abstractions of the physical sciences that spurned our fundamental understanding of hardware. Also we have failed to create the evolutionary environments that would create fit for purpose solutions. We are like the Romans building bridges - we have methodologies, but no true understanding. Networking The widespread introduction of digital transmission and switching in the 1980s paved the way for the integration of voice and data transmission. Corporate users are now able to specify high rate digital circuits which co-exist with the telephony infrastructure. The last few years have seen the penetration of ISDN into small businesses and homes thereby bringing the benefits of digital transmission to a wider user group. The copper pair local loop is now being stretched to the limit with the introduction of Video on Demand (VoD) which probably represents the maximum viable information transfer rate achievable. Optical fibre is the only technology that provides true future proofing for network operators who have now accumulated over 20 years experience with these systems. Operation at 2.4 Gbit/s is now commonplace, and 10 Gbit/s is readily achievable. Current research exploits the non linear properties of fibres with soliton transmission systems offering the prospect of 1T bit/s systems within the decade. These impressive transmission rates are not accompanied by corresponding high system costs - quite the reverse. Each time the bit rate is quadrupled the cost per bit transported falls by a factor of three. Further, the use of all optical amplifiers has extended the reach of optical systems without the attendant cost of optical/electronic conversion. The technological capability now exists for all optical fibre networks - no electronics, no limits, total transparency. The extension of such networks into business premises and homes will provide the bandwidth to meet the demands well beyond 2000. Although the integration of computing, switching and transmission technologies will provide the physical components required for future information systems there remains the problem of software. Both the software needed to run and maintain the network, and more importantly for the end users, for applications. System Complexity and Software Network and service providers have responded to these problems by installing ever more complex management systems. It is now commonplace for major switching sites, or line transmission equipment, to have their own built in management and control system. This typically provides a means for managing the configuration of the equipment and dealing with hardware and software faults. In order to provide centralised communications management these integral systems are connected to management centres. The development of large software systems is a very demanding enterprise with a high risk of failure. This failure can manifest itself as the inability of systems to meet specifications (a development failure) or an in service failure where the system falls short of its performance objectives. Therefore as the scale and complexity of the networks grow the risk of a major failure through a malfunction of the management system software increases significantly. If the traditional centralised approach to system software continues into the coming information age the likelihood of major failure becomes unacceptably high. At the centre of the problem there exist twin difficulties: the inability of the software community to build highly reliable large systems and the difficulty people experience analysing large data sets. Fortunately, current systems and AI research strongly indicates that solutions to these problems will be forthcoming. More Data - Less Information "We are increasingly data rich and information poor" As companies rightsize, fragment, and outsource more of their non critical functions, employ fewer people, use more contract staff, and operate in a faster and more chaotic environment, the need for accurate and timely information will increase. Without access to good data, and the ability to turn it into good information, companies will fail to meet the challenges of the future. We as individuals will increasingly be asked to do more in a shorter time and intelligent access and intelligent manipulation of data will be increasingly vital. For most of us the removal of the PC and/or lap top would see our output drop to a fraction of what we currently achieve. In short, we are unable to operate without this computational power. PC technology is rapidly becoming our third lobe - an augmentation of the cerebral processor with a silicon surrogate. What it lacks is real intelligence - and that is what is required if it is to help us cope with the 21st century world of instant information, no delay, no time to contemplate, only rapid response. New Approaches Networks Communications networks by their very nature are distributed systems. Therefore it is questionable why management has been centralised. An analysis of the history of network management indicates a number of principal forces:
Whilst a centralised approach enables skilled people to handle large volumes of network incidents, which leads directly to reduced manning levels, the amount of data collected is proportional to N2 - where N = the number of active nodes. This leads to GBytes of data per day being generated for the largest networks. An alternative, and new philosophy, is that by distributing management systems and increasing automation through AI, significant reliability and cost savings can be achieved. This is augmented by the progressive slimming down of network hardware through the advantages afforded by fibre and fewer network nodes. For example, a copper and radio based network of 6000 switches can be replaced with fewer than 100 switches in a fibre based equivalent. During the last decade research has demonstrated clearly that expert systems can automate many network management functions. Both rule and model based expert systems can diagnose faults on equipments and networks to a very high degree of precision. In addition, expert systems are making inroads into areas such as network planning and system design. Research into distributed network management clearly shows that distributed automation is feasible. In fact the kernel of the AI systems which power distributed management have been given a name - agents. The technical press have recently latched onto this name and are using it for any localised software system which does not reside in a mainframe. In our view the term agent should be reserved for a particular kind of AI system - one where negotiation is a central part of the software functionality and one where the agent can say no to a request. Agent Definition
Because agents are complete software systems that conduct a specialised task they are ideal candidates for distributed systems. Furthermore careful design of agent software could well allow significant reuse - in effect agent software could become an off the shelf product which will help to contain the software mountain. Distributed Agent Systems This network of distributed management agents co-operate and negotiate to provide the service (point to point video telephony) or information product (the latest blockbuster film) for the customer. Mobile Agents When the mobile agent has been received on a host computer it will be validated as an acceptable agent and allocated an amount of processing power. Additionally it will be given any necessary access to local databases etc. in order to accumulate the information necessary for its activities. Naturally whilst resident on a foreign host the mobile agent is consuming resources. Therefore it will have to carry electronic credit to pay its way. Upon completion of the set task, gathering and analysing information on company reports for instance, the agent will be transmitted to the reporting point to download the information Applications and Services 21st Century Information Services and Products Home
Work
The distinction between home services and business applications will become increasingly blurred as home working becomes more commonplace. The range of information channels and diversity of entertainment sources available will provide significant opportunities for the service providers. Just as the end users will need agent technology to assist them to locate, negotiate and route information, the service providers will need to develop agent systems which assist the targeting of audiences and users. Ideally the service provider systems should build up a profile of every user connected. Such profiling systems must have the ability to track the changing interests and habits of users and anticipate their future needs. They can then become proactive providers of new information products. Internet - The Bizarre Bazaar
With all these limitations why is internet such a roaring success? Right now it is almost the only thing available of its type. So what should we look forward to? Infonet - The Supermarket In the information world the most common analogy to shopping is the library. Shelves of books, journals, magazines and papers in some regular order according to subject and author. We know exactly what we are looking for, and just by chance we find an interesting book that we didn't know about, or feel we will need at some future date. If only we could replicate this environment in the electronic world of information the advantages would be enormous. Consider the potential benefits:
21st Century Information Systems The Chairman's Request Today- you would need a team of several people and it would probably take several weeks to complete. In the future- he needs the answer within the hour - this move will impact on your own companies stock position - so what do you do? There is no help - you are on your own. Action 1: Action 2: Action 3: Action 4: As the user this was your view of the operation, but out in the network and the libraries the action was frenetic. The routing algorithms had ensured the smooth transmission of your agents, avoiding cable and electronic failures. The best deal on tariffs had been negotiated given the time limit of 10 minutes - and the files are massive - this has saved a little money - but it is for the chairman. If you had been given a day, then the files could have been held back until a low cost space was available. The carrier agents had bid to carry your files, but they were under a lot of pressure because of the ten minute window - it cost. The information servers agents made sure your agents got priority - but again it cost. But at least you have your information on time. In the next 50 minutes your Darwinian programme takes all the data and evolves a series of likely scenarios. These are in no way optimised, but they are fit for purpose, and they do paint a likely picture. This you present to the chairman as an animated 3D colour graphic of options - he can now make a decision. The above may appear too futuristic, but almost every element has been demonstrated in isolation. A Typical Day in 2010
Final Remarks Inevitably there will be people who believe we have gone too far along the technical futures roadmap. We do not agree. All the ideas we have used are in existence now. True we have extrapolated the scope and power of the agent paradigm but to no greater extent than the performance increases we have witnessed in similar technology areas. The greatest challenges facing the new information industry to deliver promises held out by the technology will be system integration and software maturity. There will have to be a concerted effort on the part of the telecommunications operators, computer vendors and many specialist suppliers of equipment and software to create the information platform which will serve us all in the 21st century. |
![]() |
||
![]() |
|