Last Modified: ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?



Homepage / Publications & Opinion / Archive / Articles, Lectures, Preprints & Reprints

Synopsis
Much of the established thinking on future information networks and working is constrained by, or at least incremental on, today's experience and practice. For example, 'the copper mind set' would have us believe that bandwidth, or more accurately bit transport capability, is a scarce and expensive commodity. However, optical fibre transmission has freed us from that constraint. We might also ve led to the conviction that protocols such as X25 are necessary, right and proper. Not so. They were designed for the noisy and unreliable copper cable networks of the 1970s. They are now out dated and outmoded - with frame relay their dying ember. Network and service management systems currently adopt a centralised approach which brings with it the baggage of large unwieldy software systems - also a vestige of an inflexible hardwired past. Advances in open distributed computing and artificial intelligence are providing superior alternatives for modern flexible information networking. In this paper we explore these new freedoms and give our personal views on the developments which might have an impact on our IT lives in the next decade.

Introduction
The integration of technologies, applications and services on an unprecedented scale is almost certain as telecommunications and computing converge. A key driver will be the migration of computer power to the periphery of networks resulting in major changes in business practice. A further significant force will be the increasing number of information, entertainment and other services for the office and home. In contrast to today, the degree of control afforded to the user by the providers of services and networks will have to be almost total. No one will wait.

Our exploration of the future begins by reviewing the underlying trends in the support technologies focusing on the rapid change in capability. We then consider the problems of scale and complexity that are now becoming severe in established networks. The need to tame these two problems is urgent and is both the second driver and a primary limiting factor. Interaction between business practice, home life and entertainment in the evolving information systems world is also seen as a further driver of change.

After a critical look at today's software we indicate how various Artificial Intelligence (AI) and distributed system techniques will make a dramatic impact on future systems. The final sections give a prognosis for a global information system with a seamless integration of currently separate technologies providing intelligent support to people.

Technological Trends


Computing
Since their general introduction in the early 1980s Personal Computers (PCs) have now completely populated business life in the western world. Many technical professionals now own two machines: a desk top and a lap top. This has arisen through the utility of being able to 'carry' the office to meeting places and performance/price incentives. To some extent PCs have replaced the traditional role of clerical assistants, drawing boards and filing cabinets. In future they may well simulate much of our laboratory, test and modelling space - and replace the human secretary. Today they provide a diary function, store documents, undertake word processing and interconnect their user with the global messaging systems. Impressive as these capabilities are they still demand a high degree of capability on the part of the user. This need for users of modern IT to learn the complexities of their systems in order to unlock the potential of technology is an impediment to the mature, but less of a problem for the young.

Even though we can be generally critical of today's technological packaging and user unfriendliness, it must be said that the performance is truly impressive. PCs currently deliver a hardware line up which has echoes of yesteryears' supercomputers: 25 - 160 MHz clock; 64 bit processor; 32 Mbyte RAM; 1 Gbyte Disk; CD, video and hi-fi sound. All this at affordable and reducing prices. At the supercomputer end of the spectrum the computer power is truly staggering. Specialised numerical processing machines operate at 10's of GFLOPS and general purpose machines have arrays of 100's of processors.

The forces that have shaped the computer market are a combination of competition, application and technology. Chances are that the performance/price ratio will continue its upward trend for at least the next 10, and probably 20, years. Such extrapolation tells us that in 2005 PCs will be delivering c1000 MIPs and supercomputers exceeding 10 TFLOPSs.

Given this remarkable advance in hardware ability, it is interesting to contemplate the relatively static situation in software. If only it was simpler, more reliable, understandable and low cost. Why not? We have yet to realise the analogous abstractions of the physical sciences that spurned our fundamental understanding of hardware. Also we have failed to create the evolutionary environments that would create fit for purpose solutions. We are like the Romans building bridges - we have methodologies, but no true understanding.

Networking
The same technical and commercial forces that have shaped the computer scene have also been at work on telecommunications. Digital transmission and switching now provide near perfect voice communication across most of the globe. Mobile systems allow people to enjoy unprecedented freedom of where they access networks. Satellite technology will see mobile telephony interconnection achieved whilst in the air or at sea. Complementing this impressive world-wide voice network a number of voice activated services are available: home banking, database access, messaging and control systems.

The widespread introduction of digital transmission and switching in the 1980s paved the way for the integration of voice and data transmission. Corporate users are now able to specify high rate digital circuits which co-exist with the telephony infrastructure. The last few years have seen the penetration of ISDN into small businesses and homes thereby bringing the benefits of digital transmission to a wider user group.

The copper pair local loop is now being stretched to the limit with the introduction of Video on Demand (VoD) which probably represents the maximum viable information transfer rate achievable. Optical fibre is the only technology that provides true future proofing for network operators who have now accumulated over 20 years experience with these systems. Operation at 2.4 Gbit/s is now commonplace, and 10 Gbit/s is readily achievable. Current research exploits the non linear properties of fibres with soliton transmission systems offering the prospect of 1T bit/s systems within the decade. These impressive transmission rates are not accompanied by corresponding high system costs - quite the reverse. Each time the bit rate is quadrupled the cost per bit transported falls by a factor of three. Further, the use of all optical amplifiers has extended the reach of optical systems without the attendant cost of optical/electronic conversion. The technological capability now exists for all optical fibre networks - no electronics, no limits, total transparency. The extension of such networks into business premises and homes will provide the bandwidth to meet the demands well beyond 2000.

Although the integration of computing, switching and transmission technologies will provide the physical components required for future information systems there remains the problem of software. Both the software needed to run and maintain the network, and more importantly for the end users, for applications.

System Complexity and Software
The steady increase in services and features offered to customers has been accompanied by a rapid build up of system complexity. This has occurred both through the large quantity of transmission and switching equipment required to provide global communications, the increasing number of carriers - all using different systems and the necessity to maintain 24 hour service.

Network and service providers have responded to these problems by installing ever more complex management systems. It is now commonplace for major switching sites, or line transmission equipment, to have their own built in management and control system. This typically provides a means for managing the configuration of the equipment and dealing with hardware and software faults. In order to provide centralised communications management these integral systems are connected to management centres.

The development of large software systems is a very demanding enterprise with a high risk of failure. This failure can manifest itself as the inability of systems to meet specifications (a development failure) or an in service failure where the system falls short of its performance objectives. Therefore as the scale and complexity of the networks grow the risk of a major failure through a malfunction of the management system software increases significantly. If the traditional centralised approach to system software continues into the coming information age the likelihood of major failure becomes unacceptably high.

At the centre of the problem there exist twin difficulties: the inability of the software community to build highly reliable large systems and the difficulty people experience analysing large data sets. Fortunately, current systems and AI research strongly indicates that solutions to these problems will be forthcoming.

More Data - Less Information
The way the management of network information has grown exponentially over the past few decades has been reflected in the impact of system complexity on individuals. In professional life it is not unusual to spend 85% of your time trying to find information, 10% putting it into the right format, and only 5% making the critical decision.

"We are increasingly data rich and information poor"

As companies rightsize, fragment, and outsource more of their non critical functions, employ fewer people, use more contract staff, and operate in a faster and more chaotic environment, the need for accurate and timely information will increase. Without access to good data, and the ability to turn it into good information, companies will fail to meet the challenges of the future. We as individuals will increasingly be asked to do more in a shorter time and intelligent access and intelligent manipulation of data will be increasingly vital.

For most of us the removal of the PC and/or lap top would see our output drop to a fraction of what we currently achieve. In short, we are unable to operate without this computational power. PC technology is rapidly becoming our third lobe - an augmentation of the cerebral processor with a silicon surrogate. What it lacks is real intelligence - and that is what is required if it is to help us cope with the 21st century world of instant information, no delay, no time to contemplate, only rapid response.

New Approaches


Networks
Communications networks by their very nature are distributed systems. Therefore it is questionable why management has been centralised. An analysis of the history of network management indicates a number of principal forces:
  • The desire to reduce manning levels through centralising specialist skills.
  • An influx of computer network mindsets into the telecommunications industry that did not envisage the differences of the two regimes.
  • A mindset that said: if it changes, measure, record and centrally analyse later.
  • A lack of expertise and understanding of distributed systems.

Whilst a centralised approach enables skilled people to handle large volumes of network incidents, which leads directly to reduced manning levels, the amount of data collected is proportional to N2 - where N = the number of active nodes. This leads to GBytes of data per day being generated for the largest networks. An alternative, and new philosophy, is that by distributing management systems and increasing automation through AI, significant reliability and cost savings can be achieved. This is augmented by the progressive slimming down of network hardware through the advantages afforded by fibre and fewer network nodes. For example, a copper and radio based network of 6000 switches can be replaced with fewer than 100 switches in a fibre based equivalent.

During the last decade research has demonstrated clearly that expert systems can automate many network management functions. Both rule and model based expert systems can diagnose faults on equipments and networks to a very high degree of precision. In addition, expert systems are making inroads into areas such as network planning and system design. Research into distributed network management clearly shows that distributed automation is feasible. In fact the kernel of the AI systems which power distributed management have been given a name - agents. The technical press have recently latched onto this name and are using it for any localised software system which does not reside in a mainframe. In our view the term agent should be reserved for a particular kind of AI system - one where negotiation is a central part of the software functionality and one where the agent can say no to a request.

Agent Definition
An agent is a self contained software programme which has two parts:

  1. A generic part which performs communications, co-operation, and negotiation with other agents and users
  2. A role specific part which carries out specialised functions associated with its task

Because agents are complete software systems that conduct a specialised task they are ideal candidates for distributed systems. Furthermore careful design of agent software could well allow significant reuse - in effect agent software could become an off the shelf product which will help to contain the software mountain.

Distributed Agent Systems
Agent technology frees the mind from the traditional approach which continues to dominate communications management thinking. Furthermore the agent paradigm allows a radically different approach to the marketing of information services to be implemented: For the first time an open market in network services and information products becomes viable. The central idea is that an agent represents an actor or resource in the information marketplace. That agent will then negotiate to achieve the best deal for its 'client'. This concept is shown diagramatically in the figure below and a particular scenario is provided at the conclusion of the paper.

This network of distributed management agents co-operate and negotiate to provide the service (point to point video telephony) or information product (the latest blockbuster film) for the customer.

Mobile Agents
In the foregoing brief description of agent architectures we have, for clarity, concentrated on fixed agent structures. But there is significant work being conducted world-wide on agent paradigms where the agents roam information networks. The idea is that the mobile agent programmes will be set tasks and be transmitted from computer host to computer host to carry out those tasks.

When the mobile agent has been received on a host computer it will be validated as an acceptable agent and allocated an amount of processing power. Additionally it will be given any necessary access to local databases etc. in order to accumulate the information necessary for its activities. Naturally whilst resident on a foreign host the mobile agent is consuming resources. Therefore it will have to carry electronic credit to pay its way. Upon completion of the set task, gathering and analysing information on company reports for instance, the agent will be transmitted to the reporting point to download the information

Applications and Services
The information age in the 21st century will be dominated by choice. Users will have a bewildering range of services, information channels and entertainment outlets to select from. Our vision of this future system requires the use of agent technology to assist users to manage their personal information space. N. Negraponti (Director MIT Media Labs) has coined the term 'electronic butlers' for these types of agents. Just like human butlers the agents will have to learn the needs and whims of their owners. The services that could be supported by future information networks will include the following:

21st Century Information Services and Products


Home
  1. VR Games
  2. 3D films
  3. Video telephony
  4. Television channel access
  5. Newspaper delivery
  6. Electronic mail
  7. Timetable access
  8. Music on demand
  9. Shopping on demand
  10. Medicine on demand

Work

  1. Data Visualisation
  2. Computer Datamining Aided Design
  3. Video telephony
  4. On demand video conference
  5. Specialist magazine delivery
  6. Electronic mail
  7. Database access

The distinction between home services and business applications will become increasingly blurred as home working becomes more commonplace.

The range of information channels and diversity of entertainment sources available will provide significant opportunities for the service providers. Just as the end users will need agent technology to assist them to locate, negotiate and route information, the service providers will need to develop agent systems which assist the targeting of audiences and users. Ideally the service provider systems should build up a profile of every user connected. Such profiling systems must have the ability to track the changing interests and habits of users and anticipate their future needs. They can then become proactive providers of new information products.

Internet - The Bizarre Bazaar
There is no doubt that the Internet has sparked a massive interest in networking and information services. However, whilst many enthuse about this heroic experiment in organic growth, autonomous action and anarchy, it is wholly unsuited to anything other than amateur applications. It can be likened to a bazaar - what you are looking for is probably available if you only knew where to find it - and if you are lucky enough to discover something useful, you could verify its pedigree and value. Nevertheless, Internet is currently the only global information network, and it is growing exponentially in terms of the number of users, servers and data available. It also has primitive information agents (Gofers) at work today. So what is wrong?

  • The bit rate is very low - delays are seconds for speech, minutes or hours for pictures and large files.
  • The concatenation of slow routers gives wildly differing delay characteristics throughout and between transmissions.
  • Point to point reliability is very poor. Routers, servers and links fall over regularly.
  • Routing of information is not contained or in anyway optimised. The internet was originally designed to withstand the damage of a nuclear war - not optimise the route lengths and delays. It was designed for survival not service.
  • Interfaces vary widely, are not user friendly, and their operation takes some skill and a lot of patience. This is not an environment for the computer illiterate.
  • The agent technology available is relatively dumb - there is no arbitration or negotiation, only seek and find.
  • The information sources tend to be chaotic and at least disordered.
  • Commercial operations are not possible in the strict sense.
  • No one can be held responsible for the information available - is it valid, what is it worth, who are the originators, who owns it?

With all these limitations why is internet such a roaring success? Right now it is almost the only thing available of its type. So what should we look forward to?

Infonet - The Supermarket
When we go shopping we do not expect to hunt for the soap powder or the cornflakes we go straight to the shelf. There is some order, some regularity that helps us quickly assemble all the necessities of life so we can get on with the exciting bits. But if we are shopping for clothes or luxury goods we expect to have to invest some time and effort and anyway it can be fun. In this activity there is one vital ingredient that augments our preconceived notion of exactly what we want - serendipity. That chancing upon some style or thing we had not considered or did not know that we wanted.

In the information world the most common analogy to shopping is the library. Shelves of books, journals, magazines and papers in some regular order according to subject and author. We know exactly what we are looking for, and just by chance we find an interesting book that we didn't know about, or feel we will need at some future date. If only we could replicate this environment in the electronic world of information the advantages would be enormous. Consider the potential benefits:

  1. Information could be located almost instantly - all of it - no duplication.
  2. Information would always be up to date, traceable and validated.
  3. Libraries of paper would be reduced to compact servers.
  4. Information and data correlation would be simple.
  5. All human knowledge would be available to everyone.
  6. Custom books could be produced by everyone to meet individual requirements.
  7. Going all electronic would move us a step on from the printed word and fixed 2D picture, to a world of animation, simulation, audio and video explanation.

21st Century Information Systems
In the Office
So far we have considered the various technical components necessary for the construction of future information systems. Let us now consider a practical example to bring all of these features together.

The Chairman's Request
The Chairman's Office: A message from the chairman requests information on a new smart skin technology and a consortium of pharmaceutical and electronic companies getting together to exploit their position on the world market. He needs a breakdown of the technology, the companies, likely partnerships and market potential.

Today- you would need a team of several people and it would probably take several weeks to complete.

In the future- he needs the answer within the hour - this move will impact on your own companies stock position - so what do you do? There is no help - you are on your own.

Action 1:
Prepare multiple agents for release onto the world's information network (by region) to seek out the R&D reports, papers and press releases on smart skins during the past year - but to return within 10 minutes. These are smart agents - last time you asked for information on smart skins you gave no time limit and they can remember where the best return on effort was realised. They go for the IEE, IEEE, IOP Journals, the XZW, R2F company reports first.

Action 2:
Prepare multiple agents for release onto the world's information network (by region) to seek out the stock position of all the companies known to be interested in smart skins. These agents agree to meet with Action 1 agents in the network in five minutes to cross check findings. After correlation they are to collect all available and relevant information for analysis.

Action 3:
Prepare multiple agents for release onto the world's information network (by region) to seek out the latest analysis and synthesis programmes to model this situation.

Action 4:
Release agents!
Ten minutes later you have all the information and modelling tools ready to complete the analysis.

As the user this was your view of the operation, but out in the network and the libraries the action was frenetic. The routing algorithms had ensured the smooth transmission of your agents, avoiding cable and electronic failures. The best deal on tariffs had been negotiated given the time limit of 10 minutes - and the files are massive - this has saved a little money - but it is for the chairman. If you had been given a day, then the files could have been held back until a low cost space was available. The carrier agents had bid to carry your files, but they were under a lot of pressure because of the ten minute window - it cost. The information servers agents made sure your agents got priority - but again it cost. But at least you have your information on time.

In the next 50 minutes your Darwinian programme takes all the data and evolves a series of likely scenarios. These are in no way optimised, but they are fit for purpose, and they do paint a likely picture. This you present to the chairman as an animated 3D colour graphic of options - he can now make a decision.

The above may appear too futuristic, but almost every element has been demonstrated in isolation.

A Typical Day in 2010
Philip is a telecoms manager who commutes to work. His wife Jill is a social worker. They have a teenage son, Paul, still at school.

  1. 05.30 Philip's daily newspaper is delivered and stored ready for inspection. Of course Philip's newspaper is a selection of items drawn from many news sources by his personal information agent. A delivery indication is displayed on the console. Philip will engage his 'paper' over breakfast on the wafer thin wall display. Items of specific interest are down loaded onto his 'computer you wear' for 'head up display later'. Items of professional interest are despatched to his office.
  2. 06.45 The centralised alarm system wakes the family according to its weekly schedule.
  3. 07.30 Philip leaves for the office. The train is late. He directs a voice message agent to negotiate with those of his colleagues to delay his first meeting.
  4. 07.50 Paul wants to play the latest virtual reality video game before school. He asks for the latest top twenty games and selects number one. The display warns Paul that he can only play for 5 minutes before his weekly credit runs out. The home computer offers a series of cheaper games but Paul declines.
  5. 08.15 As Paul leaves for school, his mother ensures that his alarm caller is set. In the event of an incident he can activate the alarm to transmit his position and condition to police, parents and the school.
  6. 08.30 Jill receives a video phone call from a friend and they arrange to meet later for coffee. Jill reaches for an icon and a table is booked at her favourite coffee house.
  7. 09.15 Jill makes a video phone call to the virtual supermarket and inspects and chooses fish for dinner. Credits for the fish are automatically transferred t the fishmongers account.
  8. 09.30 Jill dictates a letter to a friend, automatically inserting a video sampler of Paul's birthday party. Despatch is controlled and instantaneous.
  9. 11.00 During coffee with her friend Jill decides they should see a show on their next day off. She contacts a ticket agency on her 'body furniture terminal', books two electronic tickets which are debited to their individual accounts.
  10. 11.30 At the office Philip realises that the heating at home has not been turned down so he makes a telemetry call to check and reset it - he also gets a security and mail check.
  11. 15.00 Philip decides to work from home for the rest of the day. He transfers all incoming calls to his home. His office computer informs him of the video conference he had overlooked. At the touch of an icon the video conference is re-routed to Philip's home.
  12. 16.30 When Paul arrives home he starts his homework and realises he has left a rare file at school. He consults an on-line information service to see if they have a copy. He finds it is available and down loads the relevant sections. The customer agent would have chosen the cheapest delivery mechanism.
  13. 19.15 Philip is in his study watching a video transmission from a meeting where his boss is giving a paper. The rest of the family are watching a quiz program in the lounge. Meeting over, Philip watches the news, an item appears that the whole family should see, so he calls it back for replay later.
  14. 20.00 The business of the day over, the whole family enjoy a 3D blockbuster film.
  15. 21.00 Automatic security sensors are activated by the home computer as darkness approaches.

Final Remarks
Throughout this paper we have introduced some new ways of thinking about problems that have beset telecommunications and computing for some considerable time. We believe such paradigm shifts are essential to meet the technological challenges that 21st century Information Networks will create.

Inevitably there will be people who believe we have gone too far along the technical futures roadmap. We do not agree. All the ideas we have used are in existence now. True we have extrapolated the scope and power of the agent paradigm but to no greater extent than the performance increases we have witnessed in similar technology areas. The greatest challenges facing the new information industry to deliver promises held out by the technology will be system integration and software maturity. There will have to be a concerted effort on the part of the telecommunications operators, computer vendors and many specialist suppliers of equipment and software to create the information platform which will serve us all in the 21st century.

All materials created by Peter Cochrane and presented within this site are copyright ? Peter Cochrane - but this is an open resource - and you are invited to make as many downloads as you wish provided you use in a reputable manner