Preprints & Reprints
Back to Preprints & Reprints > Publications & Opinion > Homepage


COPPER MIND SETS
Peter Cochrane

Much of our technological thinking and expectation is conditioned by a history of established practices and trends. In telecommunications it is radio and copper cables with constrained bandwidths and high distance related costs; for software, the acceptance of massive complexity to realise relatively simple tasks; for the interface between man and machine it is technology convenient to the machine. In each case the solutions become steeped in political and corporate correctness through massive investment programmes that perpetuate the established trajectory. However, every few decades technological progress across a number of apparently disconnected areas presents an opportunity for radical change. The great benefit of these epochs is that they effectively change the nature of the problem and allow new solutions to be engineered. Such an opportunity is now on the horizon with the dual realisation of very low cost telecommunications and computing (with bits, distance, MIP's and storage almost free relative to other physical commodities). These prospects go far beyond multi-media, poor quality teleconferencing, video telephones and co-operative working environments. But to grasp them requires new, and not old, mind sets.

1 PROLOGUE
Many systems and networks are still being designed as if the constraints of the copper past were still with us: compress the signal; save bandwidth; minimise hold times; continue with the old protocols although they are now the new bottle-neck; use the established design rules despite the disappearance of the original technology ! The apportionment of operational and economic risks also seems to be vested in the past and requires realignment. The wholesale reductions in the hardware content of networks and machines relative to the massive expansion (explosion!) of software is a prime example, as is the poor human interface suffered on many teleporting services when the need to reduce physical travel is so apparent and bandwidth is so abundant.

Today we sit at an interesting cross-roads where the advances in fibre optic technology has (or will have soon) removed the bandwidth bottle neck and distance related costs in telecommunications. Integrated electronics has effectively made information processing and storage limitless and effectively free, whilst software has introduced the potential for unexplained system and network failures on a grand scale. What should we do, or be doing ? There would appear to be only a small number of key actions required to take the next major step. First we need to start thinking and engineering systems and not hardware, software, networks and applications as if they were in some way disconnected. Second, we need to embrace new tools and techniques that might help us understand the non-linear nature of the world we have/are constructing. Third, we have to humanise machine interfaces and information presentation if we are to stand any chance of understanding the nature of our systems. In short, we need to be able to visualise before we can hope to conceptualise and understand. Finally, we should not shy away from starting again with a clean sheet of paper when incrementalism has driven us into a "technological box canyon".

In this paper we examine some of these issues and postulate other possibilities that are still being investigated or offer obvious potential.

This paper should be read in conjunction with the three sister papers in the "Telecommunications, Transport and Telepresence" session which it augments.

2 PERSPECTIVE
The growth of civilisation is critically dependent upon rapid and efficient telecommunication for its organisation, control and stability. From the runners and beacons used in ancient times, to the Chappfe telegraph, electric telegraph, radio, and today's telephone and data networks, the need has always been for faster and more effective telecommunications. This is probably more evident today than at any other time with the emerging need to combine communication, computing and control. Tomorrow the list is likely to include robotics, telepresence and variants of virtual reality. It is interesting to reflect that none of this, not even a ubiquitous telephone network, would be possible with wholly copper transmission technology. It is doubtful if there is enough copper and associated technology, or money available to pay for everyone to have just a telephone. Certainly the bandwidth limitations of copper and radio rule out any significant excursion into computing, control, telepresence and other future services. Only the timely arrival of optical fibre technology during the last 15 years gives us the means to attain the dream of telecommunications for all at a price we can all afford, with the added bonus of a near infinite bandwidth and service expansion in the future.

It is perhaps understandable that the development of copper transmission and switching technology over the past 150 years has conditioned many people in the industry to the network topology, configuration, service and performance of the kind we currently enjoy. The reality is that the status quo will not do! Optical fibre is the only means of realising the next paradigm change, which may well be greater than the leap from the beacon to the electric telegraph. All we have achieved so far with optics is more of the same with better performance at a lower price. What we have to do is rethink the problem of systems, networks, operations and services to encompass the advances made possible by this young technology.

The exponential growth of telecommunications continues even during the present global recession and there is no sign of any down turn. However, it is sad to reflect that, 85% of all the telephones are with the 15% of the population who have the money ! Just 15% of us on this planet (about 825M telephones total and over 700M of these in the developed countries) have good communications - the rest haven't. It would be nice to think that in the new millennium we could start to correct this imbalance. Indeed long term global stability is likely to become almost wholly dependant on us achieving such an apparent ideal. Even so, in terms of growth potential a modest increase in industrialisation in the second and third worlds would see a massive growth in global communication.

Ecologically, IT has a significant part to play in the reduction of waste and raw materials. Unfortunately much of this technology has seen an extension and expansion of earlier paper based practices. In reality all the base technologies to drastically reduce the consumption of paper (rain forest) and physical travel (barrels of oil) is available.

The cost of a telephone call, fax, computer communication session/file transfer and video conferencing call in terms of raw material usage is insignificant compared to international travel costs by at least six orders of magnitude. Even in domestic terms there are considerable opportunities for significant national savings. For example; the UK has an approximate population of 55M, a work force of 25M, with 23M dwellings and 25M telephones. The total transport cost, in energy terms, for the population to travel to work has been estimated at 1.8 x

10 9 GJ/year, whilst the entire telecommunication infrastructure consumes an estimated 5 x 10 6 GJ/year. If only 1% of the working population changed to full-time working from home, then an estimated 10 7GJ/year could be saved. Globally this figure is probably of the order 10 9 GJ/year and could probably be increased significantly with enhanced telecommunications services.

3 A SELECTION OF LIMITATIONS & IRRITANTS
Much of our IT has changed little since it was originally introduced and leaves a lot to be desired from a user point of view. Some notable examples are as follows:

  • A telephone that still has more or less the same bandwidth as it did in 1876.
  • Not being able to recognise people on the phone, and not being able to see them.
  • Not having global telecommunication and computing mobility.
  • Brittle software in robust hardware, systems, networks and machines.
  • Coding systems that distort the voice and grossly distort moving pictures.
  • Trying to communicate with visual images of people who are the wrong size and colour and are distorted when they move.
  • Sterile teleconferencing environments that lack basic meeting room facilities.
  • Having to travel so much when it could be circumvented by IT.
  • Having to send floppy discs and tapes via surface mail (Frisbee Net !) because it takes hours to send the data over the phone line.
  • Time delays in communication, control, computing and associated interfaces.
  • A PC with a 25MHz clock (and even that is getting old!) and a PSTN only able to deliver 64kbit/s (or low multiples thereof) as a switched service.
  • A PC screen that often only displays 75% of one page in an 'N' page document.
  • Inappropriate technologies surviving, and/or having their lives extended for the wrong reasons.
  • The growth of conservatism and short termism in the West being overtaken by a more adventurist and long term view in the Pacific Rim.
  • E-Mail addresses with enough characters to define the position of every atom in the known universe.
  • Fighting through cascaded windows to achieve elementary objectives.
  • Having to communicate using an increasing amount of paper.
  • Voice command on car phones, but not on office machinery.
  • A growing paper and electronic filing system that still has be categorised, sorted and culled on a regular basis.
  • Living in a world that is both speeding up and removing delays in the decision making process, whilst not providing new/adequate tools to cope.
  • Radical technological opportunities being lost (and/or suppressed) through political/regulatory pressures and short comings in industry.
  • etc, etc, etc ......

4 NEW NETWORKS
The longer transmission distances afforded by optical fibre line systems over their copper predecessors is predicating a reduction in the number of switching nodes and repeater stations. This will soon be augmented by the arrival of the optical amplifier and network transparency. This technology migration realises improvements across a broad range of parameters including, reduced component count, improved reliability, reduced power and raw material usage, increased capacity and utility. A further logical (and revolutionary) development will see the concentration of more traffic onto single rather than several fibres in parallel. The advent of optical transparency over fibre and Wavelength Division Multiplex (WDM) makes such a proposal the more reliable option as the time to repair one fibre is far shorter than for multiple fibres !

Today we only access < 0.005% of the fibre bandwidth available, and we might be able to approach 1% with currently available technologies. But further, everything we have seen optical fibre do so far may be totally eclipsed by moving from the linear to the non-linear regime. It has hidden properties that we have not discovered yet, and therefore affords a potential for exploitation well into the 21st Century. We already see experiments with bit rates of 10Gbits over 10 6 km of amplifying fibre.
As the cost of transmission continues to go down we have to question the balance between transmission, switching and software. Radical reductions in the number of network nodes and repeater spans, consolidated switches, reductions in network management and constraints imposed by software look to be good targets. We may also have to move away from thinking that the local call does not just span the London area but the whole of the UK, expanding in time to encompass Europe and gradually the whole planet. E-Mail is the only practical realisation of this concept that is presently available - and it is a subscription service !

4.1 Switching and Transmission
Today we only use time and space for switching in telecommunications. But at the present rate of expansion it is becoming clear that we will not be able to realise the density of electronics to meet future growth by using these two parameters alone. It will become imperative to move into a third parameter to get the equipment density and energy concentration into existing buildings. For example, 10 years ago a typical UK repeater station floor had to accommodate some 2.4 Gbit of speech circuit capacity. That same floor today has 6.8 Gbit and projections indicate that 40 Gbit will be required by the end of the millennium. This packing density cannot be achieved with conventional electronics alone. Another degree of freedom is required - wavelength is the obvious choice. Recent developments have seen the demonstration of suitable technology such as contactless 'D type' (leaky feeder) fibres embedded in printed circuit backplanes. When coupled to an Erbium doped fibre which amplifies the light signal(s) when pumped optically, a loss-less multi-tap facility is realised which can distribute 10Gbit/s and higher rates almost endlessly.

An interesting concept now arises - the notion of the infinite back plane. It could be used to link for example Birmingham, Sheffield and Leeds through the use of optically amplifying fibre that offers total transparency. Such concepts naturally lead to the idea of replacing switches by an optical ether operating in much the same way as radio and satellite systems today. The difference is the near infinite bandwidth of the optical ether. Demonstrators have already shown that a central office with up to 2M lines could be replaced by an ether system, but suitable optical technology is probably still some 20 or so years away. Systems of this kind would see all of the software, control and functionality located at the periphery of networks with Telco's probably becoming bit carriers only !

4.2 Signal Format
Customer/service demand and technology are each providing the motivation to change the nature of telecommunications. In the midst of this revolution it is interesting to contemplate a further feature of communication history - the cyclic alternation between digital and analogue formats. Before the development of natural language, the "grunt or gesticulation" was no doubt the simplest indicator of man's agreement or displeasure. We have since progressed to use the tally stick, smoke signals, drums, music, the written word, Morse code, telephony, and PCM (Fig 1). In parallel, computation has also followed an alternating analogue - digital history. We might thus presuppose that this alternation will continue into the future, and may even be encouraged by the inherent qualities of optical fibre communications and photonic computing. There is certainly evidence that increasing complexity and future requirements may necessitate the use of analogue techniques to solve the problems of achieving a useful level of artificial intelligence, associative computing and control systems. Optical amplifier technology looks ideally placed to meet this challenge and may even promote the move back to analogue signalling!

4.3 Mobility
Whilst cellular radio technology is now well developed, and satellite mobile will no doubt follow, we suffer an apparent lack of spectrum, although we have yet to exploit microwave frequencies up at 180 - 300 Gbits and higher. But we also have the ability to take radio off air, up-covert it to the optical regime, feed it down fibre that is transparent through optical amplification, and get that signal to emerge in a distant cell undistorted. So, for multi-site working we could have access to the same facilities in all the buildings throughout an organisation, regardless of its geography/demography We would effectively be sharing the same signal/work space, thereby giving all participants the illusion of being at the same location.

Another exciting prospect is optical wireless. The performance of free space optics in the home and office is very similar to microwaves, but with the advantage of a vastly greater bandwidth. Systems in operation in the research laboratories are already providing pico-cellular illumination of the desk, personal computer, staff member, and offer the potential of active badges/communicators and inter-desk/computer links. Applications in the local loop might also be anticipated as an alternative to fibre, copper and radio.

For global mobility the major challenges are likely to remain the organisational and control software necessary to track and bill customers - this represents the major system design challenge ! Other issues include the ability to deflect/hold/store calls that are traversing time zones at unsocial hours

4.4 Satellite & Radio
Point-to-point satellite technology has now been eclipsed by optical fibre, carrying over 55% of all international communication. However, satellite now has a new role in broadcast, getting into difficult locations, mobility and service restoration. The path delay for geostationary satellites is problematic for real time communication, but there are alternatives. There are proposals for low Earth orbit (~1000km) satellite's, about about the size of a football and launched by super guns, to form a cellular system in the sky using 70 or so low cost units. Although they will periodically fall out of the sky, their replacement would be at low cost.

Other exciting developments include direct satellite to satellite links using free space line-of-sight optics, on board signal processing and switching, more compact and efficient coding schemes, individually positioned multiple micro-beams, the use of much higher microwave frequencies than normal ground to satellite links, and even optical links from satellite to ground. All of this can be expected to extend the use of satellite technology by an order of magnitude (or so) beyond that of today. Fundamentally satellite systems look set to migrate to mobile, and difficult/rapid-access applications. The technology is however a refinement of what we already have, and other than using free space optical links, it is hard to see any further major innovations.

4.5 The Economics of Analogue and Digital
The development of PCM and transistor based electronics jointly enabled the realisation of TDM transmission systems that could most economically combat the limitations of copper pair crosstalk and attenuation. By about 1960 PCM systems were becoming established, and in the 1970's they became widespread. This in turn led to the development of digital transmission systems at higher rates which had well defined and internationally agreed standards by the mid 1970's. At this time repeater spacings had fallen to 2km for 2.048Mbit/s on twisted-pair cables and 140Mbit/s on coaxial cables. Systems operating at 565Mbit/s rates were also being investigated but required repeaters at 1km intervals. During this same period a number of administrations had completed economic studies that established the combination of digital transmission and switching as the most economic. This is reflected in Fig 2 which shows the options considered and their relative costs. Programme decisions made at that time are only now seeing their full realisation with the near full digitalisation of national and international networks in the developed world. A point to ve recognised here is that historically the fixed assets of a Telephone Company (Telco) are generally measured in ?Bn, and any major network changes take 5 - 10 years to complete. Before fibre technology the ratio for long lines would be of the order of 50% transmission and 50% switching. With the widespread introduction of optical fibre into modern networks the transmission asset base may now be as little as 10% with some 50% of all resources expected to reside in the access network. This has significantly changed the economic balance away from the original figures. When repeaters are spaced at 50 - 100 km intervals and direct photonic amplification is introduced, an all-analogue system for national and international wideband services then looks far more attractive.

5 SOFTWARE
The biggest single problem with software science and engineering is the almost total lack of any science and engineering. Whilst we might use the excuse that it is a very young field, this does not give the full story. Indeed, should we even expect to achieve as mature an understanding as that enjoyed in mathematics and the physical sciences? In truth software is a further level of abstraction beyond mathematics and may well defy real understanding for some time. A suitable analogy might be the mechanical engineer trying to design a bridge on the basis of molecular descriptors, devoid of useful concepts such as 'Young's Modulus', or the electronics engineer designing circuits on the basis of quantum mechanics rather than 's' parameters. We are currently standing too close to the software problem - and we need to see through several more levels of abstraction before we are likely to form a full understanding and appreciation (Fig 3).

Unfortunately the above state of play in the development of software is little comfort to the designers, coders and users who are facing very real problems today. The solution looks some considerable way off, with structured programming, formal methods, requirements capture analysis (RCA) and object orientation being very unlikely to save the day. (Anyone who disagrees should contemplate the production of a bicycle using RCA - what chance is there that it will be right first time? Very little! In contrast, a series of rapid prototypes will certainly succeed). All that can be expected is a tidying up of the software production processes we already have! Very soon we will be back to where we started, with an expansion in the size of software we are able to produce, and even a modest improvement in our efficiency, but the level of confusion, uncertainty and engineering and operational risk will be at least the same, and may even be worse. The software factory approach might pay dividends, with tried and tested modules available in a reusable form (like Lego or standard integrated circuits from Texas or Motorola et al). However the chances are that history will repeat itself and the demand will be for custom software, for the same reasons we have had to produce custom and semi custom integrated circuits.

5.1 Size and Complexity
By any comparison of man's efforts, software is becoming a cause for significant concern in terms of it's sheer scale. For example; the complete works of William Shakespeare take up about 450m of paper; the line code for a small telephone switch is about 1km and a central office is in the 4-6 km range; network control centres are in the 6-10 km region; the Encyclopaedia Britannica is about 4.3 km. A full stop in the wrong place and the spacecraft misses the planet ! In the software domain very minor things pose a considerable risk, which appears to be growing exponentially as we look to the future. We have to find new ways of solving this increasing burden of risk as the present trajectory looks unsustainable in the long term. Quite perversely the unreliability of hardware is coming down rapidly. It is becoming more and more reliable while software is becoming more and more unreliable, so much so that we are realising sub optimal system and network solutions (Fig 4). From any engineering perspective this growing imbalance needs to be addressed. If it is not, we can expect to suffer an increasing number of ever more dramatic (even global) failures.

5.2 Network Disaster Scale
Over the past few years there have been a number of spectacular telecommunication network failures on a scale not previously experienced. Quantifying the impact of these and future failures is now both of interest and essential if measures against them and future network design are to be correctly focused. The key difficulty is the diversity of the failure types, causes, mechanisms, correction, avoidance and customer impact in each case. A simple means of ranking network failures by severity, such that they can be readily understood by the non-specialist as well as by the trained engineer, uses the approach of Richter for ranking the severity of earthquakes. The total network information capacity outage (loss of traffic) in customer affected times is thus:

D = log10 (NT)

where N = number of customer circuits affected

T = total down time

Exploiting the relationship with the Richter scale further, the scale levels reached by some typical and extreme events are as follows:

  1. On the earthquake scale, a magnitude of around 6.0 has special significance because it marks the (fuzzy) boundary between minor and dangerous events. A magnitude of 6.0 on our outages scale would represent an event in which, say, 100,000 people lost service for an average of 10 hours.
  2. Beyond this, earthquakes in excess of 7.0 magnitude are definitely considered major events, wreaking widespread havoc. In telecommunications networks outages above 7.0 are mercifully rare, but the series of related USA outages in the summer of 1991 would rank at such a level. Globally there appears to have been only one outage that exceeded level 8 on our scale. Estimated readings near 10.0 have so far occurred only for earthquakes.

In Fig 5 we have included qualitative ranges of media, regulatory and governmental reaction relative to our disaster scale.

5.3 Ants and ANTS - A Viable Alternative ?
It seems remarkable that we should continue along a trajectory of developing ever more complex software to do increasingly simple things. This is especially so when we are surrounded by simple organisms (moulds and insects !) that have the ability to perform complex co-operative tasks on the basis of very little (or no!) software. An ant colony is one example where very simple rule-sets and a neural computer of only 200 (English garden ant) to 2000 (Patagonian ant) neurons is capable of incredibly complex social behaviour.

Numerous studies of ant and insect life forms have been reported with varying degrees of success. However, we have migrated to the ANT (Autonomous Network Telepher) as a likely contender for the future control of telecommunication networks. Initial results from simulation studies have shown considerable advantages over the conventional software currently used. In one recent study of network restoration, only 400 lines of ANT code replaced the !,000,000+ lines presently used in the real network. Software on this scale is within the grasp of the designer's full understanding, and takes only a few days to write and test by a one man team! Some initial results are given in Fig 6.

5.4 Evolution
Another radical approach that mimics nature is that of competitive evolution. In this case a number of seed programs (say 10) are written to address a given problem. They are let loose and timed. The winner (the one that takes the shortest time to complete the function) is then frozen. The rest are instructed to mutate ! The trial is run again and repeated with the winner being frozen and the rest being instructed to mutate. If after a few 100-1000 trials a competitor remains at the bottom of the league table it is culled by a reaper function. Repeating the process eventually realises an overall winner, after say 50,000 trials. The final form of the software may typically be expected to be half the size of the original, but operate about four times faster, and is likely to be incomprehensible (to the human) and far removed from the original seed from which it evolved. This is an interesting area that is developing rapidly for a number of diverse application areas including telecommunications, control, computing, signal processing and adaptive systems. In Fig 7 we present a forecast for the rate of progress in this field, including the capacity development rates of desk top computers and work stations.

6 SECURITY
Security is an ever-increasing problem and if we are going to realise distributed computing, telepresence and tele-action systems via telecommunication networks, certification at a distance will then become vitally important. To achieve this we can combine the scanning of the human face, eyes, lips, hands, thumbs, to provide a high degree of security. This can be augmented with voice recognition, handwriting, and ultimately, if we are still not convinced, we could ask for some obscure piece of information known only to the user. Using a small combination of these techniques, we can get error rates better than 10-10 for a few 1000 $ today.

7 ROBOTICS
There are now around half a million robots in the world, mostly in the form of arms on assembly plants. In the future, as robots become autonomous and self organising, we may get caught out by the rate of growth for telecommunication capacity in a similar manner to the recent growth in cellular radio. That is, we may not have radio stations in the correct locations ready for robots to do the things they have to do. They will need to know where they are, we will need to know where they are, and when they get in trouble because of their limited ability we will need a capability for human interdiction at a distance. The forecasts for autonomous robot production from Japan suggest the real possibility of machine-to-machine telecommunication/traffic densities rivalling those of the human race by 2020 !

8 VISUALISATION
Today we are living in an increasingly complex, and non-linear, world. Unless we can develop new techniques to help us visualise future systems, so that we can conceptualise and understand/formulate solutions, our technological development may become stunted. At present one of our key problems is the distillation of core information from vast amounts of computer, network and system generated data. Virtual Reality technology affords a new and novel opportunity to create new perspectives. Two examples of this approach are shown in Figs 8 and 9. In Fig 8 the migration of a lightning storm across East Anglia and the resulting network failure reports are presented on two animated 3D plates: Fig 8 does not do justice to an animated document which will be demonstrated at the conference. In Fig 9 a portion of the BT network is shown in 3D form. Here the user can "fly the network " and gain new perspectives from inside and outside switching and transmission links. This will also be demonstrated at the conference.

9 FUTURE GOODIES - A FEW SAMPLES

  • In 1995 a wrist-watch pager will be available, and around 2000 a wrist-watch communicator (with video ?) will be realised. In the former case prototypes already exist, while in the latter the biggest challenge is the battery technology !
  • Using a very closed subset of voice communication with a machine, we can get satisfactory I/O today. But natural human voice communication (and full cognition) with a machine is probably another 15 years away (and perhaps 20), but by constraining the set that is being discussed, a workable level is then possible today !
  • If the trajectory of electronics development continues we can expect to see machines that will be 1000 times more capable than present day machines within 10 years. In the new millennium we can thus expect computing power 1,000,000 times greater and storage capacities that are far higher than we can even contemplate today.
  • A simple question: when will a super computer equal a human from the standpoint of storage and sheer computational capability? A projection along our established trajectory suggests the answer to be the year 2010 ! At that time a super computer equals the human. By the year 2030 it will arrive as your friendly personal computer, requiring very much more than a 64 kbit/s ISDN port, yet no human intervention to establish communication with others of its kind!

10 CONSUMER TECHNOLOGY
Consumer/domestic technology can be expected to have a greater impact on the office and IT than previously experienced through the increased expectations of the younger audience who are the workers of tomorrow. A few likely examples are :-

  • Virtual reality is poised to reach the home in 1993 in the form of games machines. Already you can link two Game Boy" terminals via a cable and play each other across the room. A $100 modem from Radio Shack allows players to communicate across town. (My kids have had strict instructions about this - we don't get free local calls in the UK!).
  • Suppose it is Cup Final Day. The scene is Wembley Stadium. 100,000 people are in the Stadium - minus 1 - you!. In your seat is a virtual presence terminal with a fish eye lens giving full 360o vision back into your home VR terminal via a satellite broadcast link. Swipe the terminal with your Amex card and you can sit there in comfort and watch the game as if you were really there. You look to the left and right, front and back, and up and down: you have all-round vision giving you the perception of being at Wembley Stadium amongst the people around you.
  • Already we see miniature cameras being strapped onto cars, cricket-stumps, quarter-back helmets. It is particularly interesting the different view that you get watching sporting events: it completely changes your perception.

11 FINAL THOUGHTS

In some respects we are currently suffering from technological indigestion, and it is likely to worsen as we approach the new millennium. We now have so much "ripe technology" (and even more is coming) that we are spoilt for choice. What is required is a new approach to telecommunications, computing and IT that breaks the bonds with the "copper" past and reaches out for new and novel solutions/applications.

12 FURTHER READING

  1. Kruger, M W: Artificial Reality II, Addison Wesley, 91.
  2. Cochrane, P et al: CAMNET, Interlink 2000, Aug 92.
  3. Ishii, H & Kobayashi, M: ClearBoard, CHI-92 Conference May 92.
  4. Cochrane, P & Heatley, D J: Optical Fibre Systems & Networks, ibid [2] Feb 92.
  5. Mason, C: ?Software problem cripples AT&T long-distance network?, Telephony, vol 218/4, 22 Jan 90, p 10.
  6. Neumann, Peter G: "Some reflections on a telephone switching problem", CACM vol 33 no 7, Jul 90, p 154.
  7. ?SS7 errors torpedo networks in DC, LA?: Telephony, vol 221/1, 1 Jul 91.
  8. ?DSC admits software bug led to outages?: ibid [7], 221/3, 15 Jul 91, pp 8-9.
  9. Davenport, P: ?Scarborough returns to electronic dark ages?. The Times, issue 63864, 15 Nov 90.
  10. ?50 killed in Kirghizia earthquake?. The Times, 21 Aug 92, p 6.
  11. Winter, C S & Martin J L F V: "Cybernetics in Biology - When will evolving software be possible?", Internal BT report not yet published.
  12. Brown, G.N., et al., "TENDRA - A simulation tool for the analysis of transport networks employing distributed restoration algorithms", IEE Colloquium on "Resilience in Optical Networks", London, 29th Oct'92.
  13. Whalley, S et al: Virtual worlds as an engineering tool, Internal BT Report.
  14. Foster, J & Cochrane, P: Assessing the scale of telephone network outages, IEE Electronics Letters - to be published.

FIGURE CAPTIONS
Fig 1. The Analogue-Digital Cycle
Fig 2. Economics of Analogue & Digital Telecommunications
Fig 3. Software as an Abstraction
Fig 4. Hardware-Software Reliability
Fig 5. Outage Disaster Scale for Networks
Fig 6. ANTs Network Example Result
Fig 7. Evolutionary Software
Fig 8. Lightning Strikes in East Anglia
Fig 9. Flying the Network - VR as an Engineering Tool