Last Modified:



Homepage / Publications & Opinion / Archive / Articles, Lectures, Preprints & Reprints

From Copper to Glass
(The right idea, decisions, and investments at the right time....)

Perspective

FDM
The most visible manifestation of this process is the stacking of radio and TV transmissions in Frequency space. Each channel has a well specified, controlled and agreed operating frequency. Well, the same principle is used in the telephone network for individual speech channels. On the biggest commercial systems over 10,000 people converse over each of several coaxial tubes in this manner.

TDM..
By sampling speech (or any other analogue signal) and quantising into discreet levels, it may be coded into a series of numbers that can be transmitted as a bit stream. By interleaving many such streams it is possible to transport thousands of individual channels over copper, radio or optical fibre bearers. The largest commercial systems today transport over 30,000 simultaneous conversations.

DIGITAL v ANALOGUE
The decision to go all digital in the telecommunications network was taken on purely economic grounds. In the 1970s it was estimated that the provision of a global network would realise a cost advantage is about 50% with digital for a given end to end speech quality. This has been entirely validated and exceeded by experience. It is also worth remembering that the prime advantages of digital include: a controlled signal fidelity almost irrespective of distance transmitted; easier to switch; easier to manufacture; and in principal, easier to interface.

The UK telecommunications network is one of the most modern in the World with 100% availability of ISDN, digital transmission and switching of all long lines circuits and direct customer dialling facilities to over 129 countries. Today the UK enjoys access to more wide band circuits and services than any other European country, and more than the rest of the world per worker employed. All of this was predicated by the very early decision to replace the twisted copper pair and coaxial cables, and the vast majority of the microwave radio network, with optical fibre. Today there is an installed base of over 4M km of single mode fibre linking all UK network nodes with a medium that can offer an almost infinitely expandable bit transport capacity. This fibre now carries over 95% of all UK telecommunications traffic and 75% of international connections. Achieving this advance was the result of timely R&D programmes coupled with strategic decisions on infrastructure and investment matched to exponentially growing demand for services.

To date optical fibre technology has saved the UK Bns in providing low cost telecommunicationnetworks. This will soon be further enhanced through the introduction of optical signal processing which will reduce the dependency on electronics and realise a transparent network. In turn this will see a partial return to analogue forms of transmission and switching, thereby bringing about further reductions in the number of network nodes, equipment, maintenance and operating costs, whilst increasing network flexibility and utility.

These developments, along with creation of a highly competitive open market, have placed greater emphasis on the study of service delivery and facilities that can be provided now and in the far future. They have also drawn attention to such major issues as software reliability and new forms of realisation to meet the needs of mobile and multi-media applications. The outdated protocols created for copper lines and old computer network architectures now represent major performance limitations. The lack of bandwidth in the copper dominated local loop is also a limitation in the realisation of the so called SuperHighway. The solutions to most of this are to hand, they only require the right political and engineering decisions.


Figure 1. Logistic curve of transmission system development.

Genesis
The history of telecommunications has seen a remarkable progression of technology meeting mankind's needs, with an accelerating ability to transport information at lower power and material costs year on year. As each technology has saturated at some fundamental capacity limit, it has been overtaken by some timely and often spectacular (and usually unexpected) advance. This is illustrated in Fig 1 for copper line systems. Here we see the limits of open copper conductors overtaken by loaded twisted pair cables, to be then surpassed by electronic amplification, Frequency Division Multiplex (FDM) and Time Division Multiplex (TDM) on twisted pair and coaxial cables. By 1960 it was becoming apparent that these cable based technologies would present fundamental limits to further progress and growth within a decade or two. The fundamental capacity in terms of the number of simultaneous telephone calls that could be carried, coupled with the basic cost in raw materials, physical space, power and investment was set to create barriers to further progress. With hindsight this situation can now be seen to have posed a significant threat to the progress of the global economy.

One solution to the transmission capacity limit of copper cables and microwave radio that was avidly pursued in the 1970's was that of Trunk Wave guide. A 50 mm diameter helically coiled wire inside a resin based pipe was shown to support a transmission bandwidth of some 60 GHz with relatively low loss. Repeater sites (amplifiers) at 15km intervals were demonstrated to be possible provided the radius of curvature along the route was did not exceed 100m or so. This represented a significant advance over the 2 km repeater spacing required by copper cables, and the economics looked sufficiently attractive for wave guide networks to be planned in the UK, France and the USA. However, the technology demanded costly civil engineering, with laser alignment of ducts and 'straight line' deployment for optimum performance. Nevertheless, development and planning progressed to an advanced stage, production plants for the wave guide were constructed, and the technology was at the point of being launched on a national scale as a backbone trunk system, when it was eclipsed in the mid/late 1970's by the arrival of low loss optical fibre.

Interestingly, optical fibre is also a waveguide, but of a much smaller diameter (~125 um) able to support the higher carrier frequency (~1014 Hz) of light. Among it's key features are: very small size, very low material usage, low production costs, an ability to use existing cable ducts, very low loss, and wide bandwidth (high traffic capacity). In the early days of fibre it was not evident just how dramatic these advances would become, or to what extent they would impact on the national and global economy.

Key Decisions

Repeaters, Amplifiers & Regenerators
As signals are transmitted over any medium they become weaker with distance travelled. So they have to be periodically restored in some way. In the early days of the electric wire telegraph this process was completed by a human being who 'Repeated' or 'Relayed' the signal. Hence the term 'Repeater' for what can now actually be an amplifier or regenerator, and 'Relay' for what is now an electromagnetic switch.

Strictly speaking, analogue signals are amplified, whilst their digital counterparts are regenerated. In the former case the process is ideally linear and distortion free. But in practice thermal and non-linear noise is always added by every amplifier. Signal distortion is therefore strongly linked to the number (and hence distance) of amplifiers (repeaters) in a system. For twisted copper pairs and coaxial cable the distance between repeaters is typically 2km, and thousands have to be cascaded to span continents and the globe. It is therefore extremely expensive to maintain signal quality over great distances.

Digital signals off the advantage of having a finite, and small, number of states. Provided the signal is not allowed to become too weak, it can be simply regenerated by a decision circuit. In practice of course the signal is also amplified, shaped and retimed as a part of the regeneration process. Hence, with good design the accumulation of signal distortion is broken and fidelity maintained irrespective of distance travelled.

Interestingly, digital systems operating over copper cable also have repeaters every 2km. Their fibre counterparts however routinely exceed 30km and may even exceed 200km. So the amount of equipment employed is drastically reduced.

The last two decades has seen a number of key decisions responsible for the radical changes in national and global telecommunications that we all benefit from today. In chronological order, these where:

Figure 2. Logistic curve of transmission system development.

Key Decision 1
During 1972-77 the British Post Office (BPO) decided to opt for a wholly digital network based on optical fibre technology. Major R&D programmes were launched in both switching and system realisation. Each month saw fundamental breakthroughs in fibre and device fabrication, performance and application. Fibre loss fell rapidly while the demonstrable capacity increased exponentially. It soon became apparent that buried repeaters would not be required, which in turn brought about significant reductions in system cost and maintenance compared with copper cables (Fig 2).

Key Decision 2
By 1980 a number of trials had been completed with industry using two different fibre types: multi-mode and single-mode. The former had a larger central core (~50um) and was relatively simple to handle and joint, whilst the latter had a very small core (~8um) and proved more difficult. However, the bandwidth of multi-mode fibre was severely limited by the multi-path propagating of light pulses, whereas single-mode fibre suffered negligible dispersion and offered an almost infinite bandwidth. By 1981 the Japanese had decided to opt for multi-mode, quickly followed by much of North America and most of the rest of the world. However, the BPO decided to go single-mode for all applications. This had two key consequences: all R&D became focused on solving the problems of one technology rather than two, and UK industry only had to produce single mode fibre and devices for the bulk buying sector at the time - telecommunications. As a result the UK was able to realise state of the art systems at a lower cost, and in an earlier time frame than any other nation. Most of the North American players reversed their decision within 12 months, whilst the Japanese continued to pursue the multi-mode route for a further three years. Ultimately they too opted for single mode, but in most countries the illusion of multi-mode fibre being easier to install, and devices being cheaper for the local loop persisted for many years.

Key Decision 3
In 1981 the BPO opted to modernise all aspects of it's long and medium haul networks by the ubiquitous installation of single mode fibre optimised for operation in the 1.3 um window. Problems associated with fibre jointing, splicing, connectors and devices then had to be solved to meet the needs of the network plan.

Key Decision 4
Since the geographic distribution of all BPO buildings (offices, radio stations, switching and repeater sites) at the time was such that interconnection distances were <35 km, systems were specified for a single-span transmission distance that met or exceeded this figure, thus obviating the need for buried repeaters and associated power feed equipment. By the mid 1980's this decision was further endorsed by the demonstration of research systems with a 100 km single span and bit rates >=1Gbit/s

Key Decision 5
The first production systems operated at 140 Mbit/s between the major trunk nodes (switching centres). This immediately realised major operational improvements in terms of lower maintenance cost, fewer outages and better tolerance to external interference. At the same time R&D system trials were realising >4 times this bit rate over the same spans. By 1985 the decision had been taken to upgrade all major systems to 565 Mbit/s, which in turn reduced the cost per bit transported by >1/3.

Key Decision 6
While fibre technology was being deployed, a series of developments led to the use of fusion splicing, as opposed to mechanical and/or epoxy based joints. This is still the case today. One of the many benefits is that the BT network is intrinsically capable of future upgrading to Duplex (bi-directional) operation at the same wavelength (with or without optical amplification) by virtue of the low loss/reflection of joints. This is not always the case in other countries where the use of non-fusion splices has resulted in high reflections and other problematic features.

Key Decision 7
The decision in the late 1980's to extend fibre systems to every switch site in the BT network has radically improved the reliability and performance of the network in all respects.

Key Decision 8
The next logical step would have been to extend fibre into the local loop, but like the Gulf War, the advance of fibre stopped at the border. At this point the dead hand of politics struck to momentarily stem the advance in the guise of a new regulatory regime. By restricting the services that could be delivered, fibre in the local loop was rendered uneconomic for all but the medium and large corporate customers. Direct fibre deployment to these customers is prevalent and R&D for local loop systems has solved all the installation and maintenance problems. For example, the development of Passive Optical Networks and their demonstration at Bishop Stortford, Birmingham, and more recently in the Ipswich and Colchester VoD trials validated the base technologies.


Figure 3. Logistic curve of transmission system development.

Key Decision 9
A four-fold increase in the base bit rate to 2.4 Gbit/s is now planned with the same repeater spacing as for the previous 140 Mbit/s and now dominant 565 Mbit/s systems. This should see a further >1/3 cost reduction per bit carried, and will be coupled with a move to synchronous working, a streamlining of the software loading, and a reduction in the number of network nodes (switches).

It is worth noting that, even if today's state-of-the-art 10 Gbit/s systems were to be deployed, the utilisation of fibre bandwidth is unlikely to reach even 1% for some years (Fig 3). Indeed, it is reasonable to visualise the enormous capacity of fibre satisfying telecommunication needs well into the next millennium.

Key Decision 'N' (Pending)
For any company moving into the local loop the major cost is not cable or equipment but civil engineering. Digging the street up and laying ducts is the major cost by at least an order of magnitude. In BT's case it did all the civil engineering decades ago for copper. The hole in the ground exists, but it is full of copper that ultimately has to be removed and replaced with glass. So why not do it right now? The answer is solely down to economics! With an ability to deliver services that is constrained by regulation it turns out to be unviable. Freedom of action for BT will not come until the new millennium. At that time local loop fibre deployment may prove economic. Certainly the replacement of the old copper network will have to be with fibre.

Interestingly the Cable operators and other competitors in the UK are laying copper pairs and coaxial cable to their customers with optical fibre stopping at some distant node. The primary reason for this is the underdeveloped technology imported from the USA. The fruits of this legacy, which amounts to Bns of investment will no doubt haunt UK networks for decades to come and be manifest in limited bit rates and high operating costs compared to fibre.

What Next ?

Optical Amplifiers
Electronic amplifiers take in the electrons constituting a signal and create even more - this is the process of amplification - but we do not usually think of it this way. Amplifiction means 1mW in and 1W out - more electrons - more current. The additional electrons are provided from a power supply and are steered by transistors or some other device to the output terminals in sympathy with an input signal.

The same is true of light - only light is a series of discrete photons that can be also be amplified. A laser is an amplifier - and it is also an acronym that has become a word - Light Amplification by the Stimulated Emission of electromagnetic Radiation. A few photons propagate down a gas filled tube, crystal or optical fibre and even more are created - amplification. In this case the additional photons can be externally provided from some other light source, or be converted electrons.

The first successful optical amplifiers were modified laser structures. Without their mirrors and a fibre-in, fibre -out assembly at the end facets, a gain block can be realised - an optical amplifier. Later, it was found that by exciting fibre with a high power optical beam of photons, then fibre itself could amplify. By doping the fibre with different rare earth elements (Erbium for example) the efficiency of the process can be greatly improved.

The above historical perspective highlights the giant leap in technological capability compared with the copper past, however the story is far from finished. As the R&D on optical fibre systems continues we might expect to see even more dramatic advances over the next two decades. In order of probable importance, these are:

Optical Amplifiers
Arguably the most significant of recent developments has been the optical amplifier. When used as cable repeaters, not only do they reduce the component count in the transmission path, which has clear benefits to reliability and running costs, but they also open up the full bandwidth of the fibre windows (~60,000 GHz with today's technology, which compares with ~100 MHz for a good copper cable). They also allow the development of novel transmission and network concepts that take full advantage of the properties of fibre.

The first optical amplifiers successfully developed were based on semiconductor lasers. Their structure gives rise to certain fundamental features which limit their overall performance and have detracted from their widespread use. Nevertheless, through their small size and GaAs material base they are now receiving attention with regards to their integration with other GaAs devices and wave guides to form opto-electronic IC's.

The majority of today's developmental work is devoted to Erbium Doped Fibre Amplifiers (EDFA) since they can readily realise all the features crucial to high performance transmission systems. Furthermore, their principal parameters are defined by atomic composition rather than physical geometry as with the semiconductor variety, which means that manufacturing consistency and yield is high. Their realisation is simple, involving the direct doping of a silica fibre core to provide a medium which affords gain when optically pumped at an appropriate wavelength.

The Black Box
It is important to recognise that we have become conditioned to viewing repeaters and amplifiers as discrete entities: the "black box". With the advent of fibre amplifiers has come the prospect of distributed amplification, where the cabled fibre itself provides gain. Under these circumstances transmission is truly lossless. Furthermore, theory predicts that distributed amplification is optimal regarding noise accumulation: an echo of our earlier analogue FDM heritage.

International Systems
Fibre amplifiers in either discrete or distributed form will be central to future transmission routes and networks. For example, the recently installed TAT-12 and TAT-13 transatlantic cable systems each utilise discrete EDFA's throughout their ~7,000 km length. They provide 5 Gbit/s capacity on each of 4 fibres per cable between the UK, France and the USA in a self healing ring. With a speech circuit capacity of 300,000 simultaneous conversations at an installation cost of only $750M, these cables provide >10x the capacity of previous systems at the same cost. Systems have been, or will soon be, installed to span all the oceans and seas of the planet, and already eclipses the combined capacity of all the satellites currently in orbit.


Figure 4. Logistic curve of transmission system development.

Solitons

Optical Repeaters
Naturally enough the first optical repeaters followed the established pattern of their electronic forebears. In fact, the first systems merely had photo detectors and lasers/LEDs glued on the end of existing system designs. But the optimisation of fibres and devices soon delivered a far more efficient set of options. So the basic question is whether to regenerate or merely amplify the optical signal. The argument continues, and is quite complex due to the non-linear fibre physics involved. To date the vast majority of optical systems employ regenerators, and fibre amplifiers have only just arrived on the commercial scene. But they offer the prospect of new signal and network forms by removing all electronics from the signal path - end to end photons opens up a new paradigm.
A new generation of transmission systems that exploit rather than fight against the non-linear properties of fibre are at an advanced R&D phase. The key to these systems is the Soliton: a particular pulse shape which, when correctly launched into the fibre, maintains its shape as it propagates: linear distortion (pulse spreading through dispersion) exactly cancelling non-linear portion (pulse narrowing). After a certain distance solitons do eventually succumb to attenuation and dispersion and so break down. However, by placing optical amplifiers at critical intervals along the cable, or better still by using distributed fibre amplification, pulse breakdown is avoided. Fig 4 summarises some of the bit rate/distance achievements in laboratory, field trials and installed systems.

Figure 5. Logistic curve of transmission system development.

Optical Ether
If we consider exploiting the bandwidth of fibre in a similar manner to the radio spectrum in mobile communications, we rapidly migrate towards an Optical Ether as shown in Fig 5. To address an end point terminal all that is required is the area wavelength (not area code) and the final (fine grain) wavelength for WDMA, frequency for FDMA or code in the case of CDMA. This approach could be extended to replace the central office for <200,000 lines with a 100% non blocking capability, with potential for improvement to 4,000,000 lines. At today's service levels, up to 20-million lines could be accommodated with a potential for significantly more. Beyond this the need for conventional switching of some kind becomes necessary.

The ether concept has been demonstrated in the laboratory with a small number of terminals switching 100 Gbit/s capacities, and in the London network with a more modest capacity. The key problems and limitations now hinge on the realisation of sufficiently miniature and reliable electro-optics, memory capacity, control systems and wavelength standards. The rate of developmental progress towards these requirements currently puts commercial realisation in the 2010 time frame.

Another interesting prospect for transparent networks is that of customer mobility. It is possible to transport raw microwave signals over fibre with only a single stage of frequency translation at each end. A radio cell can therefore be replicated over several remote and diverse locations to give the illusion of a single location. Dispersed locations then start to appear as one to the individual user. The certainty of this prospect increases with the realisation that optical wireless is also being developed for in-building and street use for both telephony and broadband services.

WDM is exactly the same as FDM only the frequencies are much higher. It turns out to be more convenient to define the optical wavelength rather than the frequency, not least because the two techniques are sometimes overlaid. However, you cannot but suspect that had electronic engineers been at the front end of the R&D, instead of physicists, we would be using FDM for both.

TDMA, FDMA, WDMA, and CDMA are techniques for multiple reuse of Time, Frequency, Wavelength and Code so that more simultaneous use can be made of the medium. TDMA is in common use in digital mobile telephone networks and is often referred to as a Ping-Pong system.


Figure 6. Logistic curve of transmission system development.

Optical Bus
The concept of distributed fibre amplification can be applied to the data bus. For example, the optical equivalent of a "tapped bus" has been demonstrated using D-fibre couplers in a network of weakly amplifying fibre (Fig 6). Although noise accumulation eventually dominates, it is nevertheless feasible that such a bus could grow to accommodate >1000 taps. This, coupled with the ability to use fibre amplifiers to offset splitter loss in a branching network that fans-out to several million customers opens up the concept of Virtually Lossless Networks.


Figure 7. Logistic curve of transmission system development.

Analogue -v- Digital
The history of telecommunications has undergone cyclic changes with regard to the use of analogue and digital transmission (Fig 7). The earliest form of digital transmission: telegraphy using Morse Code, was replaced by analogue transmission over a century ago with the arrival of the telephone. Analogue remained dominant in a FDM form up until the late 1950's at which point digital PCM was introduced. Within three decades the balance had reversed, with digital TDM substantially dominating over analogue. Today there is a hint of a return to analogue, with WDM affording a convenient way to increase fibre utilisation without recourse to high speed electronics.

Optical Processing
The use of optical technology in storage and signal processing is common place in certain application areas: witness optical hard disks for PCs and CD players. The impact of optical processing on telecommunications has yet to be experienced, but is likely to be as profound. For example, many research-level neural networks are being realised with optical components as they offer massive parallelism and interconnections. But there are many incentives to migrate towards optical processing. The principal advantages are the potential for high speed with massive connectivity and low power consumption. For example, a massive 1012 parallelism may be realised in an optical processor, using a holographic elements.

Fibres can also afford a degree of storage and optical processing by virtue of "structures" than can be written into their cores as variations in the refractive index. For example, a grating structure gives the fibre a wavelength dependent attenuation which, in its simplest form can realise an optical filter, or in a more complex form an equaliser to flatten the frequency response of an optical amplifier.


Repeater Technologies

And Almost Finally
In less than 20 years optical fibre has revolutionised the global telecommunications network. For BT this has been realised through timely R&D programmes coupled with the planning and deployment of systems developed and manufactured in the UK. In all respects the BT and industry have enjoyed a lead that has seen the UK profit through the realisation of a network and services that are advanced by any standard. During the next 20 years optical technology can be expected to impact even more prominently by realising new networks, services and applications.

The realisation of network capacities orders of magnitude beyond those of today will be essential to service the growth in both fixed and mobile telecommunications. Specifically:

  1. Computer capacity will increase by 1000x in the next decade, and 1,000,000x in the decade that follows. At the same time clock rates will increase by >10x.
  2. Distributed computing is only effective with wide bandwidth channels between computers that are capable of data transfer rates on a par with the internal delays of the computers involved.
  3. The number of mobile units is expected to increase by 20x over that of today in the same time frame.
  4. The teleportation of human senses: voice vision and touch, is already well developed and will be required to meet the need of all sectors as they try to maximise the utilisation of highly skilled and qualified staff.
  5. Usurping physical travel with telecommunications will become a necessity through diminishing natural resources.
  6. Controlling and servicing the global economy will demand improved telecommunications and computing.
  7. The introduction of Virtual Worlds technology is probably just the tip of an iceberg of new services that we could broadly classify as "not invented or thought of yet".

The network decisions and developments necessary to ensure that the infrastructure is available to service this latent and developing demand include:

  1. A consolidation and ultimate eradication of all switching nodes.
  2. The deployment of Optical Networks in the local loop.
  3. The deployment of optical amplifiers to realise network transparency.
  4. The deployment of new forms of software to reduce the massive risk build up due to unnecessary complexity.
  5. The introduction of WDM as a switching mechanism to replace TDM.
  6. The realisation of new fibre based devices that will simplify the coding, selection and switching processes.
  7. A migration away from digital back towards analogue signal formats, modulation and signalling.
  8. The introduction of pre-emptive fault/failure forecasting and diagnosis.

All of the technologies to realise such a revolutionary change in the next 20 years has been, or is currently being researched. Among the major challenges will be their planning and phased introduction to meet the increasing demand for more bandwidth and new services. Probably the major obstacle will be the legacy systems of copper, radio, old optical fibre and human mind sets.

The recent trend of consumer electronics leading the high technology industries in the development of applications, both at the forefront of the technology capability and in a very short time frame, poses a significant challenge. Unless the telecommunications industry realises networks that are far more versatile, progress will either stall or change direction. In this respect the UK position looks more favourable than most.


Sub-system Reliability

The Ultimate Challenge
The much publicised super highway is still a dream. No one knows how to design or build it. Combining all the facilities of the telephone, broadcast, cable and computer networks in one grand entity looks very ambitious to say the least. One thing for sure ATM is not the answer. Such networks operate very different to any other and as a result exhibit new and interesting phenomena such as information waves.

Harmony & Chaos
Just 20 years ago all telephones were on the end of a wire and static with users making an average of 2 or 3 telephone calls per day at unrelated times. True there were busy hours, and meal times and tea breaks would see a distinct lack of calls, but by and large calls were governed by random events. This all changed with the arrival of TV phone in programme. Someone singing a song on TV could result in half a million people telephoning London to cast votes for their local hero in the space of 15 minutes. A new world of network chaos was born.

With the arrival of the mobile telephone a new phase of chaos erupted. Traffic jams, train and plane cancellations all trigger correlated activity - everyone calls home or office within a few minutes. Naturally enough, cellular systems become overloaded as thousands of people demand to be connected at the same time. So a transition has occurred, from a random world of reasonably distributed events, to a highly localised and correlated world of activity triggered by anything causing us to act in unison.

All of this might seem trivial and easy to repair, but consider the prospect of networked computing. When 5 or 10 of us meet our low cost NCs will be plugged into the same line or server. At critical times during our discussion, we will all wish to access information or download to distant colleagues. This will be correlated activity with a vengeance and on a large scale that is difficult to contemplate.

Probably the most famous example of correlated activity between machines was the computerisation of the London Stock Market and the Big Bang. Here machines programmed with similar buy and sell algorithms had no delay built in. Shortly after cutting over from human operators to machines the market went into a synchrony of buy, sell, buy, sell. This is an existence theorem for uncontrolled chaos - it happens.

Many people equate chaos to randomness, but they are very different. Chaotic systems exhibit patterns that can be near cyclic, and often difficult for us to perceive. Random systems on the other hand are totally unpredictable. Curiously without computers we would know little or nothing about chaos, and yet they may turn out to be the ultimate generators of network chaos on a scale we might not be able to match.

Information Waves
Anyone who drives on motorways will have experienced traffic waves created by some unseen event. Probably the best place to experience this phenomenon is M25 when, for no apparent reason, the traffic speed can oscillate between 10 and 70 mph for long periods. Sometimes the traffic comes to a complete halt and then lurches forward to 40 mph and back down to 0. This is the classic behaviour of a system of independent entities in a serial queue having a delay between observation and action. In this case the observation might be an accident, a breakdown, or someone driving foolishly. The delay is between our eye, brain and foot. As soon as we see something and we reach for the break pedal then very shortly afterwards so does everyone else, and the wave starts.

There is no doubt about it, rubber necking when driving is dangerous, but people do it. An accident occurs and people slow down to take a look, and then they speed up. Strangely when the incident has been cleared away the wave that has been set in motion may last the rest of the day. Whilst the traffic is dense, the wave motion persists long after the event has subsided. The system has an unseen memory - us. Might we then expect similar phenomena in electronic systems for communication between people and machines. This is a racing certainty and the technology and phenomenon is already with us.

Packet switching and transmission systems so beloved of the computer industry are ideal for the creation of information waves. To date these have largely gone unnoticed because terminal equipment's assemble packets to construct a complete message, file or picture. The end user sees nothing of the chaotic action inside the network as the information packets jostle for position and queue for transmission. Only when we try to use computer networks for real time communication do we experience arrival uncertainties. Our speech sounds strange with varying delays in the middle of utterances and moving pictures contain all manner of distortion.

The reality is packet systems are fundamentally unsuited to real time communication. So why use them? It turns out that for data communication where arrival time is not an issue, they are highly efficient in their use of bandwidth. These systems were born in an era where bandwidth was expensive and they represent an entirely different paradigm to the telephone network. However, the champions of 'packet everything' always like to tell you that this is the true way for information to be communicated. Curiously they often do this by sending you a single line e-mail message with a 35 line header.

So what of the future? We now have a world of optical fibre and infinite bandwidth. A world of shrinking geography, where distance is becoming irrelevant and where the fundamental reliability of networks, communication and computing is increasingly dependent upon the electronics used for transmission and switching. If we are to see significant advances in reliability and performance then we have to eradicate much of the electronics used today. The contest will then be between two philosophies, the circuit switching of the telephone network and the path or packet switching of the computer industry. But because an optical fibre is like the M25 with a million lanes and no bad drivers, it might just be that these two diverse approaches will coalesce with the switching of light.

Who Pays?
The demand for telecommunications bandwidth is insatiable, the market competitive and vibrant, investment levels high, and still the SuperHighway seems a long way off. People complain of network costs when a telephone call to the USA is now less than 30p a minute, whilst in 1956 it was 2.80. In contrast the price of a new car has gone up from 450 to 11,000 in the same time frame. How strange! Is it that we value telecommunications more than transport that drives us all to expect free access, or is it the other way around? Telecommunications charges in the UK are among the lowest in the world, up to 5 times lower than West Germany and France, and very much on a par with the USA. But in the USA local calls are free - well sort of - they are actually cross subsidised by long distance traffic. The deregulation of the USA market is removing the scope to continue this subsidy. Internet, Web TV and other heavy local call usage is putting a severe strain on the networks and we may see the need to charge emerge as a means to control demand. In the UK things are different.

It will probably take at least two decades before we all have access to the SuperHighway - fibre to office and home. It will cost around 15Bn to install and the money has to be created from somewhere. Freedom to carry any form of traffic is an essential requirement for any company to generate the required revenues. Until then our access will be limited - but the costs will continue to fall. When we do get fibre all the way the cost savings to operators will translate into the single most dramatic fall in charges. Then, bandwidth, distance and time will be insignificant in the charging equation. We just won't care - bits will be like the rain - almost free.

Word Count = 5169 (Outboxes and Figures are extra)

All materials created by Peter Cochrane and presented within this site are copyright Peter Cochrane - but this is an open resource - and you are invited to make as many downloads as you wish provided you use in a reputable manner