The human mind, and particularly the human memory, works in strange ways. The
Officious Bystander (TOB) has a mind like a down-market antique
shop, crammed from the floor-boards to the ceiling with useless
bits of old junk. Like the proud owner of such a shop,
TOB has convinced himself that some of his junk is very valuable,
although others are unable to see its worth. But on any
view, most of it is totally useless - and very hard to get rid
of. In TOB’s case, 1969 was a golden year for the acquisition
of useless information. He was then in fifth grade, at
a one teacher State primary school. The teacher seemed
quite old, although he was probably younger than TOB is now.
He was also a stern disciplinarian, and had enormous faith
in the benefits of rote learning.
TOB will not deny that some of this rote learning was distinctly
beneficial - spelling, and the “times tables”, for example.
TOB has never regretted being compelled to learn, by heart,
the capitals of Europe, from Oslo in Norway to Sofia in Bulgaria;
though it seems strange that, at a time when young Australian
men were fighting and dying in Vietnam, it was thought more
important for Australian children to know that Lisbon is the
capital of Portugal, than to know that Djakarta (as it was then
spelt) is the capital of our nearest neighbour.
Amongst the many lists of information which TOB and his class-mates
had drummed into them was “the Sunshine Route” - a list of cities
and towns along the Queensland coast from Brisbane to Cairns.
It is surprising how often it proves useful to recall
that Home Hill comes after Bowen, and before Ayr.
One category of information, learnt by rote, has proved to
have a less enduring relevance - namely, the Imperial system
of weights and measures. But in 1969, nobody could have
foreseen the adoption of the metric system by Gough Whitlam’s
francophile government. Had Whitlam not introduced this
reform, in emulation of his hero Napoleon, one might still have
regular occasion to utilise the knowledge that there are 5˝
yards (or 16˝ feet) in a "rod, pole or perch”; that there
are (confusingly) 112 pounds in a hundredweight; or that there
are four pecks in a bushel. We were even taught the old
monetary system (pounds, shillings and pence) - although this
had been replaced three years earlier, on 14 February 1966 -
perhaps because the educational authorities were concerned that
the move to decimilisation might be reversed, but more probably
because the text-books which we were using had not yet been
updated.
The acquisition of all this inutile information was more
than compensated by rote learning of great and enduring significance.
It is difficult to imagine how any responsible educational
system could send students out into the world without the ability
to recite, from memory, at least one poem by each of Banjo Patterson
and Henry Lawson.
There is, however, one thing which has stuck in TOB’s mind
these past 32 years, which was not learnt by rote. It
appeared in a science text-book, which was a companion to the
science lessons which we received by “wireless radio” broadcasts
from the ABC. The long-forgotten author of this text predicted
that, “by the year 2000" (which then seemed a very long
way off), computers would be so small and so powerful that a
machine no larger than a television set would be able to perform
the work then undertaken by huge main-frame computers;
and, moreover, that these devices would become so inexpensive
that it would be common to see them in business offices, and
perhaps even in people’s homes.
In 1969, this was a remarkable prediction. This was
the year of the Apollo 11 moon landing, and Neil Armstrong’s
"great step for mankind”. It is said that the computer
processing power available aboard Apollo 11 was the equivalent
of what is now contained in a pocket calculator. The enormous
main-frame computers at Mission Control in Houston could not
process the volume of data being transmitted, to the point that
Armstrong had to take manual control of the lunar module (the
Eagle), and land it with no more assistance from computers than
Captain Cook had when he landed at Botany Bay.
In 1969, apart from the obscure writer of an obscure science
text for school-children, nobody had any conception of how computers
would revolutionise all aspects of society over the next quarter-century.
Even those closest to the action had little idea. In
1943, the founder and chairman of IBM, Thomas Watson, expressed
the view that "there is a world market for maybe five computers”.
In 1949, Popular Mechanics magazine predicted that "Computers
in future may weigh no more than 1.5 tons”. Even as recently
as 1977, the founder and chairman of Digital Electronics Corporation,
Ken Olsen, could not understand why “anyone would want a computer
in their home”. And Bill Gates, of MicroSoft, is on record
as saying that he could not see why anyone “would ever need
more than 640K of RAM”. These predictions are on a par
with the pronouncement by Richard Woolley, the British Astronomer
Royal, in 1956, that "Space travel is utter bilge”; less than
12 months later, Sputnik went into orbit.
Anyone who has seen the recent cinema release, Enigma, will
have noticed the huge and primitive computer which was used
by the code-breakers at Bletchley Park to crack the Germans’
“Enigma” codes during the Second World War. (Enigma, by
the way, was the British term; the Germans’ term was Ultra).
This computer was the work of a mathematician, Alan Turing,
who in 1938 published a mathematical paper entitled On Computational
Numbers, proposing a calculating machine which he (immodestly)
called the “Universal Turing Machine”. The machine constructed
at Bletchley Park, called Colossus, was capable of performing
about 5,000 calculations per second - that is, it had what would
now be called a “clock speed” of 5,000 hertz (Hz), or 5 kilohertz
(KHz). Today, computer processing power is measured in
megahertz (MHz) or gigahertz (GHz). A desktop or laptop
computer with a processing capacity of 100 MHz could perform,
in about eight minutes, the decoding process which Colossus
took 15 hours to achieve. The latest microprocessors have
“clock speeds” in excess of 1 GHz - that is, one thousand
million hertz, or about 200,000 times faster than Colossus.
Later in the war, Turing went to New York to explain his
invention, and seek support for its improvement. At a
meeting of corporate executives in the technology sector, he
was asked whether what he was proposing could be described as
a “powerful brain”. Turing’s reply was, “No, I’m not interested
in developing a powerful brain. All I’m after is just
a mediocre brain, something like the President of the American
Telephone and Telegraph Company”.
To describe the growth in computer processing power since
the 1960s as “exponential” is literally accurate. In 1965,
Dr Gordon Moore, a founder of Intel Corporation, predicted that
computer processing power would double every 18 months, whilst
the cost would remain constant. This algorythm has proved
remarkably accurate for more than 35 years. On the other
hand, cynics suggest that the benefits of “Moore’s Law” are
neutralised by “Gates’s Law” - the proposition (attributed to
Bill Gates) that “the speed of software halves every 18 months”.
In other words, whilst new computers may have the capacity
to perform twice as many calculations per second as compared
with computers manufactured 18 months earlier, the latest software
requires twice as many calculations to be performed in order
to achieve the same result. It is ironic that Gates’s
company, MicroSoft, is the worst culprit of the “software bloat”
which produces this result.
Can Moore’s Law continue to hold good in the future? Computer
industry commentators suggest that it can - for at least another
5 or 10 years. Ultimately, however, it must hit a brick
wall; a real and (possibly) insuperable obstacle to increased
speed. This is the point at which “Moore’s Law” runs into
“Einstein’s Law”.
In simple terms, microprocessing speeds are enhanced by cramming
increased numbers of transistors on to smaller microchips. But
there is a finite limit beyond which the size of a microprocessor
chip can be reduced no further. Already, the printed circuits
on a silicon chip are no more than one or two atoms wide. According
to conventional physics, they cannot be made any smaller.
Electromagnetic signals travel within a microprocessor at
a speed approaching the speed of light, which again is an insuperable
barrier - at least according to conventional physics. Whilst
the speed of light (300 million metres per second) is incredibly
fast, it is a threshold which the “clock speeds” of modern microprocessors
are rapidly approaching. If a silicon chip is (say) 2
centimetres wide, then, at the speed of light, only 15 thousand
million electromagnetic signals can pass across it every second
- which equates to a “clock speed” of 15 GHz. If the size
of the chip can be reduced to (say) 1 centimetre, a maximum
speed of 30 GHz is conceivable. But if it is accepted
(in accordance with conventional physics) that circuits cannot
be thinner than 1 atom, and that electromagnetic charges cannot
travel faster than the speed of light, Moore’s Law cannot continue
to function beyond the end of this decade.
If this is the effect of “conventional physics”, what about
“unconventional physics” - that is, quantum physics. This
seems to be the logical future of computer technology. Quantum
physics is concerned with the behaviour of sub-atomic particles,
offering the prospect that the size of central processing units
can be reduced even further. And the remarkable thing
- the truly remarkable thing - about quantum particles is that,
at least in theory, they can travel faster than the speed of
light.
The use of quantum particles in computing is fraught with
problems. Not the least of these are the purely mechanical
problems in containing and manipulating particles which are
smaller than atoms. You cannot keep sub-atomic particles
in a sealed container, any more than you can carry water in
a butterfly net - and for the same reason. Sub-atomic
particles can only be contained by other sub-atomic particles;
and even then, there is the problem of "quantum tunnelling”.
According to classical (Newtonian) physics, if a body
has sufficient energy to pass through a barrier, it will always
do so; if it has insufficient energy to pass through a barrier,
it will never do so. Sub-atomic particles do not seem
to understand this principle. If a quantum particle possessing
a certain energy level is confined by a greater force, there
is a finite probability that the particle will escape; eventually,
it will inevitably do so.
Even more perplexing, however, is the impact of “Heisenberg’s
uncertainty principle”. Essentially, this principle dictates
that the behaviour of a sub-atomic particle can never be predicted
with certainty, only as a matter of probability - hence the
(mercifully theoretical) paradox of Schrodinger’s cat, which
is placed in a sealed chamber along with a mechanism which is
capable of killing the cat, depending on whether or not a quantum
event occurs. Accordingly to quantum theory, the cat is
in an indeterminate state - that is, it is both alive and dead
- until someone opens the container and observes its condition.
Einstein had a lot of trouble accepting the “uncertainty
principle”, declaring his belief that “God does not play dice”.
Yet the “uncertainty principle” is still a mainstay of
quantum theory.
Despite these extraordinary difficulties, research institutions
are currently exploring the prospects of quantum computing.
Whilst this research is being undertaken at a range of
institutions around the world, one of the leading research teams
is the Special Research Centre for Quantum Computing Technology,
at the University of Queensland.
The fact that sub-atomic particles can travel faster than
light also raises the possibility, at least in theory, of "backward
causation” - that is, the idea that an event at one moment in
time can be caused by an event occurring at a later moment in
time - which is the nearest thing to practical “time travel”
which serious physicists are prepared even to conceptually entertain.
This does not infringe Stephen Hawking’s “chronology protection
conjecture” - that quantum effects “make the universe safe for
historians” - for the very reason that the Heisenberg uncertainty
principle makes it impossible to regulate the phenomenon of
“backward causation”. So even quantum computers will not
be able to calculate tomorrow’s prices on the stock exchange,
or the winner of next year’s Melbourne Cup. But quantum
computers do offer the prospect of a “quantum leap” in computer
technology, so that today’s Pentium IVs will one day be considered
as cumbersome and inefficient as Alan Turing’s Colossus.
Such computers will be highly efficient at storing, retrieving
and manipulating the kinds of data which formed the basis of
rote learning in TOB’s primary education. But TOB expects,
and indeed hopes, that even a quantum computer will never be
able to write poetry like Patterson’s and Lawson’s. |