# Why we should thank the Victorians for our PCs

## Astonishing inventions that made the IT revolution possible

Boolean logic

Born in Lincoln in 1815, George Boole devised a form of logic that operates on just two values which can alternatively be thought of as true or false, or – more pertinently to our discussion of computing – 1 or 0.

Boolean logic defines various ways in which these values can be manipulated and combined. Examples include the AND function and the OR function, both of which take two inputs and then produce a single output. With the AND function, the output is a 1 only if both the inputs are 1s; whereas the OR function produces an output of 1 if either or both of the inputs are 1s.

The apparent simplicity of these functions doesn't do justice to their power. By combining the simple electronic building blocks that implement these functions (which are referred to as AND gates and OR gates), it's possible to create flip-flops, adders, shift registers and many more of the constituents of a computer that can work on binary numbers. As such, Boole had laid the theoretical foundations for today's computers.

Of course, it's a perfectly valid question to ask what would have been wrong with computing with decimal numbers, as Babbage's Analytical Engine was designed to do. A look at one of the earliest electronic computers, the University of Pennsylvania's ENIAC, provides just a glimpse of the advantages offered by moving to binary.

ENIAC handled decimal numbers, storing each digit in an electronic circuit called a ring counter that contained 36 valves. As 10 of these ring counters constituted a register, it took 360 valves to store a number in the range -9,999,999,999 to +9,999,999,999. By way of contrast, a 32-bit binary register can store numbers in the range –2,147,483,648 to + 2,147,483,647.

Using similar electronic circuits to those used in ENIAC, a 32-bit register required just 64 valves (70 for 35 bits) or, in today's terms, 64 transistors.

Data transmission

Today, data processing goes hand-in-hand with data communication, but to see the first developments in this technology we need to cross the Atlantic. Predating the telephone by more than 30 years, the telegraph is often considered the poor relation.

This undervalues the pioneering work of Samuel Morse, who first demonstrated the code that bears his name back in 1844. Morse Code uses short and long signals (known colloquially as dots and dashes) interspaced with gaps of varying lengths to represent letters, numbers and a range of symbols.

It's really not too different from ASCII (American Standard Code for Information Interchange), which is the code used in current-day data transmission. It was designed to be sent by hand using a finely balanced switch called a Morse key and received by ear or as marks on a paper strip, but it's also possible to use Morse Code for automatic data transmission by computer.

In providing a means of automatic data transmission it achieved what has only become possible in recent times (and then not perfectly) through voice communication. It's commonly assumed that Morse Code was designed arbitrarily and that it was just by chance that the code for E, for example, is 'dot' whereas that for J is 'dot dash dash dash'. This does Samuel Morse a great disservice, as we'll see if we fast-forward over a hundred years to 1952.

One of the first methods of data compression – and now a widespread and essential technology in areas as diverse as data communications, photography and music reproduction – was Huffman Encoding. This analyses the data stream to determine how frequently each character occurs and then assigns codes of variable length, with shorter codes going to the most commonly encountered characters.

Although infrequently encountered letters end up being represented by longer codes than in non-compressed text, the short codes used for the common letters more than compensate, so the compressed text can be as little as half the size of the original.

Going back to Morse Code, the two most common letters in the English language, E and T, are represented by 'dot' and 'dash' whereas the two least common ones, Q and Z, are represented by 'dash dash dot dash' and 'dash dash dot dot'.

So not only is Morse Code the world's first system for data transmission, it also has a built-in method of data compression.

Moving on from Morse's telegraph lines to the wireless data transmission that's so familiar to us today, we need to return to this side of the Atlantic.

Born in Bologna, Italy, Guglielmo Marconi moved to England when the Italian government failed to invest in his work, which involved experimenting with the electromagnetic waves that were first discovered by Heinrich Hertz.

But while the German physicist had a theoretical interest in what would eventually be called radio, Marconi's interest was much more practical in nature. The history of Marconi's development of the wireless telegraph was one of increasing the transmission range step by step.

In 1897, in a demonstration to the British government, he transmitted a signal over a distance of 6km on Salisbury Plain. Later in the same year he demonstrated that radio waves could travel over the sea, first at a range of 6km and then over 19km.

During 1899 Marconi first achieved communication between Britain and France, and later equipped three ships of the Royal Navy with radio equipment allowing them to communicate over a distance of 137km.

Marconi's equipment might have transmitted and received radio signals, but the hardware bore no resemblance to the equipment that does the same job today. With no electronic devices such as valves or transistors, Marconi resorted to the brute-force approach.

To generate the signal he used a spark transmitter that applied a high voltage across an air gap to produce a lightning-like spark. The result was broadband electromagnetic radiation from ultraviolet through visible and into the radio spectrum.

His receiver used a coherer, a glass envelope containing iron filings that coalesced under the influence of radio waves thereby allowing an electrical current to flow. The result of using such crude equipment was that the transmitter had to use very high power levels and the antennae were huge.

The station in Cornwall that Marconi used to achieve the first ever trans-Atlantic transmission demonstrated this. The transmitter consumed 25kW – several thousand times more than a mobile phone – and the antenna was supported by four 66m masts. Crude it might have been, but it worked – and the world suddenly became a lot smaller.

Many of the building blocks of the modern world might have come into place during the reign of Queen Victoria, but that doesn't necessarily mean that the information age could have been born back in the 19th century.

Babbage was a forward thinker but, hampered by the technology of the time, his Analytical Engine could only ever have calculated anything at a snail's pace. His creation couldn't have possibly made a dent in what we expect of today's computers.

The missing link – the one that did indeed act as the catalyst for the digital age half a century later – only saw the light of day after Queen Victoria's death. In 1904, the English physicist John Ambrose Fleming discovered that a glass envelope from which the air had been extracted would permit the flow of electricity between a heated cathode and an anode in one direction only. Today we'd call his creation a diode valve.

Three years later, American inventor Lee de Forest discovered that by placing a coil of wire between the anode and the cathode, the current flowing between them could be influenced by the application of a small voltage. This contributed significantly to radio communication. In allowing a voltage on one circuit to control what was happening on another circuit, it was also the first electronic switch.

The triode – as de Forest's invention was called – could form the basis of Boolean logic gates and hence of an electronic computer. So much of what was needed for a computing revolution was therefore in place, but for the lack of that final piece of the jigsaw.

As it happened, we had to wait until the late 1940s for progress to continue.

-------------------------------------------------------------------------------------------------------

First published in PC Plus Issue 289

Liked this? Then check out 5 technologies to thank the 1950s for