Thursday, August 30, 2007

Faster Calculating


Computer pioneer John W. Mauchly was born on this day in 1907 in Cleveland, Ohio.

John Mauchly was an obscure professor of physics --in fact, he was the whole physics department at Ursinus College in Collegeville, Pennsylvania -- who was best known locally for his entertaining "Christmas lectures" on basic physical principles, illustrating Newton's laws with skateboards and bringing spectroscopic principles to bear on finding out the contents of Christmas gifts wrapped in colored cellophane.

His avocation, however, was weather prediction, and when he wasn't teaching Physics 101 to pre-med students he was writing papers on the effect of solar activity on rainfall patterns. His sticking point, however, was that the mathematical analysis required to support his theories required a better, faster calculating machine than he had ever encountered.

Believing electronics to be the answer after a visit to Iowa to see the pioneering work of John Atanasoff, in 1941 Mauchly enrolled in a U.S. War Department- sponsored "defense training in electronics" course at Penn, where he met Pres Eckert, a recent graduate from the Penn electronics department. Together they discussed the possibility of designing an electronic calculating machine, which culminated in Mauchly's proposal for defense funding, "The Use of High-Speed Vacuum Tube Devices for Calculation." The government bit on the concept, needing faster calculators to calculate ballistic missile trajectories, and in 1943 Mauchly and Eckert began work on the ENIAC -- the Electronic Numerical Integrator and Computer.

As they built the 30-ton behemoth, they had to figure out not only how to coordinate the activities of 18,000 vacuum tubes, but they had to find wire that rats would not eat. Ultimately ENIAC was used for 8 years on hydrogen bomb problems and calculation of Russian weather patterns.

Next, Mauchly and Eckert began to work on the EDVAC (Electronic Discrete Variable Computer), a stored-program machine, but mathematician John von Neumann grabbed the reins of the project and Mauchly and Eckert got into a dispute with Penn over the ownership of their designs; so in 1948 Mauchly and Eckert formed their own company, the Eckert-Mauchly Computer Corporation, to commercialize computers. They built the UNIVAC I, which they sold to such firms as Prudential and A.C. Nielsen (as well as the U.S. Census Bureau) for a price of $150,000 each, but they were cash poor and sold out to Remington Rand, the typewriting company, in 1950, which in turn sold out to Sperry in 1955.

Having demonstrated that there was a market for large-frame computers, their much better financed competitor, IBM, began to pour its resources into the opportunity, eclipsing the success of Mauchly and Eckert's UNIVAC. Mauchly and Eckert, however, had shown the world that electricity could be used to solve mathematical problems and produced the first commercial electronic digital computer.

Mauchly died on January 8, 1980.


Labels: , ,

Sunday, March 11, 2007

As We May Think


"Almost forgotten today, he essentially invented the world as we know it: not so much the things in it, of course, but the way we think about innovation, what it means, and why it happens." -- G.P. Zachary.

Vannevar Bush was born on this day in 1890 in Everett, Massachusetts. He was that rare combination of entrepreneur, visionary and mechanic -- a person whose handiwork has left profound marks on the scientific, governmental and economic features of the 20th century landscape, and whose bold technological paradigms continue to have an impact on the information age.

After receiving his Ph.D in engineering from Harvard-MIT, Bush spent World War I developing a magnetic submarine locator for the U.S. Navy, but became frustrated with the red-tape which not only interfered with Bush's design process but resulted in only 3 devices being installed in Navy ships before the Armistice. It was then that he realized that an engineer who did not understand politics and economics and the effect of new scientific advancements on existing political and economic institutions would never amount to anything. He returned to MIT after the War and, applying his appreciation of politics and economics, plunged into both research and administration, helping to make MIT a major center for electrical engineering in the process.

While at MIT in 1935, Bush designed and built a machine for calculating complex differential equations, perhaps the first practical forerunner of all modern computers. Called the Differential Analyzer, the machine looked something like a printing press, weighing 100 tons, and to program it one had to employ screwdrivers and hammers -- but it was most effective in ballistics research, permitting the rapid calculation of artillery firing tables accounting for variables such as temperature and wind.

Also while at MIT, Bush was the driving force behind the commercialization of a great deal of technology, virtually pioneering the concept of the "university spin-off company," co-founding Raytheon Manufacturing to build radio tubes as well as a half dozen other companies which eventually not only made him a wealthy man but gave him years of first-hand entrepreneurial experience.

In 1939, Bush moved to Washington to head the independent Carnegie Institution, and the following year became the chief of Franklin Roosevelt's National Defense Research Committee. Judging that the lack of coordination between science and government was a national security risk, he used his influence to obtain government funding for science research (unheard of at the time), particularly in the area of nuclear physics. Advocating the replacement of the outmoded tradition of having the government run factories, he was the architect of the system of awarding federal contracts to business and actively promoted cooperation among government, business and academia to encourage scientific advancement.

By 1941, Bush was administering all Allied defense research with his red-tape-cutting, slash-and-burn style, playing a supervisory role in the development of everything from radar to sulfa drugs to the atom bomb. Einstein may have convinced Roosevelt that it was important, Oppenheimer may have directed the Manhattan Project, but it seems that, for better and for worse, Bush was the only man willing to herd all the cats necessary to permit the Manhattan Project to be born.

By continuing to press for governmental cooperation and the judicious use of federal funds for science after World War II, Bush kept the scientific infrastructure in place which led to the development of such programs as the Internet (through the Advanced Research Projects Agency, which he conceived) and NASA.

Bush retired from scientific administration in 1955, but by that time he had begun yet another career as a theorist. In his 1945 Atlantic Monthly article, "As We May Think," Bush described a hypothetical device called a "memex," a means for harnessing the information explosion through a universal library which could be designed to allow its owner to link in some automated fashion associated pieces of information, creating "trails" of thought which could then perhaps be shared by others. Bush's exploration of the possibilities of linking information is not only a forerunner to Ted Nelson's idea of "hypertext," but would appear to have been the germ of the idea behind the organization of Tim Berners-Lee's World Wide Web; in addition, in the same article he proposed such concepts as a machine which could type one's words as they were spoken, and a cyclops camera, to be worn on one's head for recording what one sees.

Vannevar Bush died on June 28, 1974 in Belmont, Massachusetts.

Labels: , ,

Friday, June 23, 2006

Codebreaker


"One day ladies will take their computers for walks in the park and tell each other, 'My little computer said such a funny thing this morning.'" -- Alan Turing.

Mathematician and code-breaker Alan Turing was born on this day in 1912 in London. A brilliant youth who was neglected by his parents, Alan Turing was shy and socially-awkward, despite being a gifted athlete as well as a math whiz. At boarding school he developed a deep friendship with a schoolmate, Christopher Morcom, whose death when Turing was 18 left Turing devastated. Both before and after Morcom's death, it appears that Turing was confused about his own sexuality, although in later years, while he made occasional comments about having children and settling down, he was thought to have been exclusively homosexual.

In 1931, Turing entered King's College, Cambridge and found it rocked by a debate over the limits of mathematical analysis in reaction to the concept of "undecidables" put forward by Kurt Godel, who proved that there were some mathematical problems that were beyond the reach of logic. At 25, Turing imposed his indelible stamp on the debate -- and the history of computer science -- with a paper entitled "On Computable Numbers," in which he described an imaginary machine that would be capable of answering any mathematical question which could be logically answered through algorithms. Turing observed that such a machine (which he dubbed the "Universal Turing Machine"), though universal in its ability to apply mathematical logic to a problem, would be incapable of deciding whether a question was "undecidable," thus supporting Godel's view.

More interesting to computer historians, however, was that Turing's conceptualization of the Universal Turing Machine was a theoretical blueprint for the modern programmable computer. Before anything could be made of the idea, World War II had begun and Turing was invited to join the secret group of British cryptanalysts at Bletchley Park, where he led the British government's attempts to break German codes scrambled by Arthur Scherbius' Enigma scrambling machine by conceptualizing the descrambling approach and stringing together multiple descramblers. In conducting the war, Winston Churchill came to rely heavily on Turing's work, and even increased funding to the project based on a direct request from Turing himself.

After the war, Turing worked with the British National Physical Laboratory and at the University of Manchester to attempt to build a prototype of the Universal Turing Machine. A proponent of artificial intelligence, Turing unveiled his now famous "Turing test" in 1950, whereby a subject would be locked in a room to pose questions to 2 unseen answer providers, one human, one computer; if the subject was not able to tell which was the human and which was the computer by the content of the answers, then according to Turing the machine would be said to be "thinking" as well as the human.

In 1952 he admitted to police that he was having a homosexual affair, and was convicted of "gross indecency." Subjected to female hormones as "therapy" for curbing homosexual lust, Turing grew depressed and committed suicide on June 7, 1954 in Wilmslow, Cheshire, England by eating a cyanide-poisoned apple (he had been an avid fan of Disney's Snow White and the Seven Dwarfs) at the age of 41. His role in cracking the Enigma code would not be revealed until the 1970s.

Also an accomplished marathon runner, during the 1940s, Turing was unofficially one of Britain's finest, achieving a personal-best time of 2:46:3, only about 11 minutes behind Delfo Cabrera's 1948 Olympic gold-medal winning time of 2:34:51.6.

Labels: , , ,

Saturday, June 17, 2006

Nelson's Xanadu


"The story of Ted Nelson's Xanadu is the story of the dawn of the information age. Like the mental patient in Thomas Pynchon's Gravity's Rainbow who believes he is the Second World War . . . Nelson, with his unfocused energy, his tiny attention span, his omnivorous fascination with trivia, and his commitment to recording incidents whose meaning he will never analyze, is the human embodiment of the information explosion." - G. Wolf.

Information theorist Ted Nelson was born on this day in 1937 in Chicago, the son of film director Ralph Nelson and actress Celeste Holm.

After studying philosophy at Swarthmore, Nelson was pursuing a master's degree in sociology at Harvard when he enrolled in a computer course and began to have visions about the future of information. There he made an attempt, before the invention of word processing systems, to create a "writing system" which would allow writers to store and edit their work; unfortunately, he took an "incomplete" in the course. In the 1960s, he became known as a computer theorist, without actually producing software, and coined the words "hypertext" and "hypermedia" to refer to the linking of related texts or media -- a concept which had been explored as early as 1945 by Vannevar Bush.

With a growing reputation as a visionary, Nelson worked in and out of business and academia attempting to advance his ideas about the nonsequential, interlinked presentation of information and his predictions of millions of simultaneous users of this information, but his chronic lack of focus (he suffers from attention deficit disorder) and rebelliousness (among his favorite maxims are "most people are fools, most authority is malignant, God does not exist and everything is wrong") got him bounced from job to job.

From the late 1960s, however, Nelson has been actively supervising the design of Xanadu (named after the "pleasure dome" referred to in the unfinished poem by Samuel Coleridge, Kublai Khan), a proposed hypertext system in which all links between text are two-way and which would provide for the publication of comments on existing works to appear as anntotations; parallel retrieval and editing; version management; and an efficient system of copyright management. In effect, he had envisioned a universally accessible, self-updating electronic library/town meeting.

In its early days, the proposed system anticipated Tim Berners-Lee's World Wide Web; since the emergence of the Web in 1990, Nelson has offered his proposed system as a less autocratic, more multi-dimensional and interactive alternative to the Web, which he disparages as a mere "child's wagon" in terms of its power and complexity. Like Charles Foster Kane's Xanadu construction project in Orson Welles' Citizen Kane, Project Xanadu is "still unfinished," although it has had the backing of no less than Autodesk (for a time) and despite the fact that Nelson had predicted its release as long ago as 1976, 1988 and 1991. According to the official Xanadu website, Nelson's investors forced his work to be made available in an "open source" environment in 1999, although Nelson is seeming to insist that Project Xanadu is ongoing as an independent project.

Nelson's critics tend to portray him, at worst, as a woolly charlatan, leaving behind him a pile of unfinished projects, disgruntled investors, and a collection of clever new buzzwords (including "docuverse," "cybercrud" and "softcopy"); at best, they see him as a brilliant, compulsive mad-monk, squandering his genius by toiling away at the mystically unattainable -- like Isaac Newton in his later years, searching for the keys to alchemy.

Labels: , ,

Saturday, April 29, 2006

"1 J/degree K = 10(23) bits"?


Information theorist Tom Stonier was born on this day in 1927 in Hamburg, Germany.

Stonier emigrated to the U.S. in 1939. He first won acclaim as the author of Nuclear Disaster (1964), a report to the New York Academy of Sciences on the potential biological effects of the detonation of a 20-megaton bomb in Manhattan.

An early proponent of the introduction of computers in education, Stonier followed the rise of computers with an eye on their effect upon the biological nature of human beings. In his book Information and the Internal Structure of the Universe (1990), Stonier observed that while science began as a study of the properties of "matter," during the past 200 years it has concerned itself with the properties of "energy," culminating in Albert Einstein's assertion that energy and matter are transducable (by way of the equation E=mc²). Stonier argued that "information" -- representing the changing expressions of all forms of organization on psychological, physical and biological levels -- has been identified by thinkers of the late 20th century as elemental, much like matter and energy, and from this premise he proposed the seemingly preposterous yet compelling notion that information, matter and energy are all transducable from each other.

In an updating of Norbert Wiener's observations, Stonier analyzed the relationship between entropy (energy which becomes unavailable, disintegrating into disorder, as the irreversible process of change occurs within our universe) and information, with the chronological moments in the history of the universe as the sample points on a graph: at the far right, entropy and information are at zero, but as one moves left along the graph, away from the "Big Bang" which begins to differentiate gravity, weak and strong nuclear forces and electromagnetism, matter evolves into more complex forms, with systems (from atoms to molecules to bacteria to humans to civilizations) becoming more complex and ultimately self-organizing, information expands exponentially, even as entropy theoretically increases. As Stonier put it, "One does not start with zero information and have proverbial monkeys typing at random hoping to author Hamlet. Instead, a highly advanced information system named William Shakespeare was born into an advanced information culture, and in due course added further information as the universe cycled on."

His radical conclusion: "The concept that as the universe evolves, its information content increases, is in opposition to the idea that the increase in entropy will inevitably lead to the 'heat death' of the universe" -- as if to say that new ways of processing information, of recreating order within our systems, have seemed to evolve as the universe evolves and produces more information to be processed.

Stonier seemed to be suggesting a transduction of energy into information, something that is knowable at some level by some perceiving system -- an atom, a person, a human nervous system, one's genes. The suggestion of a common denominator for psychological, physical and biological systems somewhere at the confluence of matter, energy and information is an interesting anti-Cartesian metaphor, and may yet have intriguing implications to our collective approach to scientific questions in the future about areas as diverse as neurobiology, energy processing and genetics.

Stonier died on June 28, 1999.

Labels: