It's been a rough year for the IT industry. The death of Apple co-founder Steve Jobs in October grabbed international headlines. But we also lost other major figures from almost every area of technology, including Xerox PARC founder Jacob E. Goldman, who died in late December. Here's one last look at some of the people who made a big difference.
Dennis M. Ritchie
Godfather of Unix, Father of C
September 1941 - October 2011
Arguably the most influential programmer of the past 50 years, Dennis Ritchie helped create the Unix operating system, designed the C programming language. And he promoted both, starting in the 1970s.
Ritchie worked closely with Unix designer Ken Thompson starting in 1969, integrating work by other members of the Bell Labs research group. And in 1971, when Thompson wanted to make Unix more portable, Ritchie radically expanded a simple language Thompson had created, called B, into the much more powerful C. Just how influential has all that work been? Unix spawned lookalikes such as Linux and Apple's OS X, which run devices ranging from smartphones to supercomputers. And by one account, eight of today's top 10 programming languages are direct descendants of C. (Read more about Unix in Computerworld's 40th anniversary of Unix package.)
While Ritchie was serious about Unix and its potential for creating a computing community, he knew better than to take himself too seriously. He quipped that Unix was simple, "but you have to be a genius to understand the simplicity." And Ritchie wasn't above an office prank. In 1989, he and Bell Labs cohort Rob Pike, with the help of magicians Penn and Teller, played an elaborate practical joke on their Nobel prize-winning boss, Arno Penzias. (You can see the prank in this video clip.)
A Knack for Encryption
July 1932 - June 2011
Among the Bell Labs researchers who worked on Unix with Thompson and Ritchie was Bob Morris, who developed Unix's password system, math library, text-processing applications and crypt function.
Morris joined the Bell Labs research group in 1960 to work on compiler design, but by 1970 he was interested in encryption. He found a World War II U.S. Army encryption machine, the M-209, in a Lower Manhattan junk shop. Morris, Ritchie and University of California researcher Jim Reeds developed a way to break the machine's encryption system and planned to publish a paper on the subject in 1978.
Before they did, they sent a copy to the National Security Agency, the U.S. government's code-breaking arm -- and soon received a visit from a "retired gentleman from Virginia," according to Ritchie. The "gentleman" didn't threaten them, but he suggested discretion because the encryption techniques were still being used by some countries. The researchers decided not to publish the paper -- and eight years later, Morris left to join the NSA, where he led the agency's National Computer Security Center until 1994.
Ironically, it was Morris's son, Robert Tappan Morris, who brought him into the national spotlight: In 1988, the younger Morris, then 22, released an early computer worm that brought much of the Internet to its knees. The senior Morris said at the time that he hadn't paid much attention to his son's interest in programming: "I had a feeling this kind of thing would come to an end the day he found out about girls," he said. "Girls are more of a challenge."
Intelligence, Artificial and Otherwise
September 1927 - October 2011
He may be best known as the creator of the Lisp programming language and as the "father of artificial intelligence" (he coined the term in 1956), but John McCarthy's influence in IT reached far beyond would-be thinking machines. For example, in 1957 McCarthy started the first project to implement time-sharing on a computer, and that initiative sparked more elaborate time-sharing projects including Multics, which in turn led to the development of Unix.
In an early 1970s presentation, McCarthy suggested that people would one day buy and sell goods online, which led researcher Whitfield Diffie to develop public-key cryptography for authenticating e-commerce documents. In 1982, McCarthy even proposed a "space elevator" that was eventually considered by a government lab as an alternative to rockets.
But McCarthy's first love was A.I., which turned out to be harder than he first thought. In the 1960s, McCarthy predicted that, with Pentagon funding, working A.I. would be achieved within a decade. It wasn't -- as McCarthy later joked, real A.I. would require "1.8 Einsteins and one-tenth of the resources of the Manhattan Project."
The Digital Man
February 1926 - February 2011
As an engineer working at MIT's Lincoln Laboratory in the 1950s, Ken Olsen noticed that students lined up to use an outdated computer called the TX-0, even though a much faster mainframe was available. The difference? The mainframe ran batch jobs, while the TX-0 (which Olsen had helped develop as a grad student) allowed online interactivity.
In 1957, Olsen and a colleague, Harlan Anderson, took that insight and $70,000 in venture capital money and started Digital Equipment Corp. (DEC) to make smaller, more interactive machines. The company's PDP minicomputers were inexpensive enough that a corporate department could own one (a PDP-7 was used to develop the first version of Unix at Bell Labs).
Olsen's management approach as CEO -- hire very smart people and expect them to perform as adults -- helped DEC become the second biggest computer maker after IBM. But Olsen was also opinionated and sometimes stubborn. He publicly grumbled about Unix (calling it "snake oil") even as his company sold lots of Unix workstations, and DEC was late to join the move to PCs. DEC's sales declined, and in July 1992, Olsen was forced to resign from the company he founded. DEC was sold to Compaq six years later.
April 1926 - March 2011
Working to make electronic communications bulletproof at the height of the Cold War, Paul Baran developed what would eventually become a core technology of the Internet: packet switching. Baran was a researcher at the Rand Corp. think tank in 1961 when he suggested that messages could be broken into pieces, sent to a destination by multiple routes if necessary and then reassembled upon arrival to guarantee delivery.
Baran wasn't the only one to think of the idea -- U.K. researcher Donald Davies came up with a remarkably similar idea at about the same time and gave it the name "packet switching." But the U.S. Air Force liked Baran's version of what was essentially an inexpensive, unreliable network with intelligence at the edges. AT&T, the dominant U.S. telephone company, didn't -- it had an expensive, reliable network, and company engineers publicly scoffed at Baran's idea.
However, packet switching was adopted for Arpanet, the predecessor to the Internet, and eventually for local-area networks in the form of Ethernet. Today, even phone calls are typically sent in digital packets. (This hour-long video interview shows Paul Baran receiving a 2005 Computer History Museum Fellow Award.)
Last of the First Programmers
December 1924 - March 2011
Jean Bartik was the last surviving member of the original programming team for the ENIAC, the first general-purpose electronic computer. But that understates her work. Bartik, the only female math graduate in her 1945 college class, was hired to make the physical connections that let the ENIAC perform artillery calculations, and she served as a lead programmer on the project. But Bartik also developed circuit logic and did design work under the direction of ENIAC's hardware developer, J. Presper Eckert.
After ENIAC, Bartik followed Eckert to work on both hardware and software for the commercial Univac I mainframe and the specialized BINAC (Binary Automatic Computer). But once the Univac was complete, Bartik retired at age 26 in 1951 to raise a family. She returned to a much-changed IT industry in 1967 and worked as an editor at several analyst companies until she was laid off in 1985, when she was in her 60s.
Jack Keil Wolf
Disk Drivin' Man
February 1926 - February 2011
There's a reason why the amount of information we can store on hard disks keeps growing -- and its name is Jack Wolf. That may be an overstatement, but it's not too much to say that Wolf did more than almost anyone else to use math to cram more data into magnetic drives, flash memory and electronic communications channels.
Wolf began his professional life as an information theorist, teaching and working at RCA and Bell Labs, with much of his work relating to compressing information. But in 1984, he moved to the new Center for Magnetic Recording Research at the University of California, San Diego. "I knew nothing about magnetic recording," he admitted in a 2010 lecture. "Not only did I not know how to spell coercivity, but the first time I mentioned it in a talk I mispronounced it. But UCSD reluctantly made me an offer as the first faculty member in CMRR."
It was a good choice. Wolf and his students, dubbed the "Wolf pack," cross-pollinated magnetic drive design with information theory, applying compression in increasingly creative ways, and spread Wolf's ideas throughout the industry.
June 1925 - September 2011
Silicon Valley had many builders, but one of them literally built some of the high-tech hub's first silicon-making machines. Julius Blank was one of the "Traitorous Eight" engineers who founded Fairchild Semiconductor in 1957. He and his seven colleagues had acquired that unflattering sobriquet because they decided to strike out on their own just a year after Nobel Prize-winning physicist William Shockley had recruited them to create a new kind of transistor at Shockley Labs.
The Eight included future Intel founders Gordon Moore and Robert Noyce, but the lesser-known Blank had skills critical to the new venture: Before going to college, he had been trained as a machinist. Along with eventual venture capitalist Gene Kleiner, Blank built Fairchild's machine shop, created the manufacturing machinery and outfitted the rest of the fab. Within nine months, Fairchild went from occupying an empty building in Mountain View, Calif., to shipping its first transistor.
How well did that first hand-built equipment hold up? In 1962, Fairchild set up its first offshore plant in Hong Kong, and no new equipment was required. "We took the old, ancient equipment from Mountain View," Blank told an interviewer in 2008. "They just put it in crates and shipped it overseas. It came over there rusty, but they just sandblasted it, put a coat of paint on it and put it together; it worked fine."
No More Mobile Monopoly
October 1922 - October 2011
Motorola CEO Bob Galvin didn't design the first working handheld mobile phone -- one of his researchers, Marty Cooper, did that in 1973. But Galvin broke AT&T's monopoly on mobile-phone service in the U.S. by demonstrating a Motorola phone at the White House in 1981, spurring then-President Ronald Reagan to push the FCC to approve Motorola's proposal for a competing cellular network, just three years after AT&T had lost its long-distance monopoly.
Galvin, whose father and uncle started the business that would become Motorola, took the company's reins in 1956 and led it for more than three decades. During that time, Motorola went from the car radios and walkie-talkies that the company had been making to microprocessors (the early Apple Macintosh's 68000 and Power CPUs), TVs and satellite communication systems.
Galvin also pushed to make Motorola's manufacturing competitive with non-U.S. companies, supporting development of the Six Sigma quality system starting in the 1970s. By the time Galvin retired as Motorola's chairman in 1990, the company dominated the cellphone hardware business.
Gerald A. Lawson
December 1940 - April 2011
The man who created the first home video-game system that used interchangeable game cartridges wasn't a typical Silicon Valley engineer. Jerry Lawson was 6-foot-6, more than 250 lbs. and African-American -- even more of an IT industry rarity in the 1970s than today. Lawson's creation, the Fairchild Channel F, arrived in 1976, a year before Atari's first home game system, and sparked an industry of third-party video games.
That wasn't as simple as it sounds. Lawson, who worked for a succession of government contractors before joining Fairchild Semiconductor, discovered that the biggest challenge with plug-in cartridges was satisfying the FCC's radio-frequency interference requirements. "It was the first microprocessor device of any nature to go through FCC testing," Lawson said in a 2006 interview. "We had to put the whole motherboard in aluminum. We had a metal chute that went over the cartridge adapter to keep radiation in. Each time we made a cartridge, the FCC wanted to see it, and it had to be tested."
The resulting game system was a moderate market success, but its biggest impact was on Lawson's friends at Atari, who rushed their own cartridge-based home system into production. The rise of the video game had begun.
The Man With the Robot Arm
February 1912 - August 2011
If one man represents the real-world impact of IT, it's probably George Devol, who developed the first digitally programmable robot arm. A lifelong tinkerer with a fascination for electronics, Devol invented a system for recording sound for movies in the 1930s, then switched to systems that used photoelectric cells to open and close doors and sort bar-coded express packages (he also used "electric eyes" to count visitors to the 1939 New York World's Fair).
After starting a company that made anti-radar devices used by the U.S. Army in World War II, Devol turned his inventiveness to factory automation in the 1950s. The large programmable "Unimate" arm he developed used magnetic drum memory and discrete solid-state control components. It made its factory debut in 1961 on a General Motors assembly line in New Jersey, stacking freshly die-cast (and very hot) metal parts. By 1966, the arms were being used by other automakers for welding, spray-painting and applying adhesives, and the Japanese were using them, too. Within 20 years, Devol's Unimation was the biggest robotic-arm company in the world. (Here's a video interview in which Devol discusses his work.)
Devol's biggest public moment may have been one in which he never actually appeared. In 1966, the Unimate arm was a "guest" on television's Tonight Show, where the arm was programmed to sink a golf putt, pour a beer and lead the band. (See this video clip.)
December 1915 - September 2011
Lee Davenport didn't invent battlefield radar for tracking enemy planes, but the system he developed -- which used a computer to control anti-aircraft guns -- did its job better than any previous approach during World War II.
Recruited as a Ph.D. student in early 1941 by the top-secret Radiation Lab at MIT, Davenport oversaw day-to-day work on the SCR-584 anti-aircraft system. The U.S. Army began using it in combat in early 1944, first in Italy and then for the D-Day invasion. At the Battle of the Bulge, the radar system was also used to spot German ground vehicles in the snowy terrain.
In addition, the SCR-584 was used in 1944 to defend London against German buzz bombs. During that operation, Davenport said he found members of one U.S. anti-aircraft crew trying to read the SCR-584 manual in combat because they hadn't been trained to use the system. "Seven or eight buzz bombs came within range while I was there," he later said. "The crew never got a single shot off at any one of them." But once trained, the SCR-584 crews were very effective in shooting down the buzz bombs.
Heartbeat of the Century
September 1919 - September 2011
It was an electronic mistake in 1956 that led to the first practical implantable cardiac pacemaker. Wilson Greatbatch, an electrical-engineering professor at the University of Buffalo, was building a heart rhythm monitor for the school's Chronic Disease Research Institute. When he attached a wrong-size resistor to a circuit, it produced intermittent electrical pulses -- which, Greatbatch realized, might be used to regulate a damaged heart.
Two years later, doctors at the Veterans Administration hospital in Buffalo demonstrated that a 2-cubic-in. implantable device built by Greatbatch could regulate a dog's heart. That same year, a pacemaker from Swedish designer Rune Elmqvist was implanted in a human patient, but it failed within days. In 1960, an improved version of Elmqvist's pacemaker kept a patient in Uruguay alive for nine months. But that year in Buffalo, 10 patients (including two children) received Greatbatch's device, and its battery lasted two years or more. In 1972, Greatbatch was able to re-engineer the device with a new battery that worked for more than a decade. (Visit the Vega Science Trust website to watch an hour-long video interview in which Greatbatch discusses his work.)
Frank Hayes has been covering the intersection of business and IT for three decades. Contact him at email@example.com.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.