Forbes.com



Burning Questions, Final Answers



What was the first technology?


In the interests of stark realism, we'd like to nominate the handheld rock (aka the Palm Pummeler), which could be used for all sorts of practical purposes, including bashing in the heads of competing Homo sapiens, thus advancing the gene pool of the winners.


In tough times, cash is king. Which of the large tech companies has the most cash?


As of early spring Microsoft, by a long shot, with $26.9 billion, nearly twice the amount of Intel ($13.5 billion) and five times the third-place finisher, Dell, at $5.4 billion (also three times GE's cash position). By the way, the largest tech company with the least amount of cash is Sprint FON at $122.0 million.


Will a cell phone stop a bullet as well as a Bible does?


After months of testing this out in the Forbes ASAP benchmark laboratory, we can safely say that the prize still goes to the Bible. According to Mike Hill, engineering professor at the University of California at Davis, "The flexible and fibrous Good Book will stand more of a chance of stopping a bullet. The brittle cell phone, replete with mechanical inhomogeneities, is unlikely to withstand the force." On the other hand, the word inhomogeneities might stop a bullet by itself.


Amazon says it will be profitable by the fourth quarter of 2001. Will it be?


No, at least not on the bottom line, the way most of us think of profitability. Amazon is referring to operating profit, which doesn't include its billion-dollar debt payments. According to at least two analysts, Scott Reamer of SG Cowen and Jeffrey Fieler of Bear Stearns, bottom-line profitability won't happen—if it ever happens—until the end of 2002 or 2003. Yikes! OK, what about operating profit? Will Amazon make its Q4 goal? With the downturn in the economy, this might be less likely. In fact, Reamer and Fieler think it's more likely Amazon will achieve the goal sometime in fiscal year 2002. The key seems to be whether Amazon can drive sales in its new electronic and toy stores and continue to drive fulfillment costs out of those parts of the company that are profitable—namely books, CDs, and videos.


Was the Internet boom and subsequent bust ultimately good or bad for the U.S. economy?


Good. Just take one simple measure: the Nasdaq, before and after the August 1995 Netscape IPO. Back then, the Nasdaq was trading around 1,000, with a total market cap of $1 trillion. As of press time, it was around 2,000, with a market cap of about $3 trillion.


What chart best sums up why the Nasdaq has fallen from its high of 5,000?





Stock prices and earnings, 1871-2000

Real (inflation–corrected) S&P Composite Stock Price Index and Real S&P Composite Earnings.

Source: Irrational Exuberance, by Robert J. Shiller, Princeton University Press, 2000


Who is high tech's Edison?


If you mean someone who consistently remained an inventive genius over an entire career, we'd nominate the late Bob Widlar. Widlar was also the wildest figure of Silicon Valley's Wild West early days. Haven't heard of him? That's because Widlar didn't work on digital chips; he worked on linear devices (also known as analog devices), the supporting technology for digital circuitry—not to mention television, radio, and telephones. Unlike digital components, with their teams of mask designers, linear devices were uniquely suited for lone artistic geniuses like Widlar. For 30 years at Fairchild Semiconductor, then National Semiconductor, then Linear Technology, Widlar regularly appeared with some stunning new creation.

Widlar was also a maniac. Stories of his antics are part of Silicon Valley legend. At Fairchild and National, he kept an ax in his office, and when frustrated, he would attack the nearest tree or linoleum floor. He was once spotted wandering drunk down New York's Fifth Avenue in a snowstorm trying to figure out how to walk to New Jersey for a sales call. When he quit Fairchild, he scrawled on the six-page exit questionnaire: I WANT TO GET RICH. X. (He never signed his name.)

At National Semiconductor, when the company temporarily fell on hard times in 1969 and had to cut back on landscaping expenses, Widlar showed up one morning in his Mercedes convertible, pulled a goat out of the trunk, leashed it to the bumper, and had it chew on the lawn for the rest of the day. That night, in a nearby bar, he auctioned the goat off to the highest bidder. Widlar spent his last years living in Mexico, regularly flying to the Valley to deliver his latest brilliant design. Tech is unlikely to see his kind again.


Who said, "Sitting at your computer surfing the Web is not as good as the mambo"?


Writer Mary Gordon, at a lecture in San Francisco. Not to be picky, but we'd choose the tango.


Will English become the official language of the Internet, as it has become in aviation?


No, non, nyet, and nix. For one thing, domain names are no longer limited to English. With multilingual domain name registration, many Internet users can type domain names in their native languages. For example, sites can register domain names in Korean and Japanese, as well as traditional and simplified Chinese characters. Portuguese, Spanish, and Arabic domains will be available in the near future.


In how many languages does Microsoft issue software?


Thirty.


How many languages does Bill Gates speak?


He speaks only English but writes in XML, C++, C Sharp, and Visual Basic.


Who was the first person to envision the Internet?


In his 1934 Traité de Documentation, a Belgian lawyer named Paul Otlet conceived of a Universal Network for Information and Documentation in which access would be gained through multimedia workstations that lay waiting to be invented.

In 1938 H.G. Wells, who tended to see engineers as the saviors of the world, published World Brain, which advocated the development of an encyclopedia to provide a systematic ordering of human thought and to act as a sort of super university.

Of course, Tim Berners-Lee created the World Wide Web, while Marc Andreessen and Eric Bina created a way to navigate it, but they're relative newcomers.


Will we ever be attacked by an incurable computer virus?


If you've ever lost all your work because a virus deleted, deformed, or defaced your hard drive, you already have been attacked by an incurable virus.

But on a larger scale, it's very unlikely that a killer virus would have time to take over all or even large parts of the Internet. Detection methods are getting as sophisticated as the new polymorphing viruses, such as the Love Bug. Even a virus that keeps changing its code and size or other attributes still retains enough of a fingerprint or pattern from its ancestor, experts say, to let new types of antivirus scanners pick up its scent. The common cold, on the other hand...


Who invented cookies and how did they get that name?


What we mean by cookies today (a way for Web sites to secretly know who is visiting) was not its original meaning. Legend has it that in 1970 an IBM computer operator at Brown University, who allegedly locked users out of their terminals until they manually typed the word cookies, inspired two MIT students to create a similar program. Why cookies? One theory is that the name refers to a popular cookie-bear character from a 1960s commercial. The current meaning of cookies came into being in 1994, when Lou Montulli, one of Netscape's founding engineers, concocted the first batch of Web cookies as simple mechanisms to make it easier for users to access their favorite Web sites without going through a lengthy process of identifying themselves each time. The man People magazine named the "Sexiest Internet Mogul" in 1999 had no idea that his invention would spark an ensuing debate over privacy issues.


Does the Web site of the Girl Scouts of America contain cookies?


Yes.


What is a Worm Light?


If you're 12, you know that hiding under the covers with a flashlight and comic books is old school. Now you can play Pokémon until 3 a.m. with the help of a tiny Worm Light attached to your Game Boy.

The discovery that the Game Boy's connector port produced enough power for a small light-emitting diode was the spark for the simple device. Just connect the Worm Light into the port on the left-hand side and turn on the Game Boy. No batteries needed. Just think of the possibilities when proctologists get ahold of one.


What is a "booth babe"?


According to GraphicPower.com, a Web site for graphics professionals, "Booth babes are an integral part of the Macworld Expo experience." Which is to say that young, smiling women in attire that might be described as business-carnal are essential to drawing prospective customers to listen to vapid pitches about vaporware.

To view some winsome booth babes, check out: http://www.graphicpower.com/showreports/mwesf2000/gallery/ bbabes/boothbabes.stm


Are there safeguards in place to stop a 1929-type market crash?


It depends on which stock exchange you're talking about. Two years after the 1987 crash, the NYSE instituted a number of methods to stop large-volume automated computer trading. These include "circuit breakers," which, just like the ones in your utility closet, turn off trading when the Dow gets overloaded due to a steep decline. There are three types of circuit breakers. If the Dow decreases 10%, trading is halted for 60 minutes. If it falls 20%, trading is halted for two hours. And if it falls 30%, trading doesn't happen for the rest of the day. "Collars" are another way to cool the market, and, again, stop large computer trading. If the market moves up or down 210 points from the previous day's close, a collar is attached to the raging bull or bear, which, in effect, limits sell and buy prices. In 2000, collars were put in place 50 times, and as of March 2001, 15 times. The Nasdaq, however, has no similar safeguards. According to spokesperson Scott Peterson, the exchange lets the market "find its own level."


How does fiber optics carry information using light?


Our very own George Gilder might ask, "How many angels of infinite bandwidth can dance on grains of sand spun into crystalline fibers and woven into worldwide webs?" But we're not in George's metaphoric league. We just want to know how something the width of a human hair and wrapped in what looks like a very long garden hose can transmit the entire contents of the Library of Congress from San Francisco to New York in the blink of an eye. The answer? First, you convert all those ones and zeros of data into light pulses (on and off) generated by a laser. Second, the pulses are sent down an optical fiber that generally has three layers: The central core is the region where the light travels and is an ultra-pure glass "injected" with a rare-earth element like erbium that helps to keep the light pulse strong. The surrounding layer of pure silica glass acts like a mirror and keeps the light in the core by a kind of caroming reflection. The outer layer is a protective coating of plastic. Finally, the cabling process combines multiple fibers into a protective structure composed of a combination of plastics and Kevlar and/or metal, to protect its fragile contents from nasty things like backhoes.


When will we solve the Last Mile Problem?


Guess what? We already have. More than 50% of U.S. households can get broadband (i.e., cable modems or DSL lines), but only a fraction currently do. Both technologies, which can be up to 30 times faster than a regular dial-up modem, are expensive and can be a nightmare to install. Plus, many people already have broadband at work. But the real reason is a lack of killer applications. TV had Uncle Miltie, Milton Berle, whom everyone just had to watch. The Internet doesn't. But what about all those cool things you keep hearing about, like an on-demand big-screen movie experience or producing your own entertainment by controlling cameras at a sporting event? Those types of applications take huge amounts of bandwidth (say, 100 to 200 megabits per second), and at this point, that is only possible when fiber-optic cables come into the home. A few lucky communities, involved in test trials, are getting such bandwidth, but the majority of us will have to wait years. One estimate: 2020.


Are slide rules still manufactured?


Fiddlesticks—no one manufactures slide rules anymore. But if you're desperate for retrotech, you can buy them online from eBay. There's also a slide rule discussion group on Yahoo Groups called "Sliderule." Be there and be square.


Who is the oldest really good computer game player in the United States?


Check out former game show host Geoff Edwards, aka Poacher, who at the age of 70 still frags players one-fifth his age from his cyberheadquarters in Los Angeles. So concerned is Edwards to protect the feelings of his virtual victims that when he stops playing Quake, he signs off, "Gotta go. Mom's calling me for dinner."


What was the most money made selling a domain name?


In November 1999 Jake Winebaum and Sky Dayton, cofounders of the Internet incubator eCompanies, paid $7.5 million for Business.com. Since then, they have sunk more than $75 million into the site in hopes of making it into a successful business portal.


What does Esther Dyson actually do, and is she any good at it?


Like lots of other trumpeters of the New Economy, what Esther Dyson did best during the go-go '90s was make a joyful noise. Technically, she's president and owner of the computer consulting firm EDventure Holdings, but her real position is as the doyenne of the digerati. How one gets to be a doyenne we've never quite figured out, or we'd be one by now. Like celebrities who are famous for being famous, Dyson is important because a lot of people think she's important, and they pay plenty to attend her conferences, where she never fails to make entirely unsurprising pronouncements.

Daughter of renowned physicist Freeman Dyson, the man who said, "It is better to be wrong than to be vague," Esther has figured out how to do both simultaneously. Dyson knows Russian as well as guru-speak, which must have made it seem like a naturally good idea to invest in computer ventures in Russia. Our advice? Learn Chinese.


Has any tech executive ever been kidnapped?


Two assailants abducted Adobe cofounder Charles M. "Chuck" Geschke from the parking lot of his Mountain View, California, office on May 26, 1992. Held for a ransom demand of $650,000, Geschke spent four days in captivity before being rescued unharmed, but the incident served to draw dramatic attention to the lax security habits of Silicon Valley's new multimillionaires. Discussions of personal security for these high tech executives were initially kindled in the mid-1980s after Apple CEO John Sculley escaped a kidnapping attempt while out jogging. Later, however, he was run off the career path by Steve Jobs.


Is it possible, using existing technology, to keep a severed human head alive and conscious?


The most likely scenario is a head transplant. In a 1999 interview with New Scientist, Dr. Robert J. White, professor of neurosurgery at Case Western Reserve University School of Medicine in Cleveland, said he could scrub up and perform the operation in an afternoon. Quackery?

Hardly. White is a Harvard-educated career neurosurgeon who published much of his early work in such journals as Science and Nature.

White performed head transplants on rhesus monkeys in the 1960s and '70s. During one 1970 experiment, the transplanted heads were revived to a state of full consciousness for up to 36 hours. They were alert and responsive. Their eyes opened and followed objects. They could bite your finger if you got too close.

White never placed an intact head on a mechanical support system. Too many unanswered moral and ethical questions, he said. He did remove the brains from the skulls of monkeys and sustain them by connecting major arteries and veins to pumps and oxygen chambers. A brain is more difficult to sustain outside the body than other organs because it consumes a great deal of oxygen, is more sensitive to oxygen deprivation, and contains no carbohydrate reserve to feed its chemical reactions.


Does an elevator's Close Door button actually do anything?


Whether the button works is largely a matter of traffic flow, according to Michael Jordan-Reilly of Otis Elevator. In a hotel lobby, for instance, where people often are waiting with their luggage, the elevator is programmed so the Close Door button will work to help get them up to their rooms faster. But in a busy office building where companies want to jam as many people as possible into one elevator to keep their lobbies clear, the Close Door button won't work.


A lot of companies got hit when the bubble burst. Which of the largest tech companies was hit the hardest?


That's easy: Cisco, which lost $325.7 billion off its market cap between 1999 and 2000. CMGI was the biggest loser in terms of percentage drop in market cap: 97%.


Will there ever be compatibility among operating systems?


No.


Can the Bush administration's top technology adviser program his own VCR?


Surprise, surprise, he can. Or at least he says he can. (In the interest of full disclosure, we say we can, too, but we can't.) Then again, maybe it's not so surprising. Floyd Kvamme, a partner at the Silicon Valley venture capital firm of Kleiner, Perkins, Caufield & Byers is, after all, an engineer. Before he started slathering entrepreneurs with money, he was one of the original "Fairchildren," the daring young nerds who launched the digital revolution. The good news is that our tech tsar is an actual techie. The bad news is that he isn't really a tsar. Word is that Commerce Secretary Don Evans isn't ready to surrender control of the main driver of the New Economy, so Kvamme will be a part-time adviser to the administration, not an inside-the-Beltway big shot. The Valley venturer and his wife gave more than $145,000 to help George W. become president, but apparently that wasn't enough to get a place setting in the inner circle.


Who first called it "surfing the Web"?


Well, it wasn't Vinton Cerf, father of the Internet. And it wasn't some rad code-writing surfer living in Malibu, and it definitely wasn't the Hittites (see question No. 45). The term was first coined by one Jean Armour Polly, a former public librarian working on an article about the Internet in 1992:

"At that time I was using a mouse pad from the Apple Library. The one I had pictured a surfer on a big wave. 'Information Surfer,' it said. 'Eureka,' I said, and had my metaphor."

Except for that slightly suspect "Eureka," we believe her.


How many bones did Oracle CEO Larry Ellison break when he tried to body surf a 30-foot wave in Hawaii?


Well, OK, maybe it wasn't 30 feet high, but it was big enough to break Larry's clavicle in three places, his neck in two places, and several ribs. Among Hawaiian surfers, the term for kids who do really stupid things is grommet. On Wall Street, they just say, "Short the stock."


What percentage of Internet sites is porn related?


The percentage is almost impossible to determine—and we tried really, really hard.

We did find an estimate from an outfit called Datamonitor, suggesting that 69% (granted, a suspicious number) of all Internet content in the U.S. and Europe is adult related. But that was from 1998, and this is clearly a growth industry.

According to recent surveys by Zogby International, more than 40 million Americans look at Web porn, presumably shouting to their spouses, "I'm just reconfiguring my browser" when asked what's taking them so long.


What's the difference between a Ponzi scheme and the Internet stock bubble?


Let's get this straight: A Ponzi scheme is a premeditated fraud in which a con artist hypes a nonexistent or overvalued investment. The fraud is sustained for a while by paying "dividends" to early investors from funds collected from later ones. It takes its name from Charles Ponzi (or as we affectionately call him, The Ponz), who in 1920 took in an estimated $15 million selling phony promissory notes to gullible investors.

By contrast, the "Internet bubble" was about buying and selling overvalued stock in companies with dubious or nonexistent business plans. The run-up in value was based on the fact that many others had bought it. As you can see, there's a BIG difference!


What has proved to be the most inaccurate prediction in technology?


Without question, the promise of the paperless office. According to the PaperCom Alliance, a nonprofit that studies the future of paper-based products, electronic communication actually increases paper use in virtually all market sectors. As a side note, the very upstarts that promised to take advantage of electronic efficiency, e-commerce companies, have consumed massive amounts of paper with their direct mail, catalogs, and print advertising to build brand awareness.


When was the term vaporware coined?


Newton's Telecom Dictionary says the term originated after Bill Gates announced the Windows release at the fall 1983 Comdex show in Las Vegas. Then, like your last flight, the release was delayed, delayed, delayed.

Once out of the box, vaporware has spawned dribbleware (software released in small increments); slideware (products that exist only in a vendor's slide show); bloatware (software that eats your computer); not to mention brochureware, expireware, guiltware, hookemware, nagware, shovelware, trashware...stand by for no-there-there-ware.


What's the most notorious example of vaporware?


There's no official ranking, but Ovation Technologies' high-flying news conference in Manhattan's Windows of the World restaurant in 1983 is surely a standout.

The news conference was called to unveil what the Massachusetts startup hoped would be the sexiest product on the market, one that would help the company beat its main rival, Lotus Development Corp. Programmers spent months trying to build a financial spreadsheet of 700 vertical columns by 2,048 horizontal rows. The company raised $6.8 million in financing, set up a distribution deal with Tandy Corp., and received rave reviews in trade magazines.

Alas, Ovation didn't have a product, only a video presentation attached to a computer to give it a little verisimilitude. "It was smoke and mirrors," admitted Paul Davis, a former programmer for the company, in a Wall Street Journal article. Davis recalled that he had to snatch a computer keyboard from a reporter who was banging on the buttons to see if the program really worked.


Are black boxes really black?


Actually, they aren't. The cockpit voice recorder and flight data recorder—the so-called black boxes that investigators use to reconstruct the events leading up to a plane crash—are orange.


When was the term Silicon Valley first used and who coined it?


Forbes ASAP has the real story on this one. In 1971 Don Hoeffler, then a reporter for Electronic News, a weekly tabloid magazine covering news, financials, and product announcements in the electronics industry, was sent to write a series of articles on the explosion in new semiconductor companies in the Santa Clara Valley, just south of San Francisco. Hoeffler was sitting in a hotel lobby going over the notes from his interviews when he overheard two young out-of-town salesmen talking about their experiences visiting local firms. Being a good reporter, Hoeffler not only listened but took notes. Said one salesman, "Boy, there sure are a lot of semiconductor companies around here these days." "Yeah," replied the other, "this place is turning into a regular Silicon Valley." Hoeffler had his hook. He datelined the series "Silicon Valley, USA." The rest is history. To his credit, Hoeffler never claimed kudos for inventing the name, only for recognizing it.


Who sent the first email?


Ray Tomlinson, in 1971 in Cambridge, Massachusetts, to another computer in the same room. The message? QWERTYUIOP—the keys across the top line of the keyboard. Unintelligible? Sure, but better than "Joke of the Day."


What is the oldest email chain letter still in circulation?


The Craig Shergold letter, which asks for cards to be sent to a 7-year-old boy who was diagnosed with terminal brain cancer in 1989. It started out as a mailed and faxed letter, but shortly thereafter found its way to email and Usenet and is still being circulated. Craig is now healthy and in his early twenties. Although he has requested that the cards stop, they continue to come.


What does ROFLMAO mean in an Internet chat room?


Rolling on the floor laughing my ass off.


The Pony Express traveled at 10 mph, the mail train at 60 mph, the airplane at 160 mph, the jet at 580 mph. What's the speed of email?


If all the networks, hubs, and routers are working flawlessly, an email sent from San Francisco to New York can travel at about 30,000 miles per second. International email, like international travel, takes a bit longer—about 10,000 miles a second.


How long does it take, and how many stops are involved, for an email to go from San Francisco to New Delhi, India?


Depending on countless factors that you may never understand, anywhere from a second to an hour. The bigger question may be: How does it get there at all?

First, your computer breaks down the message into digital bits, about the size of 16 characters. These chunks are wrapped in "envelopes," with "address" information: the IP address of the mail servers where both the sender's and the receiver's mail accounts reside.

Press Send and these electronic fragments zip through a series of routers in places such as San Jose, New York, Santa Barbara, and Tokyo. Routers are "intelligent" devices that sit at gateways connecting the wires that make up the Internet.

The router sends the tiny packets of information via the quickest route, which often means the least-congested one. One packet, the "Dear John" part, might travel from San Francisco to San Jose, veer over to Tokyo, and finally land in New Delhi. Meanwhile, the "I love someone else" packet could travel from San Francisco to Honolulu to Mumbai and on to New Delhi. But they would typically arrive within milliseconds of each other, still heartbreakingly coherent.


What was the first money?


Plenty of odd stuff has been used over the millennia to represent value—seashells, whale teeth, beads, and scraps of paper with pictures of dead presidents. The Greeks even used iron nails, but the phrase "a nail for your thoughts" never caught on. Herodotus tells us that the kings of Lydia created the first coins in the eighth century B.C., though they might have swiped the idea from the Hittites.

In the late Middle Ages, the most sought-after and easily exchanged coins in Europe were forged in the mint of Venice from as early as the ninth century. These gold coins were called zecchinos, from zeccha, the word for mint, and give us the modern word sequins. Paper money evolved when goldsmiths gave receipts for coins left in their charge. But not until 1833 were the Bank of England's notes made legal tender.


Were there more suicides in Silicon Valley and San Francisco in 2000 than in previous years?


No, but if there is going to be a rise in suicides from the tech stock catastrophe of 2000, it likely won't happen until 18 months after the downturn began in March 2000. This phenomenon—call it the "post post-thing"—was observed in a long-term study at Johns Hopkins University in the early 1970s, according to Eve Meyer, executive director of San Francisco Suicide Prevention. People can endure a single personal crisis, such as losing a job or losing monetary savings, in the short term. "It is not until that person is laid off, runs out of savings, loses their family, and turns to substance abuse that they will start to contemplate suicide," she says.


How many blank CDs were sold worldwide in 1998? How many in 2000?


In 1998 it was 700 million. Last year this number grew to a staggering 3 billion. This year's sales volume is projected to be up some 45%. Thanks, Napster.


Whose scruffy loafers are these?


They belong to Bill Gates.


When was the novel Wired Love published and what was it about?


Wired Love, by Ella Cheever Thayer, was published in 1879 and was about a long-distance romance between telegraph operators.


How many bits in a nibble?


Four. A nibble is half a byte. Two bits, which used to be 25 cents, are now half a nibble.


In the original Mercury space program, what did the initials UCD stand for?


Urine Collection Device. And since you've asked, NASA didn't have one ready for Alan Shepard's first suborbital flight—with predictable results.


If technology increases productivity, why does automated directory assistance now take twice as long and work half as well?


AT&T residential service spokesperson Mark Spiegel, reached after only four phone calls, disputed the premise of our question: "No, you're wrong. It doesn't take longer."

So we called information in the 312 area code for the number of the Chicago Four Seasons Hotel. Simple enough inquiry, but after asking the automated computer the question, then repeating it for a human, we noticed that we'd been on the phone for an astonishing two minutes and three seconds. So Spiegel was right: It doesn't take longer—for AT&T.


Where did the term computer bug come from?


The term bug to indicate a glitch may have originated, as all things do, with William Shakespeare, who used the word to describe a disruptive event in Henry VI. But the computer bug story told most often comes from the lips of Admiral Grace Hopper. In 1945 a technician solved a glitch in the navy's Mark II Aiken Relay Calculator by pulling an actual insect out from between the contacts of one of its relays. The moth was taped into a logbook and put on display in the Naval Surface Warfare Center Computer Museum. Later, the incident was immortalized when geeks began to use the term debugging to describe solving problems with computer hardware or software.

So next time something goes awry in your Mark II Aiken Relay Calculator, remember Shakespeare's line: "So, lie thou there: die thou, and die our fear, For Warwick was a bug that fear'd us all."


By the way, who was Admiral Grace Hopper?


The admiral was one of the greatest military computer minds of her time. In 1944, as a lieutenant, she worked on the Mark I, a precursor to electronic computers, and later went on to help design the Univac I, the first commercial electronic computer.

Admiral Hopper was named the first computer science "Man of the Year" by the Data Processing Management Association in 1969. She was awarded the National Medal of Technology in 1991. She passed away in 1992. Five years later the navy commissioned a guided-missile destroyer, the USS Hopper, in her honor. The ship is nicknamed "Amazing Grace."

After spending her career in the military, it's no surprise she is quoted as saying, "It is much easier to apologize than it is to get permission."


Who is digital technology's Einstein?


Claude Shannon's theory of information is about as close as high tech has come to e=mc². In his 1948 paper, "A Mathematical Theory of Communication," Shannon showed that all information sources—telegraph keys, radios, people talking—have a rate at which they produce information. This can then be measured in bits per second. In other words, information is like any other measurable physical quantity, such as density or mass. Says Timothy Ferris, author of The Whole Shebang: A State of the Universe(s) Report, "If science continues to pursue the idea that the universe is based on computation, then Shannon's theory may loom large in future histories of science, as a first stop toward understanding what is and is not information."


What was one of the earliest inventions of Bill Hewlett and Dave Packard?


An automatic urinal flusher.


What happens to venture capital fundraising if Nasdaq remains in a prolonged bear market?


If history is any guide, fundraising could fall, at the very least, 69%, dropping from a high of $98 billion in 2000 to some $30 billion in a year's time. And if that isn't bad enough, it could take a minimum of three years to climb back to a point anywhere near the 2000 high.


If the smartest computer is no brighter than the dumbest human being, should we just forget the whole thing?


According to cybervisionary Ray Kurzweil, computers are already a lot smarter than the dumbest human being. "Our interneuronal connections compute at 200 calculations per second, whereas electronics is already 10 million times faster. Your personal computer can remember and retrieve billions of items of information easily, whereas we're hard-pressed to remember a handful of phone numbers."

Yet, despite Kurzweil's convincing statistics, we haven't met a computer yet that understood the difference between, "To be or not to be" and "0001110101010101010101010101010101010101011111111111."


What's the worst tech decision ever made?


There are lots of contenders for this one, but the most egregious example is the herd mentality of venture capitalists over the past five years, as shown by their decision to invest $101.5 billion in Internet companies from 1996 to 2000. Just to be pessimistic, let's say all these companies plunge to their deaths. Would this compare to some of tech's other monumentally bad decisions? Believe it or not, yes. Early in Intel's life, a Japanese calculator company called Busicom enlisted Intel, then a memory chipmaker, to develop a new, multipurpose chip to make calculators cheaper. Intel responded by inventing the microprocessor. But calculator prices fell and Busicom let the rights revert to Intel. Estimated lost revenue: $170 billion.


How long will Moore's Law prevail?


Until August 1, 2007.


Are you sure?


No, we're not. But at some point, with current technology, it's simply going to be impossible to double the number of transistors on a chip every 18 months. It's not only a matter of space; the closer transistors are packed together, the hotter the chip runs. To overcome these hurdles, engineers in molecular and quantum computation are working to bring Moore's Law down to an entirely new level. The real problem is economics. Will the price of research and production justify the extension of Moore's Law? The U.S. government thinks so, and funds have been allocated for its advancement. Gordon Moore himself pins the end of Moore's Law (with current technology) sometime in 2007. The nature of the law as a self-fulfilling prophecy demonstrates the power of prediction to become fact. The real question may be: Do we need Moore's Law to continue?


So, do we need Moore's Law to continue?


No and yes. No, because we only use about 5% of our PCs' power at the moment. Yes, because how else is the computer industry going to convince you to buy a new PC unless it's much faster and more powerful?


Whatever happened to cold fusion?


Like alchemy or the perpetual motion machine, cold fusion is one of those scientific pipe dreams we really, really wish would come true. Alas, the methods behind the claim that we could produce an almost infinite, magically efficient energy source by doing nuclear fusion at room temperature turned out to be inconsistent or fraudulent, depending on your level of skepticism.

Still, an underground network of cold fusion experimentation continues. Futurist Arthur C. Clarke is among the believers and funds Infinite Energy magazine. If the truth is out there, cold fusion is way out there.


Whatever happened to cold-filtered Miller Genuine Draft?


It's not genuine draft, and like cold fusion it's still an impossible dream.


What is the worst high tech merger?


Mattel's purchase of the Learning Company, an educational and entertainment software company, for $3.6 billion in May 1999. Executives were hoping to become a significant presence on the children's software gaming market. Unfortunately, the Learning Company was a business on the decline, a victim of commoditization, according to George Boutros, Credit Suisse First Boston's managing director and head of global technology, mergers, and acquisitions. Two years later, the toymaker sold the company for practically nothing.


What is the greatest product of the digital age?


You probably thought we'd say the Internet browser, and you'd be wrong. It's the Intel 8080 microprocessor. It wasn't the first microprocessor (that was the 4004) or even the first with eight bits (the 8008). But it was the first true single-chip microprocessor with a modern architecture still used today. It set off the video game and personal computer revolutions and established the design that defines all microprocessors in use today. Without it, an Internet browser wouldn't be able to browse. And by the way, there are only 400 million people browsing the Internet, compared to 2 billion microprocessors.


What is the biggest failure in tech history?


Iridium, the global satellite telephone company, must be near the top. Its 2000 demise cost investors some $5 billion (and its assets were bought for pennies on the dollar by Iridium Satellite LLC, which launched service in April). Or pick any of the high-flying dot-com companies, such as AllAdvantage.com, which burned through $167.8 million of investors' money. Or Pets.com's $152.4 million flop. But before you get too pessimistic about all the dot-com failures, consider Trilogy Systems. In the early 1980s computer genius Gene Amdahl (noted for the IBM 360 and Amdahl Computer Corp.) raised more than $200 million in a bid to leapfrog Moore's Law by building a mainframe computer on a single chip. The prototype, complete with its own tiny cooling tower, was switched on—and produced enough heat to melt itself to scrap. End of Trilogy.


What's the best magazine about the digital revolution?


We thought you'd never ask. Of course, there is Wired, if you can read green type on a green background. And Industry Standard, which always wowed us with its name. Followed by the occasionally fishy Red Herring. The grandfather of the bunch is Upside, which is currently looking for same. But in all modesty we'd be remiss to leave out good old Forbes ASAP, the only magazine we know of where As Soon As Possible means every 60 days.