Sunday, July 27, 2008

Genealogy, the Idaho Russet and Innovation

In April I was elected Chairman of the New England Historic Genealogical Society, one of my favorite organizations on earth. Not only does the Society have a world class staff, but its Trustees and Councilors are a group of extraordinarily talented individuals who dedicate their time, talent and treasure to collecting, preserving and interpreting--so our mission goes--the stories of families in America.

Founded in 1845, the locus of the Society has moved rapidly in the last decade from its beautiful library on Newbury Street in Boston (still active and vibrant) to a global on-line presence.

With that in mind, these are remarks I made shortly after becoming Chairman. The subject appears to be about the Idaho Russet potato, one of which I pulled from my pocket during the speech. I suggest, for full effect, that you find one in your kitchen and place it on your monitor now.

Or, I have provided some here for ambiance.

Several Trustees/Councilors have asked that I post my remarks on-line. As I only had the speech three-quarters complete the morning it was delivered, I am taking the liberty of completing it in writing here.

And, for the record, I love McDonald’s French fries. Though, as the Roman Terence advised, moderation in all things.

Here are my remarks:
I want to take you back to, say, 1920, for a minute and posit the following: I believe that there are a number of businesses, a number of industries, where a reasonable person could have reasonably predicted how they might evolve over the next 80 years.

I'll give you an example: Automobiles.

I believe you could have looked at the automobile industry in 1920 and forecast a fairly correct future for the automobile: faster, sleeker, more powerful, safer. You wouldn't have gotten the electronics right--things like antilock brakes--but you might have assumed that the auto could fly or drive underwater; in other words, it's not clear the industry has evolved as quickly as you might have predicted in 1920.

The same is true, I believe, of aircraft: faster, sleeker, higher. Maybe we would have had them landing on the moon, or hovering like helicopters. But, I don't think we would have been too far from reality.

Now, let me give you a contrary example. Some of you will recognize this as an Idaho Russet potato. Is there anything more low-tech than a potato? But, here's the irony: Take a relatively complex technology like the automobile and predict the future--not so hard. But take a low tech commodity like a potato and, well, I don't think there is any way that you could have stood on an potato farm with (the recently departed) J. R. Simplot in 1920 and predicted what would happen to the Idaho Russet in the next eighty years.

In 1920, of course, potatoes were already a staple of the American diet, but they were harvested by hand and usually eaten in one of three ways: baked, boiled or roasted. Today, of course, we still do that, but of the 79 lbs. we each eat on average annually, 40 lbs. are baked, boiled or roasted and 39 lbs. come in the form of--you guessed it--the most widely sold food item in the United States, French fries.

Today, the lowly Idaho Russet potato is picked, sent to a processing plant that will process several million pounds of 'taters a day. Rarely touched by human hands, it'll have its jacket blown off by hot steam and then be fired at 117 feet per second through a hose into a finely sharpened grate, coming out the other side in a perfect French fry cut. Blanched. Frozen. Almost ready.

Because next we'll have to visit a fragrance and flavor company along the New Jersey Turnpike to come up with the flavoring to be added back to the fries to make them taste like French fries, since the processing tends to take much of the flavor away (and we're no longer allowed to fry them in beef tallow).

The real irony, of course, is that when you buy that bag of French fries and eat all 534 calories, 54% of those calories will come not from potatoes but from corn. (Those of you who have read Omnivore's Dilemma will know that corn has taken an even stranger journey than the potato.)

What did we miss? A potato is, after all, a potato. And, were we predicting its future in 1920, we might have anticipated better pesticides or better fertilizer and higher yields. But how could we anticipate, for example, the culture of the automobile that grew up in the 1930s and 40s in places like southern California, and that led to things like drive-in movies and restaurants. How about the proliferation of franchising in the 20th century? Or applying high speed industrial processing to food products—i.e.: treating potatoes as "production units"? Or the ability to ship cold product cross-country?

Hopefully you understand my point: a potato is still a potato, whether it's 1920 or 2008, but nothing else around it is much the same.

Now let's talk about genealogy for a moment. I believe you could have stood in this Society in 1920--or 1850 for that matter--and reasonably predicted what the genealogy world would look like in 1985. More books. Better access to archives. Better scholarship. More integration of national or global records. Maybe even, given the advent of photography, microfiche and microfilm.

But here's the irony--where the potato gets shot at 117 feet per second out of the hose, so to speak:

I do not believe you could have stood in this Society in 1980, or perhaps even 1990, and reasonably predicted what the genealogical world would have looked like today. That's less than a generation of fortune-telling, and we would have likely guessed wildly wrong.

Why? Because the art and scholarship of telling our families' stories didn't change, but everything around it did. The web, email, podcasts and blogs. Digitization of data. Social networking and online sharing. A computer for every person.

So, let me "talk genealogy" to you for a second:

+ MyGenerations Network, the parent of Ancestry.com, boasts 2.5M active users and 8.7M unique visitors per month

+ In Dec 2007, Findmypast.com was acquired by Scotland Online, which defines itself not as a genealogy company, but as an ISP and IT solutions provider that also owns ScotlandsPeople.

+ Familybuilder of New York, which searches for relatives via social networks, has over 2 million registered users and over 7 million profiles since launching in June 2007

I've just told you three critical things about the competitive genealogy world and I have yet to mention books, libraries or Idaho Russet potatoes.

That's not to say that books and libraries aren't still absolutely critical to great genealogy. But it does say the following, and this is the real reason I accepted the position of Chairman of the Society: What makes this place so exciting, so dynamic, and so full of opportunity is that we are a 150-year-old organization that focuses on history, embraces tradition and treasures scholarship--oh, and that just happens to be in a technology-driven, high growth, rapidly evolving industry.

That speaks to being a new kind of organization, and I believe we have been watching that transformation over the last ten years.

Having our nineteenth-century Society thrive in the twenty-first century means nurturing all of the things that have made us a one-of-a-kind institution since our inception, and at the same time, taking smart risks, accepting ambiguity, addressing complexity, connecting old dots in brand new ways, and embracing technological change. It speaks to being very well capitalized so that we can address opportunities and weather set-backs. It means partnering in ways that advance the non-profit genealogy agenda. And it means having the kind of Councilors and Trustees who love tradition and love technology, and, on any given day of the week, cannot decide which they love more.
At which point I placed the Idaho Russet potato back in my pocket and sat down.

Wednesday, July 23, 2008

When the Other Side Adopts "Your" Innovation First


Earlier this year Seth Shulman’s The Telephone Gambit was published. In it, Shulman suggests that Alexander Graham Bell had a sudden, remarkable leap of intuition in his invention of the telephone after visiting the patent office in Washington and seeing key elements of Elisha’s Gray’s (very similar) patent.

That got me to thinking about the May 2006 New Yorker article which suggested that Mark Zuckerberg may (have, might have, could have) stolen the idea for Facebook from his college friends. When Zuckerberg helped write some code for his dormmates one winter break and suddenly morphed from Harvard student to Harvard drop-out, bound for the West Coast, it sounded an awful lot like that same remarkable leap of inspiration that powered Alexander Graham Bell’s innovation. (Zuckerberg has since paid to make his problem go away.)

A few years ago I sat at a National Chamber of Commerce meeting in Washington, D.C. to talk about global product and trademark theft, and at the table were five major software companies, three of four major league sports, a half-dozen luxury goods manufacturers and the representative from a high-end automobile company who told me that there were knock-off firms in China capable of making every single part of his best-selling models.

It seems we know an awful lot about corporate, product and brand theft. We even read from time to time about corporate espionage, and know there are all kinds of sinister hacks traveling around the web, seeking to steal proprietary information.

Oh, and there’s that guy from Nigeria who will send you $1 million if you send him $10,000 first. (I read recently that there are almost 600 versions of this letter floating around.)

But, what about the other kind of theft, the kind where one side innovates, ignores or mismanages the innovation, and then watches the other side adopt it and take a commanding leadership position? This isn’t exactly innovation stolen. It’s more like innovation overlooked.

Usually at the price of great pain.

Michael Tougias and I wrote King Philip’s War: The History and Legacy of America’s Lost Conflict, which was published in 2000. King Philip, whose Native American name was Metacom, was the son of Massasoit, and his namesake war was fought between New England’s English colonists and members of the Wampanoag, Narragansett and others Algonquian tribes in 1675 and 1676.

Writing the book was a labor of love, meaning that neither of the authors was able to use the royalties to acquire a ski chalet in Telluride. Still, it sold well (and still does today, thanks to Michael’s gift as a one-man marketing machine), and has led to numerous speaking engagements at bookstores, libraries, historical societies and schools.

Of those, schools are perhaps my favorite, especially when I get in front of an antsy group of second-graders and try to dazzle them with tales of King Philip’s various body parts. (For the scoop on Philip’s famous head and hand, you’ll have to read to the bottom of the article.) Inevitably, however, the question arises: “Did the Indians use bows and arrows to fight the colonists?”

The short answer is, yes, but only when they had to--which was not very often. And therein lies a story of innovation “overlooked” that rivals anything happening today in China or on the web.

Here’s some quick history. By 1500, and certainly at the time of King Philip’s War in 1675, the matchlock musket was used almost universally as the weapon of choice for the common foot soldier. The matchlock was the first mechanism—-really, a slow burning wick which dropped into a pan of gunpowder—-that allowed a solider to use both hands to aim his rifle.

It wasn’t elegant, and it often got wet and failed, but it served its purpose for two centuries.

Beginning about 1630, the flintlock was introduced to soldiers in Europe. The flintlock—a piece of flint released by a trigger causing a spark in a flash pan to ignite the gunpowder--is a much superior technology to the matchlock. Not only is it faster to reload, but it is more likely to actually fire the musket.

By the time of King Philip’s War the flintlock was a 50-year-old innovation. However, it is almost certain that the colonial farmer who would be called upon to fight the war in the forests of Massachusetts and Rhode Island owned a matchlock musket. Geared to farming and raising livestock, these yeoman farmers had no particular reason to move away from their matchlocks. So, on training days, when able-bodied men were forced to practice with their militia on the village green—standing together in a clump and firing “volleys”--matchlocks were undoubtedly the weapon of choice.

Now, consider the Native American. The English in New England tried very hard to keep their Indian friends from gaining access to muskets. But it was a practical impossibility. Not only were the English willing to sell or trade muskets to their Indian neighbors, but the French and Dutch ringing New England were also very willing to arm the natives.

It seems pretty clear to historians that the Native American, who often hunted for his meat, probably adopted and became proficient with flintlocks as soon as they became available in the colonies.

Thus, at the start of King Philip’s War, English farmer-soldiers with inferior matchlocks and poor shooting skills faced off against Native American marksmen who possessed superior European flintlocks.

Consequently, the first six months of King Philip’s War were essentially a rout for the Native Americans. It appeared at one point that the English might be pushed back to the Atlantic, pinned in their coastal towns like Boston.

The Indians simply took superior European technology and beat their European opponents over the head with it.

[There is an irony to this story, and subject for another post perhaps. At least one of the colonial commanders was happy that the natives had switched from bows and arrows to guns, because, to paraphrase, with bows and arrows they could get off five or six shots in the time it took to shoot one musket ball. And, once the musket was shot, the colonial soldiers knew exactly where to aim by looking for the smoke. How many times does the older, inferior technology hold advantages over the shiny new innovation?]

Now, jump ahead nearly three centuries to the 1950s. By this time, Swiss matchmakers had dominated the world’s watch markets for more than a century, with particular success in the prior two decades as much of the world’s watch industry has turned to wartime applications. In 1924 a number of Swiss firms had pooled their resources to form the Swiss Laboratory for Watchmaking Research (LSRH) and by WWII had introduced waterproof, shock resistant and automatically winding watches. Swiss firms stressed quality and beauty—essentially selling tiny, intricate, bejeweled mechanical machines—offered exclusively through jewelers and upscale retail.

In 1967, The Swiss Horological Electronic Center invented the first quartz wristwatch. The quartz crystal could be made to vibrate so precisely that its oscillations would drive a watch’s accuracy to seconds per year (vs. minutes per week for competing automatics). A better technology, right?

The problem, as Amy Glasmeier says in Manufacturing Time, was that “With the advent of the quartz technology, the romance of the watch evaporated. Now virtually anyone could make watches and find a market for them in the world economy.”

This was decidedly un-Swiss, who were loathe to develop their breakthrough technology. But it was just the recipe for an aggressive Japanese firm rapidly emerging from WWII.

In fact, by 1969, Seiko had duplicated the Swiss invention and launched the “Astron,” a serious blow to both Swiss and American watchmakers. By 1978, quartz watches overtook mechanical watches in popularity. From 1970 to 1984, the Swiss watch industry lost 60,000 jobs, while Swiss watch companies decreased from 1,600 to about 600.

Only in 1982, when the first Swatch prototypes were launched, did the Swiss watch industry begin its comeback.

Innovation invented and overlooked again—at great pain.

Needless to say, it just keeps happening. One of the great innovation heists in modern times occurred between Univac and IBM. Univac, credited with building the first computer, knew without question that its magnificent machine was designed for scientific work. When business folk showed an interest, in fact, Univac ignored them. Meanwhile, IBM had followed suit with a computer, this one designed for astronomical calculations. But, when business came knocking, IBM decided to send a salesman. By 1960 Univac still had the most advanced computer, but IBM owned the market.

Oh, and since you’ve been good and read this far, let me tell you about King Philip’s body parts: When Philip was shot and killed in a swamp in Bristol, Rhode Island, in June 1676, he was decapitated and his head sent to the Governor of Plymouth Colony, Josiah Winslow, as a trophy of the war. Winslow had the head set on a pike on the major thoroughfare in Plymouth, and there it sat for (what we think) was at least 20 years. (That alone will give you some sense of the impact the war had in New England.)

Meanwhile, one of Philip’s hands was identifiable by a scar from a previous firearms accident. This “remarkable” hand was severed and given to Alderman, the man who shot Philip. Alderman preserved it in a bucket of rum and made his living after the war exhibiting the hand for a few pennies in taverns around New England.

Now, when your teacher tells you that you must write me a thank-you note, you can forget all about the interesting story of matchlocks and flintlocks and innovation overlooked and, like my young second-grade friends, write, “Dear Mr. Schultz: Thank you for telling us the story of King Philip’s really, really cool head and hand.”

Tuesday, July 8, 2008

A Plague of Dead Squirrels (a.k.a. The Unintended Consequences of Innovation)

Yeah you got yer dead cat and you got yer dead dog
On a moonlight night you got yer dead toad frog
Got yer dead rabbit and yer dead raccoon
The blood and the guts they're gonna make you swoon


--Loudon Wainwright III, Dead Skunk

[NB: No animals were injured in the writing of this article.]

I am sad to report that I am predicting a plague of dead squirrels on the roads of my suburban New England neighborhood. Not tomorrow, but--guessing now--beginning in about 2010 or 2011, and easily stretching for a decade.

I’m predicting the same for your neighborhood as well.

And it won’t just be squirrels—it’ll be chipmunks and skunks, rabbits and raccoons, and a few mystified deer. I’m afraid, even in urban areas, there will be a few more bicyclists thrown from the carbon frames of their Kona King Zings. All beginning about 2010.

It will be, for better or worse, another in a long, unbroken line of unintended consequences surrounding otherwise staggeringly beneficial innovation.

Let me explain.

I just reviewed a book written by Clay McShane and Joel Tarr called The Horse in the City: Living Machines in the Nineteenth Century. It’s a fascinating look at the horse as a “living technology”--and a very persistent living technology--whose numbers continued to grow rapidly despite the steam and mechanization of the Industrial Revolution.

The horse of the 19th century is apt to call up a tableau from some bucolic farm, or perhaps of cowboys out on the open range. While these scenes certainly existed--powered by our allegiance to the American frontier myth--the explosion in the use of the horse in nineteenth-century America occurred primarily in urban areas.

By 1900, a city like New York contained an average of one horse for every 26 people, with 130,000 horses in Manhattan alone pulling street cars and food wagons, carrying firefighters and their equipment, removing snow, carting off the dead, and even providing sources of stationary power.

One of the most interesting features of an innovation is what happens (all around it) during rapid adoption. In the case of horses, we know the obvious: more people and goods got around the city faster and with less human energy. But what about the other consequences, the unintended consequences of such rapid adoption?

In the case of the nineteenth-century horse, the authors point to the following items:
Waste. It’s fair to say that Brooklyn agriculture was built on Manhattan manure. (Farmers termed Manhattan a “manure factory.”) It was only when imported guano became a cheaper commodity that this inter-borough trade slowed.

(Those of you interested in learning how bird poop is harvested, or indeed, how it achieved a sustainable competitive advantage over horse poop will, I’m afraid, have to seek sources outside this blog.)

Abuse. Urban reform groups like the ASPCA took up the welfare of the horse, policing against abuse while actively euthanizing old or lame horses, worth more to the rendering plant than alive. It became clear that the horse was viewed by most city-dwellers in utilitarian terms—-a unit of production-—subject to replacement when the creature became less productive.

Infrastructure. Cities had to create extensive plant to support the horse, including municipal stables and carcass-removal programs. Parkways were created, in part, as venues for afternoon promenades. Meanwhile, the numbers of teamsters, hostlers and stable-keepers tripled from 1870 to 1890—a strange phenomenon in the face the Industrial Revolution.

Medicine. The burgeoning urban horse population led to the rise of a skilled class of urban veterinarians.

Breeding. The horse became subject to breeding programs designed to increase its size and endurance.

Sprawl and suburbs. Street railroads pulled by horses not only encouraged the sprawl of residential neighborhoods but also enabled an expansion of amusement parks and resort destinations for the working class. Indeed, the size and stench of the attendant infrastructure virtually ensured that well-heeled urbanites would eventually find their way to suburbia, even if they had to create it in the process.

Farming. Hay production soared in the farmlands because of the growth of the urban horse. By 1909, more than half of New England’s farmland was involved in hay production. This led to improvements in hay-pressing technology and the ability to ship hay great distances.

Trade. A vast national and international trade in horses developed.
What McShane and Tarr make clear is that, while a horse is a horse (of course, of course), the unintended consequences inherent in their innovative urban use led to a set of vast, largely unforeseen, and completely unintended consequences.

All of which got me thinking about some of the unintended consequences of more modern innovations.

Take the iPod, for example, one of the great entertainment gadgets of our times. Doesn’t it seem likely that one of the unintended consequences of the iPod will be a generation of Americans who begin to experience serious hearing loss in their 40s? Will the iPod one day double, with the flip of a switch, as a hearing aid?

Of course, its predecessor, the television, helped to create the couch potato, the TV dinner, and a habitually sleep-deprived society. (And we still adore it, so I suspect the iPod will still be treasured, even as our national hearing deteriorates.)

Some of the great unintended consequences of our time come from our medical innovations. The wonder drug of the twentieth century, penicillin, has led to the evolution of the superbug. Even Viagra (what could be wrong with Viagra?), a sensation with older men since its launch ten years ago, has the dubious distinction in a recent poll (of women—they finally polled women!) of causing one-third of females to be just plain annoyed at having to have sex at the drop of a pill, and one-out-of-ten who now believe that Viagra led to their husband’s infidelity.

Of course, if you need ill-effects from innovation, look no further than the Web, which robs us of our time and concentration, and truly appears to be making us all stoopid.

The TV, the iPod and the Web; the Tinker to Evers to Chance of unintended consequences: Of the great inventions of the last century, one makes us fat, lazy and tired, one destroys our hearing, and one lowers our IQ and our ability to concentrate.

And we’d love to take an aspirin to make it all better, but that has unintended consequences as well. I just can’t remember what they are.

As for social innovation, a stunning article in the July/August Atlantic by Hanna Rosin suggests that one of the great social programs of our generation--demolishing public-housing projects in large cities to free the poor from the destructive effects of concentrated poverty--has led to steadily falling crime rates in large cities for the last 15 years. That’s the great news. The unintended consequence? Almost like a successful franchising scheme, violent crime didn’t disappear; it just relocated to the mid-sized cities. FBI data now pegs the most dangerous spots in America as Florence, South Carolina; Charlotte-Mecklenburg, North Carolina; Kansas City, Missouri; Reading, Pennsylvania; Orlando, Florida; and Memphis, Tennessee.

Which, speaking of the spread of violence, brings me back to my original prediction of lots and lots of dead squirrels.

In the same Atlantic issue, Jonathan Rauch masterfully profiles General Motor’s attempts to build a true electric hybrid by 2010 in his article, “Electro-Shock Therapy.” This is not your neighbor’s Prius, which is a gasoline-powered car with an electrical assist. GM’s “Chevy Volt” will draw its power from any standard electrical socket and go 40 miles on a single charge. After 40 miles a small gasoline engine will ignite, driving a generator that will maintain the battery.

That means the wheels are always driven by the battery. That means the car will drive hundreds of miles on a tank of gas. That means the 75% of Americans who drive less than 40 miles a day will never buy any gas.

There are lots and lots of technology hurdles to meet, mostly around the battery, if GM is going to make its 2010 date. (You could have the car today if you didn’t mind pulling the battery around in an air-conditioned UHaul, for example.) But fear not; even if the date slips a bit, there will be electric cars on the road in the not-too-distant future. Like 2011.

Very eco-friendly. Very cool. Very innovative. Pretty darn fast. Awfully darn heavy. And very, very quiet.

And that is terribly bad news for squirrels. Because one of the unintended consequences of this breakthrough innovation will be, I’m afraid, a national sneak-attack on creatures of every sort caught, however momentarily, dallying in the road.

I sure hope someone is thinking about his. Maybe Michelin is inventing tires that whistle at some special squirrel frequency. Maybe the next generation of road asphalt comes with sensors. Because, in my town alone I can think of any number of blind corners that are made safe only by the rumble of an internal combustion engine.

When I lived in New York City I used to worry about the squirrels of Central Park, confined to a little island and genetically severed from their brethren. I worried that they would become a race of beer-swilling, sausage-scarfing, spandex-wearing rodents who would one day strap on rollerblades.

Now, I am more inclined to worry about squirrels everywhere. And deer. And you and me, out jogging or riding our bikes.

Q: Why did the squirrel cross the road?

A: Because it couldn’t hear the one-ton, battery-powered rolling mass of silent steel bearing down on it at 50 MPH from around a blind corner.

Tuesday, July 1, 2008

It's a Messy, Messy World (or, "How the Web Was Won")

My very first job out of college was in the credit training program of the Chase Manhattan Bank on Wall Street. I never had any interest in becoming a banker, but I’d been deferred at business school for two years and needed to find something—“besides surfing or skiing,” per my grumpy Admissions officer—and Chase was kind enough to help out.

Frankly, despite my dim view of a banking career, I was pretty excited about working for one of the icons of American finance. At that time, the Chase (never just “Chase”) was a star, maybe the star, of world banking. David Rockefeller was CEO, and we actually got to see him nearly every week (when he was in town) reporting on his visits with potentates and on the general economy at the “Friday Morning Meeting.”

Consequently, it was a great surprise and real disappointment for me to find, after a few months of work, that the Chase was peopled by—can you believe?--people. The kind who occasionally did poor analysis. Sometimes did bad loans. Sometimes participated in petty infighting. Took it as a personal affront if some other bank changed the prime rate before them. Even lost a customer once in a while.

Oh, the Chase did have its share of brilliant people as well, but I was rattled by how human much of the activity was.

As a guy just out of college I saw this as an indictment of the Chase—it must have had one heck of a PR department to maintain its sterling reputation with all of these idiots walking the halls.

In retrospect, of course, (with a little wisdom and time, and a few scars)I came to understand that the Chase had indeed been a truly great institution that just happened to have working at it, as I said, people. Good, smart, but ultimately flawed people.

Historian David McCullough is asked from time to time why history is important. After (I suspect) he suppresses a gag reflex, he offers a typically cogent response. Here’s one element of it, in relation to his book, 1776:
I want people to see that all-important time in a different way --- in the way it was. For a number of reasons, including the absence of photographs, we tend to see the men and women of the Revolution as not quite real. . .it's a pageant in which the performers are all handsome as stage actors, with uniforms and dress that are always costume perfect. I want to be inside that other time. I want to convey the atmosphere of the time, what it was like to have been alive then, what the reality was for those people. I often think about how they would feel if they could read what I'm writing. I imagine them asking, "Does he get it?"

For me the key event is the Continental Army's escape from Brooklyn after being soundly defeated in the first full-scale battle of the war. It's the Dunkirk of the Revolution, and an example of both individual character and outside forces powerfully at work. On the one hand, Washington's role in the escape was leader-ship at its best. On the other hand, circumstances beyond his or anyone's control played a part almost beyond belief. If the wind had been blowing in a different direction, the British would have been able to bring their warships up the East River and seal off any possibility of escape for Washington and his troops. The war and the chances of an independent United States of America could have ended there and then.
Think about that. Washington had just lost a critical battle and made a complete mess of things, and if it hadn’t been for the right wind, Americans might now be drinking tea every afternoon.

To be fair, of course, if the weather had been different on the English Channel in 1588, the Spanish Armada might have destroyed the British fleet and Americans might now be drinking tea with paella every afternoon.

But, that’s the point, and why we study history. Without stories like this, everything looks neat and tidy and preordained. The history of the United States becomes: The Mayflower landed. The Revolution was won. The Civil War freed the slaves. American Idol premiered. The sun rose this morning.

The truth is, people are smart and courageous and stupid and weak and unpredictable. Even in iconic institutions, as my Chase experience proved. Plans go awry. Chance intervenes. Nothing—and I mean nothing—is preordained. We just get comfortable with the outcome and think, after a while, that it “had to be that way.”

That’s one reason why the oral history of the Internet—“How the Web Was Won”--in the July edition of Vanity Fair was such a welcome article.

This morning when you turned on your computer you saw a nicely arranged desktop, neatly gathered email, a bunch of your carefully harvested blog articles and access to a marvelous search engine. Maybe you shopped or bid on an auction, or checked your “wall” for postings from your friends.

Such cosmic order all points to the history of the web as being: Something called ARPA. Mosaic. Netscape gets whipped by Microsoft. AOL. Amazon. Ebay. Google. MySpace. The crazy guy dances on YouTube. The sun rose this morning.

It’s an easy trap to fall into, and makes us wonder, when we read about the current tribulations of Yahoo or Twitter, why the past is so darn neat and the present so awfully messy.

In the spirit of the Chase and David McCullough, then, and to disabuse you of this notion, I’ve captured my 14 favorite quotes from the Vanity Fair article. These come from the very folks who lived the early history of the web, the ones who remember the truth.

Enjoy, and remember when things don’t go perfectly today (and they won’t), that it’s just part and parcel of a messy, messy world.

1. Bob Taylor, the third director of ARPA’s computer-science division, disproving the myth that, every so often, innovation really is a “eureka” moment:
In my office in the Pentagon I had one terminal that connected to a time-sharing system at M.I.T. I had another one that connected to a time-sharing system at U.C. Berkeley. I had one that connected to a time-sharing system at the System Development Corporation, in Santa Monica. There was another terminal that connected to the Rand Corporation.

And for me to use any of these systems, I would have to move from one terminal to the other. So the obvious idea came to me: Wait a minute. Why not just have one terminal, and it connects to anything you want it to be connected to? And, hence, the Arpanet was born.

When I had this idea about building a network—this was in 1966—it was kind of an “Aha” idea, a “Eureka!” idea. I went over to Charlie Herzfeld’s office and told him about it. And he pretty much instantly made a budget change within his agency and took a million dollars away from one of his other offices and gave it to me to get started. It took about 20 minutes.
2. Paul Baran, who conceived one of the Internet’s building blocks—packet switching—while working at the Rand Corporation, reminds us, as Scott Berkun explains in his excellent book, The Myths of Innovation: “Ordinary things, people, and events are transformed into legends by the force of time, all the time.”
I get credit for a lot of things I didn’t do. I just did a little piece on packet switching and I get blamed for the whole goddamned Internet, you know? Technology reaches a certain ripeness and the pieces are available and the need is there and the economics look good—it’s going to get invented by somebody.
3. Baran, Bob Taylor (also with ARPA), and Bob Metcalfe (who invented Ethernet), on what happens when innovation upsets the status quo, how the Big Dogs sometimes miss the party completely, and why bad memories linger:
Baran: The one hurdle packet switching faced was AT&T. They fought it tooth and nail at the beginning. They tried all sorts of things to stop it. They pretty much had a monopoly in all communications. And somebody from outside saying that there’s a better way to do it of course doesn’t make sense. They automatically assumed that we didn’t know what we were doing.

Bob Taylor: Working with AT&T would be like working with Cro-Magnon man. I asked them if they wanted to be early members so they could learn technology as we went along. They said no. I said, Well, why not? And they said, Because packet switching won’t work. They were adamant. As a result, AT&T missed out on the whole early networking experience.

Bob Metcalfe: Imagine a bearded grad student being handed a dozen AT&T executives, all in pin-striped suits and quite a bit older and cooler. And I’m giving them a tour. And when I say a tour, they’re standing behind me while I’m typing on one of these terminals. I’m traveling around the Arpanet showing them: Ooh, look. You can do this. And I’m in U.C.L.A. in Los Angeles now. And now I’m in San Francisco. And now I’m in Chicago. And now I’m in Cambridge, Massachusetts—isn’t this cool? And as I’m giving my demo, the damned thing crashed.

And I turned around to look at these 10, 12 AT&T suits, and they were all laughing. And it was in that moment that AT&T became my bĂȘte noire, because I realized in that moment that these sons of bitches were rooting against me.

To this day, I still cringe at the mention of AT&T. That’s why my cell phone is a T-Mobile. The rest of my family uses AT&T, but I refuse.
4. Stewart Brand, co-founder of the Global Business Network and the Long Now Foundation, on the different and unplanned trajectories innovation takes:
The idea of Arpanet was that it was going to basically join up computational resources. It was not set up primarily to do e-mail—but the computational-resource connection turned out to be not so important, and the e-mail turned out to be the killer app. These were people who were just trying those two experiments, one to try to make the computational resources blend, and the other to stay in touch with each other conveniently. You were inventing in all directions, with no particular certainty what was going to play out.
5. Leonard Kleinrock of UCLA, on great moments underappreciated:
September 2, 1969, is when the first I.M.P. [or packet switch] was connected to the first host, and that happened at U.C.L.A. We didn’t even have a camera or a tape recorder or a written record of that event. I mean, who noticed? Nobody did. Nineteen sixty-nine was quite a year. Man on the moon. Woodstock. Mets won the World Series. Charles Manson starts killing these people here in Los Angeles. And the Internet was born. Well, the first four everybody knew about. Nobody knew about the Internet.

So the switch arrives. Nobody notices. However, a month later, Stanford Research Institute gets their I.M.P., and they connect their host to their switch. Think of a square box, our computer, connected to a circle, which is the I.M.P., 5, 10 feet away. There’s another I.M.P. 400 miles north of us in Menlo Park, basically at Stanford Research Institute. And there’s a high-speed line connecting those two. We are now prepared to connect two hosts together over this fledgling network.

So on October 29, 1969, at 10:30 in the evening, you will find in a log, a notebook log that I have in my office at U.C.L.A., an entry which says, “Talked to SRI host to host.” If you want to be, shall I say, poetic about it, the September event was when the infant Internet took its first breath.
6. On the unintended consequences of innovation: The arrival of e-mail was followed quickly by the arrival of “junk” e-mail, or spam. Gary Thuerk, a marketer for Digital Equipment Corporation, sent the first spam into the Arpanet in 1978—it was an open invitation to two product demonstrations in California. (The Ferris Research technology group estimates that the global cost of combating unwanted e-mails will reach $140 billion in 2008.)

7. On more unintended consequences of innovation:
As late as 1988, e-mail was still far from widely used—nearly all traffic was either academic or military-oriented. In that year Ronald Reagan’s former national-security adviser John Poindexter was indicted for his role in the Iran-contra scandal, and his trial was one of the first to bring e-mail into the courtroom. Dan Webb was the prosecuting attorney in U.S. v. Poindexter.

Dan Webb: I didn’t really know what e-mail was, to be honest with you. All of a sudden these top-ranking government officials were communicating back and forth with each other with amazing candor just as if they were in a conversation. And it opened my eyes to what, in effect, was a stunning change in the way evidence gets presented. What we’re always doing is we have witnesses, and we’re trying to reconstruct past historical events through the imperfection of recollection. All of a sudden you have these things called e-mails, where there’s a verbatim record of what was actually communicated at a point in time.
8. On yet more unintended consequences: When the Internet started to become a truly globalized system, the potential threats to it became more insidious—interconnectivity is both a strength and a weakness. The first significant attack came on November 2, 1988, in the form of the so-called Morris Worm, created by a Cornell graduate student named Robert Tappan Morris. Keith Bostic, a computer programmer then at Berkeley, was one of those who tracked Morris down.
Keith Bostic: Basically, Robert Morris finds a couple of security problems in Unix systems and figures he can write a worm. He’s a student. He’s not being malicious here. Fires that sucker off. And unfortunately he makes a pretty boneheaded programming error. Instead of doing what he intended, which was kind of, you know, to wander around the Net and have a good time, it just pretty much shut down all the network systems.
9. On how the media often misses the big stuff: The first browser to take-off was Mosaic, created by Marc Andreessen, a student at the University of Illinois. Entrepreneur and Silicon Graphics founder Jim Clark soon took notice and partnered with Andreessen to create Netscape Communications.
Marc Andreessen: And so we basically said to ourselves, you know, if a lot of people are going to connect to the Internet, if only because of e-mail, and if all the P.C.’s are going to be going graphical, then you’ve got this whole new world where you’re going to have a lot of graphical P.C.’s on the Internet. Somebody should build a program that lets you access any of these Internet services from a single graphical program.

It sounds obvious in retrospect, but at the time, that was an original idea. When we were working on Mosaic during Christmas break between 1992 and 1993, I went out at like four in the morning to a 7-Eleven to get something to eat, and there was the first issue of Wired on the shelf. I bought it. In it there’s all this science-fiction stuff. The Internet’s not mentioned. Even in "Wired".
10. On the cost of poking the wrong bear:

Hadi Partovi was the group program manager for Internet Explorer at Microsoft. He later co-founded Tellme Networks and is president of iLike. Thomas Reardon was on Microsoft’s original Internet Explorer team.
Hadi Partovi: Both Marc Andreessen and Jim Barksdale were trash-talking basically. I mean, there was a competition between the companies, but it got to the point where they felt they were far enough ahead that they might as well trash-talk to build up the perception that these guys are going to win. On the one hand, you know, they were the David and we were the Goliath. On the other hand, Internet Explorer only had 5 percent market share in the Web-browser world, and nobody had even heard of it when we started out. And it definitely got people’s competitive juices up. Marc Andreessen had said something along the lines of “Windows will be reduced down to being a poorly debugged bag of device drivers.” And what that means is basically the relative value of Windows will be pretty much meaningless.

Thomas Reardon: Andreessen said that Windows was just a piece of shit. Well, that became a call to arms for us. We had this famous meeting called the Pearl Harbor Day meeting that year. Bill was going from talking about the Internet to: O.K., now we need a battle plan. The Internet Explorer team went from 5 people to 300.

Hadi Partovi: I personally printed out the strongest quotes from the Netscape people, with their faces, so if you walked down the hallway of the Internet Explorer team, you’d see the faces of one of these Netscape executives and what they said.
11. Jeffrey P. Bezos, founder of Amazon.com, on how it takes big and little ideas to keep innovation moving:
When we launched, we launched with over a million titles. There were countless snags. One of my friends figured out that you could order a negative quantity of books. And we would credit your credit card and then, I guess, wait for you to deliver the books to us. We fixed that one very quickly.

When we started out, we were packing on our hands and knees on these cement floors. One of the software engineers that I was packing next to was saying, You know, this is really killing my knees and my back. And I said to this person, I just had a great idea. We should get kneepads. And he looked at me like I was from Mars. And he said, Jeff, we should get packing tables.

We got packing tables the next day, and it doubled our productivity.
12. AT&T wasn’t alone in missing the Internet opportunity. Vinod Khosla created Sun Microsystems with Stanford classmates Scott McNealy and Andy Bechtolsheim, and Bill Joy.
Vinod Khosla: The media people essentially did not think the Internet would be important or disruptive. In 1996, I got together the C.E.O.’s of 9 of the 10 major newspaper companies in America in a single room to propose something called the New Century Network. It was the C.E.O.’s of The Washington Post and The New York Times and Gannett and Times Mirror and Tribune and I forget who else. They couldn’t convince themselves that a Google, a Yahoo, or an eBay would be important, or that eBay could ever replace classified advertising.
13. On how innovation can make us all temporarily stark raving mad: The dot-com boom of the 1990s was epitomized by the initial public offering of Netscape Communications, in August 1995; on the opening day of trading, Netscape’s stock price almost doubled in value. Before long, Silicon Valley was the scene of the most frenzied investing in modern times. Some companies, such as Amazon.com and eBay, had realistic business models; many other start-ups did not. Record losses soon followed. Between March 10, 2000, and October 10, 2002, the nasdaq Composite Index, which lists most technology and Internet companies, lost 78 percent of its value.
Hadi Partovi: There were so many start-ups where they’d have a fund-raising party. The company basically would have a business plan and a PowerPoint, no technology. They’d raise $10 million and then there’d be like $250,000 or $500,000 blown away just on the party.

Jeff Bezos: Many of those companies didn’t spend the money in a thrifty way. They would raise $25 million with a single phone call and then spend half of it on Super Bowl ads.

Hadi Partovi: Most investors didn’t understand the Internet. They just knew that these things that have “dot-com” next to them were worth a lot and were going to be really big someday, and they missed the last one. I remember DrKoop.com. And I remember they were losing money, I think $10 million a month or some crazy amount, and they still had an I.P.O. of almost a billion dollars, something really ridiculous.

Rich Karlgaard’s, whose Upside magazine was the first to cover the Silicon Valley start-up scene: The hottest job title during the frothy days was—you’d see 25-year-olds who had the title of “vice president, business development.” It was like sales without the quota. I remember asking one of these V.P., biz-dev guys how his company was doing, and he says, “Oh, it’s great, we’re into our third round of financing.” And I said, Well, how about the revenue side? Are you profitable? He says, “We’re a pre-revenue company.”
14. On enternal hope, and why the world remains a messy, messy place despite our best efforts otherwise:
Rich Karlgaard: And after it all, there was a bumper sticker you’d see in Palo Alto: “Dear God, one more bubble before I die.”