Tuesday, December 30, 2008

A Little Encouragement for Baby Boomers (& Happy New Year!)

The term aging Baby Boomers makes my teeth hurt, and I saw it more than once in the popular press in 2008. I suspect it’s an insidious plot by Gen X and Y to define their predecessors as “past their prime and fading into history.” Something they cooked up in journalism school. Last year, just before graduation.

Aging Baby Boomers. I hate that. And I respectfully submit that the very best days of our generation are still in front of us.

Maybe not physically.

OK. Definitely not physically.

But in terms of getting things done, making a contribution, and changing things for the better. The very best days are still in front of us—and it’s essential to our mental health to keep that idea front and center.

It reminds me of a visit my family made a few years ago to the beautiful Rijksmuseum in Amsterdam. The visit came not only during Rembrandt’s 400th birthday celebration, but during a renovation of the museum, so that the staff had assembled some of the very best of the collection in a very small area (and closed off the rest of the museum). This was music to the ears of touring parents who would try to get their children through about a dozen museums in ten days.

As we went through the guided tour I kept hearing about the Golden Age. The Golden Age this. The Golden Age that. It was then that I came to realize that the Netherlands—historic, lovely, democratic, energetic--thought of its best days as four hundred years ago, during the seventeenth century.

All I could think, in my ugly-American mindset, was how difficult it must be to live in a place where the belief is universally held that the golden age is not now, not tomorrow or next year or even a century from now, but 400 hundred years ago.

Don’t you lose a step, a bit of energy, maybe some of your edge if you are always and ever on the downslope? Doesn’t it make the rocking chair look just a shade more inviting, to think the best is behind you?

With that in mind, and as you—my fellow Baby Boomers, reaching ages 49 to 66 in 2009 (if you accept Strauss and Howe’s 1943-1960 definition)--make your New Year’s Resolutions for 2009, think about some of the past contributions and milestones of folks "our age." If nothing else, it'll encourage you not to skimp when you make up your list.

These examples are all taken from Eric Hanson’s fun new book called A Book of Ages.

Here we go. Pay attention.

At age 49, Mark Twain wrote Huckleberry Finn and George Eliot wrote her masterpiece, Middlemarch. At 49, Abe Lincoln spoke out against slavery and lost his first debate to Stephen A. Douglas. Davy Crockett died at the Almao after running out of bullets.

Have courage, roll with defeat, but don’t run out of bullets.

At age 50, Julius Caesar crossed the Rubicon, Henry Ford began manufacturing the Model T, FDR offered the nation “a New Deal,” Irving Berlin wrote Good Bless America, Eugene O’Neill wrote The Iceman Cometh, Julia Child premiered The French Chef, and Igor Sikorsky (after 30 years of trial) flew his first helicopter. Grateful Dead guitarist Jerry Garcia introduced a line of neckties. Painter Chuck Close became paralyzed from the neck down but learned a new way to hold his paint brush and would continue painting three enormous canvases a year from his wheelchair.

Oh, and Charles Darwin, at age 50, published The Origin of Species.

Keep tinkering. Keep changing. Keep thinking outside the species. Never give up.

At age 51, Leonardo Da Vinci painted the Mona Lisa. Some time later, and not to be outdone, 51-year-old Dr. Suess was given a list of 225 words to use in his new book. Two of the words were cat and hat.

At age 52, milkshake machine salesman Ray Kroc learned that a hamburger restaurant in San Bernardino owned by Richard and Maurice McDonald was making forty milkshakes at a time. Kroc investigated.

At age 53, Walt Disney opened a theme park in California. Samuel F.B. Morse strung some wires between Washington and Baltimore. Charles Dickens was in a train returning from France that plunged off a bridge, killing ten. Dickens’ car was left hanging from the trestle. After he escorted a lady friend off the train, he went back on the teetering car to save a manuscript he has been writing.

Kind of puts into perspective the last time you lost your cool trying to recover that unsaved Word document, no?

At age 54, Oliver Cromwell became lord protector of England, Frederick Douglass was allowed to vote for the first time, and Robert E. Lee declined the invitation of Abraham Lincoln to lead Union forces. Alfred Nobel read his premature obituary in a French newspaper, found himself described as a “merchant of death,” and dedicated most of his enormous wealth to promoting peace.

Hanson’s book is a great read. A little bit male. A little bit artsy. Too much of a couple of folks in particular. But lots of fun to read. Buy it. There’s lots more that I’ve left out.

At age 55, Rachel Carson wrote The Silent Spring, creating the modern environmental movement. Of less import but greater initial acclaim, Wilt Chamberlain published his memoirs, claiming 1.2 partners per day (20,000 in all) since he was 15.

At age 56, Henry Luce launched Sports Illustrated. George Frideric Handel premiered The Messiah.

Decades later we’re still singing, and still waiting for the bathing suit issue.

At age 57, Anais Nin admitted to two husbands, one a New York banker and the other a forest ranger in California. She compared her life to a trapeze. James Joyce shared copies of Finnegans Wake with his friends. George Washington was sworn in as the first president of the United States, saying he felt “like a culprit who is going to the place of his execution.”

It turned out pretty well, at least for Joyce and Washington.

At age 58, Miguel de Cervantes published part one of Don Quixote, Daniel Defoe published Robinson Caruso, and Fyodor Dostoyevsky finished The Brothers Karamozov. Langston Hughes, who had written 26 books in 34 years, started writing from midnight to six or seven in the morning because people kept stopping by during the day and interrupting him.

Good energy and focus, eh?

At age 59, Elizabeth Taylor married for the eighth time. There’s lots more good 59-year-old accomplishments, but I’ll stop there. It seems like enough said.

At age 60, the prophet Muhammad and his followers conquered Mecca. Jack LaLanne swam from Alcatraz to Fisherman’s Wharf in San Francisco. Handcuffed. Towing a half-ton boat. Ditto the above—I’ll stop there.

At age 61, on a plane heading for Washington to be interviewed for a seat on the Supreme Court, Harry Blackmun did “what he has always done when faced with a decision:” he wrote a list of pros and cons. The pros won. Blackmun will have a decisive influence on Roe v. Wade.

At age 62, Ed Sullivan uttered five words: Ladies and gentlemen: the Beatles.

At age 63, Lena Horne opened a one-woman show at New York’s Nederlander Theater that would run for 333 performances—the longest running solo show in Broadway history.

At age 64, Harry Truman was so far behind in the polls that pollsters just stopped asking. Truman embarked on a whistle-stop tour of hundreds of cities, went to bed election night thinking he had lost, and woke up elected. Isaac Newton was knighted, Mao Zedong launched the Great Leap Forward, and Henry Ford produced his 15 millionth Model T.

At age 65, Winston Churchill—arguably the greatest man of the 20th century—was elected prime minister of England for the first time. He told the country he had nothing to give but toil, blood, tears and sweat. Also at age 65, Andrew Carnegie offered $5.2M to the city of New York to build libraries, the start of some 2,800 libraries built nationwide from Carnegie funds.

At age 66, the oldest our Baby Boomers will be in 2009, Paul Revere built the first mill in the U.S. for rolling copper, eventually trademarked as Revereware.

Lest you think you can rest at 66, Baby Boomers, there’s more, but I’ll leave you with only a hint: At age 90, Frank Lloyd Wright was asked to design an opera house, two museums and a post office. More spectacularly, 90-year-old Sarah gave birth to Isaac.

Happy New Year, Baby Boomers! May our Golden Age be always in the future.

And remember: Don't run out of bullets.

Monday, December 15, 2008

The Incalculable ROI (Ode to the MBA and the Erie Canal)

Return on Investment (ROI) is a funny thing. It is deceptively easy to calculate but almost never comes out the way you expect. In many cases it doesn't work because what you can quantify will almost certainly pale in comparison to what you cannot.  These inestimable flows are, nonetheless, very real and very valuable.

I call this the incalculable ROI.

I was pondering such a thing when I read the recent Wall Street Journal article on Executive MBAs (EMBAs). Scott Thomas, a 31-year-old who was halfway through the EMBA at a Cleveland school dropped out to enroll at Ohio State University’s EMBA program. This doubled his tuition costs to $72,500 for the 18-month degree.

Later in the article the WSJ told us, based on their calculations, that this was probably a good thing for Mr. Thomas because the Ohio State program yields a 170% return on investment, third behind only Texas A&M and the University of Florida.

Here’s some of how the WSJ decided this:
We scoured the responses from our summer 2008 survey of EMBA graduates for data about salary, raises received after graduation, company-sponsored figures, tuition and out-of-pocket costs. . .To calculate the benefit, or return, we used the graduate-reported median raise after completion of the program as the first-year salary increase. We added a 5% annual increase over the following four years, based on the average annual increase expected by compensation specialists and executive recruiters.
This kind of heroic analytics can result in being terribly precise about great inaccuracies.  It is the kind of limited ROI that, hopefully, Mr. Thomas will be taught to avoid during his EMBA training.

In fact, there’s good indication that Mr. Thomas already “gets it.” One of his reasons for transferring to the Ohio State program was that “the alumni network is unbelievably large, and they’re unbelievably loyal.” That's one reason he agreed to double-down on tuition. He has in mind a potential inflow, a potential return, that is—at this point in Mr. Thomas’ life—incalculable. Because, by moving to a program with a terrific alumni network, Mr. Thomas might well connect with a future investor in some future start-up. Or his next boss. Or a partner who goes on to help him launch a world-beating product. Whatever the “incalculable inflow,” Mr. Thomas believes that it will potentially—and more than likely—dwarf the ROI carefully calculated by the WSJ.

Steve Jobs’ well traveled, oft-downloaded 2005 commencement address is another example of the incalculable return. When Jobs talks about dropping out of Reed College because he didn’t want to spend his parents’ life savings without having a clue what he wanted to do with his own life, he spent the next year visiting classes for fun. One was a calligraphy class, something he took because he admired the art and was interested in learning how it was done.
None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them.
This is the incalculable ROI.  The quantifiable stuff is what you can see and reasonably anticipate, and it’s not unimportant or irrelevant by any means. But the real return almost always dwarfs the anticipated cash flows. The real return comes about because presumably smart, talented, ambitious people (like Scott Thomas) place themselves on a collision path with other smart, talented, ambitious people, or (like Steve Jobs) on an intercept course with interesting, engaging ideas.

And, the incalculable ROI applies especially to big ideas and huge initiatives.

Take the creation of the Erie Canal. On October 26, 1825, Governor DeWitt Clinton boarded the Seneca Chief at Buffalo, arrived in Albany nine days later, and then, with steamboats replacing horses, floated down the Hudson to New York harbor where he poured a keg of Lake Erie water into the Atlantic Ocean. It was, like the title of Peter L. Bernstein’s excellent book, a Wedding of the Waters.

The Erie Canal was 363 miles, 83 locks, 675 feet up and down, and cost $7,143,789 to build. A calculation of the ROI on the project would show that the construction cost was paid in nine years. In 1882, when tolls were finally abolished, the canal had produced revenue of $121 million, more than four times its operating costs.

On a micro level, a shipment of flour could, thanks to the Erie Canal, travel 2,750 miles by water before its transportation cost was equivalent to 130 miles of travel by road.

All things considered, this was a pretty darn good ROI.

But, what did the Erie Canal really do? How do we, in retrospect, measure the incalculable ROI?

Well, thanks to the Canal, Albany, Utica, Syracuse, Rochester and Buffalo all became boomtowns while ten brand new municipalities were founded between Syracuse and Buffalo alone. And, New York City became one of the world’s elite cities: In 1824 some 324 vessels were counted in New York harbor. With the Erie Canal completed, an observer counted 1,241 vessels in the harbor one day in 1836.

Many hitherto subsistence farms in New York began growing product for market. Farmers with new-found cash now joined their urban neighbors in the consumption of products from northern factories. Luxury articles became attainable by the working class. Thanks in large part to the Erie Canal, the cost of a wall clock dropped from $60 to $3 by midcentury, and a mattress from $50 to $5. Folks in Batavia were able to feast on Long Island oysters. Traders in New York City began buying commodity contracts ahead of the harvest, creating an early form of hedging and adding to the City’s financial clout.

The turmoil caused by the rapid development of western New York made it fertile ground for some of the greatest movements of the Second Great Awakening, changing forever religion in America.

The Erie Canal brought strong Yankee growth to Ohio, Indiana, and Illinois, keeping Southerners (arriving “up” the Mississippi) from dominating those states, a major factor in the years before the issue of slavery was settled. In fact, the Erie Canal changed the axis of American economic power from north-south to east-west, and would lead to an “age of canals”--especially in the North--that would provide a meaningful advantage during the Civil War.

The money generated by the Canal, the so-called Canal Fund, served as a kind of stabilizing central bank during the Crash of 1837 when no central bank existed.

The completion of the Erie Canal led to a sharp rise in patent applications along its route. New York State led the nation in new patents per capita in almost all sectors of the economy in the mid-nineteenth century, with only southern New England leading in manufacturing after 1830.

The Erie Canal meant Midwest food production would flood Europe, leading Great Britain in 1846 to repeal tariffs on food. This freed European labor for work in factories, reducing the cost of production around the world.   How do you calculate that ROI?

Scott Thomas believes in the incalculable ROI. So does Steve Jobs. So did Dewitt Clinton.  I have a personal belief in the incalculable ROI.

I once bought a plane ticket to fly from Boston to Denver, with a stopover in Chicago. It was for a job interview. I was upset because the ticket was last minute and too expensive and I wasn’t sure the hiring company would pay for it (and I didn’t have much money). I wasn’t happy about the stopover in Chicago. I wasn’t sure I wanted the job and, after I’d interviewed, I was sure I didn’t want the job.

I suppose I was thinking that it would be a good experience. I could practice my interviewing. I might meet some interesting people. I could see how another company worked. I’d get to see the Rocky Mountains.

My calculated ROI on that ticket was miserable. But how about my incalculable ROI?

Well, on the way home, after meeting the company I didn’t want to work for, on the stopover I didn’t want to make in Chicago, with the ticket I didn’t want to pay for, I met a woman. Twenty-five years, six addresses, four dogs and three children later, we’re still together.

How do you calculate that ROI?

Thursday, December 4, 2008

Leadership in the White Space

Before I talk about white space I need to talk about dark matter.

Sometime in the 1930s scientists began observing phenomenon, like clusters of galaxies, whose movements didn’t make sense based on what could be seen. There had to be more mass “out there” holding things together. In fact, there had to be a lot more mass. That was the first inkling of something called dark matter.

As time went on, more and more data suggested that not just some, but most of the cosmos was invisible. And scientists got very clever at observing, say, the way light from distant galaxies bends, figuring out the total mass it would take to create the bend, and then subtracting out the known mass. The result: dark matter.

It’s like Sherlock Holmes said, “Eliminate all other factors, and the one which remains must be the truth.” Even if it’s invisible. Even if nobody quite knows what it is.

Today, we think dark matter may be the most voluminous and important stuff in the universe.

I believe there’s a comparable concept in business, organizations and leadership which is sometimes called “white space.”

It’s invisible, largely inscrutable, and may be the most important force in an organization. And there is no doubt that it bends light and energy to its will.

The other day on XM Radio I was listening to Bob Edwards interview one of XM’s Classical Music hosts. The two were discussing the enormous impact of Leonard Bernstein. Edwards asked a really good question, as he often does: If every member of the New York Philharmonic is a virtuoso musician able to read, interpret and play virtually any piece of music ever written for their instrument, what exactly does the conductor do?

The XM Classical host answered, and I’m paraphrasing slightly, “There’s a lot that goes on between the notes.”

That could be the title of a classical murder-mystery novel, no? Between the Notes: Mozart, murder, oboes, sex, and mis-tuned timpani. Yowza. The hard-boiled detective would look at the voluptuous flutist and say, “You may be able to hit the high E-sharp, but you have to know there’s a lot that goes on between the notes.”

Anyway, I thought the XM host touched on something real. There’s a kind of glue, a kind of energy that binds all of the talent in an organization together. Even when—or especially when—all the talent can hit every note on the button every time. With vibrato, even.

This reminded me of a CEO interview I read maybe ten years ago. This particular CEO, who led a brilliant management team, was asked what he did all day. He replied, “I just manage in the white space.”

In my first MBA year we were required to take something called Human Resources Management. I distinctly remember, on day one of the course, the professor said something like, “I know you all think this is a waste of your time. You think you should be learning how to read balance sheets and segment markets and value equity. And that’s part of what you should be learning. But I want you to know something: This is the most important course you’ll take. You just won’t recognize that for a while.”

He might have said—and we would have been equally skeptical—“Anyone can learn to do a cash flow. We’re going to spend the semester learning something very different, about operating in the white space. Some of you will never figure it out. But those of you who get it right, there’s some chance you’ll be successful.”

Learning to read a balance sheet is, in a way, like learning to play an E-sharp. It’s a real, teachable skill, one of the visible galaxies. It’s the stuff out there we can see and understand. But the stuff we can’t see, that binds all the cash flows and notes and keeps the galaxies from flying off into pieces—that stuff needs lots of our attention, too.

Getting an orchestra of prima donnas to play together brilliantly is about leadership in the white space. Making a tough decision on an investment, building support along the way and, generating enthusiasm for the final decision--that’s leadership in the white space.

A short time ago we didn’t know dark matter existed. Now it appears to be the stuff that holds everything together. It’s where all the cosmic heavy lifting gets done.

In organizations, the white space does the same thing. It’s where all the notes get wired together. It’s where all the energy is manufactured. It is, in most cases, the fundamental contributor to success.

As for that murder-mystery novel I was proposing, the one with the sex and oboes, the one entitled Between the Notes?  Question the percussionist.

Any good detective will tell you to never trust a guy who hits things for a living.

Friday, November 28, 2008

It’s Beginning to Look (A Lot Less) Like Christmas

It’s no secret that consumers hit the brakes hard this fall, forcing U.S. retail sales down sharply and dimming prospects for a merry Christmas for the nation’s merchants.

Then came news that Christmas tree vendors were worried that consumers would do the unthinkable and go without this year. Some vendors, in fact, have put in orders for lower quality trees, hoping to keep their prices down.

(Personally, I think people will forego Christmas in Hawaii and splurge instead on a better tree. Hard times tend to make home, hearth and tradition that much more important. At least that would be my bet.)

In any case, if your tree hooks to the left, leans to the right, and has a big bite taken out of its lower branches, at least you’ll know why.

Now, however, the unthinkable has happened: Even Santa is getting downsized this Christmas. In the face of the worst economy in years, communities around the country are scaling back the lights, shortening the parades and hiring fewer Santas. The WSJ reports that “Santa bookings have dropped so steeply that the Amalgamated Order of Real Bearded Santas, which represents 700 jolly souls in red velvet, held a series of meetings to discuss their economic survival.”

Santas who do get hired are reading stories instead of delivering gifts, just to keep their variable costs to a minimum.

It all makes you long for the kinder, gentler days of Santa when the big guy flew above economic downturns and the crass commercialism of the season. Or did he?

Well, I’ve finally finished Stephen Nissenbaum’s excellent The Battle for Christmas, and I’ve got news for you. Santa—who I always figured was just slightly younger than the Big Bang--was essentially invented in the 1820s and co-opted almost immediately by the rising material tide in the early Republic. He did, Nissenbaum argues, help to soften the season, which was violent and drunken in post-Revolution America. But Santa was and is part and parcel of our economic history, and has been intimately bound to our financial fortunes for two centuries.

Let’s return to the New York City of the early nineteenth century. As the population exploded from 33,000 in 1790 to 270,000 in 1835, the city spread rapidly northward from the very southern tip of Manhattan. Immigrants arrived and a growing, impoverished underclass arose. Nissenbaum writes, “In the second decade of the nineteenth century, New York underwent an explosion of poverty, vagrancy, and homelessness. That was followed in the third decade by serious outbreaks of public violence. In the eye of New York’s respectable citizens, the entire city appeared. . .to be coming apart completely.”

This was the downside to Jeffersonian and Jacksonian democracy. While Thomas Jefferson, surrounded by slaves and farmland, was spending himself into gentlemanly bankruptcy, he was also preaching the virtues of the common man. Unfortunately, a few too many common men didn't realize they were the cornerstones of virtuous democracy and were instead assembling in mobs and creating havoc in urban centers.

(I often think that Jefferson was able to wax eloquent about the common man because he never actually hung out with any. Adams, on the other hand, knew his neighbors well and became a Federalist, arguing against straight-up democracy. Sir Winston Churchill might have echoed Adams fears when he said “The best argument against democracy is a five-minute conversation with the average voter.”)

There was, in the first half of nineteenth-century America, what historians now refer to as “mobocracy”—a kind of violence that emerged as wage-labor developed, social norms were spun on their heads, older and newer immigrants clashed, and the American middle class was ultimately birthed.

Some of the mob violence was engendered less by emerging class warfare and more by alcohol. In 1825, the average American over fifteen years old consumed seven gallons of alcohol a year, mostly as whiskey and hard cider. The comparable number today is two gallons, mostly as beer and wine. Drunkardness was a serious public health issue in the early Republic that would be addressed with a vengeance by the emerging temperance movement.

Violence surfaced over Christmas and New Years in particular, Nissenbaum tells us, as a way for the poor and emerging wage-labor class to let off steam, a kind of escape valve. For wage-earners, the coming winter might mean lay-offs (as the rivers iced up), forced unemployment and want. Christmas could become a season to express dissatisfaction, ethnic or class resentments. It was “something less than a full-fledged radical movement but more than sheer, unfocused rowdiness.”

However defined, by the 1820s Christmas misrule had become an acute social threat. Bands of young street toughs, members of the emerging urban proletariat, had begun to travel freely and menacingly wherever they wanted. 1828 was remarkable for a particularly extensive and violent display, as an army of youth marched from the Bowery to Broadway to the City Hotel to a black neighborhood church to the city’s main commercial district to the Battery, causing destruction and beatings along the way.

As urbanization grew, well-to-do New Yorkers fled north to fenced and hedged estates, but the city was never far behind. In 1811, New York began a plan to construct a regular grid system of numbered streets, the ones we use to navigate by today.

At this point three men, Dutch-tinged “Knickerbockers,” enter our story. John Pintard was a prominent NYC merchant and civic leader, a founder of the New-York Historical Society, and a major player in establishing Washington’s Birthday, the Fourth of July and even Columbus Day as national holidays.

On the evening of Dec 31, 1820, Pintard’s household was awakened “by a band of loud revelers marching down Wall Street and directly outside his house, banging on drums, blowing fifes and whistles.” In a letter written early the next morning Pintard says he roused “mama” (in her kerchief, one wonders) and “threw on his clothes in haste, and down we sallied.” (Did he throw up the sash?)

Santa’s arrival? Not yet. Just a bunch of ruffians celebrating the season, causing havoc, assaulting a few pedestrians, and damaging an occasional shop along the way. Just another holiday night in early nineteenth-century New York City.

Pintard would spend the next day going to church and celebrating the season’s end with a round of visits with friends and acquaintances, followed by an afternoon family dinner of venison, holiday dishes and toasts. That was his kind of holiday—quiet, genial, safe, and domestic. (Nissenbaum does a great job contrasting the two versions of holiday celebration; buy the book!)

As a guy able to help create national holidays, could Pintard avoid taking a run at transforming Christmas?

In fact, from 1810 to 1830 or so Pintard took the responsibility of inventing Christmas rituals to try to create the perfect-remembered holiday. He led the effort to bring St. Nicholas to America as the icon of the New-York Historical Society and the patron saint of NYC. In 1810 Pintard paid for the publication of a broadside, sponsored by the Society, which featured a picture of St. Nicholas bringing gifts to children during the Christmas season.

Now, enter our second Knickerbocker, Washington Irving, a friend of John Pintard. Irving’s The Sketch Book appeared in 1819/20, a smashing success, and one that made him an instant celebrity. Along with "Rip Van Winkle" and "The Legend of Sleepy Hollow," there were also five stories about Christmas. “In these stories Irving used Christmas as the setting for a culture in which all the classes joined together in paternalist harmony.”

Irving’s were lovely, popular stories, but historians now know that he was—like Pintard-- “inventing tradition.” Indeed, he later admitted that he had never actually seen the kind of Christmas he described. (In 1843 Dickens’ A Christmas Carol would round out our view of Christmas—roaring fires, carolers, snow-covered lanes and the assembly of families. Like Irving, Dickens was inventing tradition.)

Now, all it took was the contribution from a third Knickerbocker, Clement Clarke Moore. Moore, friends with Pintard and Irving, was a professor of Oriental and Greek literature at Columbia College, and later compiled a two-volume Hebrew dictionary at General Theological Seminary. But, he was best known as the author of A Visit from St. Nicholas, now known as Twas the Night Before Christmas.

Moore clearly borrowed from Irving, who mentioned St. Nicholas 25 times in his Sketch book, including references to a wagon, Santa’s pipe, and adding a line that read, “laying a finger aside of his nose.” Moore undoubtedly drew from Pintard’s experiences as well—being awakened by an intruder on Christmas Eve. An even greater influence, however, were Pintard’s (and Moore’s) attitudes: conservative, opposed to Jeffersonian and Jacksonian mobocracy, fearful of the working class, and resistant to urbanization.

Interestingly enough, the year Moore wrote St. Nicholas for his children, the New York legislature gave men without property the right to vote; Pintard wrote, men “who had not stake in society.”

Even more pressing, in 1818 Moore’s huge family estate of Chelsea (now the namesake of the neighborhood north of Greenwich Village)—which in Moore’s youth had been pastoral and removed from the urban jungle--was being forever changed. In fact, part of Chelsea had been seized by eminent domain and was being split down the middle by something called “Ninth Avenue.” So, in the 1821 City Directory, Moore is found no longer living at Chelsea, but near the corner of Ninth Avenue and Twenty-first Street. The city had finally caught him.

Moore was upset by these changes to New York City, its loss of beauty and tranquility, given over to a conspiracy of “cartmen, carpenters, masons, pavers, and all their host of attendant laborers.”

These many forces came together in strange and wonderful ways in A Visit From St. Nicholas. Moore would take Pintard’s rude awakening of 1821 and turn it into a moment of magic—the basis for our own Christmas Eve.

First, in Moore’s poem, Santa posed no threat, though he was intentionally modeled as one of the working class. For example, he smoked the “stub of a pipe”-- a clear gesture to the proletariat, as patricians customarily smoked “mighty” long pipes (sometimes two feet long) known as aldermen or church wardens. (In fact, the working class often bought longer pipes and then broke them to a stub.)

In addition, Moore painted Santa as looking like a “peddler just opening his pack,” making him “something between a beggar and petty tradesman”—the veritable bottom of the barrel in the emerging craftsmen class.

Moore was not being subtle: The very kind of person the Knickerbockers feared most had now invaded his poetic home to do nothing more than respectfully deliver gifts to children.

There’s lots more, and Nissenbaum is superb in tracing the transformation of St. Nicholas from Bishop into a proletariat Santa that brought together upper and lower classes peacefully—at least in verse--in nineteenth-century New York. Nissenbaum says, the Knickbockers used their invention of Santa Claus to help forge a “placid ‘folk’ identity that could provide a cultural counterweight to the commercial bustle and democratic ‘misrule’ of early-nineteenth-century New York.”

Oh, and that “finger-aside” thing? Today, we no longer know what “putting a finger aside of his nose” means. But to a nineteenth-century American, the message would be clear. Moore’s Santa was saying to the reader: “I’m only kidding. You know I don’t exist. Let’s keep this between the two of us.” Indeed, Pintard and Irving might have admitted the same to us, that their images and rituals of the holiday were largely inventive longing.

In 1823 A Visit from St. Nicholas was published by a newspaper in Troy, New York. The following year four new almanacs, all published in Philadelphia, printed the poem. By 1828, it was being printed widely around the nation. Not long after that (newly-formed departments of) police in Philadelphia and other American cities began looking for groups of unruly boys at Christmas, ready to throw them in jail. In fact, commercial Christmas presents had their start in the decade of the 1820s, and merchants began to have a vested interest in keeping the streets free of rowdy behavior so that shoppers could navigate their stores.

In 1834, a letter printed by a Boston Unitarian magazine would sound all-to-familiar to modern ears:
All the children are expecting presents, and all aunts and cousins to say nothing of near relatives, are considering what they shall bestow upon the earnest expectations. . .I observe that the shops are preparing themselves with all sorts of things to suit all sorts of tastes; and am amazed at the cunning skill with which the most worthless as well as most valuable articles are set forth to tempt and decoy the bewildered purchaser.

And leading the charge: Santa Claus. By the mid-1820s Santa was hawking goods, and by the early 1840s Santa had become a common commercial icon, a figure used by merchants to attract the attention of children to shops. (He had also, you'll note, regained his long pipe.) Interestingly enough, the depression that set in at the end of 1839—the deepest ever experienced by the United States—was openly countered by merchants who used Christmas as a way to attract shoppers. “Old Hard Times” was being replaced, readers were assured, by “Old Santa Claus.”

Just twenty years after the birth of Santa, and 15 years after taking on a national presence, merchants knew that Christmas was the one time of year that good Jacksonian Americans, committed to frugality and many deeply distrustful of luxury, could be expected to buy and consume things--even if they did not need them and could not afford them. Christmas had become a special ritual time “when the ordinary rules of behavior were upended.”

“By mid-century,” John Steele Gordon wrote in An Empire of Wealth, “Christmas had become the major secular holiday it is today and would grow into the most important engine of the retail business.”

All of which brings us back to our own nagging recession and desperate need for a retail engine.

If only Santa would help us more.

Unfortunately, it’s really not looking very pretty. It turns out the Amalgamated Order of Red Bearded Santas (AORBS) are having internal labor problems of their own. The December issue of Harper’s Magazine reported that the Chapters and Lodges of the Pacific and Rocky Mountain Regions AORBS separated from the parent organization this last April. Their charges, among others: showing a vindictive and persecutory attitude toward its members, and engaging in un-Santa-like dialogue with members of the Order.

Vindictive and persecutory? Un-Santa-like dialogue? (Where have you gone, Joe Dimaggio?) Seriously. Page 26.

I may have to detox with a little Charlie Brown after this.

It’s a reminder that from his creation, Santa—invented as a sign of goodness, comfort and peace in a turbulent world—came from solid, sometimes combative working class stock. Not a bad combination.

Just watch out for that finger-aside thing.

Sunday, November 9, 2008

Smith, Slywotzky and Some Damn Beavers

Not far from where I’m sitting, just up the road apiece and across the state line, Jedediah Smith Sr. and wife Sally Strong farmed land and raised a family that would eventually number 12 children.

The fourth, born in 1799 after the Smiths had move from New Hampshire to New York, was Jedediah Jr. In 1821, Jed Jr. went to work for William H. Ashley’s fur trading company out of St. Louis. After a series of extraordinary adventures, Smith and partners acquired Ashley’s company and successfully took on the Hudson’s Bay Company monopoly in the fur trade. Today, Smith is remembered as one of the most successful explorers and entrepreneurs of the first half of the nineteenth century.

Also not far from where I’m sitting, in the lowlands along our driveway, a family of beaver are hiding out, waiting for sunset to repair their dam in a little culvert that flows under our driveway. These beaver don’t much bother me since our home sits on a rise above the lowlands. But, in a town where everyone has septic and wells, and where some of the wells reach aquifer less than fifty feet below the service, these beavers are bedeviling our neighbors.

It seems their magnificent dam is backing up water from nearby Cedar Pond, flooding acres and acres of well-tended backyard. It’s causing septic systems to overflow, and septic to flow into well water, and nothing that walks on two legs is very happy when that happens.

Finally, not far from where I’m sitting—in fact, on the bookshelf to my left--is a copy of Adrian Slywotzky’s The Art of Profitability, published in 2002. I had the opportunity to work with Slywotzky for a spell last century when a company for which he was consulting grew interested in partnering with our business. The Art of Profitability, which suggests that we’d all be a lot more profitable if we thought about profits on a regular, disciplined basis, is a delight to read and suggests a variety of ways to redefine business to make more money—a good tonic given our current economic straits.

Now, let me tell you how these things all tie together. And let me start with those dx!% beavers.

From about 1750, when they were hunted out of the Commonwealth, until 1928, when one was spotted in West Stockbridge, beavers had been absent from Massachusetts. Folks in Massachusetts were so excited by their reappearance that three more were imported from New York and released in nearby Lenox in 1932.   Fourteen years later, the Massachusetts Department of Fish and Game reported that there were 300 beavers in 45 colonies around the state.

As with every other instance when human beings have messed with the balance of nature, we got it wrong. The well-meaning people of Massachusetts failed to offset their good deed by providing the beaver population with a natural predator. Then, to make matters worse for homeowners, in 1996 the state passed a ballot referendum prohibiting or restricting the use of many types of commonly-used traps.

The result: the beaver population in the state grew from 24,000 in 1996 to 70,000 in 2001.

Now, if I were pitching you on my new “beaver-pelt” business, I would take that annual growth rate of 24% and project it right into 2009, telling you that from one lonely beaver in 1928, the state beaver population is now over 385,000.  And you can bet there are a slew of illegal immigrant beavers and more than a few expired H1-B Visa beavers to add to the organic growth of the population.

Despite the hundreds of thousands of beavers I now estimate reside in Massachusetts, we here in the neighborhood are only worried about the three or four which have set up housekeeping on the premises.

To solve the flooding problem, every morning one of my good neighbors comes by and takes down the allowed two inches of dam, letting the water flow back into Cedar Pond. Every evening the beavers repair their dam, insuring that the floodwater stays in the backyards of my neighbors. The theory is—and I have never seen this to be the case—we will eventually wear the beavers down and they will move away.

It all makes me pine for the 1820s and 1830s, which was the height of the beaver trade in America and across the Atlantic. Back then, one of my neighbors could have taken his gun, shot the beavers, walked to Newburyport or Salem or Boston and sold their pelts for a good price. The pelts would be sent on to Europe to be made into one of the stylish hats for which Europeans were all a-twitter.

And here is where Jedediah Smith comes in. Daniel Howe tells us the following about Smith, taken from Dale Morgan’s 1953 Jedediah Smith and the Opening of the West:
At the the age of twenty-two [Smith] retraced much of Lewis and Clark’s route up the Missouri. During his short life Smith proved himself a natural leader, an intrepid explorer, and a successful businessman. Taking his Bible and a few companions, this sober, religious young man laid out the route of the future Oregon Trail over South Pass in 1824 and explored the regions of the Great Salt Lake.. .Along the thousand of miles that he traveled without maps, he fought some Indians, traded with other, survived hunger, thirst, snowstorms and floods, and got mauled by a grizzly. He successfully challenged the Hudson’s Bay Company in the fur business, and with two partners was able to buy out his employer Ashley in 1826. A rich man when he returned to St. Louis in 1830, Smith had seen more of the Rocky Mountain West than anyone else in his time. . .

If you are not familiar with the Hudson’s Bay Company (HBC), it is the oldest commercial concern in North America, dating from 1670. It now operates retail stores, and is owned by a private equity firm (which seems weird to say), but in the first part of the nineteenth century it was the most powerful private organization in North America and actually ruled most of Canada.

The HBC, for parts of two centuries, was essentially its own sovereign country.

How do you attack a giant? One way is to change the profit model, and that’s what Smith and his partners did. Starting in 1825, the firm of William H. Ashley paid salaries to keep white trapper-traders in the wilderness year round, departing from the depot-based, pay-per-pelt practice of HBC.

I once visited the offices of a very large pharmaceutical customer and, as we were walking down the hall, my guide was saying, “That’s where the VP Marketing is, and that’s where the VP Production is, and that’s where the executive from 3M has his office.”

“3M,” I asked?

“Well,” said my guide, “we do so much business with them that it makes sense they should have an office permanently here to help organize our purchases.”

This was essentially the idea that Smith and his cohorts hit upon: Let’s pay to keep the trappers-traders in the wilderness, living alongside the natives, building up long-term relations and establishing strong interior distribution. It was the Early Republic equivalent of the Application Engineer.

That’s how you take on a giant, then; you change the way profits are made.

All of which leads me back again to Adrian Slywotzky’s The Art of Profitability. In it, Slywotzky suggests that you read only one chapter of the book (i.e.—one profit model) a week and really stew on the material. Imbedded in most chapters is other, recommended reading, like Innumeracy: Mathematical Illiteracy and its Consequences, Asimov on Astronomy, Einstein’s Dreams, Confessions of an Advertising Man, and Ezra Pound’s ABC of Reading. The entire story takes place as a conversation between a mentor and struggling mentee.

Chapters include Pyramid Profit (Mattel developed a barely-profitable $10 Barbie to block low cost knock-offs from establishing a connection to their customers), Multi-Component Profit (Coke makes a different profit per ounce in the supermarket, restaurant and vending machine), and Switchboard Profit (Michael Ovitz packaged talent, story, and critical mass in Hollywood to gain share and profits).

The chapter that struck me, however, was one called Customer Solution Profit. A company, Factset, generated high profits by identifying a potential customer and then sending a team of people to work onsite at the company, sometimes for months, and almost always for free—an extraordinarily costly proposition.

It would be like paying salaries to trappers to stay in the woods year-round.

If Factset won the account, however, they would have built such great relations with their customer, and such a tailored solution, that their profits soared, easily making up in the longer term for what they had invested in the first year. By then, their product and people were woven into the account, the expensive selling was done, and the account became a long-term, high-margin profit center.

There’s more where that came from, all courtesy of Slywotzky’s engaging storytelling and ability to see clear patterns in a variety of business models.

All of which, finally, leads me back to our beaver problem.

I’m sorry to say that Jedediah Smith died at the early age of 31, surprised by a Comanche hunting party. His life burned, in the parlance of Bladerunner, twice as bright but half as long.

I just looked up Adrian Slywotzky on Wikipedia and he appears to be very much alive and well, industrious as ever.

Given those two very different outcomes, and given that the rain is still falling hard, would you like to wager which future is most likely in store for our neighbors, the beaver family?

Wednesday, October 29, 2008

Don't Go Toward the Light!

How many hours of sleep do you lose a week because of electric lights and electronic gadgets? Maybe an hour every weeknight? Maybe 3 or 4 hours every weekend?  Maybe more?

Fear not.  The great restorative elixir of our age is coffee.  It has become the worker's little helper, the drug that makes us clear-eyed in the mornings and props us up in the afternoons.  It has evolved from drink to self-medication to  lifestyle, philosophy, and economic juggernaut, all with the underlying mission of off-setting our loss to the bright lights and dancing screens of the night.  Caffeine has turned Dunkin and Starbucks into the great Pavlovian beacons of our age.

I was pondering this while reading the National Geographic article, “The End of Night,” which highlighted another aspect of having too much man-made light on our planet. Author Verlyn Klinkenborg's interesting claim is that we have “engineered night to receive us by filling it with light,” no different from damming a river.

"Now most of humanity lives under intersecting domes of reflected, refracted light. . .Nearly all of nighttime Europe is a nebula of light, as is most of the United States and all of Japan.” In the south Atlantic where squid fishermen use halide lamps to attract their prey, the light cast into space is brighter than Buenos Aires."

The consequence is light pollution. In many places on earth, we have lost the stars. Worse yet, “whenever human light spills into the natural world, some aspect of life—migration, reproduction, feeding—is affected.”

“Migrating birds collide with brightly lit tall buildings," Klinkenborg writes. "Insects cluster around streetlights, providing artificial hunting-grounds for bats. Birds sing at unnatural hours, breed earlier than they should, and put on fat too early for their migratory cycle. Hatchling sea turtles are confused by artificial lighting on the beach, with losses in the hundreds of thousands."

Then there’s the toll light takes on us. Klinkenborg adds, “for the past century or so, we’ve been performing an open-ended experiment on ourselves, extending the day, shortening the night, and short-circuiting the human body’s sensitive response to light. . .At least one new study has suggested a direct correlation between higher rates of breast cancer in women and the nighttime brightness of their neighborhoods.”

Imagine living in a country where 200 million adults are habitually tired. Imagine how grumpy people would be in traffic, how difficult they'd be to work with, and what bad listeners they’d all make. Imagine the foolish things that would go on in such a country where everyone gets robbed of an hour of sleep, compensates with caffeine all morning, and sleepwalks all afternoon.

In 1995, Wolfgang Schivelbusch wrote Disenchanted Night: The Industrialization of Light in the Nineteenth Century. In it, he discusses some of the social implications of light, and makes it clear that, while nobody likes to stub his toe in the dark, the adoption of 7-by-24 artificial light has come at a huge cost to Europeans or Americans.

Here are a few of the things I learned:

1. For thousands of years, the flame remained essentially unchanged as a source of light for human activity. When people wanted more light, they added more flames. In 1688, for example, 24,000 lights—presumably wax candles--were used to illuminate Versailles.

2. Because artificial light was expensive, only royalty used it for extravagant displays like Versailles. “Artificial light was used for work, not for celebrations; it was employed in a rational, economical way. It emancipated the working day from its dependence on natural light, a process that had begun with the introduction of mechanical clocks in the sixteenth century.” Prior to that, Schivelbusch writes, “the medieval community prepared itself for dark like a ship’s crew preparing to face a gathering storm”—retreating indoors, closing the city gates, and bolting doors.”

3. As long as the artificial light required was limited to individual craftsmen, candles and oil lamps were adequate. But, once industrial methods of production were adopted, artificial light was needed for larger spaces and longer periods of time. “In the factories, night was turned to day more consistently than anywhere else.”

4. Schivelbusch says that the wick was as revolutionary in the development of artificial lighting as the wheel was to transport. In fact, people grew so accustomed to wicks that, “in the dazzling brightness of the gaslight, the first thing people wanted to know was what had happened to the wick. ‘Do you mean to tell us it will be possible to have a light without a wick,’ an MP asked the gas engineer William Murdoch at a hearing in the House of Commons in 1810.”

5. Once a house was connected to a central gas supply, it lost its autonomy. “To contemporaries it seemed that industries were expanding, sending out tentacles, octopus-like, into every house. Being connected to them as consumers made people uneasy. They clearly felt a loss of personal freedom.” Many turned off their gas at night, like the medieval city closing its doors. By the mid-1820s most big cities in England had gas; by the late 1840s it had reached many small towns and villages. By 1829, gas was being used for street lighting.

6. There was, of course, a genuinely good reason to fear gas; early gasometers were expected to explode at any minute. And, often they did.

7. The most outstanding feature of gaslight was its brightness. Traditional flames paled in comparison. In fact, the gas flame was so bright people could not look at it directly. Hence, the need arose for shades and frosted glass as ways to dissolve and soften the concentrated light. Worse, though, was that gas used up so much air that it was impossible to stay in gas-lit rooms. People often felt it at the theater, where headaches were common; at home, gas caused headaches and sweating, and could ruin interior decorations. Household guides at the time recommended against gaslights in any of the common living areas.

8. By the mid-nineteenth century, 1,500 police patrolled Paris by day and 3,500 lanterns lit it by night. This lighting was so effective in reducing crime that lantern-smashing became a common crime. In Les Miserables, you might recall, one of the chapters ("A Boy at War With Street-Lamps") describes Gavroche out having his turn at the lanterns. In many cities, the magnificent signboards that decorated the front of shops were removed because they blocked too much light.

9. With signs coming down, shops transitioned to the lighted shop window. This paralleled the ability, about 1850, to make large sheets of glass. Together, these inventions allowed retail shops to extend their hours past sundown.

10. The electric light bulbs shown at the 1881 Paris Electricity Exposition were marketed as superior to gas in every way, shining evenly and steadily irrespective of the season. The bulb demonstrated was, by comparison, a little weaker than today’s 25 watt light bulbs. Unlike gaslight, all doors in the household were open to electric light.

11. Still, the electric light took some getting used to. As one observer noted, “There is something that is lost in electric light: objects (seemingly) appear much more clearly, but in reality it flattens them. Electric light imparts too much brightness and thus things lose body, outline, substance—in short, their essence. In candlelight objects cast much more significant shadows, shadows that have the power actually to create forms.”

And, because the electric light “lit” more of the space, and more brightly, it changed the nature of home decorating. “Muted colors are more compatible with the lively lighting in our homes.”

It’s been only about two centuries that humans have been able to control the dark. The unintended consequences of lighting the world are significant, both on ourselves and the creatures around us. New home decorations. Confused turtles. Tired people. Sick people.

Imagine a world without electricity and electronics. But before we do, let’s go get some coffee so we don't fall asleep while we're doing it.

Thursday, August 28, 2008

Historical Postcards & the Battle of New Orleans

For every nation there are a handful of events each century that are so stunning and transformative that they become what I call “historical postcards”—big, bright pictures burned into the memories of an entire living generation.

One such postcard for my father’s generation was Pearl Harbor.  He was just a young boy when the bombs fell, and I remember him telling me that he heard the news on the radio and hid under his bed for fear that his New England city would be next. He never forgot that terrible feeling of fear and loss, nor did many members of his generation.

There have been, by my count, five historical postcards in my generation. I do not include the assassinations of Robert Kennedy, or the Space Shuttle Columbia disaster; these were big, important events, and I can still see the pictures in my mind.  But an historical postcard, at least by my definition, changes the world as we know it.

Friday, August 1, 2008

Real Perspective on the "Greatest Generation"

I have an important question to ask you.

Who is the greatest baseball player of all time, Ken Griffey, Jr. or Alex Rodriquez?

If you are a precocious 12-year-old fan living in New York or Cincinnati, that may be a profound question worth debating.

If you are a baseball fan of just about any other stripe, that question is, to put it bluntly, boneheaded.

(If you hate baseball, a few more paragraphs and we’re onto the main topic.)

Being in your teens or older, for example, you’ve seen Barry Bonds play. A little bit older and you’ve seen Hank Aaron. My age and you remember Mantle and Mays. Still older and it’s Dimaggio, Williams and Musial. And, there are certainly plenty of folks still alive who saw Babe Ruth, Ty Cobb, and probably Christy Mathewson play.

The interesting thing about baseball, at least modern baseball, is that its entire history is still within the living memory of Americans. And when we have that kind of collective perspective on an issue, the ensuing debate can be very rich and very nuanced.

In fact, real perspective allows us to frame the question well in the first place.

On Sunday, July 27, the Boston Globe magazine ran a series of articles on the Baby Boomer generation. Boston-based freelance writer Tom Keane wrote a piece challenging Tom Brokaw’s praise of the WWII generation as the “Greatest Generation,” concluding instead that it was the Baby Boomer generation that really delivered on the promises (especially around human and civil rights) that the Greatest Generation fought to protect.

It was a thoughtful article and well written, but I couldn’t help but think it was a debate over a question framed without any real perspective. In fact, to the question “Which is the greatest generation ever in America, the Baby Boomers or the “Greatest Generation,” I believe the correct answer is: “Ken Griffey, Jr.”

Here is the problem, or at least part of it: Americans are about as a-historical a culture as has ever existed. We look forward perhaps better than any culture ever has (a point to be debated, but only with perspective), but once something is in our rearview mirror, it’s really gone.

In fact, there are five generations alive at this moment in America—Gen X (trying to steal Scrabble on Facebook), Gen Y (complaining about working a 5-day week), the aforesaid Baby Boomers (mostly blogging, I think), the (more traditional) Silent Generation (being franchised by Tom Brokaw as the “Greatest Generation”) and the G.I. Generation (born 1901-1924).

(If you are interested in the full roster of generations, see Strauss and Howe’s book, Generations.) Technically, the sixth living generation, the Lost Generation (born 1883 to 1900) is still around, represented by a handful of hearty 108+ year olds.

So, when we think about the “greatest generation ever,” we a-historical Americans really have two living, mature generations to compare, because those are the only two we know anything about. As a culture, unfortunately, we are sometimes only as good as the last thing we lived.

[In fact, does it strike you as somewhat ironic, and therefore ephemeral, that the belief that the Baby Boomers are somehow the transcendent generation is being propagated by, well, Baby Boomers?]

The reason I mention all this is because I have just finished The Purpose of the Past, a collection of Gordon Wood’s reviews of history books for the New York Times Review of Books. I took a course from Professor Wood many decades ago and he is one of my heroes, having (with his mentor, Bernard Bailyn) rescued the American Revolution and the Constitution from Charles Beard and his forces of economic evil. I love Wood’s reviews, which are fair and measured but don’t pull any punches. As you read through the book, you come to realize that what Wood has written (over time, and unwittingly) is really a collection of interconnected essays which speak to our interpretation of American history in the twentieth century.

There were several reviews that struck me as particularly relevant, but none more than Wood’s review of Joyce Appleby’s Inheriting the Revolution: The First Generation of Americans, published in 2000. Perhaps it was coincidence that I read the Appleby review on the same day Tom Keane’s piece ran in the Boston Globe.

Wood writes:
We are often told that the baby boomers, that is, those born in the two decades or so following World War II, have brought the greatest transformation of political, social and cultural life in American history. Ever since this generation came of age in the 1960s and 1970s, it has involved America in a multitude of radical changes allegedly unmatched by the experience of any previous generation of Americans—changes in politics, civil rights, race relations, sexual habits, family life, women’s roles, cultural attitudes. . .But maybe this baby-boomer generation is not unique after all. If we read Joyce Appleby’s new book, we might conclude that at least one earlier generation, the first generation—those born in the two decades or so following the Declaration of Independence—participated in an equally radical, or perhaps even more radical, transformation of American society and culture.
Here, in a nutshell, are Keane’s and Appleby’s findings. You be the judge:
Keane on the Baby Boomers:

When the first boomers started coming of age, segregation was legal, mixed marriages were prohibited, and blacks and other minorities lived on the fringes of American society. Women weren’t much better off. . .And gays? As far as most of America was concerned, they didn’t even exist. . .Collectively, those groups were upward of 65% of today’s population. . .

Credit the baby boomers and their rejection of the world they were handed for changing that. Marches on Washington, women’s lib, Stonewall—it was under the boomers’ watch that women, minorities, and gays became part of the mainstream. . .The Greatest Generation may have saved the American Dream, but it was the boomers who helped make it come true for all.
Pretty good, eh? Pat, pat on your back. Pat, pay on my own back—though I have learned recently—much to my distress--that I may in fact be part of the lost “Generation Jones,” which, it turns out, has “an unrequited craving of unfulfilled expectations.”

And I thought that was the beginning of middle age heartburn.

But what of Appleby’s First Generation? How does it stack up against our Baby Boomers? Here, oh reader, some real perspective:

1. The First Generation was the most mobile in American history.

And I don’t mean moving from Novato to a pad overlooking the Bay in San Francisco. “Tens of thousands of ordinary folk pulled up stakes in the East and moved westward, occupying more territory in a single generation than had been occupied in the 150 years of colonial history. Between 1800 and 1820, the trans-Appalachian population grew from a third of a million to more than two million. ‘Never again,” Appleby writes, ‘would so large a population of the nation live in new settlements.’”

In other words, this was America’s true frontier generation. For them, the frontier wasn’t some inspiring American myth—it was what they found every morning when they opened their front door.

2. The First Generation created America’s middle class.

Think about that. In the early nineteenth century, choices and occupations of all sorts multiplied in writing, publishing, journalism, school teaching, law, politics, medicine, civil engineering, painting, and preaching. To follow the careers of those in the first generation, writes Appleby, “is to watch the sprawling American middle class materialize, summoned into existence by political independence, thickening trade connections, and religious revivals, all tied together by print.”

3. The First Generation created a literate America.

If Baby Boomers have been forced to adjust to the Internet, Americans in 1800 were forced to read as a necessity of life and a principal activity of nation building.

Northern Americans became one of the most, if not the most, literate people in the world. Printers, publishers and booksellers all doubled in number in the first decade of the nineteenth century. By 1810 Americans were buying 24 million copies of newspapers annually, the largest aggregate circulation of any country in the world

4. The First Generation re-created American religion, forging a landscape that we still occupy today.

Preachers sprang up everywhere; the Second Great Awakening sent over 3 million Americans to revivalist camp meetings in 1811 alone, and undermined the old established orders of Congregationalists and Anglicans by creating a new and uniquely voluntary religious world dominated by evangelical Methodists and Baptists. The “American evangelical Christian” was invented, as were brand new sects--the Disciples of Christ and Mormons—which had no European roots whatsoever.

5. And while they were at it, the First Generation created American capitalism and entrepreneurialism.

It was the willingness of “ordinary men and women. . .to move, to innovate, to accept paper money, and to switch from homemade goods once commercial ones were available” that accounts for the expansion of farming, commerce, credit and information.

Boot-strap manufacturing ventures proliferated in the rural North: “Bright young middling men from obscure background, like Peter Cooper and Amasa Goodyear, were able to take advantage of America’s unique conditions and opportunities to become successful businessmen. Novelty and inventiveness became their watchwords. Because American labor was expensive compared to European labor, these hustling entrepreneurs were eager to develop machines and tools to enhance productivity. They were risk takers as well, showing ‘a surprising willingness to venture outside the realm of their experience.” America’s internal market became the largest in the world.
The Baby Boomers were good. But this First Generation created “a powerful myth about America that metamorphosed ordinary labor into extraordinary acts of nation building”—a myth so powerful that succeeding generations had trouble questioning it.

Including our own.

So, your call: Baby Boomers or First Generation?

And, after all this talk about the greatest generation, and real perspective, I have one final, very important question just to see if you have learned anything:
Who is the prettiest woman in the history of America, Scarlett Johansson or Jessica Alba?

If your answer is “Ken Griffey, Jr.,” you may proceed to the lightening round.

Sunday, July 27, 2008

Genealogy, the Idaho Russet and Innovation

In April I was elected Chairman of the New England Historic Genealogical Society, one of my favorite organizations on earth. Not only does the Society have a world class staff, but its Trustees and Councilors are a group of extraordinarily talented individuals who dedicate their time, talent and treasure to collecting, preserving and interpreting--so our mission goes--the stories of families in America.

Founded in 1845, the locus of the Society has moved rapidly in the last decade from its beautiful library on Newbury Street in Boston (still active and vibrant) to a global on-line presence.

With that in mind, these are remarks I made shortly after becoming Chairman. The subject appears to be about the Idaho Russet potato, one of which I pulled from my pocket during the speech. I suggest, for full effect, that you find one in your kitchen and place it on your monitor now.

Tuesday, July 8, 2008

A Plague of Dead Squirrels (a.k.a. The Unintended Consequences of Innovation)

Yeah you got yer dead cat and you got yer dead dog
On a moonlight night you got yer dead toad frog
Got yer dead rabbit and yer dead raccoon
The blood and the guts they're gonna make you swoon

--Loudon Wainwright III, Dead Skunk

[NB: No animals were injured in the writing of this article.]

I am sad to report that I am predicting a plague of dead squirrels on the roads of my suburban New England neighborhood. Not tomorrow, but--guessing now--beginning in about 2010 or 2011, and easily stretching for a decade.

I’m predicting the same for your neighborhood as well.

And it won’t just be squirrels—it’ll be chipmunks and skunks, rabbits and raccoons, and a few mystified deer. I’m afraid, even in urban areas, there will be a few more bicyclists thrown from the carbon frames of their Kona King Zings. All beginning about 2010.

It will be, for better or worse, another in a long, unbroken line of unintended consequences surrounding otherwise staggeringly beneficial innovation.

Let me explain.

I just reviewed a book written by Clay McShane and Joel Tarr called The Horse in the City: Living Machines in the Nineteenth Century. It’s a fascinating look at the horse as a “living technology”--and a very persistent living technology--whose numbers continued to grow rapidly despite the steam and mechanization of the Industrial Revolution.

The horse of the 19th century is apt to call up a tableau from some bucolic farm, or perhaps of cowboys out on the open range. While these scenes certainly existed--powered by our allegiance to the American frontier myth--the explosion in the use of the horse in nineteenth-century America occurred primarily in urban areas.

By 1900, a city like New York contained an average of one horse for every 26 people, with 130,000 horses in Manhattan alone pulling street cars and food wagons, carrying firefighters and their equipment, removing snow, carting off the dead, and even providing sources of stationary power.

One of the most interesting features of an innovation is what happens (all around it) during rapid adoption. In the case of horses, we know the obvious: more people and goods got around the city faster and with less human energy. But what about the other consequences, the unintended consequences of such rapid adoption?

In the case of the nineteenth-century horse, the authors point to the following items:
Waste. It’s fair to say that Brooklyn agriculture was built on Manhattan manure. (Farmers termed Manhattan a “manure factory.”) It was only when imported guano became a cheaper commodity that this inter-borough trade slowed.

(Those of you interested in learning how bird poop is harvested, or indeed, how it achieved a sustainable competitive advantage over horse poop will, I’m afraid, have to seek sources outside this blog.)

Abuse. Urban reform groups like the ASPCA took up the welfare of the horse, policing against abuse while actively euthanizing old or lame horses, worth more to the rendering plant than alive. It became clear that the horse was viewed by most city-dwellers in utilitarian terms—-a unit of production-—subject to replacement when the creature became less productive.

Infrastructure. Cities had to create extensive plant to support the horse, including municipal stables and carcass-removal programs. Parkways were created, in part, as venues for afternoon promenades. Meanwhile, the numbers of teamsters, hostlers and stable-keepers tripled from 1870 to 1890—a strange phenomenon in the face the Industrial Revolution.

Medicine. The burgeoning urban horse population led to the rise of a skilled class of urban veterinarians.

Breeding. The horse became subject to breeding programs designed to increase its size and endurance.

Sprawl and suburbs. Street railroads pulled by horses not only encouraged the sprawl of residential neighborhoods but also enabled an expansion of amusement parks and resort destinations for the working class. Indeed, the size and stench of the attendant infrastructure virtually ensured that well-heeled urbanites would eventually find their way to suburbia, even if they had to create it in the process.

Farming. Hay production soared in the farmlands because of the growth of the urban horse. By 1909, more than half of New England’s farmland was involved in hay production. This led to improvements in hay-pressing technology and the ability to ship hay great distances.

Trade. A vast national and international trade in horses developed.
What McShane and Tarr make clear is that, while a horse is a horse (of course, of course), the unintended consequences inherent in their innovative urban use led to a set of vast, largely unforeseen, and completely unintended consequences.

All of which got me thinking about some of the unintended consequences of more modern innovations.

Take the iPod, for example, one of the great entertainment gadgets of our times. Doesn’t it seem likely that one of the unintended consequences of the iPod will be a generation of Americans who begin to experience serious hearing loss in their 40s? Will the iPod one day double, with the flip of a switch, as a hearing aid?

Of course, its predecessor, the television, helped to create the couch potato, the TV dinner, and a habitually sleep-deprived society. (And we still adore it, so I suspect the iPod will still be treasured, even as our national hearing deteriorates.)

Some of the great unintended consequences of our time come from our medical innovations. The wonder drug of the twentieth century, penicillin, has led to the evolution of the superbug. Even Viagra (what could be wrong with Viagra?), a sensation with older men since its launch ten years ago, has the dubious distinction in a recent poll (of women—they finally polled women!) of causing one-third of females to be just plain annoyed at having to have sex at the drop of a pill, and one-out-of-ten who now believe that Viagra led to their husband’s infidelity.

Of course, if you need ill-effects from innovation, look no further than the Web, which robs us of our time and concentration, and truly appears to be making us all stoopid.

The TV, the iPod and the Web; the Tinker to Evers to Chance of unintended consequences: Of the great inventions of the last century, one makes us fat, lazy and tired, one destroys our hearing, and one lowers our IQ and our ability to concentrate.

And we’d love to take an aspirin to make it all better, but that has unintended consequences as well. I just can’t remember what they are.

As for social innovation, a stunning article in the July/August Atlantic by Hanna Rosin suggests that one of the great social programs of our generation--demolishing public-housing projects in large cities to free the poor from the destructive effects of concentrated poverty--has led to steadily falling crime rates in large cities for the last 15 years. That’s the great news. The unintended consequence? Almost like a successful franchising scheme, violent crime didn’t disappear; it just relocated to the mid-sized cities. FBI data now pegs the most dangerous spots in America as Florence, South Carolina; Charlotte-Mecklenburg, North Carolina; Kansas City, Missouri; Reading, Pennsylvania; Orlando, Florida; and Memphis, Tennessee.

Which, speaking of the spread of violence, brings me back to my original prediction of lots and lots of dead squirrels.

In the same Atlantic issue, Jonathan Rauch masterfully profiles General Motor’s attempts to build a true electric hybrid by 2010 in his article, “Electro-Shock Therapy.” This is not your neighbor’s Prius, which is a gasoline-powered car with an electrical assist. GM’s “Chevy Volt” will draw its power from any standard electrical socket and go 40 miles on a single charge. After 40 miles a small gasoline engine will ignite, driving a generator that will maintain the battery.

That means the wheels are always driven by the battery. That means the car will drive hundreds of miles on a tank of gas. That means the 75% of Americans who drive less than 40 miles a day will never buy any gas.

There are lots and lots of technology hurdles to meet, mostly around the battery, if GM is going to make its 2010 date. (You could have the car today if you didn’t mind pulling the battery around in an air-conditioned UHaul, for example.) But fear not; even if the date slips a bit, there will be electric cars on the road in the not-too-distant future. Like 2011.

Very eco-friendly. Very cool. Very innovative. Pretty darn fast. Awfully darn heavy. And very, very quiet.

And that is terribly bad news for squirrels. Because one of the unintended consequences of this breakthrough innovation will be, I’m afraid, a national sneak-attack on creatures of every sort caught, however momentarily, dallying in the road.

I sure hope someone is thinking about his. Maybe Michelin is inventing tires that whistle at some special squirrel frequency. Maybe the next generation of road asphalt comes with sensors. Because, in my town alone I can think of any number of blind corners that are made safe only by the rumble of an internal combustion engine.

When I lived in New York City I used to worry about the squirrels of Central Park, confined to a little island and genetically severed from their brethren. I worried that they would become a race of beer-swilling, sausage-scarfing, spandex-wearing rodents who would one day strap on rollerblades.

Now, I am more inclined to worry about squirrels everywhere. And deer. And you and me, out jogging or riding our bikes.

Q: Why did the squirrel cross the road?

A: Because it couldn’t hear the one-ton, battery-powered rolling mass of silent steel bearing down on it at 50 MPH from around a blind corner.