Thursday, December 17, 2009

Some Ideas Whose Time Has Come (Again)

The other day the WSJ ran a story on a low-tech craze sweeping Silicon Valley—entrepreneurial evenings devoted to the board game “Settlers of Catan.”  Of course, in SV it’s called “live networking” (where we used to just call it “game night”), but it’s all the same thing: people moving off their keyboards and socializing in person.  How quaint.

That’s not the only oldie but goldie making a comeback as we round the bend into a new decade.  Look at what’s happening on TV.  The biggest hit, attracting 22 million viewers every week, is “NCIS.”   As the WSJ reports, it “barely has a fan Web site. . .its viewers seldom time-shift,” and they are anything but the “young, urban demographic” that advertisers craze.  But “’NCIS’ is proof that even if the economics of the business are in upheaval, large swathes of the audience still want traditional storytelling, righteous heroes, and reality that’s not offensively gritty.”  Producers even say they avoid parochial or offensive humor.  How quaint.

(What's next?  Do you suppose people will begin playing solitaire again with actual cards?)

How about this: AOL is once again independent. The company that introduced many of us to the Web was also, ten years ago, supposed to herald the new era of synergistic “old and new media” when it combined with Time Warner.  The result has been an unmitigated disaster with more than $100 billion in shareholder value lost.  Earlier this month the companies separated, both as smaller entities, mostly (in the case of AOL) behind the competition, or (in the case of Time, Fortune and People) trying to weather the advertising recession and movement of eyeballs to the Web.  Old media is once again old media and looking for ways to survive and grow.  New media is once again new media and battling for technological advantage.  How quaint.

An article by Alan Tonelson in the recent Harper's magazine suggests “old things new” in national economics far transcends even the AOL/Time Warner debacle.  Tonelson says the story our business and political leaders have been telling us for decades, that the alarming decline in the U.S. manufacturing sector was simply the ushering in of a spectacular new era of information technologies—well, that story turned out to be so wrong.

It seems that manufacturing, something we used to be good at, might be something we really should be good at again, and fast.  “Today,” Tonelson writes, “the idea of maintaining a genuine American prosperity without a vibrant manufacturing sector stands exposed as a fairy tale.”

“American business leaders are cooling their long infatuation with ‘post-industrialism,’” Tonelson adds.  “Manufacturing is suddenly all the rage.  After forty years of outsourcing and globalization, business leaders are beginning to understand that real, self-sustaining American recovery and prosperity require a manufacturing base that is not only highly productive and innovative but is a much larger share of gross domestic product.”

Americans building real stuff.  How quaint.

I could go on.  Remember the age of conglomerates in the 1960s, which all came to a screeching halt when we were warned (by In Search of Excellence) to “stick to the knitting” and by folks like Chris Zook to “profit from the core?”  Well, when the world’s largest, revered on-line bookseller expands into blenders and socks, it seems like a logical expansion from the core.  But cloud computing?  E-book hardware?   Or how about when the world’s largest search engine, the-greatest-company-built-on-algorithms-ever-devised-by-man decides to launch its own phone?  Algorithm. . .search. . .hardware.  Of course--a natural progression from the core.  Sounds like the 1960s to me.  Very quaint.   

And, how about the coming disaster in cloud computing, or so tech prognosticator Mark Andersen predicts.  With everyone, big and small, storing sensitive corporate, consumer and personal data “somewhere else, in somebody else’s server” (my simple definition of “the cloud”), the time is ripe for massive espionage, fraud and theft.   That’ll encourage lots of folks, big and small, to reconsider where they store their sensitive data.  Some may even keep it on a server in their company or home.  How quaint.

Finally, and unlike (I reported) last holiday season, Christmas tree sales are exploding.  This is a good sign for all retailers, and a good sign for the economy.  More spending.  More stuff under the tree.  The Ghost of Christmas Past.

Maybe someone will even buy me a board game, like “Settlers of Catan.”  One that I could play with real people, in person.  Live networking in my own home.

Old things new.  How quaint.

Monday, December 14, 2009

Thinking About Thinking

One of the sub-industries that has developed in parallel with the growth of digitization and the Internet is one comprised of smart folks who are valiantly trying to divine what this assault of information is doing to our noggins.

If you are in your 70s, of course, you’ve been absorbing and adjusting to things like television and computers (plus civil rights, globalization and the sexual revolution) for decades--no small feat.  If you are in your 170s, there’s also been the telegraph and telephone, as well as the onslaught of print media (plus flight, the automobile, the corporation, a bunch of world wars and revolutions, and 25 different kinds of Coke, too). 

In fact, our brains have been under full-out, ever-shifting assault since at least the start of the Industrial Revolution.
 
A recent study makes the point:
Households in the United States consumed a mind-boggling total of 3.6 zettabytes of information and 10,845 trillion words in 2008.  That's a daily average of 33.8 gigabytes of information and 100,564 words per person.  Put another way, it's the equivalent of covering the continental United States and Alaska in a 7-foot-high stack of Dan Brown novels.
"We're all on information overload for good reason," said Roger Bohn, the study's lead author and a professor of management at UC San Diego.  "The amount we can assimilate is only a little bit more than what our ancestors could assimilate, but the amount that's available to us now is many orders of magnitude more," said Bohn, director of the school's Global Information Industry Center.
The UC San Diego researchers said the bulk of the bytes consumed came from three sources - nearly 54.6 percent from computer games, 34.7 percent through television and 9.8 percent from movies.  The average American receives information 11.8 hours of each day, or about 75 percent of the average time a person is awake, the report said. That compares to an average 4.3 hours per day based on a 1960 study.
Let’s emphasize Mr. Bohn’s observation: “The amount we can assimilate is only a little bit more than what our ancestors could assimilate, but the amount that’s available to us now is many orders of magnitude more.”  Therein likes the rub.  If we want to be productive, responsive and just plain healthy, it sounds as if our brains need to find new ways to cope.

All of which brings us back to the smart folks who are thinking about thinking. 

In A Whole New Mind, Daniel Pink takes perhaps the most extreme of positions, saying that the “keys to the kingdom are changing hands”—that the kind of person who dominated life in the last few decades—“computer programmers who crank code. . .MBAs who could crunch numbers”—are giving way to people with very different minds:  “Creator and emphathizers, pattern recognizers, and meaning makers.”

We’re moving, Mr. Pink tells us, from “an economy and a society built on the logical, linear, computerlike capabilities of the Information Age to an economy and society built on the inventive, empathic, big-picture capabilities of what’s rising in its place, the Conceptual Age.”

Mr. Pink then goes on to posit that, as the left hemisphere of the brain analyzes details, the right hemisphere synthesizes the big picture.  So, you left-brained folks who scored high on the SAT, became CPAs and thought you had the world on a string—beware.  The right-brained folks are going to take over the world.

Between you and me, I don’t cotton much to Mr. Pink’s theory; if nothing else, it’s hard to find a time, at least since the Industrial Revolution, when the right-brain wasn’t at least pari passu with the left brain and an essential ingredient of success.  To make the case that we’re moving into some new Conceptual Age, when we’ve been living with the need for high concept for several centuries, rings hollow to me.  Still, A Whole New Mind is a fun book to read and not without its charms (or its disciples).

More nuanced and convincing in approach is Roger Martin’s thesis in The Opposable Mind which says that true leadership and genius derive from being able to hold two opposing ideas without succumbing to an either-or decision.

Integrative Thinking is what Dean (of the Rotman School at the University of Toronto) Martin calls this, and it’s “the ability to constructively face the tension of opposing models and, instead of choosing one at the expense of the other, generating a creative resolution of the tension in the form of a new model.”

“The new model contains elements of the individual models but is superior to each.  This means that Integrative Thinkers are model creators, not model takers. Because of this, they are disproportionately able to come up with breakthrough ways of doing things.  They emerge as the admired and revered innovators.”

I like this concept.  It’s a better way of saying what I tried to suggest back in August when I wrote about the Great Imponderables.  There are simply issues that have no clear solution (or two diametrically-opposed, terrible solutions) which we are all expected to solve anyway, issues which great leaders find a way to answer while the rest of us remain stumped. 

There’s clearly a severe disadvantage in Dean Martin’s world to limiting use of the brain to one hemisphere or the other—that is, he doesn't call it Integrative Thinking for nothing.

A bonus in Martin’s book is his many good examples which focus on some of the great innovators in Canada, a pleasant change from the usual Silicon Valley crowd.

Another “thinking thesis” appeared recently in the Harvard Business ReviewCalled The Innovator’s DNA, the article summaries a six-year study to uncover the origins of creative business strategies in particularly innovative companies. 

Leaving aside the fact that the article doesn’t tell us how it defines or measures innovation, or how it rated and selected its examples (which truly are the usual Silicon Valley crowd), Innovator’s DNA still suggests a compelling theory that the most innovative CEOs focus more time than their peers on discovery activities.  These activities include associating, questioning, observing, experimenting and networking.

Of course, this is another blow to Daniel Pink’s “right-brain-in-ascendance” argument.   Discovery activities absolutely require that innovators engage both sides of their brain.  As Innovator's DNA, Roger Martin, and personal experience all suggest, if you plan to go to war each day, you simply cannot leave half your brain at home.

I would be remiss, too, if I didn’t throw into the mix a book I promised you (last October) that I would never, under any conditions, ever read: The Third Man Factor by John Geiger.  These are the stories of folks like Ernest Shackleton, Amelia Earhart and Charles Lindbergh, who, when subject to intense stress or monotony, sensed the “presence” of a “third person” or unseen companion who accompanied them, inevitably guiding them to safety.

There’s even a story of a man who escaped the World Trade Tower having a similar experience—being led out by a guardian angel, as it were.

I had goose bumps reading the first part of the book, which I did with all the lights on and my wife close by.  But then—and guardian angel fans should cover their ears here—Geiger begins to explore what might really be happening.  The “third man” phenomenon, which can be recreated in the lab, may well be about “right-brain intrusions into the left hemisphere.”  Cool.  And, while this theory is disputed, there is an extension of this theory that says when the "Third Man" appears, and “seems to be actively assisting someone in need, it is, in fact, a case of someone looking after their own immediate needs.”

Double cool.  The left brain told Shackleton to put one foot in front of the other in the deep Antarctic snow, and the right brain gave him comfort that all would be well through creation of a warm, supportive presence.  It’s further proof that when you strap on your batteries in the morning to do battle, you had better be sure to have both halves of your brain fully-charged.

Finally, and perhaps the best proof that integrative, full-brain thinking has been an imperative for centuries, I stumbled onto a brief but powerful book by Brooke Hindle, a pioneering student in the history of technology (and one badly in need of a Wikipedia entry).  Written in 1981, Emulation and Invention is a discussion of technology in the early American Republic, and in particular, the impact that James Watt's steam engine had after its first public showing in 1776.

As we know, in the early eighteenth century, Americans were a thinly-dispersed people almost entirely engaged in agriculture—poor, dumb farmers scratching the soil with their hoes, just trying to eek out a living.  Yet, in short order, America would come to embrace and then lead the Industrial Revolution.

The reason, Hindle tells us, is that the stereotypical American farmer of the early nineteenth century was mostly myth.  One observer reported that “there is not a working boy of average ability in the New England States. . .who has not an idea of some mechanical invention or improvement. . .by which, in good time, he hopes to better his position, or rise to fortune and social distinction.”

Americans, Hindle says, (thanks to a near-universal elementary school education) “were generally literate and at home in arithmetic, some geometry, and trigonometry.”  Europeans also noted Americans’ remarkable mobility; they moved often from job to job, were able to adopt new processes, and were able to apply solutions across related industries.

Left-brained or right-brained?  Exactly: both.

In fact, the American farmer lived with machines of all sorts, “and a small group of mechanics and artisans worked daily with gears and gear trains, cams, ratchets, escapements, bearings, cylinders, pistons, valves and cocks—the basic elements of which the new machinery was constructed.  Moreover, the machinery the farmer knew—the seed drill, the turpentine and whiskey stills, the gristmills and sawmills, and the clock—were eminently comprehensible to all who worked with them.”

Another observer suggested an American might make “a horseshoe nail more slowly than his European grandfather. . .but he is thinking out a machine which will make it for him twice as well and a hundred times faster.”

In 1833, Michael Chevalier wrote that the American conformed “easily to new situations and circumstances; he is always ready to adopt new processes and implement, or to change his occupation.  He is a mechanic by nature.”  In fact, Americans moved frequently from job to job, even within a single establishment.  They never became expert at anything, Chevalier said, but had a breadth of understanding to apply solutions across related industries.
This combination of understanding how the gears work, and deciding how to create something new with the gears is called, I think, Integrative Thinking.  It’s called the Innovator’s DNA.  It's called prospering in a world that seems to change every day, and require solutions that didn't exist yesterday.

More than that, it’s history’s way of reminding us once again--despite our conceit--that we’re just not that different.  Modern times are not as fast-paced or unique, relative to the past, as we tell ourselves they are.  In fact, one could argue that Daniel Pink is essentially right but simply 225 years late: Our ancestors taught us at least as much about the value of using both sides of our brains to cope with a rapidly changing world as have any of the latest best-sellers on Amazon.