LOOKING BACK…AND AHEAD
The Forgotten Man
In college, I read a breezy little book called How to Lie with Statistics, which, as a journalist, stood me in good stead. Not being much of a numbers person anyway, I gained a healthy dose of skepticism for any statistic that purported to prove one thing or another. I brought this skepticism to Amity Shlaes’ “new history of the Great Depression” and once again found a picture painted by the judicious choice of numbers.
Shlaes, a senior fellow in economic history at the Council on Foreign Relations and a syndicated columnist at Bloomberg, has a very definite point of view. With her muscular prose and a conservative perspective, she sets out to show that Franklin Roosevelt over-stepped presidential authority and the New Deal demonized big business in a way that was un-American. To be fair, she is just as hard on his Republican predecessor, Herbert Hoover. They both fiddled with the economy like 10-year-olds with a broken watch. Between the two of them, she says, “government intervention helped to make the Depression Great.”
Few still think, however, that the New Deal ended the Great Depression. World War II did that. What the New Deal did do was keep thousands of people from starving. It created work, yes, but at least there was work. And that brings us to the numbers. Each chapter of the book begins with the date and two statistics: the average unemployment for the year and the Dow Jones Industrial Average. The first figure lumps those employed in New Deal-related social programs in with the truly unemployed, dramatically inflating a number that is plenty bad enough on its own. The second number is just the wrong number. Why not gross domestic product, a real indicator of the size of the economy and its growth or its contraction? Instead, the book tracks a stock market in which a tiny fraction of the population could afford to invest.
This is not a one-note right-wing diatribe, but even with a grasp of the complexity of the problem, Shlaes does not give enough credit to the difficulty of the solution. The New Deal was flawed, as was Roosevelt. Some of its actions were big and thankfully short-lived blunders: the National Recovery Administration, for instance. Others were longer-lived and certainly unpleasant: the hounding of Andrew Mellon, for instance. Still others, although flawed, are permanent fixtures in America’s social and governmental landscape: Social Security, the Federal Deposit Insurance Corporation, the Securities and Exchange Commission, to name just a few.
Shlaes’ biggest criticism is that the Great Depression went on far longer than it should have. The recession within the depression, in 1937, coincided with the Dust Bowl and terrible flooding in the Mississippi and Ohio valleys. “With money and the weather breaking down, men and women in America felt extraordinarily helpless,” she writes. Sounds uncomfortably familiar. That was also the year that Roosevelt began his second term and went on to be re-elected twice more. The American voter was Roosevelt’s biggest ally but, ironically, mistrust in that collective was Roosevelt’s biggest failing. He felt compelled to tinker with the market rather than letting it try to right itself. He felt that he knew best and, to paraphrase the famous plaque on his successor’s desk, when the buck stops here, that is what voters expect you to feel.
I asked my mother to read this book and provide insight into my review. She said, with her 83 years of wisdom, that, where the Great Depression is concerned—not to mention the ’37 flood—once was enough.
Blind Spots: Why Smart People Do Dumb Things
We all smack our foreheads from time to time and groan, “What was I thinking?” We all, smart or not, do dumb things. We wonder aloud how those in charge make such daft decisions and why we continue to believe that, this time, they’ll be perfect.
The new study of neuro decision-making aims to help us figure out why that happens. I have reviewed a number of them in recent issues (Forward March/April 2009 and July/August 2009). The best of them, Brain and Culture (Forward Sept/Oct 2009) put all of the dry science into a cultural and social context and came the closest to providing comfort for our human failings.
This book, unfortunately, does none of those things. Instead, it reduces everything to 10 simplistic blind spots. We jump to conclusions. We fail to see the big picture. Ah, now I understand.
All of these books end with the hope that awareness of why we do dumb things will lead to a better world where we’ll all be reasonable, all decisions will be logical and, presumably, we will always grasp the big picture. None of these books has convinced me that that is even remotely possible, least of all, this one.
Say Everything: How Blogging Began, What It’s Becoming and Why It Matters
Well written and intelligent, Say Everything lifts the lid on a world that we all can read but know very little about. As Heather Armstrong, one of the highest profile bloggers interviewed here says, readers feel that they know everything about her, courtesy of her daily online musings (dooce.com), but “ninety-five percent of my life is not blogged about.”
Rosenberg, cofounder of Salon.com and author of Dreaming in Code, digs past the high profile Internet types we’ve heard of—Marc Andreessen, et al—to introduce us to the people who really created the blogosphere. Justin Hall began what Rosenberg calls “oversharing” in 1994. Looking like a certifiable nut ball with hair piled eight inches atop his head to cascade about his face “like a freeze-frame image of a fountain,” Hall has none of the private-public filter that most of us discover early in life. Armstrong discovered what generations of journalists learned the hard way—people read this stuff. If you make mistakes, reveal things that were shared in confidence, talk about your family in the firm belief that they will never know, you will be caught. Being “dooced” in Web slang means to be fired because of something you posted in your blog. Unless you have jumped from cubicle to cubicle in Silicon Valley or, like a confirmed news junkie, actually read bylines, you will never have heard of these people.
The book’s elegant structure comes from the thoughtful choice of whom to profile, what order to place them in and the conclusions that each chapter draws. This is not a simple chronology, but a tale of many people working alone, discovering each other online and, by the seat of their pants, inventing a new medium. They literally made it up as they went along because they understood, like Benjamin Franklin, that moveable type wasn’t the point; what you did with it was. It’s not about the technology; it’s about the content.
The first thing that strikes you about them is how incredibly smart they are—geeks, yes, who fell in love with Asteroids as teen video game players, but ones who can quote Vladimir Nabokov and Thomas Pynchon. They are intensely curious, entrepreneurial, innovative and very, very young. The second thing is how many of them did toil in those soulless cubicles and hit upon blogging as a way to exercise some creativity and something that they “hadn’t found anywhere else … a kind of self-determination.”
9/11 changed everything in the blogosphere. The story broke there first because James Marino had gone into his office in lower Manhattan early to update his side business, Broadwaystars.com. At 8:56 a.m. he posted, “Something very terrible just happened at the World Trade Center.” “For the first time,” Rosenberg writes, “the nation and the world could talk with itself, doing what humans do when the innocent suffer: cry, comfort, inform, and, most important, tell the story together.” It became clear in that one episode “that communication—each reader’s ability to be a writer as well—was not some bell or whistle. It was the whole point of the Web, the defining trait of the new medium.”
9/11 also happened as blogging technology took off. It had become possible for non-geeks to create and maintain blogs without ever writing a line of code. During the course of the decade, it also became possible to make a living at it while still attracting legions of bloggers who write because they want to, not for a salary but for the love of it.
The best part of the book is what all this means; what it tells us about where we could be headed. Blogging is no longer a new toy bounced about by a group of 20- something coders high on caffeine and sleep deprivation. There are currently 25,000 blogs, a number that doubles every five months, according to Technorati. Skeptics charge that blogging (along with other Web 2.0 technologies) will kill journalism, especially investigative journalism; that we will be caught in an endless “echo chamber” where we are shielded from information we disagree with; that a free press provides a “single national narrative” that gives us identity. It’s actually very funny that we dismiss bloggers as lightweight in one sentence and then imbue them with such power in the next.
The blogosphere is, in fact, our 21st century version of the agora—the central square in ancient Athens where citizens came to find the news of the day and to debate it in real time. The most wonderful thing about good blogs is that the person behind the words comes through, that a human being, an individual, has an opinion, unfiltered by “the media” and is willing to put it out there for all to see. Yes, there is a lot of useless drivel but “what every decent blog offers is a point of view,” writes Rosenberg. Bloggers are “more passionate, more numerous and more inclusive” than their predecessors, he says, and this “outpouring of human expression” should delight us.