Saturday, November 27, 2010

Thanksgiving downtime

Nope, still haven't called it quits on the blog yet. I'm just putting off writing my wrap-up/review/reflection on the experience of trying to read fifty books this year. In the absence of better excuses, I'll blame excessive turkey consumption and the general lethargy that always accompanies trips home for the holidays.

It's not over yet. Like the evil Derek Jeter's contract negotiations with the evil Yankees, this blog just goes on and on and on.

Friday, November 19, 2010

Second-half review

So, j'ai fini. The fifty-book challenge can finally, and mercifully, be laid to rest -- not that I didn't enjoy it, because I most certainly did. (And I'll get to that in a later post: the ups, the downs, the profound life lessons learned. Things like that. Hint: purchases of $25 or more on Amazon.com get free shipping. This was crucial in making the fifty-book challenge less challenging financially.) It's strange: these days I'm reading Freedom by Jonathan Franzen, a lengthy novel I'd been putting off forever, and there's absolutely no deadline for its completion. I'd almost forgotten what it was like to read without even a slightly gnawing sensation of panic.

Anyway, in adhering to tradition (by which I mean my solitary midpoint recap), please allow me to dole out the awards to best and worst of fiction and non-fiction, for my last twenty-five books. But first, a few statistics. On the year, I read thirty-five works by males and fifteen by females. (Believe it or not, this ratio actually improved in the second half of my challenge, with sixteen books by men and nine by women. I am ashamed. In my defense, most of my selections were culled directly from major publications' book review sections, which are overwhelmingly biased towards male authors.) And although I considerably improved my fiction exposure (fifteen of my last twenty-five books, or sixteen if one counts the hopelessly naive polemic by Roger D. Hodge, The Mendacity of Hope), I ended the challenge with an even split between fiction and non-, with twenty-five books apiece. I would go into further detail -- divisions by nationality, book length in pages, median year published, etc. -- but that would only serve to depress me and, even worse, would require actual research, which (as anyone who's kept up with this blog should know by now) is the bane of my passively critiquing online existence.

Onward, then.

Best Non-Fiction Book: The Mystery Guest, by Grégoire Bouillier

This feels a little like cheating. As a memoir, The Mystery Guest hovers somewhere between the realms of fiction (from which all memoirs take their cues) and fact (to which all memoirs purportedly aspire). But while the genre is ambiguous, the quality of the story, and the depth of feeling it achieves, is anything but. Grégoire Bouillier manages to capture, in the space of a tidy little book with a very skinny spine, the inner psychotic that rears its ugly head in all of us, given the right (wrong?) circumstances. In the case of Bouillier, this circumstance is his invitation to a birthday party of a woman he does not know, as the "mystery guest" of a former lover who had left him without explanation five years before. Perfectly depicting the protagonist's -- his own -- frayed nerves amid the taut ambiance that builds throughout the party itself, Bouillier courageously unravels the mysteries of his mind, laying bare his insecurities and thus affording grateful readers an eerily familiar reminder of the sheer insanity of romance.

Honorable mention: Unfortunately, none.


Best Fiction Book: The Thieves of Manhattan, by Adam Langer

Perhaps it's the gleeful manner with which Adam Langer mocks every aspect of the publishing industry. Or perhaps it's simply the fact that, in getting such literary bunk published, Langer's distaste for editors' discernment was vindicated by his novel's very existence. But whatever the reasons, The Thieves of Manhattan is at once a laugh machine and a sober inspection of the challenges facing modern writers in a shifting publishing landscape. Employing a niche jargon so drenched in industry particulars that he includes a glossary at the end, Langer hilariously documents the commercialization of literature, a transformation that has placed the works of ex-cons and Pulitzer Prize winners on the same bookshelf at the local Barnes & Noble. Clearly, Langer is a man more amused than outraged at the rapidly disappearing distinction between novels and non-fiction, and he references numerous hoaxes, forgeries, and plagiarisms within his own novel. It may be that Langer, exhausted by high-minded denunciations of authorial appropriation, decided that the best rebuttal was to mirthfully engage in the practice himself. For this, The Thieves of Manhattan won't snag him a Pulitzer Prize, but it will provide his readers with a basic, and far more useful, reward: a most enjoyably clever story.

Honorable mention: The Lotus Eaters, by Tatjana Soli; and All Is Forgotten, Nothing Is Lost, by Lan Samantha Chang


Worst Non-Fiction Book: The Disenchantment of Secular Discourse, by Steven D. Smith

This may come as a bit of a surprise, since I was not unkind to Steven D. Smith in my review of his book. But my own brand of disenchantment is owing not to lack of substance but of style: The Disenchantment of Secular Discourse is, quite bluntly, not that interesting. Smith's particular axe to grind revolves around a practice he calls "smuggling:" the influence of moral judgments on public dialogue despite their conspicuous absence as explicitly delineated premises. In the author's view, this results in a disingenuous conversation: the participants cannot help but unconsciously draw on their individual belief systems but are prevented, through a collective desire for credibility among peers, from admitting these principles' central role. The concept of "smuggling" is an intriguing one, but The Disenchantment of Secular Discourse (as suggested by the title itself) is, to put it lightly, an extremely dry analysis of its effects. Really, though: thumbs up for the idea.

Dishonorable mention: Once again, I didn't read any particularly horrible non-fiction books in the second half. It was, overall, a steadily decent non-fiction batch (without many outliers) this time around.


Worst Fiction Book: One Day, by David Nicholls

David Nicholls likely deserves better from me. It's not exactly fair for a beach read to be judged as a Serious Book. Then again, One Day was once reviewed in The New York Times. As Spiderman's uncle once explained, not unkindly, "With great power comes great responsibility." Mr. Nicholls, I do hope your film adaptation of the book does well at the box office, since (as I mentioned in my earlier review) that was clearly the objective you had in mind the entire time. There is nothing wrong with this, except for the fact that books written as screenplays tend to exhibit, well, diminished literary value. And I don't think I'm being cruel here. The driving concept of the book -- a peek, on the same calendar day of each successive year, at a pair of mutually-obsessed protagonists -- is better suited for straight-to-TV fare than for serious dissection. But read it I did, and skewer it I must. One Day is probably not so bad when the only alternatives are celebrity gossip mags and racy tabloids. People Magazine he is not, but neither is he Ian McEwan or Margaret Atwood. John Grisham, then?

Dishonorable mention: Tinkers, by Paul Harding; and If You Follow Me, by Malena Watrous


I'm still not quite finished with this blog. There's definitely one more post coming, at the very least. Keep checking back!

Wednesday, November 17, 2010

#50: Animal Spirits

How did John Maynard Keynes know I'm not rational? Or at least, not always rational. According to authors George A. Akerlof and Robert J. Shiller, this is one key precept that vanished somewhere along the line from its initial expression by Keynes to the onset of the Great Recession seventy years later. The duo's book, Animal Spirits: How Human Psychology Drives the Economy, and Why It Matters for Global Capitalism, is a concise attempt at its revival.

It is now nearly a foregone conclusion that humans act rationally as pertaining to economic decisions. So in the aggregate, the macro-economy will reflect thousands and millions of minor judgment calls that, taken together, constitute the long-sought-after equilibrium. The problem with this theory (even if this never seemed to bother its creator, Milton Friedman) is in its idealism. Are human beings rational? To an extent, yes. At other times, "people really are human, that is, possessed of all-too-human animal spirits," the authors write.

What are these animal spirits, and what do they do? The definition given here is "the thought patterns that animate people's ideas and feelings." This sounds suitably vague, which is precisely the point. In the rush to transform economics into a science, overweening economists threw the baby out with the bathwater, discarding the very real enigma of human behavior along with the failed economic theories of prior eras. Akerlof, the 2001 Nobel Prize-winner in economics, and Shiller want nothing more than to reintroduce these animal spirits to the field of economics and the public at large.

But first, a re-branding. What was then "animal spirits" is now studied as "behavioral economics." The authors propose five psychological aspects of this discipline: confidence, fairness, corruption and bad faith, money illusion, and stories. Each of these plays a unique role within the macro-economy, but not always intuitively. Money illusion, for example, describes what takes place when wage cuts are instituted following a deflationary trend. Even when the decrease in pay is commensurate with the drop in prices, employees usually feel cheated. A perfectly rational decision by an employer thus becomes an object lesson in the existence of money illusion (and influences the employees' perception of relative fairness as well).

This flies in the face of classical economics, in which humans are presumed to be supremely rational. (That such theories persist alongside the ongoing public fascination with the likes of Paris Hilton or, say, the British royal family is its own nifty testament to the inscrutability of the human mind.) So Akerlof and Shiller dutifully document the effects of each of their five factors before launching into eight key questions whose answers only make sense in light of the findings of behavioral economics.

This is an enlightening book, and one made all the more pleasant for its conspicuous lack of angry demagoguery. On a spectrum of bitterness from Joseph Stiglitz to Paul Krugman, the authors of Animal Spirits are clearly more aligned with the former. This is an unexpected reprieve, which understandably lends additional gravitas to their cause. Their case can be summarized thusly: don't buy too literally into the cult of the "invisible hand." Markets do fail, which is precisely why government regulation (and occasional intervention) is necessary. Of course, with the benefit of hindsight since Animal Spirits was published, it appears their advice -- like that of Stiglitz, Krugman, et al -- has gone largely unheeded. What comes next is anyone's guess.

Sunday, November 14, 2010

Rescuing the Facebook generation

For the November 25th issue of The New York Review of Books, author Zadie Smith contributed an essay titled “Generation Why?” Ostensibly, the column was a review of Aaron Sorkin’s much-ballyhooed film, The Social Network, but Smith clearly had bigger fish to fry than nerdy billionaires (especially since Sorkin and director David Fincher had already undertaken this task so elegantly themselves).

No, the issue at stake was not Facebook but the “generation” for which it was created and for whom, perhaps, its existence circumscribes theirs. Smith, in attempting to extricate Facebook from its inevitable foundation myths, nevertheless concludes that she will someday “misremember my closeness to Zuckerberg [she, too, was on Harvard’s campus for Facebook’s birth in 2003], in the same spirit that everyone in ‘60s Liverpool met John Lennon.” And yet an acute sense of separation haunts her, as much for its seeming incongruity (Smith is only nine years Mark Zuckerberg’s senior) as for its depth.

“You want to be optimistic about your own generation,” Smith muses, with a touch of nostalgia. “You want to keep pace with them and not to fear what you don’t understand.” She would be wise to heed her own advice. For what she contends in “Generation Why?” – that for the unwashed masses who fancy Facebook, Twitter, et al among life’s requisites, their online reincarnations have themselves become unhinged from, or even superseded, reality – is as emblematic of the anachronisms of the old-guard cohort (whom she affectionately dubs “1.0 people”) as it is a functional indictment of their successors.

The New Yorker’s Susan Orlean stumbles into the same trap, albeit somewhat more amiably. On her blog, “Free Range,” she posits a new hierarchy of friendship: the Social Index, a ranking of relationships by the relative frequencies of online vs. offline contact. “Human relationships used to be easy,” she explains. But “now, thanks to social media, it’s all gone sideways.” Orlean then proceeds to delineate these subtle distinctions: between “the friend you know well” and “the friend you sort of know” and “the friend, or friend-like entity, whom you met initially via Facebook or Twitter or Goodreads or, heaven help us, MySpace,” and so on. Wisely, she keeps the column short and employs a jocular tone, one whose comic value is reaffirmed by her promotion of the Social Index on – where else? – Twitter, using the hashtag #socialindex.

But one can detect a beguiling undercurrent of cynicism beneath Orlean’s evident joviality. What Zadie Smith and Susan Orlean share – in addition to their niche of the “celebrity lifestyle” whose attainment, Smith assures us, is the raison d’être of the Facebook generation – is the creeping suspicion, despite reaching a career zenith, of their continuing exclusion from the proverbial “Porcellian Club” of Zuckerberg’s collegiate fantasies. This, then, is a fate to which both they and those they pity are likewise consigned. The irony, of course, is their refusal, or inability, to identify these “People 2.0” as their kindred spirits.

Smith opts instead for the appeal to authority. In this case, that role falls to Jaron Lanier, a “master programmer and virtual reality pioneer.” (Smith, who is 35, quickly reminds us that Lanier, 50, is “not of my generation,” an assertion whose brashness once more belies her commonalities with that perpetually group-conscious underclass of Facebookers.) Quoting extensively from Lanier’s book, You Are Not a Gadget, Smith appropriates the tech-philosopher’s arm’s-length aspect toward technology as her own, spraying the reader with snippets of his wisdom. (In the book’s preface, Lanier cautioned against this Girl Talk-esque brand of mishmash, lamenting that his words would be “scanned, rehashed, and misrepresented by crowds of quick and sloppy readers into wikis and automatically aggregated wireless text message streams.”)

But Smith and Lanier have separately, and preemptively, doomed themselves to contemporary irrelevance by adhering to a retrograde narrative of the modern condition. Together, their worst nightmare is the narrowing of human existence into unintentionally confined spaces. This process takes place via “lock-in,” a series of inadvertently interacting steps which, taken together, preclude the possibility of reversal or alteration. Such was the case, Lanier argues (and Smith dutifully recounts), in the invention of the MIDI file type, a once-cutting edge format for storing and playing digital music, whose binary limitations preternaturally forced the beautiful infinity of analog melodies into a prepackaged sepulcher of bits and bytes. Once the standard had been formalized, the jig was up: there was no turning back. Music had forever changed, and not necessarily for the better. Lanier unwittingly reformulates – on behalf of the self-described “software idiot” Zadie Smith – these same fears in regard to social media.

These visions of doom are misplaced. One can feel almost viscerally the bored sighs emanating from countless millennials’ diaphragms as Zadie Smith ages before their very eyes: “When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears.” Such generous hyperbolizing obscures whatever consideration Smith’s fretting may warrant on the margins. If rescuing this Lost Generation is her utmost objective, then her plea for sanity, easily mistaken for groveling, will scatter Zuckerberg’s millions of disciples like so many cards in a two-bit parlor trick.

Notably, Zadie Smith gently ridicules the Facebook era’s emphasis on connectivity, remarking snidely that Zuckerberg “used the word ‘connect’ as believers use the word ‘Jesus,’ as if it were sacred in and of itself.” The quality of those interactions, she worries, is not worth the minimal effort exerted to vivify them. And yet she comes agonizingly close, on multiple occasions, to grasping the essence of this generation that remains simultaneously adjacent to, but seemingly unreachable from, her own. “Watching this movie, even though you know Sorkin wants your disapproval, you can’t help feel a little swell of pride in this 2.0 generation,” Smith concedes. “They’ve spent a decade being berated for not making the right sorts of paintings or novels or music or politics. Turns out the brightest 2.0 kids have been doing something else extraordinary. They’ve been making a world.”

Sound familiar? It should. The specter of John Lennon, the one “that everyone in ’60s Liverpool met,” haunts every word of “Generation Why?”  Even Zadie Smith, for whom Lennon (unlike Lanier) is clearly not a peer, cannot ignore the contemporary relevance of the former’s transformative impact on society. Culture may move more rapidly in the digital era than it did in the 1960s, but its disruptive rhythm has survived largely intact. Rebellion, experimentation, innovation: these are all hallmarks of the creative subculture, as each subsequent breakthrough quickly buries its predecessors. Mark Zuckerberg, then, is the spiritual descendant of John Lennon’s “Imagine.” We are, indeed, all connected (much to Smith’s everlasting surprise).

This is the epiphanic truth that the Facebook generation has uncovered, even if in so doing they remain blissfully unaware of the historical import of their actions. To be sure, their self-absorbed ignorance of a chronology of innovation is itself a product of the ever-shifting nature of modern culture. A generation once encompassed two or three decades; now, an absence of even five years from civilization would reduce the most precocious techie to the countenance of a Luddite. But, somewhat paradoxically (considering her alarm at Facebook’s social impact), Smith digests technology’s ephemeral nature with ease, as she states at the end of her essay: “I can’t imagine life without files but I can just about imagine a time when Facebook will seem as comically obsolete as LiveJournal.”

If this is the case, then what, precisely, is the cause for concern? Conceivably, Zadie Smith, who teaches literature, senses an intellectual fence over which the social media-savvy yet literarily deficient minds of her young charges are unable to vault. Perhaps, for a ponderous writer such as Susan Orlean, who once penned a 282-page paean to orchids, it is a fear of losing her audience to ever-decreasing attention spans. For Jaron Lanier, it may be the horror at a remix culture in which the devolution of works of art into haphazardly scissored segments (à la David Shields’ Reality Hunger) threatens the very nature of public expression. Perhaps Zadie Smith and Susan Orlean and Jaron Lanier and so many others of their age and temperament, finding themselves unable to “keep pace with [the younger generation],” succumb to the all-too-human instinct to “fear what [they] don’t understand.” In short, they face the same challenge that confronted the parents and teachers and writers of the ‘60s generation, fifty years later. They, like Mark Zuckerberg and the hordes of Facebook users who followed him in the quest for digital immortality, face the fear of oblivion.

Friday, November 12, 2010

#49: The Last Utopia

In just a few short weeks, the world will celebrate the sixty-second anniversary of the Universal Declaration of Human Rights. Adopted by the United Nations on December 10th, 1948, the document ushered in an unprecedented era of international rights norms that has since culminated in the prominence of human rights organizations such as Amnesty International and Human Rights Watch.

What Samuel Moyn argues in his book, The Last Utopia: Human Rights in History, is that the thematic line running from the UDHR's adoption in 1948 through today is misrepresented in the nascent field of human rights studies. Although cemented now as the defining moment that gave human rights its beginning, the Universal Declaration's appearance was, Moyn insists, "less the annunciation of a new age than a funereal wreath laid on the grave of wartime hopes."

This is a decidedly irreverent perspective on a movement whose brief and explosive history has (especially in recent years) been lionized as proof of civilization's continuing evolution. But Moyn is certain that these celebrants of human rights' march to glory have it all wrong. In fact, he argues, the UDHR was, if anything, more detrimental than it was helpful in facilitating the cause of human rights as it is known today. The UDHR's adoption "had come at the price of legal enforceability:" by its inability to transcend ancient notions of state sovereignty, the declaration in effect bequeathed to nation-states the power of adjudication over their own adherence to human rights standards. Moyn's contention revolves around the fact that world leaders in the 1940s were understandably reluctant to cede any jurisdiction to the whims of a supranational institution, notwithstanding (or perhaps directly due to) its supposed impartiality.

I found the author's thesis compelling at first, as he explicitly delineated the prevailing global consensus of political leaders in the post-World War II era: a strong desire for peace was complemented by a profound wariness of others' intentions. In such an environment, the idea of subordinating a national legal framework to an international structure -- especially one in which the state itself could be held blameworthy -- was not an attractive proposition to any elites. And thus was born the Universal Declaration of Human Rights, a document whose noble goals disguised an impotent enforcement mechanism.

But Samuel Moyn's continued pounding on the heads of his readers quickly grows old. I cannot count the number of times (or the plethora of ways) he tries to convince his readers that today's edition of human rights bears little resemblance to, or is only a distant relative of, that of the 1940s. "As of 1945," Moyn writes in one instance, "human rights were already on the way out for the few international lawyers who had made them central in wartime." Elsewhere: "Instead of turning to history to monumentalize human rights by rooting them deep in the past, it is much better to acknowledge how recent and contingent they really are." And, "what mattered most of all about the human rights moment of the 1940s, in truth, is not that it happened, but that -- like the even deeper past -- it had to be reinvented, not merely retrieved, after the fact."

Virtually nothing is as consistently unsurprising as professorial loquacity. But even among academics, Moyn tests the limits of repetition. His mantra seems to have been: if something is worth writing, it's worth writing one hundred times. In this regard, then, he has succeeded. Unfortunately, much like human rights themselves for a time, Moyn proves far more adept at defining their history negatively than positively. It is obvious that he considers the UDHR only nominally relevant in jump-starting the human rights movement; what is less transparent is his perspective on its true origins.

Human rights constitute the eponymous last utopia of his book's title, but Samuel Moyn does little with this concept other than to restate it over and over (just as he does with his repudiations of the movement's alleged foundation myth). "When the history of human rights acknowledges how recently they came to the world," Moyn writes, "it focuses not simply on the crisis of the nation-state, but on the collapse of alternative internationalisms -- global visions that were powerful for so long in spite of not featuring individual rights." It was, in a sense, the worldwide disillusionment with grandiose visions of the past that gradually led to the introduction of human rights as a viable alternative. It offered a (facially) moral ideal where before had existed only political ones.

In short, "human rights were born as the last utopia -- but one day another may appear." Other than brief mentions (and like so much else in The Last Utopia), Samuel Moyn leaves this final speculation largely unaddressed. As to the idea that modern human rights came about due to the Universal Declaration of Rights, however: well, that horse has already been beaten quite to death.

Tuesday, November 2, 2010

#48: Salvation City

Earlier this year I expressed the need to stop reading manifestos. This time it's dystopias that have drawn my ire: I think I'll take a break on these too. Salvation City, a novel by Sigrid Nunez, is no duller than some of the other post-apocalyptic books I've read in the past few years. It's also not particularly memorable.

Cole Vining is a thirteen-year-old orphan whose atheist parents died in a flu epidemic. The atheist bit matters, in this case, since so much of the narrative is focused on the conflicting identities of the young protagonist, as the storytelling jumps back and forth in time to pull all the strings together. Following his parents' death, and after spending time in an orphanage known as Here Be Hope, young Cole was then delivered to the rural Indiana home of Pastor Wyatt and his wife Tracy, in a place called Salvation City.

The kindly clergyman -- who, Cole notes ambivalently, "always looks right into the face of the person he is talking to" -- and his spouse are devout, fundamentalist Christians, and their peculiar lifestyle is frequently juxtaposed against Cole's earlier years under the emotionally fraught relationship of his irreligious parents. In Salvation City, and I refer here both to the book and to the town, the question is raised as to what exactly constitutes a rescue from tragedy, if not throwing into doubt the very nature of tragedy itself.

For Cole's mother, Serena, even those neighbors who had opened their doors for assistance, as the flu swept through cities and towns, were deserving of the utmost suspicion: "But they were Jesus freaks, his mother said, and she didn't want to get involved with them. 'I mean, these people are actually happy about this catastrophe. They think any day now they're going to be sucked up to heaven.'" Her twin sister, Addy, in an attempt to reclaim Cole from his new home following Serena's death, expresses much the same sentiments: "'These fanatics will use religion to justify anything -- especially the ones who believe in the imminent rapture. You do understand, don't you? That's what these monsters were counting on? The Messiah was supposed to show up before I did.'"

Cole sees things somewhat differently. As he contemplates looming adulthood (from the wide-eyed vantage point experienced uniquely by young teens) and his adoptive father claims divine guidance in trying to persuade him to stay, Cole wonders: why "didn't Jesus send a message to him and Addy, too? Wouldn't that have helped them all?"

Sigrid Nunez leaves many questions such as this one open-ended, a seeming mockery of faith that becomes less flippant upon closer observation. Salvation City dwells on choices and asks, implicitly, the important question of what makes a home. But, as often befalls works of fiction whose circumstances require a great leap of imagination, the elusive answers never seem as important as the author intended them to be, and an apathetic reader is the disappointing consequence.