Sunday, December 12, 2010

Shipping out

That is all for now from 50 Books for 2010. Hope you enjoyed it. If you're in the market for (free) musings, ramblings, and/or reflections on everything from politics to sports to technology to books, please follow me over to my new project, The First Casualty, at www.jaypinho.com.

Happy holidays!

Saturday, December 11, 2010

A reflection

It feels somehow appropriate that it is here, in the Mission district of San Francisco, that my writer’s block has finally begun to recede. For several weeks now, ever since I typed the last sentence of my fiftieth book review of the year, words had eluded me, replacing the year-long jackhammering of my fingertips for anxious table-tapping instead. Muddy’s Coffee House, at 1304 Valencia, is proving to be my long-awaited antidote, much as countless cafes and bars within walking distance provided a safe haven for yesteryear’s beatniks and the poets of today.

I am neither beatnik nor poet. I am, however, an Excel whiz: I create sales plans for an online company in New York, and I’m in the Golden State merely on business. But after reading Gregory Dicum’s recent feature in The New York Times, “A Book Lover’s San Francisco,” and eliciting a good friend’s boundless enthusiasm upon hearing of my trip to the West Coast, I decided a sign was a sign. Immediately after completing work today, I pointed my rental car, a Chevy Aveo with all the horsepower of a kitchen blender, in the direction of I-280 and my first-ever foray into the City by the Bay.

Although it has come to an end in San Francisco, mine is a literary journey that began last New Year’s Eve in Hong Kong, as I stood with my girlfriend atop the IFC mall to await the celebratory fireworks. She asked me if I had a New Year’s resolution. I’d always managed to steer clear of such reckless abandon in the past and, in retrospect, I blame the bitterly whipping wind and cacophonic house music emanating from the rooftop bar for my anomalous response: “I want to read fifty books this year."

What soon followed was a rapidly growing stack of books that started with SuperFreakonomics and ended with Animal Spirits, swallowing over ten months and forty-eight books in between. To keep myself committed, I started a blog and reviewed each book as I read it, praising some, excoriating others, and – when hungry, tired or bored – barely devoted four paragraphs each to the rest. If, as some claim, a year is best measured in books, it seems I’d learned that lesson at long last. Other lessons, however, proved harder to grasp. Among axioms of literature, “reading a book is a journey” springs immediately to mind, a trope as true as it is clichéd. Yet my always-looming year-end goal rendered me the journeying equivalent of the five-year-old in the backseat, wondering, “Are we there yet?”

And so it seemed to me, just as to that precocious (hypothetical) toddler, that I never was. As the year progressed and the inaugural feverish pitch of my reading pace gradually ceded ground to work and procrastination, the practicalities of finding time just as subtly began to assert themselves. I decided, via executive fiat, to start reading shorter books. Cut out the dry non-fiction. Embrace short-story collections. These and other considerations crowded out my personal preferences, sacrificing the lengthy luxury of Jonathan Franzen’s 562-page Freedom and the satisfaction of Tolstoy’s War and Peace in favor of the immutable fifty-book bottom line.

Somewhere along the way, I became aware of the inevitable creeping sensation that my New Year’s resolution had shed its virgin luster. Where before was the refrain “only twenty-five left to go!” there now remained only a sulking “eight left until I’m finally done with this stupid thing.” The blog, too, had become a chore. The whole endeavor was feeling, quite uncomfortably, more and more like school.

This is not to say that the occasional book didn’t capture my imagination. Some certainly did, from Olga Grushin’s surrealist portrait of a declining Soviet Union in The Dream Life of Sukhanov to Michael Lewis’ hilarious recounting of Wall Street’s outsiders in The Big Short to Grégoire Bouillier’s self-psychoanalysis in his endlessly relatable memoir The Mystery Guest, and many more besides. But the act of institutionalizing my reading stripped the written word of one of its most potent weapons: the ability to fully immerse a reader into a world of the author’s creation. With a ticking clock as the omnipresent soundtrack, my suspension of disbelief was relegated to intermittent moments of reading, often lost amongst the more numerous minutes spent fretting over my remaining schedule.

While this may read like a cautionary tale against setting numeric goals for book reading, it’s actually something a little different: a suggestion to aim high but to learn to be satisfied with a less-than-100% success rate. Which is why, even as I celebrated the dissolution of my writer’s block in San Francisco, I suppose I’ll just have to accept the fact that I still didn’t finish this essay until now, back in New York.

Saturday, November 27, 2010

Thanksgiving downtime

Nope, still haven't called it quits on the blog yet. I'm just putting off writing my wrap-up/review/reflection on the experience of trying to read fifty books this year. In the absence of better excuses, I'll blame excessive turkey consumption and the general lethargy that always accompanies trips home for the holidays.

It's not over yet. Like the evil Derek Jeter's contract negotiations with the evil Yankees, this blog just goes on and on and on.

Friday, November 19, 2010

Second-half review

So, j'ai fini. The fifty-book challenge can finally, and mercifully, be laid to rest -- not that I didn't enjoy it, because I most certainly did. (And I'll get to that in a later post: the ups, the downs, the profound life lessons learned. Things like that. Hint: purchases of $25 or more on Amazon.com get free shipping. This was crucial in making the fifty-book challenge less challenging financially.) It's strange: these days I'm reading Freedom by Jonathan Franzen, a lengthy novel I'd been putting off forever, and there's absolutely no deadline for its completion. I'd almost forgotten what it was like to read without even a slightly gnawing sensation of panic.

Anyway, in adhering to tradition (by which I mean my solitary midpoint recap), please allow me to dole out the awards to best and worst of fiction and non-fiction, for my last twenty-five books. But first, a few statistics. On the year, I read thirty-five works by males and fifteen by females. (Believe it or not, this ratio actually improved in the second half of my challenge, with sixteen books by men and nine by women. I am ashamed. In my defense, most of my selections were culled directly from major publications' book review sections, which are overwhelmingly biased towards male authors.) And although I considerably improved my fiction exposure (fifteen of my last twenty-five books, or sixteen if one counts the hopelessly naive polemic by Roger D. Hodge, The Mendacity of Hope), I ended the challenge with an even split between fiction and non-, with twenty-five books apiece. I would go into further detail -- divisions by nationality, book length in pages, median year published, etc. -- but that would only serve to depress me and, even worse, would require actual research, which (as anyone who's kept up with this blog should know by now) is the bane of my passively critiquing online existence.

Onward, then.

Best Non-Fiction Book: The Mystery Guest, by Grégoire Bouillier

This feels a little like cheating. As a memoir, The Mystery Guest hovers somewhere between the realms of fiction (from which all memoirs take their cues) and fact (to which all memoirs purportedly aspire). But while the genre is ambiguous, the quality of the story, and the depth of feeling it achieves, is anything but. Grégoire Bouillier manages to capture, in the space of a tidy little book with a very skinny spine, the inner psychotic that rears its ugly head in all of us, given the right (wrong?) circumstances. In the case of Bouillier, this circumstance is his invitation to a birthday party of a woman he does not know, as the "mystery guest" of a former lover who had left him without explanation five years before. Perfectly depicting the protagonist's -- his own -- frayed nerves amid the taut ambiance that builds throughout the party itself, Bouillier courageously unravels the mysteries of his mind, laying bare his insecurities and thus affording grateful readers an eerily familiar reminder of the sheer insanity of romance.

Honorable mention: Unfortunately, none.


Best Fiction Book: The Thieves of Manhattan, by Adam Langer

Perhaps it's the gleeful manner with which Adam Langer mocks every aspect of the publishing industry. Or perhaps it's simply the fact that, in getting such literary bunk published, Langer's distaste for editors' discernment was vindicated by his novel's very existence. But whatever the reasons, The Thieves of Manhattan is at once a laugh machine and a sober inspection of the challenges facing modern writers in a shifting publishing landscape. Employing a niche jargon so drenched in industry particulars that he includes a glossary at the end, Langer hilariously documents the commercialization of literature, a transformation that has placed the works of ex-cons and Pulitzer Prize winners on the same bookshelf at the local Barnes & Noble. Clearly, Langer is a man more amused than outraged at the rapidly disappearing distinction between novels and non-fiction, and he references numerous hoaxes, forgeries, and plagiarisms within his own novel. It may be that Langer, exhausted by high-minded denunciations of authorial appropriation, decided that the best rebuttal was to mirthfully engage in the practice himself. For this, The Thieves of Manhattan won't snag him a Pulitzer Prize, but it will provide his readers with a basic, and far more useful, reward: a most enjoyably clever story.

Honorable mention: The Lotus Eaters, by Tatjana Soli; and All Is Forgotten, Nothing Is Lost, by Lan Samantha Chang


Worst Non-Fiction Book: The Disenchantment of Secular Discourse, by Steven D. Smith

This may come as a bit of a surprise, since I was not unkind to Steven D. Smith in my review of his book. But my own brand of disenchantment is owing not to lack of substance but of style: The Disenchantment of Secular Discourse is, quite bluntly, not that interesting. Smith's particular axe to grind revolves around a practice he calls "smuggling:" the influence of moral judgments on public dialogue despite their conspicuous absence as explicitly delineated premises. In the author's view, this results in a disingenuous conversation: the participants cannot help but unconsciously draw on their individual belief systems but are prevented, through a collective desire for credibility among peers, from admitting these principles' central role. The concept of "smuggling" is an intriguing one, but The Disenchantment of Secular Discourse (as suggested by the title itself) is, to put it lightly, an extremely dry analysis of its effects. Really, though: thumbs up for the idea.

Dishonorable mention: Once again, I didn't read any particularly horrible non-fiction books in the second half. It was, overall, a steadily decent non-fiction batch (without many outliers) this time around.


Worst Fiction Book: One Day, by David Nicholls

David Nicholls likely deserves better from me. It's not exactly fair for a beach read to be judged as a Serious Book. Then again, One Day was once reviewed in The New York Times. As Spiderman's uncle once explained, not unkindly, "With great power comes great responsibility." Mr. Nicholls, I do hope your film adaptation of the book does well at the box office, since (as I mentioned in my earlier review) that was clearly the objective you had in mind the entire time. There is nothing wrong with this, except for the fact that books written as screenplays tend to exhibit, well, diminished literary value. And I don't think I'm being cruel here. The driving concept of the book -- a peek, on the same calendar day of each successive year, at a pair of mutually-obsessed protagonists -- is better suited for straight-to-TV fare than for serious dissection. But read it I did, and skewer it I must. One Day is probably not so bad when the only alternatives are celebrity gossip mags and racy tabloids. People Magazine he is not, but neither is he Ian McEwan or Margaret Atwood. John Grisham, then?

Dishonorable mention: Tinkers, by Paul Harding; and If You Follow Me, by Malena Watrous


I'm still not quite finished with this blog. There's definitely one more post coming, at the very least. Keep checking back!

Wednesday, November 17, 2010

#50: Animal Spirits

How did John Maynard Keynes know I'm not rational? Or at least, not always rational. According to authors George A. Akerlof and Robert J. Shiller, this is one key precept that vanished somewhere along the line from its initial expression by Keynes to the onset of the Great Recession seventy years later. The duo's book, Animal Spirits: How Human Psychology Drives the Economy, and Why It Matters for Global Capitalism, is a concise attempt at its revival.

It is now nearly a foregone conclusion that humans act rationally as pertaining to economic decisions. So in the aggregate, the macro-economy will reflect thousands and millions of minor judgment calls that, taken together, constitute the long-sought-after equilibrium. The problem with this theory (even if this never seemed to bother its creator, Milton Friedman) is in its idealism. Are human beings rational? To an extent, yes. At other times, "people really are human, that is, possessed of all-too-human animal spirits," the authors write.

What are these animal spirits, and what do they do? The definition given here is "the thought patterns that animate people's ideas and feelings." This sounds suitably vague, which is precisely the point. In the rush to transform economics into a science, overweening economists threw the baby out with the bathwater, discarding the very real enigma of human behavior along with the failed economic theories of prior eras. Akerlof, the 2001 Nobel Prize-winner in economics, and Shiller want nothing more than to reintroduce these animal spirits to the field of economics and the public at large.

But first, a re-branding. What was then "animal spirits" is now studied as "behavioral economics." The authors propose five psychological aspects of this discipline: confidence, fairness, corruption and bad faith, money illusion, and stories. Each of these plays a unique role within the macro-economy, but not always intuitively. Money illusion, for example, describes what takes place when wage cuts are instituted following a deflationary trend. Even when the decrease in pay is commensurate with the drop in prices, employees usually feel cheated. A perfectly rational decision by an employer thus becomes an object lesson in the existence of money illusion (and influences the employees' perception of relative fairness as well).

This flies in the face of classical economics, in which humans are presumed to be supremely rational. (That such theories persist alongside the ongoing public fascination with the likes of Paris Hilton or, say, the British royal family is its own nifty testament to the inscrutability of the human mind.) So Akerlof and Shiller dutifully document the effects of each of their five factors before launching into eight key questions whose answers only make sense in light of the findings of behavioral economics.

This is an enlightening book, and one made all the more pleasant for its conspicuous lack of angry demagoguery. On a spectrum of bitterness from Joseph Stiglitz to Paul Krugman, the authors of Animal Spirits are clearly more aligned with the former. This is an unexpected reprieve, which understandably lends additional gravitas to their cause. Their case can be summarized thusly: don't buy too literally into the cult of the "invisible hand." Markets do fail, which is precisely why government regulation (and occasional intervention) is necessary. Of course, with the benefit of hindsight since Animal Spirits was published, it appears their advice -- like that of Stiglitz, Krugman, et al -- has gone largely unheeded. What comes next is anyone's guess.

Sunday, November 14, 2010

Rescuing the Facebook generation

For the November 25th issue of The New York Review of Books, author Zadie Smith contributed an essay titled “Generation Why?” Ostensibly, the column was a review of Aaron Sorkin’s much-ballyhooed film, The Social Network, but Smith clearly had bigger fish to fry than nerdy billionaires (especially since Sorkin and director David Fincher had already undertaken this task so elegantly themselves).

No, the issue at stake was not Facebook but the “generation” for which it was created and for whom, perhaps, its existence circumscribes theirs. Smith, in attempting to extricate Facebook from its inevitable foundation myths, nevertheless concludes that she will someday “misremember my closeness to Zuckerberg [she, too, was on Harvard’s campus for Facebook’s birth in 2003], in the same spirit that everyone in ‘60s Liverpool met John Lennon.” And yet an acute sense of separation haunts her, as much for its seeming incongruity (Smith is only nine years Mark Zuckerberg’s senior) as for its depth.

“You want to be optimistic about your own generation,” Smith muses, with a touch of nostalgia. “You want to keep pace with them and not to fear what you don’t understand.” She would be wise to heed her own advice. For what she contends in “Generation Why?” – that for the unwashed masses who fancy Facebook, Twitter, et al among life’s requisites, their online reincarnations have themselves become unhinged from, or even superseded, reality – is as emblematic of the anachronisms of the old-guard cohort (whom she affectionately dubs “1.0 people”) as it is a functional indictment of their successors.

The New Yorker’s Susan Orlean stumbles into the same trap, albeit somewhat more amiably. On her blog, “Free Range,” she posits a new hierarchy of friendship: the Social Index, a ranking of relationships by the relative frequencies of online vs. offline contact. “Human relationships used to be easy,” she explains. But “now, thanks to social media, it’s all gone sideways.” Orlean then proceeds to delineate these subtle distinctions: between “the friend you know well” and “the friend you sort of know” and “the friend, or friend-like entity, whom you met initially via Facebook or Twitter or Goodreads or, heaven help us, MySpace,” and so on. Wisely, she keeps the column short and employs a jocular tone, one whose comic value is reaffirmed by her promotion of the Social Index on – where else? – Twitter, using the hashtag #socialindex.

But one can detect a beguiling undercurrent of cynicism beneath Orlean’s evident joviality. What Zadie Smith and Susan Orlean share – in addition to their niche of the “celebrity lifestyle” whose attainment, Smith assures us, is the raison d’être of the Facebook generation – is the creeping suspicion, despite reaching a career zenith, of their continuing exclusion from the proverbial “Porcellian Club” of Zuckerberg’s collegiate fantasies. This, then, is a fate to which both they and those they pity are likewise consigned. The irony, of course, is their refusal, or inability, to identify these “People 2.0” as their kindred spirits.

Smith opts instead for the appeal to authority. In this case, that role falls to Jaron Lanier, a “master programmer and virtual reality pioneer.” (Smith, who is 35, quickly reminds us that Lanier, 50, is “not of my generation,” an assertion whose brashness once more belies her commonalities with that perpetually group-conscious underclass of Facebookers.) Quoting extensively from Lanier’s book, You Are Not a Gadget, Smith appropriates the tech-philosopher’s arm’s-length aspect toward technology as her own, spraying the reader with snippets of his wisdom. (In the book’s preface, Lanier cautioned against this Girl Talk-esque brand of mishmash, lamenting that his words would be “scanned, rehashed, and misrepresented by crowds of quick and sloppy readers into wikis and automatically aggregated wireless text message streams.”)

But Smith and Lanier have separately, and preemptively, doomed themselves to contemporary irrelevance by adhering to a retrograde narrative of the modern condition. Together, their worst nightmare is the narrowing of human existence into unintentionally confined spaces. This process takes place via “lock-in,” a series of inadvertently interacting steps which, taken together, preclude the possibility of reversal or alteration. Such was the case, Lanier argues (and Smith dutifully recounts), in the invention of the MIDI file type, a once-cutting edge format for storing and playing digital music, whose binary limitations preternaturally forced the beautiful infinity of analog melodies into a prepackaged sepulcher of bits and bytes. Once the standard had been formalized, the jig was up: there was no turning back. Music had forever changed, and not necessarily for the better. Lanier unwittingly reformulates – on behalf of the self-described “software idiot” Zadie Smith – these same fears in regard to social media.

These visions of doom are misplaced. One can feel almost viscerally the bored sighs emanating from countless millennials’ diaphragms as Zadie Smith ages before their very eyes: “When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears.” Such generous hyperbolizing obscures whatever consideration Smith’s fretting may warrant on the margins. If rescuing this Lost Generation is her utmost objective, then her plea for sanity, easily mistaken for groveling, will scatter Zuckerberg’s millions of disciples like so many cards in a two-bit parlor trick.

Notably, Zadie Smith gently ridicules the Facebook era’s emphasis on connectivity, remarking snidely that Zuckerberg “used the word ‘connect’ as believers use the word ‘Jesus,’ as if it were sacred in and of itself.” The quality of those interactions, she worries, is not worth the minimal effort exerted to vivify them. And yet she comes agonizingly close, on multiple occasions, to grasping the essence of this generation that remains simultaneously adjacent to, but seemingly unreachable from, her own. “Watching this movie, even though you know Sorkin wants your disapproval, you can’t help feel a little swell of pride in this 2.0 generation,” Smith concedes. “They’ve spent a decade being berated for not making the right sorts of paintings or novels or music or politics. Turns out the brightest 2.0 kids have been doing something else extraordinary. They’ve been making a world.”

Sound familiar? It should. The specter of John Lennon, the one “that everyone in ’60s Liverpool met,” haunts every word of “Generation Why?”  Even Zadie Smith, for whom Lennon (unlike Lanier) is clearly not a peer, cannot ignore the contemporary relevance of the former’s transformative impact on society. Culture may move more rapidly in the digital era than it did in the 1960s, but its disruptive rhythm has survived largely intact. Rebellion, experimentation, innovation: these are all hallmarks of the creative subculture, as each subsequent breakthrough quickly buries its predecessors. Mark Zuckerberg, then, is the spiritual descendant of John Lennon’s “Imagine.” We are, indeed, all connected (much to Smith’s everlasting surprise).

This is the epiphanic truth that the Facebook generation has uncovered, even if in so doing they remain blissfully unaware of the historical import of their actions. To be sure, their self-absorbed ignorance of a chronology of innovation is itself a product of the ever-shifting nature of modern culture. A generation once encompassed two or three decades; now, an absence of even five years from civilization would reduce the most precocious techie to the countenance of a Luddite. But, somewhat paradoxically (considering her alarm at Facebook’s social impact), Smith digests technology’s ephemeral nature with ease, as she states at the end of her essay: “I can’t imagine life without files but I can just about imagine a time when Facebook will seem as comically obsolete as LiveJournal.”

If this is the case, then what, precisely, is the cause for concern? Conceivably, Zadie Smith, who teaches literature, senses an intellectual fence over which the social media-savvy yet literarily deficient minds of her young charges are unable to vault. Perhaps, for a ponderous writer such as Susan Orlean, who once penned a 282-page paean to orchids, it is a fear of losing her audience to ever-decreasing attention spans. For Jaron Lanier, it may be the horror at a remix culture in which the devolution of works of art into haphazardly scissored segments (à la David Shields’ Reality Hunger) threatens the very nature of public expression. Perhaps Zadie Smith and Susan Orlean and Jaron Lanier and so many others of their age and temperament, finding themselves unable to “keep pace with [the younger generation],” succumb to the all-too-human instinct to “fear what [they] don’t understand.” In short, they face the same challenge that confronted the parents and teachers and writers of the ‘60s generation, fifty years later. They, like Mark Zuckerberg and the hordes of Facebook users who followed him in the quest for digital immortality, face the fear of oblivion.

Friday, November 12, 2010

#49: The Last Utopia

In just a few short weeks, the world will celebrate the sixty-second anniversary of the Universal Declaration of Human Rights. Adopted by the United Nations on December 10th, 1948, the document ushered in an unprecedented era of international rights norms that has since culminated in the prominence of human rights organizations such as Amnesty International and Human Rights Watch.

What Samuel Moyn argues in his book, The Last Utopia: Human Rights in History, is that the thematic line running from the UDHR's adoption in 1948 through today is misrepresented in the nascent field of human rights studies. Although cemented now as the defining moment that gave human rights its beginning, the Universal Declaration's appearance was, Moyn insists, "less the annunciation of a new age than a funereal wreath laid on the grave of wartime hopes."

This is a decidedly irreverent perspective on a movement whose brief and explosive history has (especially in recent years) been lionized as proof of civilization's continuing evolution. But Moyn is certain that these celebrants of human rights' march to glory have it all wrong. In fact, he argues, the UDHR was, if anything, more detrimental than it was helpful in facilitating the cause of human rights as it is known today. The UDHR's adoption "had come at the price of legal enforceability:" by its inability to transcend ancient notions of state sovereignty, the declaration in effect bequeathed to nation-states the power of adjudication over their own adherence to human rights standards. Moyn's contention revolves around the fact that world leaders in the 1940s were understandably reluctant to cede any jurisdiction to the whims of a supranational institution, notwithstanding (or perhaps directly due to) its supposed impartiality.

I found the author's thesis compelling at first, as he explicitly delineated the prevailing global consensus of political leaders in the post-World War II era: a strong desire for peace was complemented by a profound wariness of others' intentions. In such an environment, the idea of subordinating a national legal framework to an international structure -- especially one in which the state itself could be held blameworthy -- was not an attractive proposition to any elites. And thus was born the Universal Declaration of Human Rights, a document whose noble goals disguised an impotent enforcement mechanism.

But Samuel Moyn's continued pounding on the heads of his readers quickly grows old. I cannot count the number of times (or the plethora of ways) he tries to convince his readers that today's edition of human rights bears little resemblance to, or is only a distant relative of, that of the 1940s. "As of 1945," Moyn writes in one instance, "human rights were already on the way out for the few international lawyers who had made them central in wartime." Elsewhere: "Instead of turning to history to monumentalize human rights by rooting them deep in the past, it is much better to acknowledge how recent and contingent they really are." And, "what mattered most of all about the human rights moment of the 1940s, in truth, is not that it happened, but that -- like the even deeper past -- it had to be reinvented, not merely retrieved, after the fact."

Virtually nothing is as consistently unsurprising as professorial loquacity. But even among academics, Moyn tests the limits of repetition. His mantra seems to have been: if something is worth writing, it's worth writing one hundred times. In this regard, then, he has succeeded. Unfortunately, much like human rights themselves for a time, Moyn proves far more adept at defining their history negatively than positively. It is obvious that he considers the UDHR only nominally relevant in jump-starting the human rights movement; what is less transparent is his perspective on its true origins.

Human rights constitute the eponymous last utopia of his book's title, but Samuel Moyn does little with this concept other than to restate it over and over (just as he does with his repudiations of the movement's alleged foundation myth). "When the history of human rights acknowledges how recently they came to the world," Moyn writes, "it focuses not simply on the crisis of the nation-state, but on the collapse of alternative internationalisms -- global visions that were powerful for so long in spite of not featuring individual rights." It was, in a sense, the worldwide disillusionment with grandiose visions of the past that gradually led to the introduction of human rights as a viable alternative. It offered a (facially) moral ideal where before had existed only political ones.

In short, "human rights were born as the last utopia -- but one day another may appear." Other than brief mentions (and like so much else in The Last Utopia), Samuel Moyn leaves this final speculation largely unaddressed. As to the idea that modern human rights came about due to the Universal Declaration of Rights, however: well, that horse has already been beaten quite to death.