Skip to content

Everybody Hurts

January 21, 2012

Well, this is the project that has occupied my attention for the last year, leaving me little time to complain about all those the important issues I would normally tackle on this blog (such as why we still don’t have jet packs).

I’m so excited that my book Tortured Artists is finally out. I hope everyone will support the sanctity of Ye Olde Dead-Tree Model and order a copy on Amazon, Barnes & Noble, or their Favorite Indie Bookstore.

See below for more information.

Tortured Artists
From Picasso and Monroe to Warhol and Winehouse, the Twisted Secrets of the World’s Most Creative Minds

Released March 2012 from Adams Media.

About the Book:

It is sometimes said that all great art comes from pain. Van Gogh painted The Starry Night while in emotional torment; Lennon and McCartney forged their creative partnership following the death of their respective mothers; Milton penned Paradise Lost after losing his wife, his daughter, and his eyesight. Such unremitting grief would send even the most grounded among us into a frenzied Xanax binge and associated fetal position, but these celebrated artists chose not to recoil in passive suffering. Instead, they turned their sorrow into something the world would cherish.

Tortured Artists examines the maladies that drive creative types to the brink of despair and the inspired works that are born from their anguish.

Visit the website at TorturedArtistsBook.com.

In Defense of Generation Y

August 17, 2010

gen y girl textingThe (scatterbrained) kids are all right

***

“I try to use my iPhone as little as possible,” a woman said to me last week at the very charming Jane Restaurant, in the West Village. When I asked her why, she continued with all the emphatic flare of an evangelist on Adderall. “Whenever you’re online, texting, playing games, instant messaging, whatever, you’re not in the here and now. And not being in the here and now causes stress on the body. The long-term effects of that really concern me.”

I agreed with her completely, of course—indeed, the conversation would not have been the least bit noteworthy had this particular woman been, like me, a thirty-something technology skeptic who adapted to the Internet slowly and with curmudgeonly resistance. But she wasn’t. In fact, at 21, she was barely old enough to remember when the Internet did not exist.

The discussion took place at an industry mixer held by a nonprofit organization that finds internships for journalism students. Editors and publishers from various news outlets were encouraged to attend the event, so the interns in town for the summer could drink, mingle, and trade war stories with seasoned journalism-industry types. However, being one of the few editors who actually showed up to the thing, I nearly bolted rather than face the uncomfortable proposition of having to make small talk with a roomful of 18- to 22-year-olds.

For a change, I’m glad I didn’t bolt. What began as an evening of clumsy discourse, turned into—after several swigs of the complementary raspberry champagne—an unexpectedly elucidating lesson in the complexities of the generation known as the Millennials.

In other words, I got schooled on Generation Y, the rapidly maturing cohort born roughly between 1982 and 2001. Although the idealistic punk in me despises the type of shallow pop-demography that fosters generational labels, the lazy writer in me finds such labels undeniably useful. That said, I do prefer the descriptive “Millennial Generation” to the more widely used, though insultingly derivative, “Generation Y” —a term cooked up by a writer no doubt lazier than myself. Say what you will about the kids who follow Generation X, they at least deserve a buzzy moniker of their own. They also deserve, as I discovered after mixing it up with them last week, far more credit than the stereotypes often assigned to them.

First, it’s important to understand what makes the Millennials a unique cohort. While I’m a big believer that people are essentially the same in any generation, it is fair to assume that the majority of people born after, say, 1985 grew up in a drastically different world than did those of us who remember when music came on cassette tapes and Howie Mandel had hair. By the time the younger Millennials hit puberty, the Internet had long since unleashed its vast labyrinths of connectivity and interactivity, while many vestiges of Ye Olde Three-Dimensional World—writing utensils, land lines, brick-and-mortar businesses—were, if not eradicated, at least relegated to quaintness.

To be honest, I hadn’t given much thought to how information technology might affect the collective characteristics of the first generation to grow up with it. That the Internet’s culture of vanity and instant gratification could have negative effects on developing minds almost goes without saying, and in fact the Millennials may be the first generation whose members came of age with their own prefabricated set of stereotypes. (One could have predicted such books as Jean Twenge’s Generation Me years before it was published, in 2006.) Nevertheless, as stereotypes spread like wildfire, I was just as quick to assume, unjustly, that the Millennials are as the Old Guard portrays them: scatterbrained, media-addicted narcissists, with white earbuds permanently sutured to their auditory canals, who communicate almost exclusively through text messages and references to the Harry Potter movies. I did not peg them as being particularly ambitious, nor did I imagine their lifetime goals involved anything more than accumulating Twitter followers and maybe auditioning for one of those reality shows with the mean chefs.

Mind you, I still contend that as we advance toward crotchetiness we earn the prerogative to pigeonhole the younger generations, at least to some extent. It’s kind of a rite of passage for curmudgeons. Consider that not so long ago, the baby boomers accused my own cohort of being little more than a generation of cynical, apathetic, videogame-playing underachievers—and to a large degree they were right. However, just as the onset of Gen-X adulthood, typified by a mellower Will Ferrell and a more sensitive Queen Latifah, forced our elders to reevaluate their stereotypes, I am now forced to reevaluate mine. And, judging from the conversations I had with the fresh-faced youngsters at last week’s event, I was dead wrong.

These kids were thoughtful, alert, aware of their unique position on the cusp of a changing world, and, most notably, worried about their future. True, they are more self-centered than previous generations, but narcissism has been on the rise for decades, and the aging process always has a way of tempering overblown egos. And if the Millennials are, ultimately, less introspective than Generation X, it’s not for lack of character, but rather because there is simply too much else to focus on. While the economically prosperous Clinton years left many Xers to brood over the meaninglessness of American opulence, the subsequent groundswell of financial collapse, post-9/11 paranoia, and endless digital distractions has created an environment in which young people have little time to think, much less brood. And even though the Millennials are often criticized for their inflated sense of entitlement, the reality is they are being handed a world in utter shambles. As they enter the workforce, the Great Recession is staring them bull-eyed in the face, with no sign of letting up. The Internet, over the last decade, has dismantled entire industries. Guess who will be left to pick up the pieces? It’s the Millennials who will face the unenviable task of maintaining social and economic sustainability in the post-three-dimensional world, and many of the kids I spoke with are pretty damn scared about that.

Since the mid-1990s, most of the talk about changing technology has focused on how to adapt to it. The Millennials are the first generation whose members face the opposite decision—not how to adapt to technology, but whether or not they should simply reject it. The Internet, to them, is not new media. It’s just something that exists, no different than radios, TVs, automobiles, and hotdogs on a stick. As the pioneering computer scientist Alan Kay once said, “Technology is anything that was invented after you were born.”

Last week, as I mingled with the people who will be running the world sooner than we think, I gained a new perspective on the future of America. I’m glad I stuck around. Admittedly, not all of the young people I met were as lively in conversation as the aforementioned evangelical iPhone moderate, but I was nevertheless quite gratified that no one, throughout the entire evening, interrupted me to send a text. And Harry Potter? His name never even came up.

This Flimsy House of Club Cards

June 13, 2010

I would not belong to any drugstore that would have me as a member

***

Several times a week, I shop at a nearby Duane Reade drugstore, not because I want to, but because it’s close to my apartment. The nearest supermarket is farther away by a full block—another neighborhood by New York City standards—and the less time I spend shopping, the closer I am to my ultimate goal of doing something other than shopping. And yet, try as I will to speed up the process, my transactions at Duane Reade are routinely stalled by invitations to join a disturbing consumerist cult, one whose mating call begins with a dreaded question asked by the cashier at the checkout counter.

“Do you have a Duane Reade Club Card?”

Typically, the query is followed by a pregnant pause, during which I am reminded that the cult leaders of Duane Reade Inc. will not rest until I agree to have that sterile D.R. logo tattooed on my pale behind. Of course, the cashier knows full well that I don’t have a club card. He asks me the same question every time I’m in the store, and my reply is always the same.

“No, I do not have a club card… Not today. Not ever.”

Club-card programs, once a rare gimmick in the retail world, have become an insidiously pervasive practice. Most major drugstores—Duane Reade, CVS, and Rite Aid, to name a few—adopted their own programs years ago, as have nutrition retailers like GNC and The Vitamin Shoppe, not to mention virtually every supermarket chain on the planet. To foster participation in the programs, the stores offer a superficially alluring incentive, namely the discount prices and special offers that are “rewarded” to anyone who signs up. Under the veil of euphemistic misnomers like Extra Care Card and Preferred Savings Card, retailers attempt to drive home the message that club cards are good for consumers. They’re not. They’re good for retailers, which had offered discounts long before club-card programs became commonplace in the 1990s.

Somewhere along the line, storeowners discovered that they can bully their customers into signing up for programs that do nothing more than track and monitor our shopping habits. Meanwhile, those who opt out of the programs are summarily gouged with so-called everyday prices that hover just north of what they should actually cost. In other words, when a store says that it’s rewarding customers who sign up for club cards, what it’s really doing is penalizing people who don’t—people like me. And while my refusal to sign up for club cards invariably evokes a kind of blasé, “Choose Your Battles” reaction, I can’t help but hold on to the notion that this is a battle worth choosing.

At the risk of sounding like a fringy alarmist who cries Orwell at the slightest abdication of privacy, I must, in this case, side with the fringy alarmists. I’m uncomfortable with any system that dangles convenience like a carrot, only to trick us into submitting to the convenience of the system itself. It is the kind of Pavlovian conditioning that ultimately leads to fewer freedoms for us and greater control for those in charge, and it’s a slippery slope. Imagine, for instance, a not-so-distant future in which we are presented with this hypothetical disclaimer:

Good citizens of the United States:

The new Federal Identification Microchip Program is voluntary. However, be advised that anyone who chooses not to undergo the chip-implant procedure will be ineligible to receive significantly reduced fees for a host of government services, including driver’s licenses, passports, H1N1 vaccinations, waste disposal, firearm permits, and many others.

True, going from club cards to a dystopian future of ID-chip implants is a stretch, but it doesn’t negate the fact that a store is a store, not a country club. A store is not an entity one would voluntarily “join,” and I can see no real reason to do so. I go to Duane Reade to buy things and leave. As I consider shopping a chore, not a pastime, I am really quite frightened by the notion of being “one of them,” which brings us to the emotional crux of my refusal to carry a club card.

I’ve thought long and hard about where to place the imaginary lines that separate me from the consumerist masses, the thing-worshiping parishioners of the International Church of Wal-Mart, or K-Mart, or Whatever-Mart. I’m not defending my imaginary lines, but I have discovered that I need them to function, just as we all need our own imaginary lines to recognize the chasm that separates us from them.

As creatures with an intense need for self-definition, we all draw lines. Whether we draw them on the basis of race, gender, economic status, education level, or the Beatles vs. Stones debate doesn’t really matter. To be denied the basic right to feel superior to someone else is to be denied a crucial part of the human experience. (Oh, you’re a Red Sox fan? Fuck you.) None of us is really above indulging in this type of petty thinking, at least on occasion. Even the enlightened Buddhist monk meditating on the highest mountain peak in the Himalayas is secretly feeling superior to his fellow monks on smaller mountains. Think about it. You feel superior to someone. Perhaps it’s the bitchy reality-TV star whose desperate need for attention is just plain sad, or maybe it’s the superstitious sap down the street who still needs religion to cope with life, or the smug atheist whose certainty that nothing exists beyond the material world just makes him come off like an asshole.

Or, hell, maybe the only person you really feel superior to is the guy who blogs about his aversion to drugstore club cards. It still counts.

That said, I will never join the ranks of retail club members, not because I am better than them, but because, inside, I am  just like them. I am a closet elitist, a star-bellied Sneetch who keeps his star hidden behind ill-fitting hipster shirts—the kind you can’t buy at Whatever-Mart.

Still think I should choose my battles? Join the club.

A Good Enough Marriage

March 16, 2010

[My review of George Bernard Shaw’s Candida at the Irish Rep. Originally published in Show Business Weekly]

***

Candida
Written by George Bernard Shaw
Directed and designed
by Tony Walton
Irish Repertory Theatre
132 West 22nd Street
www.irishrep.org

Review by Christopher Zara

“There are convenient marriages but no delightful ones,” quips James Morrell, the socialist reverend at the center of George Bernard Shaw’s Candida. The statement is a cheeky citation from the French moralist François de La Rochefoucauld, but it also serves as a sort of footnote for this comedy about the trappings of modern marriage. Inadvertently, the quote also stirs curiosity about the powers that be at the Irish Repertory Theatre, whose pious rendering of Candida feels a bit like a marriage of convenience itself. It is a solid piece by a company with a knack for unearthing the natural humor and genetically imposed sadness inherent in the works of Irish playwrights, but under the punctilious vision of director and designer Tony Walton, it falls just shy of greatness.

Nevertheless, both Walton and the Irish Rep have a deep understanding of Shaw’s best devices, in particular his tireless female leads who so often cling to what little power they have in male-dominated Victorian England. Dana Ivey impeccably embodied this archetype as the unrepentant brothel magnate of Mrs. Warren’s Profession in the Irish Rep’s 2005 production of that play. Similarly, Melissa Errico confidently tackles the detached pragmatism and stern perceptiveness of Candida’s titular heroine, who is caught in a love triangle of sorts between the long-winded James, to whom she is married, and a young, idealistic poet named Eugene. The triangle itself is really an excuse to chip away at the façade of James and Candida’s seemingly happy marriage. When Eugene first professes his love for James’s thirty-something wife, James simply laughs it off. The reverend is as confident as he is successful — a renowned public speaker who ostensibly sees little threat in the shy, barely pubescent Eugene. James also idealizes his wife, but devotion alone, Eugene argues, is not love, and it isn’t long before the youngster’s passionate attacks on the marriage begin to wear the tired James down. Ultimately, the question of who is the stronger of the two is exposed in the rubble of their debates.

 

The cast excels at reciting Shaw’s witty and pungent dialogue, which, let’s be honest, could be spoken by “Jersey Shore’s” Snooki and still be a feast for the ears. The wonderful Errico, a duly credible object of affection, is complemented by the talented young actor Sam Underwood, who is both funny and affecting as the lovelorn Eugene. Though squeaky at times, Underwood has a dazzling physicality, bringing Eugene’s neurotic leaps and bounds to spastic life. Other cast standouts include the terrific Brian Murray, as Shaw’s requisite gruff capitalist Burgess, and Xanthe Elbrick, as James’s impetuous secretary, Prossy. Irish Rep producing director Ciaran O’Reilly turns in a competent performance as James, easily portraying the reverend’s smugness and wit, but the actor does not exude the type of raw charisma necessary to pass James off as the magnetic public figure he is purported to be. As such, references to women ogling James during his sermons are not as credible as James’s own suggestion that ladies’ church going is merely a Sunday diversion.

We watch Candida as we would a summer blockbuster: on the edge of our seats, waiting. The hope is not for a mind-blowing alien world created by an army of effects wizards, but rather an equally riveting exposition of ideas, something to challenge us for the walk through Chelsea when it’s all over. The ideas are put forth with expert precision, but we are never really challenged, and maybe it’s because Candida never really transcends the sum of its parts, however proficient those parts might be. Still, it’s a worthy and timeless critique of matrimonial presumption, and for those whose interest is piqued by such things, the Irish Rep’s Candida is definitely a play to see — when it’s convenient, of course.

Hive Times

January 18, 2010

Surviving the whims of the many

***

The New York Times on Friday posted an eye-opening review of a new book by the famed Silicon Valley prophesier Jaron Lanier, an early champion of Internet populism who now seems to have reconsidered his love for the ever-intractable Web. Lanier’s book, You Are Not a Gadget, condemns Web 2.0 culture for trampling intellectual property, diminishing the importance of uniqueness, and fostering the digital equivalent of mob rule. To prove his theory, the author cites some of my favorite guilty pleasures—Google, Facebook, Wikipedia—as insidious forces bent on stripping us of our individual voices.

I haven’t yet read Lanier’s book, but the Times review alone has stirred my longstanding ambivalence toward the Age of Ones and Zeros. I’ve always met this brave new world with resistance, adapting to new technology with one hand while pinching my nose with the other. As a generation on the cusp of old media and new, I’ve watched about half of my cohorts embrace blogs, Twitter, and social networks while the other half slowly vanished into an ink smudge of analog obsolescence. At the same time, I slowly, begrudgingly, adopted the use of these cute little tools with a snarky sigh of acceptance, all the while proclaiming myself a lover of the clearly superior forms of old media with which I grew up.

I could never explain quite why I opposed the new order, and over the years I’ve chalked it up to a glut of idiosyncrasies—fear of the unknown, cultural myopia, Gen-X skepticism, plain old curmudgeonliness, or what have you—but my unease was always fueled by one particularly haunting thought: a world in which everyone has a say in everything.

On the surface, opposing such a world seems so bigoted. After all, why shouldn’t we all have a say in things? That’s fair, right? Yet the concept so rarely works in practical application that the wigged gentlemen who framed our Constitution even added a set of checks and balances to prevent it, hence the Bill of Rights, which protects the individual from mob rule. In other words, if enough people hated this blog post, they couldn’t simply “vote” to have me killed, as killing me would violate my civil right to, well, live. Individual civil rights trump the whims of the majority, and thank goodness for that, lest we would still live in a country of separate drinking fountains and an all-male Congress.

The Internet operates on the opposite principal, distorting the playing field so that the individual is lost under the illusion of democracy. On YouTube, for instance, a view is a view, and because videos of 17-year-old girls dancing around in their underwear will never fail to attract a certain sizable subset of the population, such videos may garner a higher ranking than a national address by President Obama. Does that make them more valuable to society? (Okay, depends on whom you ask.)

Meanwhile, as browser software has evolved to meet the growing interactivity of Web surfing, the former domain of sound opinion has become a breeding ground for unchecked schmuckdom. A review by the veteran New York Times critic A.O. Scott, for instance, is now saturated with flippant comments by readers, shielded by anonymity, who even get to rate the review from one to four stars—a kind of game of Critique the Critic. Not that the Times hasn’t always included editorials from everyday folk, but the new level of interactivity has one wondering when Pinch Sulzberger will decide to add a “like” button to stories about al Qaeda cells in Yemen.

The Internet isn’t going away (yes, I stopped wishing for that back in 1998), and putting the genie back in the bottle is about as practical as trying to convince people the world is flat. However, it’s worth pondering the questions about the future of our society that remain open ended: If everyone has a voice, does anyone? And if we truly are moving toward some Borg-like hive mind à la Star Trek: the Next Generation, is there anything we can do to mitigate its ill effects? Can we choose a life off the grid, or is resistance futile?

Andy Warhol once said, “In the future everyone will be famous for 15 minutes,” but what he failed to foresee was how the changing definition of fame would render the concept obsolete. Time magazine can slap a mirror on its cover and dub us all “Person of the Year” just as easily as we can publish our own scathing commentary about Time magazine’s irrelevance. Andy’s future is here, and we’re all as famous as we are anonymous.

Our Tough-Luck Theater Town

December 16, 2009

I wrote this article a few months ago in Show Business Weekly about the struggling nonprofit theater industry. It seemed apt in light of the story in Crain’s New York Business on Monday about the Roundabout Theatre Company’s serious financial woes.

It’s anyone’s guess where New York’s theater artists will be a year from now. Hell, try six months.

***

Recession Clobbers Nonprofit Theaters
Dwindling funds to arts groups highlight disparity between commercial and non-commercial sectors

By Christopher Zara

At a time when fewer Americans can afford the extravagance of attending a Broadway show, one would expect a conference on the future of performing arts in New York City to be a gloomy affair. However, on the stately campus of Columbia University’s School of the Arts late last month, where hundreds of theater professionals and arts aficionados gathered for a four-hour symposium dubbed “Performing Arts at a Crossroads,” the mood was surprisingly peppy and optimistic.

The conference, part of the “Future of New York City” series presented by Crain’s New York Business, comprised a string of panel discussions with arts administrators and city officials who had hoped to open a dialogue about how New York’s arts industry can weather the country’s worst financial crisis since the Great Depression. The discussions were both lively and rallying — a hopeful token of the city’s position as a leading purveyor of arts and culture — but in the end few solutions were offered as to what, if anything, should be done to combat dwindling funds in the theater industry, particularly within its embattled nonprofit sector.

“The arts are a powerful driver of everything we do and are in New York,” First Deputy Mayor Patricia Harris reminded the crowd in her opening speech, touting Mayor Michael Bloomberg as an avid arts enthusiast who returned cultural initiatives to the top of the municipal agenda on his first day in office, January 1, 2002. Less than four months after 9/11, Harris said, Bloomberg immediately green-lit Christo and Jeanne-Claude’s The Gates installation in Central Park. “Faced with a looming deficit that out-paced a national recession, why would a mayor even think about public art?” Harris asked rhetorically. “It’s because he was counting on The Gates to do three things that the arts do every single day: transform the quality of life in New York, enhance our identity and, yes, contribute to the city’s economy.”

Eight years after the 9/11 attacks, New York City is facing a new economic menace, one that has forced even the most devoted theatergoers to tighten their entertainment budgets. To be sure, the dire effects of the recession have not battered the theater community equally across the board. At last month’s conference, one panel discussion focused specifically on the theater business. Although its aim was to find common solutions for theater companies both large and small, the discussion ultimately underscored the huge economic disparity that exists between commercial and nonprofit theater.

Nina Lannan, chairman of the Broadway League, a trade association for Broadway theaters, maintained a downright cheerful demeanor in the face of questions about how the recession has affected the Great White Way. Lannan, who also worked as general manger last year on the Broadway shows Billy Elliot, 9 to 5 and Mamma Mia!, said Broadway ticket sales are down only about 10 to 15 percent, far less than in other areas of the city. During the last week of September, for example, nine out of the 26 shows currently on Broadway played at a 90 percent capacity.

In short, while Broadway’s commercial sector had been expecting a sharp recession-related decline in ticket sales, anxiety over its bottom line has so far been unfounded. “We were all worried about the summer, but with many shows, we did proceeds over $1,000,000,” Lannan said. “And September has been pretty good for us, too.”

This is not to say that Broadway is immune to the recession. To illustrate this, Lannan cited the recent musical flop 9 to 5, which, despite being based on a popular movie, failed to find an audience and was forced to close on Labor Day weekend. “I think in this climate people are being more choosy about their shows,” she said.

For Todd Haimes, artistic director of the nonprofit Roundabout Theatre Company, the consequences of the recession have been far more devastating. Haimes was quick to acknowledge a drop in subscriptions and general attendance, but he said such declines have been minor in comparison to the falloff in corporate and private donations to the Roundabout. “Where we’re getting killed — and I think this is true to varying degrees for all nonprofits — is in contributions,” he said. “It was like falling off a cliff. We didn’t prepare for it at all.”

Haimes said that many of the major corporate foundations, which could be counted on in the past for huge donations to nonprofits, have been wiped out by the stock market. Moreover, he noted a general decline in corporate philanthropy that has taken place over the last two and a half decades. “When I came to New York in 1983, it was the responsibility of every corporation to give philanthropic dollars back to the community,” he said. “Over the years that shifted, which is sad. Now it’s become nothing.”

Panel moderator Steven Chaikelson, director of Columbia’s theater program, kept a necessary degree of optimism in response to Haimes’s bleak outlook on the future of nonprofit theater. Driving home the point that the arts industry is not a particularly stable one, even in times of economic growth, Chaikelson went so far as to balk at the conference’s ominous-sounding theme, “Performing Arts at a Crossroads,” which he admitted is a vague concept. “I read that title and scratched my head,” he said. “When have we not been at a crossroads? It seems like we’ve been facing the imminent demise of theater for the last 100 years.”

Average Community’s Jersey Homecoming

December 12, 2009
***
I hope everyone in the Northeast will come to the New Jersey screening of the new documentary my brother and I produced. (I’m a little nervous about the Q&A). Press release below:

***

Remember City Gardens? Remember New Jersey hardcore in the 1980s? Well, you’re not alone.

The Zara Brothers’ Trenton-set documentary, AVERAGE COMMUNITY, will have its official New Jersey premiere this month at the Record Collector in Bordentown. The gritty punk-rock docu-memoir, directed by Trenton native Fred Zara, took home the Audience Award for Best Feature Documentary at the 2009 CMJ Film Festival in October, where it played to an enthusiastic crowd at the Norwood Screening Room in New York’s West Village.

Fred and Christopher Zara will be on hand for a Q&A. Hosted by City Gardens’ own Randy Now.

Wednesday, December 30
The Record Collector
358 Farnsworth Ave.
Bordentown, NJ
www.the-record-collector.com

All tickets are $10
Advance tickets at the store (assigned seats)
and from http://www.the-record-collector.com
NO children under 13 please.