The People’s Choice?

In a convention process, the chance of a candidate opposed by the party establishment winning a nomination is practically nil, at least by historic experience.

This disdain for the people as sovereign over the political class highlights why so many average citizens feel disconnected from our governing processes and seem to find little inspiration or enlightenment from today’s editorial or commentary pages.

From nighttime cable talk to the morning daily newspaper, we increasingly live in an age when those who commentate have no practical experience on the subject their media outlets proclaim as their area of expertise.

This lack of any down-to-earth experience in the real world is glaringly evident in the debate on primary versus convention process.

Let me use the recent Virginia Democratic Party primary to pick a lieutenant governor nominee as an example. Admittedly, the primary turnout was roughly 115,000, down about 25 percent from the 2001 statewide primary total. The Republican Party primary, which featured contests for all three statewide offices, only attracted about 175,000 voters for the top spot.

These are very disappointing numbers.

But in 1997, when Democrats last used the convention system to pick their statewide ticket, delegates were chosen at local meetings (usually at one location in a county or city). The actual number of people who turned out to these gatherings was at least 75 percent fewer than the voter totals for the June primary. That’s right: The convention process is hugely less democratic than even the low-turnout primary. This was equally true in 1993.

Indeed, it was not unusual for local leaders to have to scramble to get people to file to run for delegate in a good number of jurisdictions.

More importantly, however, is this fact of political life: The difference between a primary selection process and a convention selection process is far greater than even a 3-to-1 margin.

In a convention process, the chance of a candidate opposed by the party establishment winning a nomination is practically nil, at least by historic experience. The only candidate to ever buck the party leaders and win a spot on the statewide ticket picked at a convention was Doug Wilder in 1985. He would have been a sure winner in a primary, as his subsequent historic victories showed.

Let’s take 2005 as another example. Many key members of the Democratic Party establishment were privately working against the nomination of either former Congressman Leslie Byrne or Delegate Viola Baskerville. They preferred either Delegate Chap Petersen or Sen. Phil Puckett for the party’s lieutenant governor nominee.

Had Democrats used the convention process in 2005, I can safely say that these members of the party establishment would have been able to orchestrate the nomination of Petersen or Puckett through the influence given powerful party politicians by a nonprimary mode of selection.

But this kind of veto power was basically diluted by the primary, which gives regular Democrats power over the party bosses.

Unfortunately, Virginia is again being pushed to return to a top-down political process as was dominant in the Byrd era. Democrats need to embrace an open, bottoms-up politics.

If we return to a convention process where at least 75 percent fewer people participate, we are going to be getting very close to a taxation without representation scenario.

No doubt, this is what many state power-brokers — in politics, the media and big business — want: You pay but you don’t get much say, at least in terms of party nominees. S

Paul Goldman is the senior policy adviser to Mayor L. Douglas Wilder. He served as Wilder’s chief political strategist and was adviser on Mark Warner’s campaign for governor.

Opinions expressed on the Back Page are those of the writer and not necessarily those of Style Weekly.

© 2005 Paul Goldman. All rights reserved.

Letters to the editor may be sent to: letters@styleweekly.com

It’s Alive…Again!

Why Horror Films Are Getting a Second Chance

by Jeremy Griffin

There’s something funny about watching Cary Elwes saw through his ankle, or Mila Jovavich gunning down zombie dogs. Or of a videotape vile enough to kill you, yet courteous enough to call and tell you when that will be. Perhaps it’s just the absurdity of these things that makes them so appealing, the inherent silliness of such an eager departure from reality.

But then again, isn’t that what movies are for?

The evolution of horror films has been minimal, to say the least. While Hollywood continues to lose itself in new technologies and marketing pageantry, horror films embrace the same formula they’ve used for decades. And why not? It works. Just look at the success of “The Boogeyman,” whose initial weekend earnings of $19.5 million set a new record for a film opening on Super Bowl weekend.

In fact, only two major advancements stand out in the recent blizzard of horror flicks. One is an increased use of special effects. This is probably due in large part to the “Matrix” trilogy, which catalyzed a growth spurt for the CGI industry.

The other is a much loftier attitude towards violence. More creative deaths can compensate for a repetitive storyline. And that’s really all the audience wants. Screw the storyline — show us a new way to die. In other words, you don’t need Jerry Bruckheimer to produce your picture; you just need to raise the stakes. Bigger monsters, louder screams, more blood. If the last 10minutes of “Saw” didn’t keep you at least somewhat intrigued, then you are indeed the living dead.

Until fairly recently, the horror genre’s big names were staples of early ’70s late ’80s pop culture. Leatherface, Pinhead, Freddy Kruger; none of these characters were multidimensional in the least. Nor did they need to be. Their sole purpose was to scare the daylights out of bored teenagers. And no one can deny that they excelled at this.

But then, for whatever reason, people lost interest. The audience grew up. Tipper Gore came along. 90210 happened. You also had the beginnings of an infamous series of school shootings in Mississippi, Kentucky, and eventually Colorado. Maybe the real world became scary enough. Needless to say, it suddenly wasn’t kosher to watch movies about goodlooking young people getting butchered.

Of course, the genre didn’t disappear completely; it just kept a low profile. Unfortunately, movies like “Jason X: Jason in Space” didn’t help to boost its popularity. In the meantime, there was a barrage of more “sophisticated” pictures like “Titanic,” in which a very large boat sinks and Celine Dion sings a song about it. (It seems rather strange to me that, in the absence of any real noteworthy horror flicks, we’re always left with movies that make us feel like victims of some chainsaw-wielding halfwit…)

However, it wasn’t until Wes Craven’s “Scream” trilogy that audiences finally started to get the joke. Take a look back at Jaime Kennedy’s lectures on the mechanics of horror films, and you might catch on. These movies aren’t supposed to be serious. They’re supposed to be fun. The men and women who make them understand how trite they are. But you’ll notice that the most successful of these films don’t overcomplicate themselves. Instead, they embrace their own campiness. They capitalize on the traditional ploys — the gratuitous sex scenes, the demonic villains, and the gory deaths. Look at recent movies like “Freddy vs. Jason,” the remake of “Dawn of the Dead,” and “28 Days Later.” There’s nothing overwhelmingly original about these storylines, nor are the scripts anything to brag about. The only difference is the increased level of violence, which is the only way for these films to top their predecessors. And if you ask me, they did a pretty good job. One can’t help but revel in the brute implausibility. I saw each of these in theaters, and each time the audience laughed and applauded, and left looking quite satisfied.

Now, there are downsides to the reemergence of horror films. For instance, the “Scary Movie” series. While I can appreciate the good-natured jabs at films like “I Know What You Did Last Summer,” the first “Scary Movie” was enough for me. The other two seemed more like a collage of leftover jokes pasted together by a handful of overly stimulated fraternity brothers. Nevertheless, their popularity only points to the fact that there is much humor to be found in horror films. And the people most willing to point this out to us are the filmmakers themselves.

Horror movies have a way of tapping into the parts of the imagination we’re a bit too embarrassed to acknowledge. In a sense, it’s one of the most honest genres in that these films don’t pull any punches. What you see is what you get. No ethical obstacle course to run, no life lessons to carry away. Someone’s always going to walk into the dark room, and someone’s always waiting. And that’s oddly reassuring.

So don’t tell me that Sam Raimi isn’t laughing his way through the scriptwriting process. Or that Naomi Watts got the lead role in “The Ring” because she’s a good actress. If you think the abundance of breasts in the “Friday the 13th” flicks is unnecessary, or that the “Nightmare on Elm Street” movies are too predictable, or if you just can’t believe that Michael Meyers won’t die, then you, my friend, are absolutely right — now go back to your copy of “Bridget Jones’ Diary” and keep quiet. As for me, I don’t see any value in looking to cinema for enlightenment. To quote Stephen King on the virtues of horror stories: “If you want to learn something, go to school.” I have a copy of “Schindler’s List” that I like to watch whenever I feel the need to tap into my emotional core and truly understand the human condition. But for purposes of pure entertainment, I also have a copy of “Night of the Living Dead.”

Guess which one gets watched more often. S

TRENDING

WHAT YOU WANT TO KNOW — straight to your inbox

* indicates required
Our mailing lists: