Tell Me What A Predicate Is Or I’ll Blow Your Head Off

One afternoon almost seven (!) years ago I received a degree in English with a concentration in writing from a respected and respectable institution.  Had my life depended on being able to define a grammatical predicate as I walked through West Philadelphia that evening, it would have been ka-blammo.

It wasn’t the university’s fault; it probably wasn’t my high school’s fault; it may not have been my fault.  But it shouldn’t have been so, I’m sure of that.  This is weaselly, but really I think it was the fault of grammar itself, and its failure to remain essential.  There is no Snake sticking a shotgun in your face.  Even if you don’t know what a predicate is, you can still communicate.  You can still communicate well.  And–the scary part–you can still hoodwink people into thinking you have some expertise concerning the English language and get a $160,000 piece of paper that says so.

I’ve spent a lot of time since 2005 refurbishing the lemon of a grammar education I ended up with at the end of school.  A professional writer drawing a blank about predicates, noun cases or coordinate adjectives is almost as bad as a surgeon drawing a blank about which is the uvula and which is the urethra.

The sky isn’t falling yet, but our language is going to get really bizarre and eventually really fractured if we keep telling ourselves that a set of standards for how we arrange our words either doesn’t matter or doesn’t exist.  Grammar’s failure to remain essential will be our failure.

Oh, right:

“One of the two main constituents of a sentence or clause, modifying the subject and including the verb, objects, or phrases governed by the verb as opened the door in Jane opened the door.” (American Heritage Dictionary, 4th Ed.)  



So nobody got the Pulitzer Prize for Fiction this year.  Buzzkill!  Look at the maidens’ dejected faces, as they realize the bride just doesn’t feel like tossing the bouquet. We who enjoy the spectator sport of publishing have been denied one of our year’s high points.

Sort of.

While Ann Patchett’s “indignation” and “rage” at the non-decision in this morning’s NYT are not misplaced, let’s remember that a little anarchy is often the best way out of a stale routine.  Recent Pulitzers have gone to good and great books, but at least now we have something more to talk about than how obvious or how obscure the pick was.  What’s more obvious–or more obscure–than nothing?

Also, the selection process is now etherized on the table and ready for dissection.  We want ANSWERS!  I think readers win any time the Secret Board of Shadowy Figures in charge of literary taste-making has to tip its hand a little.

What are these awards anyway?  If Sartre flipped l’oiseau to the Nobel Committee, aren’t these all just empty back-pats and integrity-undermining booby prizes??

Just kidding, I’m not going to rail against the idea of book awards.   They’re fun to anticipate, more fun to bet on, I bet it feels great to win one, and as Ms. Patchett says they’re good publicity.  But if the lack of a 2012 Fiction Pulitzer is really this mega-blow to the enterprise of reading, things are much worse than I thought.

The bummer for me is that now, I don’t have the validation I’d hoped for of all the nice things I’ve been telling friends and family about The Art of Fielding since October.

Except for how good the book still is.

Awww, C’mon!

"Seriously, buy it."

convince, persuade You may be convinced that something or of something.  You must be persuaded to do something.

-AP Stylebook, 39th Edition

I don’t like either of these words.  In high school, I invested a lot of time and effort into Lincoln-Douglas debate.  Early mornings, late nights, Friday evenings–I was grinding out research and case revisions, or talking forcefully to the wall, practicing my stock rebuttals so the real ones would rain down like arrows at that weekend’s tournament.

I went to tournaments most weekends, sometimes on foot, sometimes by car, sometimes by plane.  My performance at these mattered more to me than my grades.  I usually did well enough to motivate trying even harder for the next one.

But there were times when I would debate before a judge who couldn’t care less about my intricate, trapdoor-laden arguments and my briefcase full of factoids.  I would unleash the beast on some lesser-prepared opponent and then kick the mangled corpse until my toes ached.

And the judge, either disliking my style or just not paying attention, would dismiss the heaps of evidence I had presented and sign the ballot against me with a simple note explaining that I “was not persuasive” or that I failed to “convince” him or her.

In retrospect these were probably good lessons in humility, but at the time they were crushing.  I would be sullen and angry for the rest of the weekend.  It seemed so unfair. Once, after such a decision eliminated me from a prestigious tournament with real silver-plated trophies, I vented my aggression on a sawtooth oak in a university quad with predictable results for my knuckles.

So that’s why I cringe a little whenever I read or am tempted to write “persuade” or “convince”.  The ghosts of poorly considered decisions and quarterfinalist awards begin to taunt me from the rafters.

This is juvenile, of course, and I’m getting over it.  These words describe a basic, essential process in how we interact with others and with ourselves: When we didn’t want to do something, or weren’t sure if we wanted to do something, and then someone or something changed our minds.  We’re all convincing and persuading and being convinced and being persuaded every day.

And it’s good to understand how they differ, because until now I saw them as melded together in one molten lump.  A convincing never has to manifest itself outside the brain of the convinced.  A persuasion always leads to specific action.

It’s curious that convince derives from the Latin for “conquer”, considering it’s the gentler of the two in that it doesn’t necessarily compel action.  Persuade, which by definition has more real-world influence, derives from the Latin for “advise, make appealing, sweeten”.

Perhaps this is an etymological reminder that to get people to do what you want, a sly, gentle hand beats brute force every time.

I know what I know


Last Monday’s post touched on the idea of “write what you know”.  If you’ve ever solicited writing advice, this well-chewed piece of cud has certainly been passed to you by some supposed authority.  It may be the most canonical tip out there (or the most clichéd–what a fine line separates those two!).

It has merit, for sure.  Splattering your half-baked fantasies all over the page invites your reader to nauseously run it to the trash outside, holding it a long arm’s length away. Research can be a productivity-killing trap, and allows no margin of error–getting any detail wrong can ruin the credibility of your whole project.   Sticking to what you already know to be true and real keeps you safe: safe from writer’s block and safe from being full of bullsnot.

Safe is boring.

The problem with “write what you know” emerges when it is taken as a thumbs-up that one’s autobiography–point A to point B to point C to point –is literature or at least fascinating to anyone lucky enough to read it.

This doesn’t just apply to fiction and memoir, of course.  In the zillions of non-news articles out there, the difference is plain between “I’m writing about this because I’m an expert sharing expertise that can actually help you and anecdotes that will actually interest you” and “I’m writing about this because it happened to me once and I couldn’t think of anything more interesting”.  Amirite?

Look, I know I don’t have any special credibility spouting off about this.  Ask me about the unintentionally hilarious 29-page story I tried to pawn off in college from the perspective of a girl in 1970’s Red China.  I’m just scared of anything that makes a villain of the imagination.  As writers, I think it’s our responsibility to see beyond the things rotting in the sun before everybody’s eyes.


P.S. Here’s the obligatory link to the same idea covered more thoroughly and lucidly by a renowned authority.  Thanks a lot, internet.

P.P.S. What the hell, make it two!

Eyeball Chambers


I just saw this improbable feel-good story: A blind woman is handwriting her novel, her pen dies, and she of course doesn’t realize it (though doesn’t a dry pen on paper feel a little different?).  When she learns she’s just embossed twenty-six pages with nothing but shallow trenches, the good blokes of the local constabulary are able to recover her work through some complicated-sounding forensic techniques.

This got me thinking about how eyes are important to writing.  Actually I had already been thinking about it, but this got me thinking harder.  Since leaving the outdoorsy manual labor of my other career for life behind the keyboard, my eyes have had trouble adjusting.  They are red and itchy, my vision occasionally blurs for a moment, and Visine helps for about ten minutes.

Someone suggested I might be “forgetting to blink”, which is a little scary to consider.  I thought blinking was one of those things that took care of itself.

In my life I’ve been lucky to have good eyesight, and the thought that I’m messing it up worries me–both the principled fear of corrupting a genetic gift, and the practical fear of where it would leave me if things continued to get worse.  I don’t think I could write if I couldn’t see.  Granted, Milton and Homer pulled it off, and Trish Vickers (from the story) seems to be doing OK.  But I can’t imagine doing it myself.

I don’t think I’m going blind from three weeks of heavy computer use.  But going forward, eye health is going to be a primary concern of mine.  There is a medically-recognized condition called Computer Vision Syndrome (or CVS–nice work by the pharmacy chain‘s PR team for keeping it out of the news), which hopefully demonstrates I’m not making this up.

I welcome any suggestions on how to keep my eyes in fighting shape.  One self-prescription, effective immediately, is to write in longhand for at least two hours a day.   Another is to pry my attention away from the computer screen every 45-50 minutes no matter what.

Another is to remember to blink.

Halfway to Nowhere


I don’t think the semicolon will still be in common usage a generation from now.

I also think the numbers for tonight’s Powerball drawing will be 1, 6, 8, 35, 37 with Powerball 14.

Laugh it up, you sneering skeptics; at midnight I’ll have $98 million and you’ll have periods over commas.

In all seriousness, it seems–from my desk, at least–that written English is headed in a direction that does not favor the semicolon.  Since it first appeared in print (the year 1494, per Lynne Truss), disagreement has persisted about when and why to use it.

Debate is healthy in matters of punctuation, but eventually we reach a break point where everyone throws up their hands and walks away.  I think we’re just about there with the semicolon, despite recent helpful but probably too late attempts to clarify it.

In Eats, Shoots and Leaves, Truss quotes a Cecil Hartley:

The stops point out, with truth, the time of pause
A sentence doth require at ev’ry clause
At ev’ry comma, stop while one you count;
At semicolon, two is the amount;
A colon doth require the time of three;
The period four as learned men agree.

Gendered 19th-century word choice and treacly Britishness aside, I like the idea of comma-semicolon-colon-period tracing a continuum of timed pauses.  It’s easy to comprehend, and a good reminder that critical communicative things like pauses, tone, inflection, and even sarcasm are often lost in writing and we need to use every tool at our disposal to convey them accurately.

But the semicolon has none of the job security the other three have.  The comma is friendly and versatile.  The colon has, if nothing else, list introductions locked down. The period will survive nuclear war.

The semicolon not only must defend its territory from “long” commas and “short” colons (thinking of pause times), but also from the em dash which now seems preferred in less-than-formal writing to indicate that two-count.

And here’s the heart of it: The internet, the dominant publishing medium of today and tomorrow, favors less-than-formal writing.  That’s just a matter of numbers, which I wish I could cite exactly, viz. how many words are published online every day versus how many are printed every day.

My guess is that the em dash is going to gradually kill the semicolon.  Academics and other ultra-formal writers will hang on to the bitter end, and ; may retain some quaint/retro appeal like handwritten letters have, but as far as common usage in thirty years?

Check the punctuation lost and found, maybe it’s under the irony mark.

P.S. Here’s a good article about why I may be wrong here.

Do You Have Change For A Fifty?


14. Avoid Fancy Words

Avoid the elaborate, the pretentious, the coy, and the cute.  Do not be tempted by a twenty-dollar word when there is a ten-center handy, ready, and able.

-The Elements of Style

Strunk and White’s four adjectives (five if you count “fancy”) describing the poorest of poor word choices are not that helpful, at least not as quick fixes.  Perhaps “elaborate” words are something all of us can easily be on guard against, but “coy”?  Does a fly on the wall in Gay Talese’s office hear the great writer mutter “no, no, that’s coy” as he scans a rough draft, popping his rare stylistic zits?

Maybe.  Maybe that sort of vigilance and perceptiveness is why he’s Gay Talese and we’re not.  But I know few others, myself included, who could even identify a “coy” word or phrase.

I struggle with overwriting.  Foremost, I struggle to define it and can only fall back on Supreme Court Justice Stewart’s evergreen comment on hardcore pornography  (“I know it when I see it”).  But that’s only marginally useful as I’m stringing lines together.  Fancy, elaborate, coy, cute, pretentious… I don’t want any of these gremlins in my machine. Who does?  But how do I know they’re there before the motor catches fire?

More immediately, I struggle not to do it.  I know that an overwritten line sagging with spelling bee words, incoherent similes and metaphors, and way-too-specific verbs has the potential to embarrass for a long time.  I’m looking hard right at the “gremlins/motor” thing above and the “Gay Talese zit” thing above that, at a loss for whether or not I’ve corrupted this post with them.

You know what, though?  I’m going to keep them.  English is too interesting not to keep trying things with it, at least on a freewheeling blog.  We’ve seen everything, so all that’s left is to try and see everything in different ways.

Spare and minimalist can be refreshing–it can be essential in some types of writing–but in most creative contexts it too often ends up being faux-Hemingway drivel more obviously awkward than overwriting is.

I think the key is to keep the ego in check, because overwriting (and derivative minimalist writing) is always an expression of the ego.  Whenever you see either, count on this: at some point the writer leaned back, appraised the pile of literary cow chips that somehow made it to your eyes, and thought “Damn, that’s good.”, probably followed by “Damn, I’m good.”

Dangerous stuff.  It’s the same ego that makes you think you’re actually a better driver after three drinks–a truly malicious bugger.  And now that I’ve identified him, I’m going to root him out of my writing.  That’s a promise.