A column by Gary Webster
The computer ate my essay!
As a child, I never used the excuse for a late homework assignment that "my dog ate it!" For one thing, for much of my childhood, my family didn't own a dog. . .not that my teacher would've known that. As I recall those golden years, I can't remember any of my classmates attempting to convince the teacher that their dog had chewed and swallowed their homework. I think the whole "my dog ate my homework" gag is the concoction of some comedy writer who certainly has gotten a lot of mileage out of it. I hope he or she copyrighted it. Besides, I can't imagine any pooch being that hungry. Homework would have to taste terrible.
However, my computer did eat the essay I'd written for December. I have no one but myself to blame. I'd written a perfectly nice Christmas essay in which I lampooned a Hallmark Channel holiday movie I'd seen while waiting for a repairman to come to my mother's house to fix her washing machine. My mother loves the Hallmark Channel. The film was about a cynical magazine reporter dispatched to a town called Mistletoe to investigate a man named Kris Kringle who claimed to be the real Santa Claus. Pretty creative stuff, huh? I'm still trying to get someone in a position of authority in Hollywood to read my screenplay while people are turning out hokey scripts about guys claiming to be the real Santa Claus and selling them to the Hallmark Channel and getting paid big bucks for them. Actually, someone in a position of authority in Hollywood has read my script, through an actress friend who read it and liked it and presented it to the man promising me that "he'll love it! He does this kind of stuff all the time!" He didn't love it. At least not enough to buy it. The guy did say he thought it should be a TV movie rather than a theatrical film, an observation I can't argue with. Too bad he doesn't work for the Hallmark Channel.
Anyway, the crux of the essay was the fact that even though I didn't watch the whole movie (the repair guy came about halfway through it), I saw enough to know how it ended. I can say with 99.9% certainty that the cynical female reporter fell in love with the man (a former divorce attorney) who owned the bed and breakfast she stayed in while investigating the old geezer she was convinced was either a mental case or a con man when she arrived in Mistletoe. The old geezer's interaction with the townspeople during the Christmas holiday convinced the reporter he genuinely was Santa Claus, and the film climaxed with her marriage to the proprietor of the bed and breakfast on Christmas Day. She submitted her story to her editor; then, having lost her cynicism thanks to Kris Kringle, quit reporting to help her new hubby run his business and they all lived happily ever after in Mistletoe. . .especially at Christmas. It had to have ended that way. It was a Hallmark Channel movie, for cryin' out loud!
Unfortunately, I didn't bother to back up the essay on my flash drive until it was time to send it on to the editor of this website, and it was lost when my computer was rendered essentially useless by a virus. The people at the local electronics store told me the virus that had struck my computer was so new, no one had yet found a fix for it, so the only way to get rid of it was to tear down the computer and delete all of my files. Anything that hadn't been backed up would be lost. And so my pithy and witty December essay met its unfortunate fate. I think the Grinch was responsible. Let that be a lesson to you: back up everything you do on your computer, or the Grinch may eat it.
I'm sorry you didn't get to read my December essay. You would've enjoyed it.
We now return to The Webster Chronicles on The History Channel.
Admittedly, what occurred this morning doesn't deserve a special program on cable television, but it is significant nonetheless. At approximately 9:43 eastern standard time this morning, the fourth day of November 2014, I voted for a Republican for governor of Ohio.
I am a registered Democrat and proud of it. But the Democratic nominee for governor left me no choice but to break with a longstanding tradition and cast my ballot for the incumbent Republican. The Democratic nominee lost my vote when the news media revealed that, while serving as mayor of the Cleveland suburb of Lakewood, and then while serving as executive of Cuyahoga County, he had driven without a valid license for 10 years. That's not such a dastardly crime in and of itself, but the media also noted that, while driving for a full decade without a valid license, the guy was putting the hammer down on county employees who were caught doing the same thing. A classic case of "do as I say, not as I do." But even that didn't bother me as much as the nominee's spokeswoman's response to the revelation that the candidate had driven without a valid license for 10 years.
The campaign spokeswoman acknowledged the candidate's transgression, but tried to explain it away by saying that the candidate really, really, really meant to get to the BMV and have his license renewed, but every time he'd planned to do so, something came up and he just couldn't make it. Every time in a 10-year period. That's roughly 2,600 business days. Every time in 2,600 chances to visit the BMV and get his license renewed, something came up and he just couldn't make it. What a coincidence!
Any candidate expecting me to be stupid enough to swallow a lame excuse like that wasn't entitled to my vote, Democrat or not.
And so, if memory serves me, today I, for the very first time, voted for a Republican for governor of Ohio. It was a relatively painless experience, but I sincerely hope my party never puts me through it again. I have a long and proud history of voting for Democratic candidates for governor. I remember the first gubernatorial election in which I voted. It was 1978. I was 22 years old and a student at dear old Kent State University. I proudly voted for Richard F. Celeste, who preferred to be called Dick. I remember how stunned I, and my fellow students, were the next morning as we asked ourselves how our candidate had lost to the incumbent Republican, James A. Rhodes, who preferred to be called James. Actually, he preferred to be called "governor," as he would be until January of 1983. It was his fourth term as the head Ohioan, and it didn't take us long to figure out how he won. While we had voted for Celeste, our parents had all voted for Rhodes, and they had us out-numbered, 2-1.
I voted for Celeste three times. He won in 1982 and again in 1986, only because in 1986 the Republicans nominated Rhodes, who was 78 years old and just too doggone old to be governor! In 1990, I voted for Anthony Celebrezze, in 1994 I voted for Rob Burch (I think), in 1998 I voted for the Natural Law Party candidate because I was disgusted with the campaigns conducted by both the Democrat and the Republican. I voted for the Democrat in 2002, although I can't remember who he was. I voted for Ted Strickland in 2006 when he was elected, and in 2010 when he was booted out of office by John Kasich, the Republican I voted for today, even though I don't trust Republicans.
I trust candidates who give moronic answers to legitimate questions even less.
As the old advertisement for the supermarket tabloid used to say, inquiring minds want to know.
And if they don't, I do.
I grew up watching and loving cartoons. When my friends. . .well, let's call them the kids I went to school with. . .were socializing on Saturday mornings, I was glued to the living room couch, watching cartoons. From seven o'clock, when Barnaby appeared on my TV screen hosting two hours of Popeye cartoons, until one o'clock in the afternoon, when the final NBC animated offering for the day concluded, I barely moved. I'm sure I rose occasionally to use the bathroom, and maybe to raid the refrigerator for a snack, but otherwise, every Saturday morning was a cartoon marathon which held my rapt attention. And when the man on TV told me not to touch the dial, I listened. Although there were wall-to-wall cartoons on all three networks (that's all the networks there were in that ancient era), my television's dial never budged from channel 3, Cleveland's NBC affiliate. Maybe I was afraid I'd hurt my wrist if I twisted the dial, or maybe I was too lazy to get off the couch to do anything other than use the bathroom and grab a piece of toast. Or maybe I just really liked NBC's cartoons.
As I approach my golden years. . .I can't believe I just wrote that, but it's the truth, hard as it is to admit. . .I still enjoy cartoons. Nickelodeon is my favorite cable network. I have almost every episode of SpongeBob SquarePants on video, and I don't even have grandchildren. I don't even have children! Another Nickelodeon cartoon I get a kick out of is The Fairly Odd Parents, about a 10-year old boy named Timmy Turner who is afflicted with the world's worst babysitter, the world's worst elementary school teacher, and the world's dumbest parents. The head fairy took pity on Timmy back in 2001 and gave him two fairy godparents to ease his pain, and to grant his every wish. . .within limits. According to fairy rules, fairy godparents can't grant a wish to enable their godchild to win a competition, and they can't grant a wish interfering with true love. At least not on The Fairly Odd Parents. Every thing else is fair game, and Timmy and his godparents, Cosmo and Wanda, had some crazy adventures over the next 13 years and hundreds of episodes. Wanda was the smart fairy. Her husband, Cosmo, had a brain the size of a needle point.
A couple of months ago, Nickelodeon presented a film entitled A Fairly Odd Summer, in which Timmy (he stayed 10-years old in the cartoons, but in the live action films he was an adult) and his girlfriend Tootie save Fairy World from destruction by protecting the world's supply of the substance that gives fairies the power to grant wishes. In order to do so, Timmy fell into a swirling pit of doom, and emerged as a fairy himself. The head fairy re-assigned Cosmo and Wanda to another miserable 10-year old, and that was the end of The Fairly Odd Parents. It took me several minutes after the ending credits to realize this. The show is over! There will be no more Fairly Odd Parents cartoons.
Throughout the show's 13-years on Nick, its creator, Butch Hartman, steadfastly kept one secret: the first names of Timmy's real parents, who were always referred to as either Mr. and Mrs. Turner, or Timmy's dad and Timmy's mom. I always figured that, in the final episode, Hartman would reveal their first names. But he didn't! The show is over, and we still don't know Mr. and Mrs. Turner's first names. Apparently, we never will. Only Hartman knows, and he isn't telling. I feel gypped. Doggone it, I want to know! What are Mr. and Mrs. Turner's first names?
I wonder if Hartman could be bribed?
How can they do this to me?
During my days as an undergraduate at dear old Kent State University, I never thought I'd find myself in the position I'm currently in: namely, doing just about anything to make a buck. Well, maybe not just about anything. Then again, I think walking around shopping malls wearing a furry dog costume, complete with a furry head, in 98 degree weather for four hours qualifies as just about anything. Yes, if you visited the Lodi Station Outlet Mall in beautiful Burbank, Ohio, during the summer of 2011 or 2012, that big furry dog in train engineer's clothing who waved at you as you were casually window shopping and otherwise minding your own business may have been me. There were some perks to the job, such as being mobbed by girls soccer teams in their uniforms complete with shorts. They really liked the big furry dog. It's a good thing they couldn't see the sweaty, skinny man inside. Then there were the well-meaning mall patrons who'd walk by me and ask as I waved at them, "are you hot in there?"
As Homer Simpson would say, "doh!" It was 98 degrees in the shade and I was wearing enough fur to make stoles for each of the Kardashian sisters. . .who probably don't wear fur, inasmuch as it's politically incorrect. Of course I was hot in there!!! BTW, the big furry dog's fur is fake. I don't want animal activists descending on the mall with their picket signs, since I and my co-horts make a lot of money wearing character costumes there. I have, however, retired as the big, furry dog. One 98 degree afternoon was enough.
Since I, perhaps wisely, perhaps not, responded to the query "do you think you can play a character with a head?" with a resounding "maybe!" I have been a big, furry dog; the Easter bunny; Curious George, the friendly and inquisitive monkey; George's friend the man in the yellow hat (for which role I was able to use my very own head, with a yellow hat on it); a Teenage Mutant Ninja Turtle; a Mighty Morphin Power Ranger (the red one, if you know which one that is); Dr. Seuss's Cat in the Hat; Martha the talking dog (almost as big and furry as the big, furry dog); Hello Kitty; and Clifford, the big red dog (but not quite as furry as the big, furry dog.) I may have forgotten a character or two. I wore the Hello Kitty costume last year at the Willoughby Hills Corn Festival, and I was looking forward to doing so again this year. More accurately, I was looking forward to earning $25 an hour wearing the costume at the Corn Festival this year. No such luck.
I hadn't really thought about it until the gentleman who schedules these appearances for our merry band of lunatics mentioned that the Corn Festival had been cancelled. I live within walking distance of the park where the festival was held the past few years, and it then struck me that I hadn't seen any posters advertising it. The chamber of commerce of my adopted home town usually started advertising the festivities in June, but not this year. Nor in July or even August.
Somebody cancelled the Corn Festival! Even worse, somebody deprived me of the opportunity to earn $50 by walking around in a cat costume for a couple of hours. I want to know who's responsible for this outrage! Get the mayor on the phone! I voted for him!
The Corn Festival is a long-standing Willoughby Hills tradition, dating all the way back to 2011 or maybe even 2010. It might even go back farther than that. And it's wrong to mess with tradition. The good people of Willoughby Hills looked forward to one last summer fling in the middle of September, and that's been stolen from them. Not to mention the chance to honor the farmers who grow the greatest of American vegetables, corn! What will they do next, cancel St. Swithin's Day?
Making up for lost time.
I promised plenty of political commentary in this mid-term election year back in January. Here it is August, and I haven't lived up to that promise, with the election three months away. Allow me to correct that mistake, although the political commentary that follows has nothing to do with this year's election.
The city fathers in my hometown. . .I was born in Cleveland, although I didn't grow up there, so I suppose that technically makes it my hometown. . .are patting themselves on the back as this commentary is being written for pulling off a coup. Few people, including yours truly, took local politicos seriously when they announced their intentions to entice the Republican Party to hold its 2016 presidential nominating convention in the brand new, $400 million convention center just constructed on Lakeside Avenue, across from Cleveland's city hall. I think the convention center cost $400 million. If it isn't a sports facility, I don't pay much attention to it. And the convention center isn't a sports facility, although politics has more than once been referred to as a sport.
Anyway, I didn't think Cleveland had the proverbial snowball's chance of landing the GOP convention, especially since it was competing against Las Vegas, among other cities. If you were a delegate to a political convention, where would you rather spend four days in the summer of 2016. . .Las Vegas or Cleveland? Me, too! But the Republicans realized they have positioned themselves as the party of traditional moral values, with much of their support coming from evangelical Christians, so how would it look for them to select their next presidential ticket in a place called "Sin City?" Thus, the decision eventually boiled down to Cleveland, Dallas, Kansas City and Denver. Why Denver and Kansas City were eliminated I don't know. I didn't even know they had been eliminated until an article on the front page of the July 6th edition of the local newspaper said so. Dallas, a cosmopolitan metroplex deep in the heart of perhaps the most Republican state in the union, came up short because it couldn't host the GOP in June of 2016. The convention would have to wait until July, and the Republicans didn't want to wait. Cleveland was available in June, and that's why the next GOP presidential ticket will be nominated on the shore of Lake Erie.
Here's what bugs me. Cleveland is an overwhelmingly Democratic city. The mayor who worked so hard to bring the Republican convention to town is a Democrat, as are most of the city and county officials who made the sales pitch to the Republican site selection committee. Admittedly, the mayor of Dallas is also a Democrat. I thought it was illegal to be a Democrat in Texas, but apparently not. The mayor of Columbus, which also made a bid for the Republican convention, is a Democrat, too.
Doesn't it seem the least bit hypocritical, or at least incongruous, that, after practically begging the Republicans to nominate their presidential candidate in Cleveland, the mayor of that city will, as soon as the delegates return home, turn his attention toward defeating that candidate? Ditto had the Republicans held their convention in Dallas or Columbus? "I did everything in my power to see to it you people nominated your presidential candidate in my city, now I'll do everything in my power to see that the candidate you nominated here gets his (or her) butt kicked in November by my party's candidate. But thanks for coming!"
I know, it's all about the money the delegates will (hopefully) spend while they're in town. But the concept still bugs me. At least it gave me a political essay.
Hey, you forgot one!
In my email today was an invitation from a young lady named Tina to attend a performance of Voodoo Macbeth at a local theatre. If you should happen to be reading this essay in a theatre, please don't read it aloud. According to theatrical tradition, it is bad luck to utter the name of Shakespeare's famous tragedy inside a theatre. Tina, with whom I appeared in a play many years ago, is performing in Voodoo Macbeth. It was the brainchild of the genius Orson Welles (and Welles would've been the first one to tell you he was a genius) who staged the play, set in Haiti rather than in Scotland, in 1936. I wonder what Shakespeare would've thought of that.
I also wonder what the Bard would've thought of our version of his play. Inspired by Tina's email, I looked up Macbeth on the internet and found a lengthy treatise on Wikipedia. Macbeth has been performed all over the world, with many of the stage's legendary actors playing the roles of the tragic Scottish nobleman and his overly ambitious wife. The author, or authors, of the internet article that took me an hour to read (and that was skipping several parts that really didn't interest me) listed most of the actors who have trod the boards in Macbeth, but somehow missed two: Webster and Marn.
In the spring of 1968, my sixth grade teacher, Kathleen Sullivan, got her hands on an abbreviated version of Macbeth and decided her class should perform the play on the stage in Upson elementary school's gym. I assume it was an abbreviated version, although the information I saw on the internet today kept mentioning that Macbeth, in spite of its five acts, is among Shakespeare's shorter tragedies. . .barely half the length of Hamlet. Anyway, Miss Sullivan saw actors among her students and held auditions for the play's various roles. Fancying myself a thespian, I decided to shoot the works. If I was going to be in the play, I may as well audition for the lead role. I had an ulterior motive: Jean Swearingen, the girl I had a crush on, was, according to scuttlebutt, going to try for the role of Lady Macbeth. If I was cast as Macbeth and Jean was cast as my wife, we'd have to rehearse together, forcing me to talk to her, which I was otherwise much too shy to do. You'll notice, however, that I mentioned the names Webster and Marn, not Webster and Swearingen. I, somehow, was chosen by my classmates to play the lead role in our production. Jean, unfortunately, wasn't selected as the female lead. Shirley Marn was cast as Lady Macbeth. Curses! Jean wound up as one of the three witches, so we had two scenes together. I still managed to blow the opportunity.
Nowhere in the voluminous story on Wikipedia is there any mention of Miss Sullivan's sixth grade class's version of Macbeth, performed once for students and once for family and friends. I don't recall if it was in April or May. I do remember the fire alarm sounding during dress rehearsal and not being cold as I stood outside in my leotards, so it was probably May. The article noted that there were three actors whose 20th century portrayals of Macbeth were considered by theatre critics to be definitive. One of them was Sir Laurence Olivier's. Mine wasn't mentioned. Neither was Shirley's portrayal of Lady Macbeth mentioned among the landmark theatrical achievements of the century. Maybe because she messed up one of her lines during the Banquo's ghost scene. I probably flubbed a few lines, too. But I can still recite the "tomorrow and tomorrow and tomorrow" monologue. I don't know what it means, but I can still recite it.
Jean and I wound up working in the same office 20 years later. I was still too shy to talk to her, which was okay since I had a crush on someone else by that time.
To quote the immortal Bard, "what's in a name?"
Many years ago, when my old college pal Bill's first child had learned to speak, he asked me what I wanted the child to call me. "Would you rather that he called you by your first name, or Mr. Webster?" I was asked. Apparently "Uncle Gary" wasn't an option, since I'd only be seeing the kid once a year, when I made my annual summer visit to Boardman. I told Bill that my preference was being called by my first name, since I never have liked being called Mr. Webster. As the old saying goes, "Mr. Webster is my father." However, I left the final decision to my friend, since the child was his and I felt it was his prerogative as a parent to decide how his offspring addressed me.
"I think he should call you Mr. Webster," said Bill, and that was what Jonathan and Jennifer called me until Bill decided they were old enough to call their elders by their first names. Which never happened, because before they reached that age, Jonathan had joined the Marines and Jennifer had gone away to college, and I haven't seen either of them since.
I was reminded of this by a story I saw on the internet the other day. I usually avoid the stories I see on my computer's AOL home page, because they're generally sensationalistic stories about carnage and destruction, but this headline intrigued me. Under a picture of former president George W. Bush were the words "guess what he wants his grandson to call him."
"I'll bet it's Mr. President," I thought to myself and clicked on the headline to find out if I was correct. Actually, there'd be nothing wrong with that, since protocol dictates that all former presidents should be addressed as "Mr. President" even when they aren't president any more, although I'm not sure that applies to family members. I'd be willing to bet Rosalynn doesn't call Jimmy "Mr. President." I can guarantee Hillary doesn't call Bill "Mr. President." I'm not so sure about Laura, but I certainly hope she doesn't.
That brings to mind another anecdote. I read that President Kennedy and his brother, the attorney general, were alone in the Oval Office discussing affairs of state one day, with Bobby continually addressing the president as "Mr. President."
"For cryin' out loud, Bobby, it's me! Your brother!" responded an agitated president. "Can't you call me Jack when we're by ourselves?"
Responded Bobby solemnly, "no, Mr. President, I will not." How did that story get out when there was no one else in the room? Don't ask me. I just hope it's true, because I think it's a cool story.
Back to the business at hand. Guess what former President George W. Bush wants his grandson to call him, as soon as the kid is able to talk? He's only a year old as this essay is written.
He wants his grandson. . .and, I would assume, any subsequent grandchildren Jenna and Barbara may produce. . .to call him "sir." Not grandpa, or granddad, or gramps. He wants to be addressed by his grandchildren as "sir."
Maybe that's the way they do things in Texas.
Fortunately, future Bushes will not have to call their grandfather "sir." Laura and Jenna quickly quashed that idea, informing the former leader of the free world that his grandson would not be required to address his grandfather like Oliver Twist asking the schoolmaster for more gruel.
If Bush is ever knighted by the queen of England, they'd have to re-consider.
What do they have that I don't have?
Aside from fame and fortune, I mean.
Recently, a Harris poll asked Americans to name their favorite books, in an effort to compile a top 10 list. As has been the case roughly since the time of George Washington, the favorite book of those responding to Harris's poll was the Bible. King James version, I suppose, although the story from which this information was gathered didn't specify. The next nine books on the list have something in common. See if you can guess what it is.
2) Margaret Mitchell's Gone With the Wind. The fact that this epic tale of the antebellum south is the second most popular book in the United States some 80 years after it was published should serve as an inspiration to all frustrated writers, which would include about 99% of us. That's because one of the publishers Mitchell queried about her book, according to one of my writing mentors, responded with a note telling her not to bother sounding out any other publishers because "there's no market for Civil War fiction these days." I wonder if that guy. . .I assume it was a guy. . .could have been related to the guy who recommended that Decca records not sign a rock-n-roll group called the Beatles because they, in his informed opinion, "aren't marketable."
3) J.R.R. Tolkien's Lord of the Rings. Although I haven't read the book or seen the film, I think it's safe to assume this is not a book about the jewelry business.
4) Harper Lee's To Kill a Mockingbird. My ninth grade English teacher, Mr. Padavick, assigned this book to our class. I couldn't put it down until the final page. I saw the movie, too. An absolute masterpiece.
5) J.K. Rowling's Harry Potter series. The survey didn't mention one specific Potter book, so most of the people responding must've read them all and couldn't decide which one they liked best.
6) J.D. Salinger's Catcher in the Rye. What's with these authors and initials? My ninth grade world history teacher, Mr. Paul, read Catcher in the Rye to our class. He didn't have us read it, he sat at his desk and read it to us. Forty-three years later, I'm still trying to figure out what Holden Caulfield had to do with world history. Maybe he was distantly related to a president or emperor or something.
7) Herman Melville's Moby Dick. I've never read this book, either, but I know it has perhaps the most famous opening sentence in all of literature. "Call me Ishmael," says the narrator. No wonder everybody who starts the book wants to know what happens next. They're dying to find out if the guy changes his name. I would have.
8) Louisa May Alcott's Little Women. The last time I read a book about women, I was in elementary school. It was a series of four books about a nutty family named the Fripseys, and the main characters were all girls. Now-a-days, I confine my reading about women to the Sports Illustrated swimsuit issue.
9) John Steinbeck's The Grapes of Wrath. I didn't know people in Oklahoma grew grapes. Or maybe they couldn't. Maybe that was why they moved to California. And why they were so full of wrath. The people, I mean, not the grapes.
10) F. Scott Fitzgerald's The Great Gatsby. I couldn't figure out what was so great about Gatsby, or the book, when I was assigned to read it in college. But you have to admit, F. Scott Fitzgerald is a great name for an author. I'll bet his agent thought of it. The guy's real name was probably Ralph Jones.
What do these nine books have in common? First of all, none of them are mine, and secondly, they're all FICTION! For the life of me, I can't understand why people would rather read about some spoiled brat living on a plantation in Georgia, or some kid studying magic in England, or some maniacal sea captain with a funny name wasting his life searching for a big fish. Okay, I know a whale is a big mammal, not a fish. Why do people waste their time reading about stuff that never happened when they could read my books about stuff that did happen? I don't get it!
No wonder every one in my writing group writes fiction except me. Maybe I should get with the program.
With sincere apologies to Gene Siskel, Roger Ebert, Gene Shalit and Rex Reed.
In the 13 years I've been writing these essays, I believe this is my first movie review. Just minutes ago, I completed a spring tradition by viewing my all-time favorite baseball film, 1949's It Happens Every Spring. I don't recall the first time I watched the film on the day before the Cleveland Indians' season opener, but it has become a rite of spring, and with the Indians opening the 2014 season on March 31st (in California, not in Cleveland, thankfully) I watched the film today.
If you've never seen the film, It Happens Every Spring is a humorous piece of fluff starring Oscar-winner Ray Milland as mild-mannered college professor Vernon Simpson. Quite accidentally, Vernon creates a liquid substance that is repelled by wood. Being a baseball fan, Vernon applies some of the liquid to a baseball and, sure enough, it avoids the wooden stick he tries to hit it with. Vernon manages to convince the St. Louis Cardinals (who are never referred to by their nickname, only by their city) to sign him to a contract, and, despite numerous trials and tribulations, wins 38 games during the regular season and three more in the World Series. At the end of the movie, Vernon has exhausted his supply of the magic wood repellant and catches a break when he fractures his pitching hand while catching a line drive bare-handed for the final out of the World Series. He returns to college and is put in charge of its new scientific research laboratory, and everyone lives happily ever after.
As much as I like the film, the script has more than a few quirks in it. For example, Vernon is in love with one of his students, Debbie Greenleaf, who just happens to be the daughter of the college's president. Debbie's crazy about Vernon, too. Wasn't that scandalous for the era? A college professor dating a student is still scandalous today, for Pete's sake! Yet Debbie's parents don't seem to mind, perhaps because, even though Vernon and Debbie discuss marriage early in the film, their relationship hasn't advanced beyond the handshaking stage. This is shown when they meet in a crowded courtyard between classes. They shake hands. THEY SHAKE HANDS! They're practically engaged and THEY SHAKE HANDS! Well, they were in public, and this was 1949. And another thing: Vernon is obviously old enough to be Debbie's father. That was glossed over, too. For the record, Milland was 42 in 1949 and actress Jean Peters, a native of northeastern Ohio who played Debbie, was an ingenue of 22. A college professor dating a student half his age! How'd that slip by? The censors must've been in the lobby buying popcorn when Vernon and Debbie are introduced to the audience.
The scriptwriter, Valentine Davies, includes the obligatory no-hitter pitched by Vernon, who uses the name Kelly to try to keep Debbie and her father from finding out what he's up to. Since Vernon has discovered a substance that is repelled by wood, meaning the ball jumps over the batter's bat, wouldn't every game Kelly pitched have to be a no-hitter? I'm just saying, that's all.
Davies also includes a "called shot" home run, a stunt Babe Ruth, according to legend, pulled in the 1932 World Series. In the bottom of the ninth inning of a scoreless game, Kelly's catcher, Monk Lanigan (played by Paul Douglas) promises Kelly that "I'm going to win this one for you." He takes a swing that would embarrass a Little Leaguer at the first pitch, then winks at Kelly and slams the next pitch over the wall for a game-winning and pennant-clinching home run. There are exactly two people waiting to congratulate Monk at home plate after he rounds the bases: Kelly and the batboy. He's just won the freaking pennant with a home run, and there are two people standing at home plate to congratulate him?? I guess players didn't get excited in 1949 like they do today.
I forgot to mention that, in Kelly's first game, the first batter he faces looks old enough to be my grandfather, and I'm pushing 60. There must have been a shortage of young actors in 1949.
Then there's the scene in which Vernon recruits two baseball players who are also students in his chemistry class to help him test his miracle concoction. He has the players join him on the school's baseball diamond at 5:00 in the morning, when it just happens to be broad daylight! There is no place in the lower 48 United States where it is broad daylight at 5:00 AM any time of year. No place! By the way, one of the students is a young Alan Hale, Junior, who later became the skipper of the ill-fated S.S. Minnow on Gilligan's Island. One of Hale's lines was, "okay, professor." How many times would he say that on the uncharted tropical island?
Flaws aside, It Happens Every Spring remains my favorite baseball flick, and now that I have viewed it for the umpteenth time, the 2014 season may officially begin. Play ball!
Could this finally be the year?
If you aren't a basketball fan, you needn't read this month's essay, although I wish you would. I certainly don't mean to chase away any of my loyal readers, assuming I have some. Actually, this month's essay isn't totally about basketball. It just starts out that way. It's kind of about math, specifically the law of averages.About two weeks after this essay is being written, the 2014 NCAA men's basketball tournament will begin. I don't mean to snub the women, but the topic doesn't apply to the women's tournament, where the development I'm about to expound upon has happened at least once.
In 1985, the NCAA, in its infinite wisdom, expanded the men's basketball tournament from 48 teams to 64. The tournament is divided into four regions, with 16 teams in each region, seeded from #1 to #16. The four teams seeded #1 are, in the opinion of the NCAA selection committee, the four best teams in the field. One of the four is most likely to emerge as the champion, although that doesn't happen nearly as often as it's supposed to. The four teams seeded #16 are, in the opinion of the tournament committee, the weakest teams in the field. If the tournament committee had its druthers, those four teams wouldn't even be competing, but the NCAA has awarded automatic bids to the "big dance" to their conference's champions. The number one seeds play the number 16 seeds in the first round, so as to put the lower seeded team out of its misery early. Nice try, here's your check, thanks for coming, have fun on spring break.
In the 28 tournaments since the field was expanded, there have been 112 match-ups between #1 seeds and #16 seeds. The number one seeds have yet to taste defeat. In other words, no #16 seed has beaten a number #1. In even other words, the #1 seeds record against the #16 seeds is an unblemished 112-0.
Each year, when the tournament begins, I watch the first round games hoping that losing streak will end. After all, it's the American way to root for the underdog, and there aren't many bigger underdogs than the teams seeded #16 in the NCAA tournament. A couple of #16's have come close to pulling the biggest upset in tournament history, and it has happened in the women's tournament. When a #16 finally defeats a #1, it will become the most famous team in college basketball history. Could that 28-year losing streak end this year? If there truly is a law of averages, then it should!
There's a reason I'd like to see a #16 beat a #1 this year. It's because if that can happen, I figure anything can happen. There are a number of highly unlikely, if not downright impossible, things I'd like to see happen, and if a #16 can beat a #1, then so, too, can these things come to pass:
For example, if Podunk State somehow manages to defeat Duke in this year's tournament, I'll believe that it will be possible that, some Friday night, I'll hear a knock at my door, and when I open it, Kate Upton, the Sports Illustrated swimsuit issue cover model, will be standing there and ask me if I'm available. For the whole weekend. And she'll be wearing a bright red Baywatch style swimsuit. Not that I have anything against bikinis, but I like the swimsuits the lady lifeguards wore on Baywatch.
If a #16 can beat a #1, then I can be the first person to win that magazine clearinghouse sweepstakes without even entering. My publisher can get an order for 10 million copies of one. . .make that both. . .of my books at a royalty of three bucks a book.
Go Podunk State! Underdogs everywhere are counting on you!
Where's Eddie Snowden when I need him?
Someone, or maybe several someones, in high places across the big pond (that would be the Atlantic Ocean) obviously don't want you to know the information I'm about to impart. I know this because my computer absolutely refused to reveal more than the first few paragraphs of an article I planned to use as background on the NBC news website. The first article I accessed, from a source in the United Kingdom, prompted two warning boxes to pop up on my computer screen, telling me that my computer was about to be infected with damaging spyware and adware if I proceeded any further. In spite of the dire warning, I tried to proceed anyway, only to have my computer flatly refuse! No matter what I tried to do, the same doggone warning boxes appeared. Apparently my computer is smarter than I am, which shouldn't come as much of a surprise, if you've read many of my essays.
Next, I accessed a website based right here in the good old United States. Unfortunately, the story about the financial plight of Great Britain's royal family (the Windsors) I found on the NBC news website irritated my computer. Each time I accessed it, a box popped up telling me that my internet access had been lost, and did I want to restore it? Heck, yeah, I wanted to restore it. I needed more information! But each time I commanded the computer to return me to the NBC website, as soon as the story showed up on my screen, the same box returned telling me the internet had stopped working. I tried it four times before detecting the development of a pattern. I've never been much for conspiracy theories, but somebody obviously doesn't want me to reveal to you the information I'm about to reveal to you, in spite of warnings of spyware and adware and a reluctant internet. Could it be England's equivalent of the National Security Agency?
Great Britain's royal family that would be Queen Elizabeth, Prince Philip, Prince Charles, Prince William, Prince Harry, Kate, the Duchess of Cambridge (I think William's wife is the duchess of Cambridge. . .I know she's duchess of something) and the infant Prince George. . .is down to its last one million pounds, which, given the rate of exchange on the day this essay is being written, is equal to 1.7 million U.S. dollars. I found out that much before my internet went A.W.O.L. on me. I also found out that the royals were paid $58 million by Parliament last year but spent $55 million of it. Or was it the other way around? That information was included in the first few paragraphs of the story the internet was willing to reveal to me. The story also said the extravagant spending is being blamed not on the Windsor family, but on their "courtiers." Since I couldn't scroll down to find out who these courtiers are, I have no idea. Are courtiers like the president's cabinet? Since they apparently specialize in spending money, maybe courtiers are like Congress. No, that's Parliament.
Excuse me if I find myself unsympathetic to the royal family's dilemma. Particularly if Parliament is going to appropriate another $58 million for them to fritter away this year. I have a problem with the whole concept of "royalty." I find it repulsive. Our forefathers specifically forbade royalty in the Constitution, and I don't understand our fascination with it. Nor the fascination of the British, especially if they're forking out 50-some million pounds in taxes each year to support their lavish lifestyle. There'll be no tears shed for the Windsors in this column. I could get along nicely on 1.7 million clams this year.
Maybe Congress will appropriate 1.7 million bucks for me so I can prove it.
Class, let me have your attention please.
Having refrained from overloading this column with political commentary in 2013, with 2013 not having been an election year of national significance, I'll return to my roots as an essayist this year, since this year is an election year of national significance. One-third of the seats in the world's greatest deliberative body (or so its members claim), the United States Senate, will be up for grabs in November. So will all 435 seats in the lower chamber of the federal legislature, also known as the House of Representatives (which, to its credit, has never, to the best of my knowledge, claimed to be the world's second greatest deliberative body.) Here in Ohio, we'll be electing a governor. If the early returns are any indication, the incumbent Republican doesn't have to worry about keeping his job. A lot, of course, can happen in 11 months, however.
And that brings me to the topic of this month's essay: jobs, or the lack thereof in the current economy, and why this is so. I, myself, am all too familiar with the present miserable state of the economy, having been employed only part-time for. . .hmm. Now that I think about it, I don't remember how long I've been employed only three hours per day, five days per week. It's been at least 4½ years, since it was in May of 2009 that I began researching my first book, which was published in 2012 and which not nearly enough people have purchased to date. I do my research when people with full time jobs are working.
It was, if I'm not mistaken, sometime in 2010 that the Commerce Department, or the Labor Department, or whichever government department is responsible for the economy, declared the mini-depression that began in 2007 had ended. We've heard a lot since then about the "jobless recovery," which is a giant family-sized crock of horse turds. There's no such animal as a "jobless economic recovery," just as there's no such thing as a dry lake. If it's dry, it's not a lake anymore, it's a hole in the ground. If there are no jobs, it's not a recovery. And since unemployment has declined to only seven-and-a-fraction percent, there haven't been nearly enough jobs created to replace the jobs lost during the mini-depression. That's why politicians on both sides of the aisle, whether they're running for national, state, regional or local office, will be touting their skills as "job creators" for the next 11 months. The most important job they want to create, of course, is one for themselves. . .on the public payroll.
President Obama has spoken often of not only creating jobs, but, in his own words, "good-paying jobs." There used to be plenty of these in our economy, until employers discovered during the mini-depression that they could improve their company's bottom lines by firing the employees with good-paying jobs and replacing them with employees who'd do the very same jobs (though probably not as proficiently) for a lot less money. Once these new employees with low-paying jobs become old employees with good-paying jobs, they'll be fired and replaced by new employees who'll be willing to work for the same low salaries the old employees worked for when they were new, if not even less. This is what's known as capitalism.
Here's what Democrats understand but won't admit, and Republicans understand but don't dare admit: no one starts a business to "create jobs," especially "good-paying jobs." Henry Ford didn't start the Ford Motor Company to create jobs. John D. Rockefeller didn't start the Standard Oil Company to create jobs. Ford and Rockefeller started their companies to make Ford and Rockefeller filthy rich, and the way to do that was to hire only the employees they absolutely needed, and pay those employees the absolute lowest salary those employees would work for. The pie is only so big, and the more labor gets, the less management gets, and management doesn't like that. Professor Coleman taught me that in junior college almost 40 years ago.
Maybe I can follow in the good professor's footsteps and get a position as an economics professor. At least until people start buying my books.
Copyright © 2014 by Gary Webster