Thank You, Graham Kerr, The Galloping Gourmet

In an overcrowded kitchen drawer, I have an old-fashioned egg beater with those fascinating gears, a mandoline, metal and wooden skewers, a device to strip corn from the cob, a device to butter corn on the cob, a device to chop hardboiled eggs, instant-read thermometers, biscuit cutters, nutcrackers, picks to dig out lobster meat, and stuff shoved in the back I may not have seen for years. This resting place also holds a bench scraper with a six-inch rule and the name Graham Kerr, with the first name in a fancy script and the second in rigid capital letters. When I see it, I often think of the important role Graham Kerr had in my life.

The mother, except for cooking bratwurst, which the father did, prepared the family meals. I might watch her in the kitchen, but I did little, if any, of the hands-on work. However, the mother worked as a grocery store clerk, and I often got my own breakfast and sometimes lunch. This often consisted of two shredded wheat biscuits (bite-sized shredded wheat was not then within my knowledge) with whole milk and copious amounts of sugar, which I would eat while reading the stories of Straight Arrow, the heroic Indian, printed on the cardboard that separated the layers of the cereal in the box. As I got older, however, I learned that I could actually turn on the stove and place bacon and an egg or two in a pan. I could cook bacon and eggs although “easy over” was often beyond me. In college, I bought a hot plate and a Woolworth frying pan and would make bacon and eggs instead of taking a trek to the student union for a snack. My roommates were amazed that I could cook an egg; I was amazed they could not.  But my culinary skills had not advanced beyond this rudimentary stage.

I was tired of dorms and roommates when I went to law school, and I lived alone in a small but serviceable apartment with a small but serviceable kitchen. I did not have a meal plan and did not have enough money to eat out. I had no choice but to cook. I cannot conjure up what I ate on a daily basis except for one famous recipe of the day—a chuck steak sprinkled with a dry onion soup mix tightly wrapped in foil and cooked long and low. It was delicious even if a lot fattier than something I would eat now.

My knowledge of food and its possibilities were limited, but one day I came back from a Torts class and turned on the tiny, rabbit-eared, black-and-white television. After the minute it took to warm up, I saw the Galloping Gourmet, Graham Kerr’s half-hour cooking show. He was lively, pun-filled, and pretending to sip wine incessantly in front of a doting studio audience of what seemed like all middle-aged women, one of whom would be taken by the hand and brought to the front to eat with him at the end of the show. And by his expression when he cut into whatever he had cooked, wielding knife and fork in what was then that strange European style, was beyond delicious.

Kerr’s cooking, as with other chefs then before shelves were stocked with many variants of extra virgin olive oil (I have looked up many times what extra virgin means, but I always forget) used gobs of clarified butter (I did not know what that was then and wondered how you made butter better understood) and cream. (Kerr’s life later took a turn towards the religious, and perhaps as a result, several years after the Galloping Gourmet ended, he reappeared on television doing what I think he called “minimax” cooking, where he took traditional dishes and showed how they could be made with less fat and calories. He wasn’t as much fun.)

His enthusiasm was infectious. Not only did new tastes await, he was trying to convince us that we could make those untried dishes ourselves in our own kitchen, seemingly in a half hour. Even so, for months the show was merely entertainment for me, not instructional. I could not make hamburgers consistently the way I wanted to; I was not about to undertake steak Diane or however he cooked lamb chops, which I had never eaten. Then one day the Galloping Gourmet made chicken Kiev. “I could make that” I thought and dared myself to do it.

I don’t remember every ingredient or step, but the basic concept was to make a flavored butter with parsley and maybe garlic, pound a chicken breast to make it thin (I certainly did not have a meat pounder but probably used the bottom of a frying pan—much kitchen equipment is not really needed), wrap the butter inside the breast, roll the wrap in an egg wash and bread crumbs, and fry the package (or perhaps fry until it is sealed and then bake it.) My result was a beautiful golden brown as Graham said that it should be, but he stressed that chicken Kiev was only properly made if the butter had not escaped during the cooking and first came out when you cut into the finished product. I was a little nervous as I was poised above my creation with a fork and a steak knife gotten with a fill up at a Shell gas station. The knife plunged in and butter flecked with parsley green spurted out drenching the chicken. I was excited and proud. (Ok, you Freudians may want to make something out of breast, spurting liquid, and excitement, but you have a dirty mind. Sometimes a breast is just a chicken breast.) Thanks to Graham Kerr I could cook, and a new life opened for me.


Dave Barry in 1991 wrote that to “see the humor in a situation, you have to have perspective. ‘Perspective’ is derived from two ancient Greek words: ‘persp,’ meaning ‘something bad that happens to someone else,’ and ‘ective,’ meaning ‘ideally someone like Donald Trump.’”

I was surprised that my friend asked me the same trivia question that he had asked only a week earlier. Of course, he should have been surprised that I did not remember the answer.

“How comes it that our memories are good enough to retain even the minutest details of what has befallen us, but not to recollect how many times we have recounted them to the same person.” La Rochefoucauld, Maxims.

What song is stuck in your mind right now? I keep hearing “rollin’ on the river.”

A short, informative piece of writing is known as a “primer.” The pronunciation of that word is changing. It was once said with a short, first vowel—prim-ur. Now it is frequently pronounced with a long vowel—pry-mer—the same is if it were a first coat of paint. Should we care? And can we talk about the pronunciation of “reprise?”

I would like my pre-pandemic life to return so that I might see signs again like I saw chalked on a board in Times Square: “Special. 2 drinks for the price of 2 drinks.”

I did not hear much mention of the fact during the campaign that for the majority of us tax increases kick in next year. These were built into the Trump/Republican tax “cut” a few years back. But don’t worry if you are rich. Your tax decrease remains. There is generally good news for the rich.

Even if you do not think highly of Donald Trump, perhaps you should give him some credit for having children who seem so loyal to him.

“You can fool too many of the people too much of the time.” James Thurber.

Could we everyone stop calling our country a “democracy?” It isn’t and never has been.

Could we stop worshiping our Constitution? It is important, but it is not and never has been perfect.

Could we get athletes to stop thanking the Almighty  for their performances? I would just once like to hear an athlete say, “I was playing a great game until God made me fumble.”

Take Another Little Piece of My Heart Now, Baby

This will be posted the day after election day, but I am writing this before the votes have been counted. I have tried to be active in the election. I have voted, but I have also spent election day doing voter protection work. I am concerned about the results, but a personal matter is weighing on me. Today, November 4, I am off to the hospital.

I was getting short of breath. This was hardly surprising. I am old. (I prefer the tautology, “I am not as young as I once was.”) I am overweight. (I prefer, “I am not as thin as I ought to be.”) I am not as active as I pretend to myself. (I know that the seventeen steps—yes, I have counted them—from my bed to the toilet, no matter how urgently traversed three or four times during the night, does not count as aerobic exercise.)

However, this seemed more than the usual shortness of breath. (I had told the elegant Elspeth that it was no surprise I had been more out of breath recently because I had been around her more often. Instead of the flirtation I had hoped for—a brief, indulgent flirtation is the most I can hope for these days—she merely looked down and fiddled with her phone.) On the fourth doctor’s visit—three in person and one by teleconference via Skype after I failed to link up three other ways—my cardiologist (you are of a certain age and condition when you refer to your personal cardiologist) diagnosed a heart flutter. (I told the beautiful Beulah that it was no wonder that my heart was fluttering because I was running into her more often. She did not even look up from her phone.)

The doctor, trying to be reassuring, said this was “repairable,” as if I should be pleased to be lumped in with a slowly leaking tire or a mishung picture. He talked about medicines. A new one or two to take and increased doses of one already part of the regimen. (He tells me that I cannot count taking a slew of pills as aerobic exercise. Perhaps I need a new doctor.) And then I heard him say that he would send me to someone else in a few weeks who would “shock the heart.” Say what?! He said this close to Halloween, but he was not speaking metaphorically. A ghoul dripping fake blood would not leap out from behind a curtain with a curdling scream. Oh no, he was being literal. He said that something more modern is now used, but it is akin to the old doctor or Frankenstein movies where paddles are placed on the chest, someone yells, “Clear!” the body is zapped and jerks about, and everyone waits expectantly to see if the heart beats on its own. Except in my case, the heart is already beating, just “fluttering,” and the goal is to reset the heart rate to its more usual 64 beats per minute instead of its present elevated state. I have rebooted a computer and a cable box many times. Now the goal is for the medical team to reboot my heart. That is today.

I am assured that I will be in a “deep sleep” for the procedure, but when (if?) I wake with its successful completion, I plan to tap dance and sing “Putting on the Ritz.” Maybe I will finally be able to carry a tune as well as Peter Boyle.

Election Day

If you have not already done so, vote tomorrow. If you can, help someone else to vote. And then try to Aaron-Rodgers it and relax. Let whatever will be the equivalent of chads settle, the debates about not-completely-filled-in circles resolve, and the long lines in some precincts dissipate. Avoid rumors. If we are lucky, in a few days we can talk meaningfully about the country’s future, but today and tomorrow are not those days.

The Shortsighted Electoral College (concluded)

The major effect of the original Electoral College was not to give power to the small states but to the slaveholding states. Madison had said that a direct presidential election was “fittest” but it would harm the South, citing the more “diffusive” franchise in the North, but the Virginian slaveholder continued with the curious comment that with a direct election the South would “have no influence on the score of the Negroes. The substitution of electors obviated this difficulty. . . .” The “difficulty” was avoided by basing the number of electors on representation in the House of Representatives. The apportionment of the House, of course, incorporated the three-fifths clause where that percentage of slaves was used in the allocation of House seats.

The three-fifths clause was, therefore, incorporated into the Electoral College giving extra power to the large slaveholding states. The first census in 1790 found that New York had a free white population of 313,000 and North Carolina had a free white population of 289,000. Each state had the same electors, however—twelve—after that first census. While New York had 21,324 slaves, North Carolina had 100,572. South Carolina had a free white population of 139,000 but New Jersey had thirty thousand more. Even so, South Carolina had twelve electors and New Jersey eleven. South Carolina had 107,094 slaves and New Jersey 11,423. (New Jersey is the starkest example of why Madison feared for the effect on the South if there had been a direct election of the President. Even if the franchise had been equally distributed in South Carolina and New Jersey, New Jersey with its larger white population no doubt would have had more power in picking the president; if the turnout was equal, New Jersey would have about 20% more votes than South Carolina. But as Madison had to know, New Jersey then allowed women to vote, and its total vote might have been twice that of South Carolina’s. With the Electoral College as adopted, even though South Carolina had the smaller white population, it had more power in the presidential selection than New Jersey.)

Virginia had a free white population of 441,000; Pennsylvania had 422,000, about a four percent difference. Virginia had 292,627 slaves and Pennsylvania had 3,731. Even though 40% of Virginia’s population could not vote, Virginia had forty percent more electors than Pennsylvania—twenty-one to fifteen.

A direct vote for President would have lessened the power of the South; instead the electoral college as adopted magnified it. Founders recognized and said that large states would dominate the vote in the Electoral College, and Southern states would have special influence in picking a President because of the peculiar way in which slaves were counted.

Unlike what some people now claim, the demigods of 1787 did not protect small states via the Electoral College, and their sop of requiring electors to vote for two people with one not from the state of the elector proved to be a laughable protection. The Framers in adopting the Electoral College did not foresee the rise of political parties even though parties were in place only a few years after the Constitution was adopted and were evident in the first contested presidential election, after Washington retired in 1796.* By then, two men ran as a team with one running for President and the other as Vice-President. The country made it through 1796 without a major problem, but the Electoral College caused a crisis in 1800.

Thomas Jefferson and Aaron Burr ran as a Republican team in the presidential election. The widespread understanding was that Jefferson was running for President and Burr for Vice President. John Adams, the Federalist incumbent, ran with his vice-presidential running mate Thomas Pinckney against Jefferson and Burr. Jefferson got seventy-three electoral votes to Adams sixty-five, making Jefferson the apparent victor, but of course, because each elector had two votes, Burr received the same number of electoral votes as Jefferson. A tie, which was not foreseen by the Framers but was close to inevitable with the rise of political parties.

The selection of the President in 1800 went to the lame-duck Federalist-dominated House, even though the Federalists had lost the election. That losing party had to decide which Republican, Jefferson or Burr, was the lesser evil. Thirty-six ballots later, Jefferson became the third President. And we got the Twelfth Amendment to fix this major flaw. That Amendment required electors to cast separate votes for President and Vice-President.** At least when it came to the Electoral College, the Framers did not see very far at all.

Remember this whenever someone suggests that the Framers were infallible or that the Constitution is a God-given document. And remember that the original Constitution gave the major slave-holding states the dominant power in picking the President.


*The Framers also did not foresee that electoral votes would be allocated by a winner-takes-all approach where the candidate with the most votes in each state would get all of that state’s electoral votes. That development, however, did not come quite as quickly as the rise of political parties. In 1796, even though Jefferson won the most votes in Pennsylvania, Virginia, and North Carolina, one elector in each of those states voted for John Adams instead and those three votes made Adams president. He received 71 electoral votes to Jefferson’s 68. Jefferson received the second most votes. (Adams’s running mate, Thomas Pinckney, garnered 59 electoral votes.) Under the electoral system then in place, Jefferson became Vice-President under his political enemy, Adams, an uncomfortable result.

**Elections might have been more fun if we still had the original electoral scheme as indicated by Alexander Hamilton’s devious actions in 1796. Although Adams and Hamilton were both Federalists, Hamilton did not want Adams to become President. Supposedly Hamilton approached electors in states Jefferson had won and urged those electors, after voting for Jefferson, to give their second vote to Thomas Pinckney. Hamilton was hoping that Jefferson-Pinckney votes plus Adams-Pinckney votes would give Pinckney the most electoral votes and the Presidency. Hamilton’s machinations seem to have borne some fruit, most notably in South Carolina where both Jefferson and Pinckney received eight electoral votes. The scheme failed because in a number of states that Adams won, the electors divided their second votes between Pinckney and other candidates or did not give any second votes to Pinckney. For example, Adams received nine votes in Connecticut, but Pinckney got only four, with five votes going to John Jay. New Hampshire gave six votes to Adams, but none to Pinckney. Pinckney received twelve fewer electoral votes than Adams. But think of the gamesmanship we might have if this original electoral edifice still existed.

The Shortsighted Electoral College

 With a presidential election looming, it should be a good time to examine again the efficacy of the Electoral College, but if the electoral vote follows the popular vote this time, the topic’s urgency will dissipate. There is, however, another good reason to consider the Constitution’s original electoral system. The insertion of Amy Coney Barrett onto the Supreme Court has made many think again about our Constitution and how to interpret it. A strain of constitutional interpretation regards the original men who framed the Constitution as so sagacious and farsighted that their constructs of 1787 are still perfect for us now. Some believe that God inspired the Constitution.

The Framers did write an amazing document. The governance it started continues on in a somewhat recognizable form to that of 1789, an extraordinary achievement. Nevertheless, an examination of the Electoral College the originators adopted reveals their foresight to have been quite limited. We should remember these limitations when some seek to deify the Framers and the Constitution.

After reading some contemporary comments suggesting that the point to the Electoral College was to preserve the powers of the small states so that the large states would not dominate the presidential selection, I pulled out The Records of the Federal Convention of 1787 edited by Max Farrand and The Federalist Papers to see what these sources said about the method of selecting the president. The issue was debated again and again in the Constitutional Convention of 1787. The delegates would agree to a method, but potential flaws in that selection process would circulate. A different scheme would be proposed and problems with the new proposal would be pointed out. This merry-go-round continued until near the end of the convention when the delegates finally settled on the Electoral College as it appears in the original Constitution.

The convention first voted to have Congress  choose the President, but criticisms soon emerged. In James Madison’s words: “If the Legislature elect, it will be the work of intrigue, of cabal, and of faction: it will be like the election of a pope by a conclave of cardinals; real merit will rarely be the title to the appointment.” Foreign governments would try to influence Congress in the selection of the President because they would think it important “to have at the head of Government, a man attached to their respective politics and interests.” In addition, a basic goal of the Constitution, the separation of powers, would be compromised because the President would be beholden to Congress for his selection. In addition, as Alexis de Tocqueville, the astute observer of the United States, wrote in Democracy in America forty years later, Congress, chosen to make laws, “would represent but imperfectly the wishes of the nation in the election of its chief magistrate; and that, as they are chosen for more than a year, the constituency might have changed its opinion in that time.”

This and many other methods were proposed and rejected: The state governors should select the President; electors selected by Congress should make the choice; electors drawn by lot from Congress should choose the President.

Madison said that the “fittest” way to select the President was to have a direct election, but he then noted two problems. “The first arose from the disposition of the people to prefer a Citizen of their own State, and the disadvantage this would throw on the smaller States.” Madison did not find this problem insurmountable and said “that some expedient might be hit upon that would obviate it.” The next speaker, however, differed with Madison’s optimism by saying, “The objection drawn from the different size of the States, is unanswerable. The Citizens of the largest states would invariably prefer the Candidate within the State; and the largest States would invariably have the man.” The delegates thought that a direct election would prejudice the smaller states, but what concerned them was that candidates from small states could not get elected because the parochial electorate in the large states would favor candidates from their states and those large-state votes would overwhelm the candidates from small states. (Reminder. In the last presidential election, Trump was then a lifelong resident of a large state, but New York overwhelmingly voted against the hometown boy. Perhaps the Founders were not familiar with the adage, “Familiarity breeds contempt.”)

Madison also maintained that a direct vote would undermine the South. Many northern states had eased the traditional requirement that only white male citizens who owned property could vote by allowing white males who paid taxes also to have the franchise, and in New Jersey, even women had the vote. Madison recognized that the “right of suffrage was much more diffusive in the Northern than the Southern States.” A higher proportion of people in the North could vote than in the South, and the South’s power would be diluted by a direct election.

Madison and others maintained that an electoral college, however selected, would obviate some of the concerns of a congressional selection. The electors would be chosen for only one purpose and would meet just once, and in the adopted version, not meet together but in the separate states so that there would be little opportunity for cabals, intrigues, and foreign influence.

An electoral college, however, does not necessarily alleviate the small-state concerns. Today many see the founders protecting the small states by giving them a slightly greater number of electors than is justified by their populations. The founders, however, addressed the small-state problem in a different way. The concern was that a candidate from a small state, even if worthy, would inevitably lose because the large-state electors would vote for one of their own. The solution: each elector would vote for two people, one of whom must not be from the elector’s state. The delegates thought that while one vote may go to someone from the home state, the second vote would be for the person seen as the best in the rest of the country, and if that person was from a small state, he could be elected with a collection of second-choice votes.

The Founders added another “accommodation to the anxiety of the smaller States,” as Madison wrote in a letter in 1823. If no person got a majority of the appointed electors, then the House of Representatives would choose the President from the five highest on the electoral list with each state having one vote. The largest and smallest states would be equal in this process, which, according to Alexander Hamilton in The Federalist Papers, would be “a case which it cannot be doubted will sometimes, if not frequently, happen.”

That Senators as well as Representatives were included in determining each state’s electors may seem to have been a major protection of the small states, but delegates knew that the large states would dominate the Electoral College. Luther Martin writing to the Maryland Legislature after the draft Constitution was promulgated but before it was adopted said that the “large states have a very undue influence in the appointment of the President.” Gouverneur Morris, a delegate to the Constitutional Convention, writing in 1803, noted that it was recognized that the large states would dominate the Electoral College. Only if the matter went to the House of Representatives did the small states have a substantial voice in the presidential selection.

(concluded October 30)


I was waiting for an angiogram in a room divided into cubicles by curtains. I could hear the guy next to me chattering, not to me, but to the nurse, who was taking his history. When the guy learned that the nurse was a Filipino, he became more voluble because his sister-in-law was a nurse born in the Philippines. I and anyone else in the room learned how his sister-in-law had worked in New York City at Bellevue Hospital but that she now worked at Stony Brook Hospital on Long Island. She had married into an Italian-American family, and she loved cooking Italian food. He exclaimed proudly, “You wouldn’t believe the spread she puts out on New Year’s Eve. We all go to her house. The food is so beautiful, and she makes so many dishes.” When he went off for his procedure, I was left in relative silence for an hour or more before I was wheeled off, but my neighbor-patient’s comments continued to ring in my ear. They made me feel better about America.

          New York City, along with several other jurisdictions, was named an “anarchistic jurisdiction” by the Trump administration in an effort to withhold federal funds. But I also hear from conservatives that NYC has confiscatory taxes to support its oppressively big government.  Anarchy, big government . . . if words have meanings, both terms can’t apply. Pick one epithet, not both.

          The controversy over the elevation of Amy Coney Barrett to the Supreme Court brought about hopes and concerns about the future of Roe v. Wade as well as the future of same-sex marriage and other LGBTQ rights. The future of the Affordable Care Act also hangs in the balance. This has given me pause. I know that the Supreme Court will be considering a case about the ACA, but even though I have some understanding of constitutional law, I do not know the reasons that suggest that Obamacare is unconstitutional? Do you? I don’t believe many people do, but I know that many desire its end. Why? Nearly all the complaints lodged against the healthcare law are not true. (You can check them out! Go online.) I assume that nearly all of the Republican and conservative elites know that the attacks on the ACA are canards, but they still act as if the foundations of society are crumbling because of the law. Why the adamant opposition? A recent analysis by the Congressional Budget Office may reveal why the Republican establishment wants the end of the Affordable Care Act. Tax increases designed to fund Obamacare are concentrated on the top one percent, but its benefits are spread widely among the bottom 40% of income earners. Thus, the ACA produced an income increase of 3.6% in the bottom income quintile and a 3.2% income increase in next higher income quintile. The middle quintile saw a 0.5% income increase with minor income increases up the income scale until we get to the top 1% where there was an income drop of 1.2%. Perhaps, such income redistribution above all else, explains why Paul Ryan, Trump, Mitch McConnell, and others wish to rid our country of that pernicious Affordable Care Act.

Real Americans I know have taken their six-year-old trick-or-treating. Real Americans I know have at least tried to carve a Jack-O-Lantern with their kids. And Donald J. Trump?


(Guest post from the Spouse)

Recently, visiting a group of friends, I mentioned that I thought one of our other friends (who was absent) was “opinionated.” A collective smirk went around the room. And then someone said (jocularly, but kindly), “And you are NOT opinionated?!?” This surprised me as I have never thought of myself as “opinionated,” and I said as much. Scoff, scoff. Smirk, smirk. Hmmm…. So I started thinking more about what I meant by the term “opinionated.”

Opinionated to me does not mean “stuck in one’s habits.” If one routinely has morning coffee at 8:50 and takes a walk at 9:30 and doesn’t like those habits interrupted, that is not being “opinionated,” that’s being an old fuddy duddy who doesn’t like her routines disturbed. Irritating enough, but not falling within the realm of “opinion.”

I realized that by “opinionated,” I meant “firmly entrenched in an idea for which one has no evidence.” In its worst examples: X holds the “opinion” that the majority of black men are dead-beats who run out on their families, deal drugs, and mug white women on a regular basis. Data? No. Y says that such a stereotype is unfair and biased. Data? Yes. X holds the “opinion” that Covid-19 is a hoax; Y says that the evidence supports an alternative narrative. In both instances, I find X is “opinionated” while Y has opinions, but they are based on data. One further example: I have no trouble holding the “opinion” that our current president is a threat to democracy and civic order. Given world enough and time (see Marvell, below), I can point to at least 421 million cogent reasons why this is so.

And so it is that well-meaning people can agree (I hope) on the need for certain sorts of “opinions” to be backed up by evidence.

However, this sort of fact-based reasoning becomes more nebulous in the realm of art. Chacun à son goût, and all that, but I believe…let me rephrase that…it is my opinion (the source of which is obscure to me, though see footnote*) that a literate person should make an effort to enunciate a reason for their goût. I like to think that my “opinions” on such matters are grounded in something that is akin to evidence (it’s “in the text” or, less convincing, “the author has said so in interviews”). However, pin-pointing/articulating such evidence is often a subjective exercise based on comparisons to past reading experiences and to the emotions elicited by them. Art, after all, is not a science. You say passage X is lyrical, beautifully descriptive, and moving. I say it is prosaic, pretentious, and off-putting. It’s the same passage! How can we read it so differently? Which one of us is “opinionated”? Which one of us is accurate?

Many who have grappled with the definition of “art” have fallen back on the  explanation that “art” is whatever survives the test of time. Fine for Greek temples, Beethoven, Rembrandt, and certain English novels of the 19th century, but what are we to make of  last Sunday’s New York Times Book Review? Many of us fall back on the collective opinions of others (The National Book Award committee, Michiko Kakutani, and Reese Witherspoon can’t be wrong, can they?). But aren’t our own educated opinions as good as theirs?

Perhaps it’s okay for us to have different opinions about “art.” It’s certainly okay if each of us experiences the world differently. But I see that it’s not okay for me to experience the world in so different a manner that I cannot empathize with another’s point of view…

however wrong it may be.


* When I was a junior in college – and a newly-minted English major – I took a seminar on the Metaphysical Poets (Donne, Hebert, Marvell, etc.). It was well beyond my reading sophistication, but I needed some English lit credits. We were asked to write an essay on (I think) Marvell’s poem “To His Coy Mistress.” To me it was banal; it sounded like a flowery Hallmark card, and I had the temerity to write as much. It was stupid on so many fronts that it’s almost hard to write about it. When the professor called me in to talk about the generous “C” I got on the paper, he basically said I didn’t know how to read, and he was right. My reading experience had been too meager to appreciate the subtleties and ironic joys of Marvell’s poem. I am happy to say that now, after an additional 50 years of reading, I experience the poem with delight. However, that “teachable moment” taught me that sometimes someone’s (including my own) goût is totally misguided.


I went to the doctor for a flu shot. When I made the appointment, I also said that I was concerned about shortness of breath. When I saw the doctor, I also told him that I had what is commonly know as “trigger finger” and about a recurrent pain that might be sciatica. Then the doctor said that he called this a “manly” visit. A woman, he said, would have come to him separately for each issue when it arose. The man, instead, decides to get a flu shot and then thinks, “I am going to the doctor. What else should I ask him about?”

One fury has God found inexpungeable:

The wrath of a woman who finds herself fungible.

                    William Espy

Donald J. Trump does not have a pet, but there must be a professional dog trainer in the White House. Mike Pence responds to the command “Heel!” better than any hound I have ever seen.

There were four million people in the Colonies, and we had Jefferson and Paine and Franklin. Now we have over three-hundred million and we have Trump and Pence. What wisdom can you draw from that? Darwin was wrong.

If ignorance is bliss, why does Trump seem so angry and unhappy?

Does this story have applicability today? A woman supposedly said to John Maynard Keynes that she wondered what David Lloyd George was like when he was alone in a room.  Keynes responded, “When Lloyd George is alone, there is no one there.”

Perhaps this phrase ascribed to a soldier in Iraq applies to our country today: “So screwed up it was like pasting feathers together, hoping for a duck.”

Pat Paulson, when he “ran” for President said, “Issues have no place in politics.  They only confuse matters.” I wonder, however, if he would still say, “The current system is rigged so that only the majority can seize control.”

The license plate holder on a nice-looking Genesis registered in Florida said: “Beautiful” Naples, Florida. I wondered about the quotation marks. Is it ironic or facetious to call Naples “beautiful?”

“If at first you don’t succeed, destroy all evidence that you tried.” Newt Heilcher

“If at first you don’t succeed, try, try again. Then quit. No use being a damn fool about it.” W.C. Fields

I am trying to expand my vocabulary, so I am going around saying, “The president’s rodomontade is rebarbative.”

“It’s better to keep one’s mouth shut and be thought a fool than open it and resolve all doubt.” Abraham Lincoln

Let’s Get Women Off the Supreme Court

Dear Loyal Readers Who Noticed That I Did Not Keep My Usual Posting Schedule Last Week,

Last Monday I posted a longer than ordinary essay about the Supreme Court nomination of Amy Coney Barrett. Because of its length, I had planned to skip my usual Wednesday post and resume this blog on Friday, but that day passed, too, barren of my wit and wisdom. You might assume that that was because I was so wrapped up in the Senate hearings that I did not get to the keyboard. I wish that were so, but instead, some health issues had me in doctors’ offices where lasers zapped my eyes and other machines found additional problems with this aged body. In what was meant to be reassuring, the doctor said that the new problems were “repairable,” and the repair strategy, which apparently does not require the copious use of duct tape, is under way, but it all took up some of my time.

Even so, I still had many moments when I could have watched the hearings. Mostly I avoided them expecting them to be as predictable as the Perry Mason reruns on ME TV, and I gather the Senate proceedings held few, if any, surprises. In the half hour I did watch, Barrett stated that her constitutional philosophy was not to place her own values into the Constitution or to seek the original intent of those who drafted the Constitution but, as other conservative judges now say, to apply the original public meaning of the document’s words. The Constitution, she said, does not evolve but, apparently, remains frozen in the eighteenth century. To her this is necessary so that judges will be neutral and not constitutionalize their individual values and views. (I have previously discussed this thinking on this blog in “We, the People of the United States,” posted July 26, 2018, and “Originally It Was Not Originalism,” posted August 22, 2018.)

Although I did not hear her use it, her explanation reminded me of Chief Justice John Roberts’s oft-mocked metaphor that judges should be mere umpires keeping their personal predilections at bay. The contention is that judging can and should be mechanistic. Moreover, rulings that use the standard of original public meaning are desirable because such meaning can be objectively determined

My mind went whirring into the future. Twenty years from now our president is Phillip K. Dick III, a sports fan. He notes in 2040 that tennis matches have long abandoned human officials for line calls using machines instead. Baseball now registers ball and strikes without a human umpire, and footballs have chips implanted so that forward progress at the end of each play can be automatically recorded without the rather slapdash procedures of line or side judges in days of yore. Referees and umpires have moved beyond human judgments, and Dick remembers John Roberts’s words that Supreme Court judges should be like umpires. (Roberts, a mere eighty-five, is entering his thirty-fifth year of Court service.) Therefore, when Stephen Breyer dies at the age of 102 after forty-six years of service as an Associate Justice, Dick nominates a computer — which has had the Constitution, all court decisions, all dictionaries, all necessary history, and anything else that could be relevant to court decisions placed in its memory and which has been programmed to make decisions using these materials — to fill the Supreme Court vacancy. President Dick states that this will eliminate the dangerous human element from constitutional interpretation. Arnold, this device’s name, is ready to take the “seat,” but a cry goes up that Dick cannot do this. The Constitution does not allow the president to appoint non-humans to the highest court. (My imagination cannot discern the source of the cries, but presumably they don’t come from the conservative wing of the Supreme Court, consisting of Clarence Thomas at the age of 92, Samuel Alito at 90, Brett Kavanaugh at 75, Neil Gorsuch at 73, and Amy Coney Barrett at a spry 68, who all claim that they mechanically interpret the fundamental laws without invasion of human emotions.) References to Caligula are made, but a horse is a horse, of course, and Incitatus was never actually made a consul but merely a priest. This is the United States Supreme Court, Dick says, and is different. Human judgement should be removed from judging as the conservatives maintain. Justice Arnold could make decisions without emotions and biases and, therefore, is better suited for the Court than any mere human.

The humans pull out their vest-pocket-sized Constitutions and flip pages to find the controlling text: The president “shall nominate, and by and with the Advice and Consent of the Senate, shall appoint . . . Judges of the supreme  Court. . . .” (We seldom notice that the Constitution does not give the president the power to appoint Supreme Court judges. The president nominates and with the Senate appoints them. The president and the Senate jointly appoint the Supreme Court.)

All sorts of linguistic tools have emerged that can be used to show how words were used in the constitutional era, but I have only bothered to look at Noah Webster’s dictionary, the compilation of which started much earlier but was first published in 1828. It says that a “judge” is a “civil officer who is invested to hear and determine” civil or criminal causes. Webster defines an “officer” as a “person commissioned or authorized to perform any public duty.” There we have it. A person. With the original public meaning, a judge in the constitutional sense is a person, and Arnold is out. (Of course, much modern constitutional law depends on the legal fiction that a corporation is a person, but that is a story for another day.)

But now the original public meaningers look a little further. Webster states that a judge is a civil officer who decides causes “according to his commission.” His. Does this word include both men and women? Not according to Webster, who defines “his” as the “possessive of he,” not “he or she.” By this analysis, a judge within the meaning of the constitution is not only a person, but a male person with a commission. People now realize that the original public meaning of “judge” in the Constitution means a man. A third of the Supreme Court must go.

Of course, the framing generation could not have meant a non-human as a Supreme Court judge. Cyborgs were not on their radar (and, of course, radar was not on their radar in 1789.) But neither was a female judge. That generation did not consciously reject women as judges; the possibility, as with non-humans, never occurred to them. Lawyers were men, and so were judges. (Some Framers may have thought of that woman lawyer, Portia, but surely they knew that in The Merchant of Venice the lying Portia came disguised as a man, Balthazar, claiming, without basis, to be a “doctor of law.”)

The original public meaning of judge in the Constitution meant a man. Shouldn’t the conservatives on the Supreme Court today read the word as it was meant in 1789?