Being Mocked for Mispronunciation

Avan Judd Stallard - Author

An aphorism did the rounds recently, and it struck a chord:

“Never make fun of somebody if they mispronounce a word. It means they learned it by reading.”

That’s been me all my life: mispronouncing words discovered exclusively through books. Sometimes I have discovered this through a gentle correction given quietly after the fact, or as part of a shared laugh over the gaffe, but not always.

The most unkind instance was when I was in law school. It was a constitutional law tutorial, compulsory and deadly boring. The word was “quorum”, which means the minimum number of members that must be present to start a valid meeting. It is pronounced kw-or-mm. I pronounced it kw-ere-mm, for I suppose I had never heard it spoken, having skipped the relevant (but mind-numbingly boring) lectures.

Many of my classmates snickered. A tall, blonde, athletic boy from a grammar school, who to this day makes me think of the Hitler youth, snickered the loudest. Being that I had to repeat the word many times in my little presentation, it got to the point that I stopped and asked the group: “What is so funny?”

The young Hitlerite decided to act as group spokesman. He took great pleasure in telling me that I was mispronouncing the word, and finished by asking if I had any sort of education whatsoever.

I glared at him, my fists balled, and I contemplated—quite seriously—leaping across the desk and punching him in the face, as would be voted appropriate by a quorum of Nannupians or public school ruffians of my ilk. But I knew it would cause more problems for me than him. Indeed, I would have been ejected from the law degree I so loathed. So the tutor interjected, and that was that.

That blonde boy was later accused of rape. Not a joke.

What’s the point? Well, it’s obvious. You can quietly and humbly correct someone’s pronunciation, and that is a kindness. But if you mock them to make them feel small and yourself big—then you are scum. I’m pretty sure that that’s what that aphorism was trying to say.

The Great Spanish Coffee Swindle

Avan Judd Stallard - Author

Spaniards drink a lot of coffee—and I love their café cultura, it is central to life—but, Christ, is their coffee rough! Not every taza in every cafetería in every ciudad, but the overwhelming majority of coffee experiences in Spain are trials to be endured, the flavour bitter and burnt and retaining little that is recognisably coffee. And I just found out why: a diablo by the name of torrefacto.

Any half decent coffee has this in common: it is 100% coffee. Torrefacto coffee is only 80% coffee, or less. It’s like the way Australian chocolate is not all chocolate; I mean, there has to be some allowance for a percentage of cockroach, right?

Torrefacto refers to the process whereby sugar is added during the coffee bean roasting process. This is a Spanish practice going back about 80 years, when sugar began to be added to the roast of the beans for one very special reason: it gave the final product a long shelf life (by embalming the beans in a caramel sarcophagus) and ensured no wastage. An added bonus was that the cheaper sugar bulked out the coffee, like the way the British pump their meat full of water.

The problem is that torrefacto coffee tastes like … how would the Spanish put it … mierda, caca, popó? Which is to say, like someone has very carefully coated some green coffee beans with liquid faeces and then roasted it to a crisp, ground it, prepared it with an espresso machine and served it with a snappy vale. Oh, and apart from the fact that it tastes like shit, it may also be cancerous (the lesser of concerns here).

So, in short:

  • Coffee in Spain normally means torrefacto coffee –> means bitter shit-flavoured coffee.
  • When it isn’t torrefacto coffee, the beans in Spain are still burned to a dark char –> means bitter shit-flavoured coffee.

Silly Spain.

Loco Spain!

Ahh, Spain …

The Professional Sceptic

Avan Judd Stallard - Author

*A version of this article first appeared on Stumbling Through the Past.

The historian is two things above all else: a sceptic and an explainer.

Whatever the subject, the ultimate goal is explaining what happened, how it happened and why it happened. But reaching that goal requires enormous scepticism: suspiciously examining the sources, which will deceive in every imaginable way, and suspiciously examining the explanations other historians have offered.

This latter requirement of the historian—professional scepticism of fellow historians—is something seldom emphasised to the layperson or student. If it is in a book, especially one written by a credentialed scholar, then it is history, it happened. It can be added to, maybe interpreted through another lens, but the core stuff—when, what, where, how and why—is established. It doesn’t need to be raked over.

That is the misconception, sometimes willingly gilded by our profession. Now here is the unadorned truth of the matter: historians make a surprising number of errors, mostly small, occasionally monumental. The trick is not to be surprised, but to expect error, seek error and to celebrate it when found, even when it is your own work that has been set ablaze.

The warmth of that fire—of his and her and my hard work turning to ash—means understanding has been advanced. After all, that is the goal, for history is not like mathematics with its proofs, chiselled out on stone tablets. It is like science, with its method centred on falsifiability and incremental gains in knowledge.

It goes something like this: the historian makes a claim they believe is true or might be true and corroborates the claim with all the evidence and argument they can muster. Then they wait for someone to further corroborate their work, or to point to a piece of evidence or argument that brings the whole edifice tumbling down. A new theory/argument/interpretation is advanced; the cycle repeats.

Hence, at the core of all this is one thing: sources—original sources.

My own experience must have been echoed by countless others. When I was a new PhD candidate and very naive about what historians did and how the practice of history worked, I wanted to tell the story of the imagined Australian Inland Sea—the sea that explorers like Charles Sturt figured existed in the Australian interior.

I realised that, first, I needed even earlier background information about what Europeans expected from southern hemispheric lands they might encounter. Hence, I read widely on the history of Terra Australis Incognita, the mythical continent imagined for the southern hemisphere. An obvious question came to the fore: how had the idea of Antipodes arisen?

Numerous books told me the answers, so I thought I knew. I’ve dug out my earliest notes from that initial research, a tad cringe-worthy, but early ideas set in writing often are:

The ancient geographical theory of Ptolemy of the second century A.D. … lent itself to deceptively scientific ideas about the globe being like any gravity-governed body: stability of the earth’s rotation on its axis was dependant on balance, and therefore the distribution of land mass above and below the equator was assumed to be equable. Hence, the southern continent could be expected to be a reflection of the Eurasian land mass, located behind the mirror of the equator.

Ptolemy, symmetry, continental balance: I had lapped up what other authors had written and figured it for erudition. But I remember the unease I eventually came to feel, a sick sort of feeling, because the scholar deep inside me was sceptical. Why? Because none of these books cited original sources. Because not one of my own conclusions cited original sources.

I was taking everything and everyone on the basis of authority. And supposing everyone before me had likewise taken it purely on authority, I wondered what might be at the bottom of it all—tortoises all the way down, perhaps?

In a perfect world, a historian would check the original sources for everything. If you have not checked the original source, don’t put it in your history, because you just don’t know.

But historians do not have infinite time, resources or sanity, so the second-best option is to rely only on other authorities who have checked the original sources and demonstrate rigour. Even historians citing their evidence sometimes make errors of interpretation, translation and context, and something so seemingly minor as a misplaced word can have major ramifications. Small errors can compound into significant misunderstandings, major mistakes or even lead to the creation of full-blown myths that sweep a discourse.

In the third-best option, you are willing to rely on those who cite nothing original and—well, neither they nor you are really historians at that point. You are writers, possibly of fiction. I realised all those years ago that that was me—writing a story, not a history.

So, driven by a healthy dose of self-loathing for how lazy I had been, I went to the original sources (to the extent that was possible). What I found was this: not Plato, not Aristotle, not Ptolemy nor any other of the dozens of ancient scholars I was able to read mentions symmetry or the need for the hemispheres to contain equal quantities of land to provide the earth with its poise—a concept I refer to as equipoisure.

The popular understandings about symmetry and balance are unadulterated myths. It really was tortoises all the way down!

As for the many modern writers and historians who state that the ancients did believe in Antipodes because of symmetry or a theory of balance—not one cites a jot of evidence. They don’t because there is no evidence (I say this to the best of my imperfect knowledge—this is one of my core claims awaiting corroboration or falsification; ego prefers corroboration, but both options get the job done).

This was a major revelation, for me if not broader humanity. The implications are huge. Take away the old chestnuts about symmetry and balance, and one of those fundamental questions historians are compelled to ask has never been fully answered—the why question.

That is, why did people in the past construct the idea of a massive southern continent and then choose to passionately believe in something that didn’t exist?

To answer that required evidence. I buried myself in treatises of natural philosophy, journals of explorers, old geography texts, proposals and memorials, letters, histories and maps—lots of maps. Luckily, researchers both amateur and professional have done a superb job of digging up innumerable artefacts that tell us something about the imaginary southern continent.

Though I am sceptical by nature and by trade, there is no denying the incredible work of these historians who precede me—and I fully acknowledge that my own work rests entirely on theirs. But…

I check their sources whenever I can, just to make sure. Sometimes—more often than most historians let on—something doesn’t quite add up. Perhaps a misinterpretation, perhaps an omission, perhaps a mistake. Sometimes it is mere minutiae, and sometimes it changes everything.

The result of my research is Antipodes: In Search of the Southern Continent (available from Monash University Publishing and select book stores). I chart the European idea of a mythical southern continent, an idea so potent in the minds of its fanciers that it helped shape early modern history. I ask a lot of why questions, and provide a few answers. Some of them may turn out to be right.

F**k the Oxford Comma

Avan Judd Stallard - Author

A minor quibble, not a matter of international import, merely enough to provoke the occasional fit of inappropriate rage: I am sick of well-meaning people, unaware of their own blinkered knowledge of the many standards of global English, pointing out what they believe is the grammatical howler of a writer having omitted the Oxford comma.

What is the Oxford comma, also known as the serial comma? It’s a little thing, like half a pig’s tail. In a list of three or more items, most Americans and some of the British include a comma before the “and” (or “or”) preceding the last item. For example, if such a person wanted to list “rage”, “apoplexy” and “sadness”, they would write rage, apoplexy, and sadness. The difference is merely that comma before the “and”.

Use it or don’t use it; for the most part, it’s mind-numbingly unimportant. Neither form—with or without the extra comma—is more or less correct. It is a custom, and only a custom, that differs by country, by institution and by person. It is not some rule of usage prefaced upon inviolable linguistic laws discovered once the human genome was decoded. Dealer’s preference.

And yet it is so very common to receive straight-faced correctives, often from Americans, telling one they have erred for want of an Oxford comma. Such mindless prescriptivism is maddening. It shouldn’t be, but the contradiction inherent in the act just sets off a certain kind of person (that person would be me).

As for what the conscientious writer should do—that is, a writer not bound by custom and merely looking to maximise clarity and readability—there tend to be two main arguments put forth, one from each camp.

Those who discourage use of the Oxford comma point out that it is normally redundant. They are right, from the perspectives of both syntax and prosody. The “and” that comes before the last item in a list alerts the reader that it is the last item in the list (this is the syntactic function, concerned with arranging the units of meaning in a sentence). Likewise, the “and” encourages, or arguably necessitates, some sort of pause or change of rhythm as the sentence is read (this is the prosodic function, concerned with regulating the rhythm and melody of a sentence).

In this sense, then, the Oxford comma is heavy-handed. Writers are often encouraged to eschew surplusage, and the Oxford comma is nearly always, strictly-speaking, surplus to the minimum required to convey one’s meaning.

But not always. Sometimes a list becomes confusing without that extra comma. If I want to list the big boys, Jack and Jim without using an Oxford comma, you cannot tell whether I mean three things (big boys, Jack, Jim) or if I’m just referring to the big boys, who are named Jack and Jim. No such problem with the trusty Oxford comma in the mix: the big boys, Jack, and Jim. So, there is no doubt, an Oxford comma should be in every writer’s arsenal.

However, to say that just because it is useful once in a blue moon every writer should use an Oxford comma in every list they write across their lifetime—we’re talking tens of thousands of lists, tens of thousands of redundant commas—is silly. It’s just getting suckered into the prescriptivist mentality: we have to have a hard and fast rule! There can be no discretion! Man and woman are not responsible enough to make their own decisions about when a comma is necessary! Yes, dammit, we will have hundreds of millions and very soon billions of redundant commas to ensure no one misses that rare occasion when it is actually needed!

Dear oh dear, the grammar nanny state, the most minor of all the dystopias.

My suggestion for those who can’t just live and let live, who are seeking some sort of guidance on this pressing matter of comma security: go with discretion. Trust the writer to use that comma when and only when it is required for clarity.

I work as an editor for living, and that’s my policy. Mostly no Oxford commas, but I’ll throw one in every blue moon when it means the reader does not have to slow down too much or reread the passage to be sure of the author’s intended meaning. And if a writer does get the exercise of their discretion wrong—if they write a list without an Oxford comma and that causes some sort of confusion—point it out and put one in. Otherwise, shut up, piss off and leave me and my commas alone.

Hitchhiking Died With The Hitchhiker

Avan Judd Stallard - Author

On the articles page of my website I have just put up “Hitchhiking Died With the Hitchhiker”, an essay on hitchhiking I published in The Lifted Brow a few years ago. It recounts some of my personal experiences hitchhiking and tries to make sense of the decline in the practice. It’s meant to be an entertaining read, but hopefully offers a little illumination, too. An excerpt of what I’m calling “illumination” follows, and the full article can be found here.

***

So what does driving past a hitchhiker say about us?

What it doesn’t say is that we lack charity or good-will. Australian individuals donate around four billion dollars to charities each year, and something like five million Australians volunteer their time. But while our intentions remain good, what has changed is the nature of that charity and good-will: simply, ours is no longer a society willing to directly help the stranger.

There is a term for this—cosmopolitanism: the extension of hospitality, offered without coercion and without any expectation of something in return, to those people who are not part of our immediate community. It is the person stopping to help someone with their bonnet up on the side of the road; it is offering a bus seat to an old lady, or help to someone struggling with their shopping bags; it is saying hello to strangers, it is stepping in when someone is being abused or assaulted, it is helping to pick up the embarrassed student’s dropped bundle of books, it is asking someone in distress if they are okay.

At times we still see this cosmopolitanism, conspicuous and reassuring, but nearly always it takes a disaster to bring it out—the 2009 Victorian bushfires, for example, when scores of people banded together to offer assistance to the stricken. Or the 2011 Brisbane floods that saw people come from all over to help clear strangers’ yards and wash down muddy walls and pile debris on the curb, some even lodging dispossessed strangers in their own homes. So, yes, at important times people still rally to help strangers, and this is laudable.

But what if the important times are also the seemingly insignificant moments between the disasters: the moments when most of our living is done? Much as we may wish it were otherwise, life is rarely about the few big events that help push us in new directions. It’s about the fabric of everyday experiences that weaves these events together. It’s about the unexceptional, and unnoticed, and seemingly unimportant. If community and cosmopolitanism are intangibles that germinate and grow through our everyday interactions, the question becomes, how many of us embody these values in the moments when no one is watching, when there is no big psychic pay-off, when there’s no media and no ambassadors to tell us we should be helping? Because that’s when community and cosmopolitanism matter—if they matter at all.

Physical and not psychic distance is the true barrier separating two strangers. The barrier that determines whether someone is community or outsider, us or other. Breach that barrier—if only for a second—and strangers suddenly become party to an ill-defined but potent kinship. The instant two people connect in person, the altruistic impulse evolved in our species fights its way to the fore. It explains why the most successful hitchhiking is also the most proactive: approaching drivers at servos and truck-stops, anywhere you can have a brief conversation to remind people you are a normal person, just like them.

Every time I unholster the thumb, I see this quirk of nature at play. On the return-leg of my first hitchhiking trip, I was struggling to get a lift near the SA/NSW border. A truckie at the rest stop was having a meal before he turned in for the night, but he promised that if I was still there in the morning he’d help me out. Pete was his name. I was still there in the morning, bowling rocks at trees (0/0), so Pete gave me a ride, disregarding his employer’s no-hitchhiker rule. He took a bigger chance the next night when we parked at a popular truck-stop on the Nullarbor. Being in his cab was one thing; I sure as hell wasn’t meant to be sleeping on the deck of a dinghy lodged on the spine of his road-train. But Pete wanted to see me safe.

In fact, I can only think of one lift which was a bit unforgiving. I was a few hundred kilometres south of Canberra, taking a shortcut over the Snowies. Cars were few and far between. It was getting towards dusk when a leathery-looking miner stopped. I thought he said he was going to Tumut, which was good for me. Apparently he said Tooma. So about an hour later when he told me this was it, I was thoroughly discombobulated. All I could think to say was, “Thanks. Bye.”

Moments later the sun dropped behind the horizon. I scrambled to put on every article of clothing I had, which didn’t amount to much. Shivering in the dark atop a stark mountain tundra of grass and snow—yes, snow—I pondered just what use I could make of my hammock. Then along came some kindly semi-locals and rescued me, curious to know what a hitchhiker was doing at the turn-off to skiing fields in jeans, at night. I wasn’t sure myself.

Of course, most lifts are the opposite of that experience. In fact, earlier that day a couple and their son had picked me up, taken me to their home, given me a cuppa, offered me a bong, and then invited me to make use of their granny-flat. Maybe I should have accepted, I thought, watching my backpack roll out the back of their wagon as we did sixty down the main drag en route to dropping me off.

As it was, I had accepted a similar offer a few days before. A feller picked me up on the way to his mate’s place where he was set to go fishing, then watch the rugby and drink some beer. He asked if I’d like to come along. I said sure—sounds good. So we went fishing. I drank his beer. His mate gave me a bed. Next morning I was dropped at the turn-off to somewhere, while the driver headed to another mate’s for a bit of target practice. Yes, he did have a high-powered rifle on the back seat. No, I found nothing unusual about that.

The fact is, when people stop, or when you stop people, they want to help.

The only thing is, most people don’t stop.

[Full article here]

Pin It on Pinterest