Australians Love Refugees, Hate ‘Boat People’?

Avan Judd Stallard - Author

When it comes to ‘boat people’, majority Australian opinion is in check with government policy and government policy is in check with majority public opinion which holds that boats should be turned back, and anyone who gets through should be sent to mandatory indefinite offshore detention.

Verification of this claim is readily at hand. Simply poke your head out of your terrace house window and ask a passer-by, or stop the next car you see on the main drag of your town, or ask the bloke next door mowing the lawn. People are willing—eager, even—to express their opinions on ‘boat people’.

More rigorous data tells the same story. A Scanlon Foundation survey found that 41% of people surveyed wanted to either turn back the boats or keep the asylum seekers in permanent detention pending removal. Another 30% would allow only for temporary residence in Australia (the instability caused by temporary visas has been shown to cause serious mental health illnesses). Just 24% would allow for permanent residence.

Similarly, a Lowy Institute poll on asylum seekers found that 71% of people thought the Australian government should turn back boats when it is safe to do so, and, failing that, 59% wanted offshore processing. Most polls are broadly consistent with these findings.

However, to confuse the situation, a poll by Essential Research found that 49% of people think boat people should be allowed to stay in Australia if they are found to be refugees—a minority of respondents, but still a significant level of support given the ongoing climate of antipathy.

This raises an important question: what percentage of ‘boat people’ seeking asylum in Australia are subsequently found to be refugees? Historically, the figure has varied year-to-year, but the range tends to stick within 70–95%. That is, the overwhelming majority of all ‘boat people’ are refugees.

Add these facts and poll findings together and a contradiction seemingly appears: most Australians support turning refugees away—the unavoidable consequence of turning boats back when most the ‘boat people’ are refugees—but a portion of those very same Australians think the ‘boat people’ (the ones they want turned back) should be allowed to stay in Australia. Welcome them on one hand, send them packing on the other.

Here’s how the discrepancy makes sense. The Essential Research survey found that 43% of people think most ‘boat people’ are not genuine refugees, and a further 25% don’t know one way or the other. In total, 68% of people are in the dark about the single-most important fact about the asylum seekers who come by boat—that most of them are refugees.

Consider this correlation: 61% of people think that the government’s approach to ‘boat people’ is either just right or too soft, and 68% of people are ill-informed about who ‘boat people’ are. This is not a coincidence.

What is clear is that Australians do not like ‘boat people’. But that finding is very specific and comes with a caveat: it turns out they have little issue with refugees generally. In fact, an overwhelming majority of Australians support the Australian refugee resettlement program that takes refugees who are assessed and placed by the UNHCR.

The Scanlon Foundation found that between 2010 and 2012 support for the program increased from 67% to 75%. To be clear, resettlement refugees come by plane, not boat.

What gives? Well, our politicians have known for years. It’s the boats, stupid.

Australians don’t fear refugees. They fear ‘boat people’. And if you want to know why—read this.

Captain Cook Graffiti Paints a Confused Picture

Avan Judd Stallard - Author

Ahead of Australia Day, the statue of Captain Cook in St Kilda, Melbourne, has been smeared with pink paint (never mind that the paint job is actually quite striking, and Cook makes a fine, if unexpected, dandy). The graffitiing of Cook memorials is part of a protest that has been going on for years.

Take the Cook statue in Hyde Park, Sydney. Paint bombs were thrown at it on Australia Day morning in 2013. More recently, on 26 August 2017, the same statute was graffitied with the words, “CHANGE THE DATE” and “NO PRIDE IN GENOCIDE”.

There is another well-known monument and memorial to Cook in Melbourne. Though Cook never lived in Australia (British settlement was not for nine years after his death in Hawaii), in 1934 his parents’ cottage was dismantled in North Yorkshire, loaded into barrels and shipped across the seas, where it was reassembled at Fitzroy Gardens.

In 2014, that cottage was vandalised three days before Australia Day. Fluorescent orange, green and yellow paint bombs coloured the brickwork and tiles. Black paint spelled out: “26th Jan Australia’s SHAME!!!” A penis or other human appendage may also have been depicted, though due to poor artistry that remains conjectural.

The year before, two days after Australia Day, the cottage was paint-bombed. Then on 4 February a message was painted on the wall: “Cappy Cook was a crook killer liar theif [and while I was going to put in brackets [sic] to suggest a misspelling, further research suggests the author of the graffiti was really quite clever, using the alternate 18th century spelling of thief from Cook’s time: theif)”.

These actions have proved wildly unpopular in the Australian mainstream, with denunciations across both popular and social media. Yet despite the continuing public interest in Cook, for many years the legend and status of Cook have been on the radar of revisionist cultural-theory academics.

This is part of a backlash against the Great Man History of the 20th century in which predominantly white male figures were credited with almost single-handedly advancing history.

For an unrelenting two hundred years, Cook was the poster boy of explorers.  In Australia, his image and legend were used to promote the greatness of the country. “Australia’s history began, when Captain Cook anchored in Botany Bay in 1770,” stated one poster from the 1950s.

A few years later, Australia’s most famous historian, Manning Clark, said something similar in the opening line of his monumental History of Australia: “Civilization did not begin in Australia until the last quarter of the eighteenth century.” He would later regret this.

In the 80s and 90s, the tide began to turn against Great Man History, which meant, given his symbolic value, it also turned against Cook. His name became attached to an altogether different aspect of our history—the dispossession, oppression and slaughter of Indigenous Australians. Australia’s Columbus suddenly became all too Columbus-like.

Cook is now claimed by both camps. By the postmodern theorists who denounce the colonial age of which he was such an integral part, and by popular writers who continue to write laudatory books, articles and television shows announcing his greatness.

Today, the irony of targeting Cook monuments to denounce Australia Day and the colonial history of Australia is that Cook neither discovered nor settled the land. It is a common error.

A survey I conducted at a university among first-year Australian history students found that 43% of those students believed the British were the first Europeans to discover Australia, and 46% thought first discovery occurred in 1770 or later. In fact, it was the Dutch who did the most to discover Australia to Europe, starting in 1606.

The other irony is that Cook truly was a great man, certainly so far as his duties as a naval captain were concerned. He inspired his men, kept them alive in an era where half or more of one’s crew could be expected to perish on long voyages, and he entered and explored Antarctic waters besieged by icebergs in a big wooden ship—arguably the greatest feat of exploration in history.

Cappy Cook was not a crook, nor liar nor thief nor killer. He was an instrument of his time—and now a symbol with little relationship to his uses by patriots and anarchists alike.

Being Mocked for Mispronunciation

Avan Judd Stallard - Author

An aphorism did the rounds recently, and it struck a chord:

“Never make fun of somebody if they mispronounce a word. It means they learned it by reading.”

That’s been me all my life: mispronouncing words discovered exclusively through books. Sometimes I have discovered this through a gentle correction given quietly after the fact, or as part of a shared laugh over the gaffe, but not always.

The most unkind instance was when I was in law school. It was a constitutional law tutorial, compulsory and deadly boring. The word was “quorum”, which means the minimum number of members that must be present to start a valid meeting. It is pronounced kw-or-mm. I pronounced it kw-ere-mm, for I suppose I had never heard it spoken, having skipped the relevant (but mind-numbingly boring) lectures.

Many of my classmates snickered. A tall, blonde, athletic boy from a grammar school, who to this day makes me think of the Hitler youth, snickered the loudest. Being that I had to repeat the word many times in my little presentation, it got to the point that I stopped and asked the group: “What is so funny?”

The young Hitlerite decided to act as group spokesman. He took great pleasure in telling me that I was mispronouncing the word, and finished by asking if I had any sort of education whatsoever.

I glared at him, my fists balled, and I contemplated—quite seriously—leaping across the desk and punching him in the face, as would be voted appropriate by a quorum of Nannupians or public school ruffians of my ilk. But I knew it would cause more problems for me than him. Indeed, I would have been ejected from the law degree I so loathed. So the tutor interjected, and that was that.

That blonde boy was later accused of rape. Not a joke.

What’s the point? Well, it’s obvious. You can quietly and humbly correct someone’s pronunciation, and that is a kindness. But if you mock them to make them feel small and yourself big—then you are scum. I’m pretty sure that that’s what that aphorism was trying to say.

The Great Spanish Coffee Swindle

Avan Judd Stallard - Author

Spaniards drink a lot of coffee—and I love their café cultura, it is central to life—but, Christ, is their coffee rough! Not every taza in every cafetería in every ciudad, but the overwhelming majority of coffee experiences in Spain are trials to be endured, the flavour bitter and burnt and retaining little that is recognisably coffee. And I just found out why: a diablo by the name of torrefacto.

Any half decent coffee has this in common: it is 100% coffee. Torrefacto coffee is only 80% coffee, or less. It’s like the way Australian chocolate is not all chocolate; I mean, there has to be some allowance for a percentage of cockroach, right?

Torrefacto refers to the process whereby sugar is added during the coffee bean roasting process. This is a Spanish practice going back about 80 years, when sugar began to be added to the roast of the beans for one very special reason: it gave the final product a long shelf life (by embalming the beans in a caramel sarcophagus) and ensured no wastage. An added bonus was that the cheaper sugar bulked out the coffee, like the way the British pump their meat full of water.

The problem is that torrefacto coffee tastes like … how would the Spanish put it … mierda, caca, popó? Which is to say, like someone has very carefully coated some green coffee beans with liquid faeces and then roasted it to a crisp, ground it, prepared it with an espresso machine and served it with a snappy vale. Oh, and apart from the fact that it tastes like shit, it may also be cancerous (the lesser of concerns here).

So, in short:

  • Coffee in Spain normally means torrefacto coffee –> means bitter shit-flavoured coffee.
  • When it isn’t torrefacto coffee, the beans in Spain are still burned to a dark char –> means bitter shit-flavoured coffee.

Silly Spain.

Loco Spain!

Ahh, Spain …

The Professional Sceptic

Avan Judd Stallard - Author

*A version of this article first appeared on Stumbling Through the Past.

The historian is two things above all else: a sceptic and an explainer.

Whatever the subject, the ultimate goal is explaining what happened, how it happened and why it happened. But reaching that goal requires enormous scepticism: suspiciously examining the sources, which will deceive in every imaginable way, and suspiciously examining the explanations other historians have offered.

This latter requirement of the historian—professional scepticism of fellow historians—is something seldom emphasised to the layperson or student. If it is in a book, especially one written by a credentialed scholar, then it is history, it happened. It can be added to, maybe interpreted through another lens, but the core stuff—when, what, where, how and why—is established. It doesn’t need to be raked over.

That is the misconception, sometimes willingly gilded by our profession. Now here is the unadorned truth of the matter: historians make a surprising number of errors, mostly small, occasionally monumental. The trick is not to be surprised, but to expect error, seek error and to celebrate it when found, even when it is your own work that has been set ablaze.

The warmth of that fire—of his and her and my hard work turning to ash—means understanding has been advanced. After all, that is the goal, for history is not like mathematics with its proofs, chiselled out on stone tablets. It is like science, with its method centred on falsifiability and incremental gains in knowledge.

It goes something like this: the historian makes a claim they believe is true or might be true and corroborates the claim with all the evidence and argument they can muster. Then they wait for someone to further corroborate their work, or to point to a piece of evidence or argument that brings the whole edifice tumbling down. A new theory/argument/interpretation is advanced; the cycle repeats.

Hence, at the core of all this is one thing: sources—original sources.

My own experience must have been echoed by countless others. When I was a new PhD candidate and very naive about what historians did and how the practice of history worked, I wanted to tell the story of the imagined Australian Inland Sea—the sea that explorers like Charles Sturt figured existed in the Australian interior.

I realised that, first, I needed even earlier background information about what Europeans expected from southern hemispheric lands they might encounter. Hence, I read widely on the history of Terra Australis Incognita, the mythical continent imagined for the southern hemisphere. An obvious question came to the fore: how had the idea of Antipodes arisen?

Numerous books told me the answers, so I thought I knew. I’ve dug out my earliest notes from that initial research, a tad cringe-worthy, but early ideas set in writing often are:

The ancient geographical theory of Ptolemy of the second century A.D. … lent itself to deceptively scientific ideas about the globe being like any gravity-governed body: stability of the earth’s rotation on its axis was dependant on balance, and therefore the distribution of land mass above and below the equator was assumed to be equable. Hence, the southern continent could be expected to be a reflection of the Eurasian land mass, located behind the mirror of the equator.

Ptolemy, symmetry, continental balance: I had lapped up what other authors had written and figured it for erudition. But I remember the unease I eventually came to feel, a sick sort of feeling, because the scholar deep inside me was sceptical. Why? Because none of these books cited original sources. Because not one of my own conclusions cited original sources.

I was taking everything and everyone on the basis of authority. And supposing everyone before me had likewise taken it purely on authority, I wondered what might be at the bottom of it all—tortoises all the way down, perhaps?

In a perfect world, a historian would check the original sources for everything. If you have not checked the original source, don’t put it in your history, because you just don’t know.

But historians do not have infinite time, resources or sanity, so the second-best option is to rely only on other authorities who have checked the original sources and demonstrate rigour. Even historians citing their evidence sometimes make errors of interpretation, translation and context, and something so seemingly minor as a misplaced word can have major ramifications. Small errors can compound into significant misunderstandings, major mistakes or even lead to the creation of full-blown myths that sweep a discourse.

In the third-best option, you are willing to rely on those who cite nothing original and—well, neither they nor you are really historians at that point. You are writers, possibly of fiction. I realised all those years ago that that was me—writing a story, not a history.

So, driven by a healthy dose of self-loathing for how lazy I had been, I went to the original sources (to the extent that was possible). What I found was this: not Plato, not Aristotle, not Ptolemy nor any other of the dozens of ancient scholars I was able to read mentions symmetry or the need for the hemispheres to contain equal quantities of land to provide the earth with its poise—a concept I refer to as equipoisure.

The popular understandings about symmetry and balance are unadulterated myths. It really was tortoises all the way down!

As for the many modern writers and historians who state that the ancients did believe in Antipodes because of symmetry or a theory of balance—not one cites a jot of evidence. They don’t because there is no evidence (I say this to the best of my imperfect knowledge—this is one of my core claims awaiting corroboration or falsification; ego prefers corroboration, but both options get the job done).

This was a major revelation, for me if not broader humanity. The implications are huge. Take away the old chestnuts about symmetry and balance, and one of those fundamental questions historians are compelled to ask has never been fully answered—the why question.

That is, why did people in the past construct the idea of a massive southern continent and then choose to passionately believe in something that didn’t exist?

To answer that required evidence. I buried myself in treatises of natural philosophy, journals of explorers, old geography texts, proposals and memorials, letters, histories and maps—lots of maps. Luckily, researchers both amateur and professional have done a superb job of digging up innumerable artefacts that tell us something about the imaginary southern continent.

Though I am sceptical by nature and by trade, there is no denying the incredible work of these historians who precede me—and I fully acknowledge that my own work rests entirely on theirs. But…

I check their sources whenever I can, just to make sure. Sometimes—more often than most historians let on—something doesn’t quite add up. Perhaps a misinterpretation, perhaps an omission, perhaps a mistake. Sometimes it is mere minutiae, and sometimes it changes everything.

The result of my research is Antipodes: In Search of the Southern Continent (available from Monash University Publishing and select book stores). I chart the European idea of a mythical southern continent, an idea so potent in the minds of its fanciers that it helped shape early modern history. I ask a lot of why questions, and provide a few answers. Some of them may turn out to be right.

F**k the Oxford Comma

Avan Judd Stallard - Author

A minor quibble, not a matter of international import, merely enough to provoke the occasional fit of inappropriate rage: I am sick of well-meaning people, unaware of their own blinkered knowledge of the many standards of global English, pointing out what they believe is the grammatical howler of a writer having omitted the Oxford comma.

What is the Oxford comma, also known as the serial comma? It’s a little thing, like half a pig’s tail. In a list of three or more items, most Americans and some of the British include a comma before the “and” (or “or”) preceding the last item. For example, if such a person wanted to list “rage”, “apoplexy” and “sadness”, they would write rage, apoplexy, and sadness. The difference is merely that comma before the “and”.

Use it or don’t use it; for the most part, it’s mind-numbingly unimportant. Neither form—with or without the extra comma—is more or less correct. It is a custom, and only a custom, that differs by country, by institution and by person. It is not some rule of usage prefaced upon inviolable linguistic laws discovered once the human genome was decoded. Dealer’s preference.

And yet it is so very common to receive straight-faced correctives, often from Americans, telling one they have erred for want of an Oxford comma. Such mindless prescriptivism is maddening. It shouldn’t be, but the contradiction inherent in the act just sets off a certain kind of person (that person would be me).

As for what the conscientious writer should do—that is, a writer not bound by custom and merely looking to maximise clarity and readability—there tend to be two main arguments put forth, one from each camp.

Those who discourage use of the Oxford comma point out that it is normally redundant. They are right, from the perspectives of both syntax and prosody. The “and” that comes before the last item in a list alerts the reader that it is the last item in the list (this is the syntactic function, concerned with arranging the units of meaning in a sentence). Likewise, the “and” encourages, or arguably necessitates, some sort of pause or change of rhythm as the sentence is read (this is the prosodic function, concerned with regulating the rhythm and melody of a sentence).

In this sense, then, the Oxford comma is heavy-handed. Writers are often encouraged to eschew surplusage, and the Oxford comma is nearly always, strictly-speaking, surplus to the minimum required to convey one’s meaning.

But not always. Sometimes a list becomes confusing without that extra comma. If I want to list the big boys, Jack and Jim without using an Oxford comma, you cannot tell whether I mean three things (big boys, Jack, Jim) or if I’m just referring to the big boys, who are named Jack and Jim. No such problem with the trusty Oxford comma in the mix: the big boys, Jack, and Jim. So, there is no doubt, an Oxford comma should be in every writer’s arsenal.

However, to say that just because it is useful once in a blue moon every writer should use an Oxford comma in every list they write across their lifetime—we’re talking tens of thousands of lists, tens of thousands of redundant commas—is silly. It’s just getting suckered into the prescriptivist mentality: we have to have a hard and fast rule! There can be no discretion! Man and woman are not responsible enough to make their own decisions about when a comma is necessary! Yes, dammit, we will have hundreds of millions and very soon billions of redundant commas to ensure no one misses that rare occasion when it is actually needed!

Dear oh dear, the grammar nanny state, the most minor of all the dystopias.

My suggestion for those who can’t just live and let live, who are seeking some sort of guidance on this pressing matter of comma security: go with discretion. Trust the writer to use that comma when and only when it is required for clarity.

I work as an editor for living, and that’s my policy. Mostly no Oxford commas, but I’ll throw one in every blue moon when it means the reader does not have to slow down too much or reread the passage to be sure of the author’s intended meaning. And if a writer does get the exercise of their discretion wrong—if they write a list without an Oxford comma and that causes some sort of confusion—point it out and put one in. Otherwise, shut up, piss off and leave me and my commas alone.

Pin It on Pinterest