This blog post first appeared on the Catalyst website on the 26th of September, 2013.

In December 2009, 32-week-pregnant Brodie Donegan from New South Wales miscarried after being struck by a car. The driver was charged with causing grievous bodily harm to the mother, but the death of the unborn child prompted no additional penalties. This was not an unusual verdict: under State law, a foetus is not considered to be a human being in any meaningful sense. At best, it is considered to be a form of property; up to a certain stage, lawfully permitted to be terminated at will.

Needless to say, this is not the case for newborn infants. Killing a child, no matter its age, is deemed a heinous act in both legal and cultural terms. Although many countries, including Australia, have included postpartum depression as a valid defence against murder charges, base penalties are on par with standard homicide sanctions.

Biologically speaking, this is a curious state of affairs. Members of the human species do not ‘begin’ at birth any more than lizards or aphids do. While one might have been forgiven for thinking so in less scientifically developed times, we now know that the preceding period of gestation is merely the earliest of several stages of development in an organism’s life-span. Any suggestion that either ‘life’ or ‘personhood’ happens to coincide with one’s first unassisted breath is, thus, something of a semantic game.

Two significant factors, however, further complicate this issue: first, the foetus’s dependence on a human carrier, who must sacrifice her bodily autonomy in order to sustain it; second, the necessity of making arbitrary demarcations of some kind or other.

It is the former consideration that has historically dominated the abortion debate, and rightly so. No serious discussion on the matter can ignore the fact that the significant burden of pregnancy and childbirth – while gladly accepted by some – is not something that all women wish for or choose. The idea, then, of legally forcing a woman to carry an unwanted pregnancy to term can never be anything but problematic.

It is too glib, however, to dismiss the alternative as less so. Whilst it is unfortunate that anti-abortion rhetoric tends to be dominated by the Christian fundamentalist right, the concerns raised by that movement deserve to be taken a little more seriously. Despite feminist rhetoric to the contrary, the desire for patriarchal control over women’s bodies seems strangely absent from pro-life groups. In their eyes, they are fighting for human rights. We can and should reject their claims about the existence of a soul or some kind of divine right to existence, but it would be wrong to characterise their whole fight as frivolous.

At the moment, the battleground is the New South Wales parliament, and the bill in question is “Zoe’s Law”. Named after Donegan’s unborn child, it seeks to grant limited personhood under law to wanted foetuses at a late stage of development. It is perhaps a sign of the success of the pro-choice movement in Western society that any potential impact on abortion law is off the table, although the usual suspects are nonetheless hovering: the anti-abortion lobby, who see the bill as a possible step forward for their cause, and their opponents, who see it as the beginning of a dangerous slippery slope.

In some respects, this is characteristic of the abortion debate: two sides very much arguing at cross purposes. Their preferred designations – pro-life; pro-choice – exemplify that. It would perhaps be productive for each side to recognise that their opponents are, in one way or other, fighting for the same long-term goal – that is, human rights. For those of us not already invested in the debate, that is how the discussion ought to be framed.

It is important, however, to keep in mind the consequences of these positions. Firstly, it should go without saying that the rights of the mother need to be preserved as much as possible. It is deplorable that societies of the past forced women with unwanted pregnancies into dangerous unsupervised abortions (a paradigm that still, shockingly, seems to persist in parts of the developed world). Similarly, the accompanying social shame – fostered and tolerated by a moralistic society – has no place in a developed society. Any legislation aimed at protecting the rights of the foetus would have to be extraordinarily carefully-worded lest mothers face prosecution for no longer wanting to carry a child.

These caveats need not be an impediment to reform. If, however, pro-life advocacy is to gain more credibility, sensible, realistic goals need to be ascertained. Clearly, a fundamentalist interpretation of human rights – i.e. beginning at conception – is unworkable and likely unwanted by all but the most extreme pro-life advocates. That leaves two options: to either adjust the marker to some arbitrary point (of which birth might well be as good as any) or to adopt something of a sliding scale. The latter is an area oft-explored by Peter Singer, whose conclusions – that young post-natal infants may not warrant the legal protections offered to adults – have proven uncomfortable and highly controversial. These, however, are the discussions we need to have if we are ever to comprehensively get to grips with the topic of abortion.

The status quo view, reflected by Western law, is that it is absurd for the rights of a barely sentient organism to be made equal to those of an adult human being. Most reasonable people would agree with this. There is, however, a catch here: the modern paradigm of human rights, as we understand it in the post-enlightenment age, allows for little discrimination between individuals based on metrics such as intellectual capacity, sensory development or physical autonomy. It is not considered more of a crime to kill young children than infants, and it is certainly not considered less heinous to murder an eight-year-old than it is to do the same to an adult. Compared to these, the difference between a newborn and a late-term foetus is relatively insignificant.

Clearly, there is something more at stake here. In our society, our worth is not merely judged according to our productivity or ability to get the most of out of life. For better or for worse, we take care of those who cannot protect themselves. While a newborn baby may not have much in the way of personhood, we recognise that it is a growing human being that ought to be cared for. An eight-month-old foetus is, arguably, very much in the same camp.

Of course, exceptional circumstances aside, abortions are generally not conducted at such a late stage; that, however, is no excuse not to legislate on the matter. While many pro-choice advocates will vehemently disagree, the fact is that there is nothing that a society such as ours doesn’t legislate on. Silence constitutes permission. While it may seem a noble conclusion, declaring the future of the foetus to be purely a matter of the mother’s preference is a rejection of its rights. That may be a reasonable conclusion, but it is a legislated conclusion.

The most unsettling consequence of this debate may well be that a fundamentalist defence of the human right to life may no longer be viable. In a utilitarian society, such as the one Singer advocates, this is a feasible conclusion. Indeed, with public support for euthanasia growing, we may well be headed in such a direction. Hand-wringing from religious groups aside, that is and will be an enormously challenging issue.

The debate over “Zoe’s law”, then, should be a welcome one. As much as the workings of our democracy may disappoint us, we should have a little faith in our lawmakers’ ability to work through complex issues without reaching fundamentally unreasonable conclusions. The progress of our society’s treatment of the matter – from the backroom abortions of the 1950s to Roe v Wade and beyond – ought to give us some hope. Considering what’s at stake, we may well need it.


This blog post first appeared on the Catalyst website on the 18th of September 2013.

In February 2013, the German parliament declared bestiality a criminal offence. This was not, needless to say, a controversial ruling; indeed, laws of this kind are now the norm in most Western social democracies. What might come as more of a surprise is that, a little less than half a century ago, many of these countries were legislating in the opposite direction; archaic ‘sodomy’ laws swept away as the sexual revolution of the late ‘60s began to reach the halls of parliament.

It was not that these lawmakers considered animal welfare to be an irrelevance; indeed, cruelty to animals had been punishable in some jurisdictions since the mid-19th century. By the 1970s, most Western countries – the former West Germany included – had laws of some kind pertaining to the treatment of vertebrates. What the repeal of sodomy laws exemplified was a societal shift away from regulation of personal morality and towards something more closely resembling utilitarianism. Wanton cruelty remained proscribed, but acts of non-harmful human/animal sexual interaction were no longer considered legally punishable.

For animal rights groups, this was far from sufficient. The central problem with the status quo, they argued, was the inability of an animal to offer meaningful consent; as such, bestiality was viewed to be somewhat akin to statutory rape. Over the past few years, their arguments have been upheld in jurisdictions as diverse as Norway and the Australian Capital Territory.

This view, however, throws up something of a quandary: if animals cannot consent, how is it that sexually mature animals engage in mating behaviour with each other, or even – in admittedly rare circumstances – initiate sexual contact with humans? Furthermore, why even consider consent in this area when it is denied to animals in so many others? Animals, after all, are given no say in whether they are slaughtered for meat, tested upon, culled, euthanised or kept in captivity.

Clearly, there is a much more fundamental question at stake: what rights, if any, can human society afford creatures from other species? What should we be aspiring to?

What is clear is that discrimination on some level will always be necessary. It is probably unworkable for the act of stepping on ants to carry manslaughter charges; likewise, few would advocate allowing the needs of many to outweigh the needs of one in cases where exterminating parasites is required in order to cure human illness. These may appear to be extreme hypotheticals, but they sufficiently illustrate the point: by prioritising human lives over less complex organisms, we must concede some form of favouritism.

How far that favouritism ought to extend is another question. Until the 19th century, belief in human exceptionalism – the view that humans and the rest of the animal kingdoms existed on entirely different planes – was the scientific and cultural norm. Although folk concepts such as the human soul have ensured that that viewpoint retains some currency, developments in evolutionary biology have shown that the human species is far less distinct than we once thought it to be. This is not to say, however, that belief in human superiority is entirely unjustified – discrepancies in areas such as intelligence, competence and ability to communicate make it somewhat unavoidable.

That acknowledgement has shaped human views towards animals for as long as legal codes have existed. Where laws regarding animal welfare have existed, they have always been enacted in service of human interests. Animal cruelty laws, primarily designed to maintain a ‘civilised’ society, are no exception. So long as animals are denied a voice in how society is ordered – and, due to the rather large communication problem, it’s hard to see how this could ever not be the case – laws will always reflect human needs and desires.

That does not mean that progress in animal welfare laws won’t occur. As the science behind food production develops, meat and other animal products will become more easily simulated or replaceable. Given our modern tendency to be distressed by unnecessary suffering, it is near-inevitable that exploitative industries will be phased out. Likewise, stronger regulations may further police our own day-to-day interaction with pets and wild animals.

Some, then, may see the new wave of laws against bestiality as a step in that direction. While that’s clearly the intention of animal rights lobbyists, modest regulations regarding other aspects of animal welfare suggest that legislators may have a rather different agenda.

As Peter Singer notes in his short essay ‘Heavy petting’, the concept of human exceptionalism still contributes significantly to views about bestiality. One need only turn the Bible – still something of an influential text, needless to say – to the Book of Deuteronomy to find provisions for dealing with such crimes. “If a man has sexual relations with an animal, he must be put to death,” a passage decrees. “And,” it goes on, “you must kill the animal”. The message is clear: this is considered to be less an act of exploitation than of indecency; the real crime being one of self-debasement. That paradigm, it seems, remains as strong as ever. For those concerned by the legal system’s encroachment into matters of personal morality, there may be cause to treat these new laws with scepticism; for those concerned with animal welfare, they may yet prove a false dawn.


This blog post first appeared on the Catalyst website on the 11th of September 2013.

Oz has elected a bigoted air-head to drag them backwards into mean prejudice and vainglorious chauvinism” – Paul Flynn, UK Labour Party MP

It’s not goodies versus baddies; it’s baddies versus baddies.” – Tony Abbott on the Syrian conflict

For those dismayed by its results, the most galling aspect of the 2013 federal election is not that the conservatives won. Neither, for those of us removed from the partisan hysteria of the ALP, is it the spectre of prime bogeyman Tony Abbott becoming prime minister. Rather, the most frustrating thing is the way that the election has been decided, fought on simplistic arguments about national debt, asylum seekers and the carbon tax.

Certainly, the departing government was far from perfect. Four leadership spills and two changes in prime minister over the course of six years gave the ALP a deserved reputation for internal instability; a few failed schemes from the early stages of Kevin Rudd’s stewardship contributing to a perception of ineptitude. For conservatives, Julia Gillard’s much-misunderstood promise not to institute a carbon tax was seized upon as clear evidence of deceit; for progressives, Rudd and Gillard’s aloof conservatism was trumped only by their willingness to throw refugees, same-sex couples and civil liberties under the bus for the sake of right-wing votes.

For all its faults, however, the Labor Party can claim to have governed competently. This includes, most significantly, having assisted the country to emerge from the global financial crisis relatively unscathed; likewise, successfully negotiating the difficulties of a hung parliament. By most objective standards, they have left the economy in an excellent position, with national debt and unemployment figures kept in check. The Gonski education reforms, introduction of the National Broadband Network and National Disability Insurance Scheme are all policies that, if maintained, should have long-lasting positive effects on Australian society.

Given that, it is ironic that the opposition chose to wage their campaign primarily on supposed Labor incompetence. These claims were, with few exceptions, highly dubious: that refugee boat arrivals had reached unmanageable levels; that the country’s debt was significant, and attributable to government profligacy; that the carbon tax, a fairly unexceptional market-based scheme for addressing climate change, was an extreme policy whose annulment was worthy of being the “first priority” of a new government.

Confronted with arguments such as these, a highly-educated populace would not only have remained unconvinced; it presumably would have severely punished the proponents at the polling booths. The fact that the exact opposite has occurred, therefore, is profoundly troubling. Certainly, it is hard to avoid the conclusion that Australian voters are too easily manipulated.

Such is democracy, though. Educated opinions on science, economics and foreign policy are and always have been considered secondary to the construction of a good narrative; a three-word slogan given more weight than a dozen peer-reviewed papers. No higher authority is required for a mandate than mere popularity. We might well ask ourselves whether this is the best possible method of running a country.

Egalitarian democracy is founded upon the premise that all adult citizens deserve an equal say in electing representatives and, by proxy, creating laws. As such, it is arguably the freest, least oppressive, political system ever adopted on a national scale. It is a privilege that we should neither take for granted nor question glibly. That does not, however, mean that our current system ought to be considered above critique. There are, after all, few more fundamentally important topics of debate than the way in which a society should be organised.

Given that Australian democracy appears to be decidedly imperfect, we might ask ourselves the following questions: is the election of substantially inferior candidates merely an unavoidable drawback of an otherwise ideal system, or is it a problem that can be satisfactorily resolved? If the latter, how might we go about it?

The most apparent problem is that there are people in our society who are far more adept – through intelligence, level of formal education, personal experience or political engagement – at judging what might constitute a better government than others. If one accepts that premise, it is at least superficially true that giving such people more of a say would lead to a better-run, more intelligent society. If we accept that the latter is a worthy aim, the challenge is to propose an appropriate way of achieving it.

Firstly, removing suffrage from groups or individuals is clearly undesirable. It would be hard to see such a move as anything more than a one-way ticket to oppression and social stratification. Furthermore, on a purely pragmatic level, setting a bar at a certain level would inevitably lead to mass disenfranchisement and, quite possibly, social unrest. A slightly less problematic alternative might be ‘weighting’ votes according to various metrics, although it goes without saying that such a process could potentially open up new avenues of corruption as well as significant logistical problems.

Perhaps, though, this is the wrong focus. Rather than entrenching inequalities by advocating elitist electoral policies, we would do better trying to work out how to build a more intelligent, educated and critically-minded body politic. That is a point that we should surely all agree on.

People cannot always be expected to know what is in their own best interests. That will always be the case. Unless we can find a workable means of improving the system – and we should not be inhibited from trying – we have to work within its parameters. Until then, it is a truism that an important part of the education process is learning from mistakes. Let us hope that, this time around, the lesson is not too painful.


This blog post first appeared on the Catalyst website on the 4th of September 2013

It’s difficult to think of many more damaging, unproductive and pervasive human reflexes than blame. It permeates through all areas of society: from intimate and familial relationships to workplaces; from the political sphere to the mainstream media. Directed outwards, it leads to contempt and ostracisation; directed inwards, to depression and self-loathing.

To some, this might seem an unavoidable part of human life. It does not have to be. There is a significant difference between identifying culpability – that is, acknowledging the role one’s actions played in bringing about a negative outcome – and the act of blaming. The former is merely an assertion of causality; the latter, an exercise in moral superiority. To blame is to assert that one could have and should have acted differently and, therefore, to condemn for failing to do so.

This kind of conclusion is, of course, somewhat dependent on belief in free will. I have covered that topic in greater depth elsewhere, but suffice it to say that it is a belief that I – and a great number of modern neuroscientists – reject. In any case, not even the staunchest free will advocate would seriously assert that we make decisions in a vacuum; clearly, genetics, culture and the situations we perceive ourselves to be in contribute significantly to what we decide to do.

The decision-making process, as I see it, is essentially a matter of competing desires. Consider, for instance, a person considering cheating on their spouse: what decision they make will, essentially, comes down to whether or not the combination of physical desire and excitement triumphs over inhibiting factors such as fear of negative consequences or empathy for their partner. Stress, memory and ability to think clearly will all contribute to the decision made. This kind of interplay is present in all situations, whether it be committing a crime or choosing which breakfast cereal to eat in the morning.

This is what makes the concept of free will so nonsensical. If these decisions are merely the results of factors out of our control — one, after all, does not choose what to desire or to what extent; such dispositions are inevitably some combination of extant personality traits and biology — then it is merely left to us to pick the option that feels ‘best’ at the time. That being so, how can we judge anyone else for making ‘wrong’ choices when we know that we would have done the same with their combination of genes, socialisation and stimuli?

Some will argue that it is important for an individual to take ‘responsibility’ for their actions. That is, I believe, a fallacy. The act of what is commonly supposed to constitute failing to accept personal responsibility – ‘making excuses’ – is usually either shifting the blame onto somebody else (the same fundamental error) or irrationally assuming helplessness (for instance, presuming that one’s upbringing precludes one from not acting criminally). What is needed in such circumstances is not ‘responsibility’, and certainly not self-blame; what is needed is rational thought.

Even if this were not so, blame would still be highly problematic. Beyond matters of free will or morality, it simply doesn’t achieve anything positive. Far better, surely, to deal with the negative consequences of certain actions, use them as learning experiences and work towards positive future goals than to indulge in recriminations over past events. This is a case eloquently argued by Sam Harris in his ‘crocodile’ analogy below:


It is not always possible to deal with these things rationally, of course, but if we at least understand what the ideal is, we are less likely to fall into patterns of blaming ourselves and others in our everyday lives.

In the realm of interpersonal relations, this is not such a radical conclusion. In the realm of therapy and self-help literature, it is arguably the mainstream view that blame is harmful and counterproductive. A more pressing problem is how to apply such a conclusion to the legal system.

It would be a mistake to assume that rejection of blame entails rejection of punishment. Most would agree that society cannot function without a legal code of some kind; furthermore, a legal code is meaningless without enforcement. Clearly, we need to have some mechanism whereby harmful behaviour is minimised; however imperfect, criminal punishment provides that.

This is a subject that psychologists Joshua Greene and Jonathan Cohen explore in their thought-provoking paper ‘For the law, neuroscience changes everything and nothing’. Here, they propose two models of law enforcement: ‘retributivism’, founded primarily on the concept of desert (that is, criminals receive the punishment that they ‘deserve’), and ‘consequentialism’, founded upon purely utilitarian concerns (that is, sanctions are calculated according to the good of society, with no regard for blame or vengeance). Given our growing understanding of neuroscience and the illusory nature of free will, they argue, we must move towards a fully consequentialist system.

In the Australian context, it is debatable as to what vestiges of retributivism remain – in the United States, where the above article originates, it is still discernible in practices such as capital punishment and mandatory sentencing – whatever the case, I wholeheartedly agree that there is no place for it in a modern justice system. If a penalty cannot be defended in terms of a) its likely success in preventing a criminal from harming others and b) its ability to provide a suitable deterrent for others, combined where possible with c) its potential for aiding in the process of rehabilitation, then it is gratuitous.

What this might mean for the future of the justice system is another matter. Greene and Cohen propose a move away from punishment altogether and towards a sort of ‘treatment’ model. Although some have pointed out the potentially dystopian aspects of such an approach, there is at least some merit to it. Certainly, for serious repeat offenders such as Adrian Bayley, it is hard to argue against early therapeutic and/or medical intervention; to do otherwise if given the option would be gross negligence.

As our understanding of neuroscience progresses, these are questions that will inevitably need to be asked. In the meantime, we would do well to understand that even the most terrifying criminal is as much a victim of their biology and environment as we are. As we dispense with folk concepts such as good and evil, we would do well to treat blame and judgementalism as similar archaisms – attitudes that do nothing to make society a better place. By what higher standard could they possibly be measured?


This blog post first appeared on the Catalyst website on the 28th of August 2013 

When it comes to sex, we live in a society of seeming contradictions. On one hand, ours is a permissive age: there is little under the umbrella of consensual sexual behaviour that our institutions do not officially endorse or tolerate. Casual sex, masturbation and diverse sexual positions are no longer seen as vices to be legislated against or ‘fixed’ by the medical establishment; indeed, outside the realms of the more conservative religious organisations, you would be hard-pressed to find anyone in a position of authority willing to condemn such acts today. Sex is written about, discussed openly between friends and simulated on film and television. On the internet, pornography is near ubiquitous.

In theory, this might be the description of a sexually liberated society. In reality, it is something of a facade. While the sexual revolution of the ‘60s and ‘70s helped bring about significant improvements in the sex lives of ordinary Westerners, the spectre of sexual shame continues to pervade our culture. It prevails through the practice of slut-shaming, in which teenage girls and young women are vilified for their promiscuity or choice of attire. It prevails in popular magazines and tabloid newspapers, in which prurience is presented as disgust. It prevails in aspects of the legal code, in which teenagers are treated like criminals for engaging in mutual sexual interaction through electronic communications. Likewise, it prevails in the difficulties faced by former sex workers in seeking employment.

Broadly speaking, there are two possible explanations for this discrepancy: the first is that progress of any kind is slow; that we still have people in positions of power in our society whose coming-of-age occurred in a much more conservative era; that this sort of shame is a slowly-fading remnant from times when the religious right were more powerful. The second, more troubling, possibility is that the sexual revolution is long since over; that, rather than being in the midst of positive change, a new puritanism has set in for good.

One of the more supposedly humorous stories of the Australian electoral campaign was the revelationthat a Queensland State MP, Peter Dowling, had taken a photograph of his penis in a wine glass. It was this accusation, coupled with some minor allegations about misusing travel allowances (later shown to be erroneous), that resulted in Dowling stepping down from his role on two parliamentary committees. What was only casually reported at the time was that the images in question had been sent by Dowling privately to his then-lover during the course of a consensual sexual relationship. The fact that she later disclosed these matters to media outlets and the Queensland Parliament seemed to provoke little consternation from mainstream media outlets; most preferring to fulfil her apparent goal by participating in his humiliation instead.

This astonishing lack of respect for the privacy of public figures – crystallised a few years back in Channel Seven’s disgraceful ‘outing’ of New South Wales politician David Campbell, and also present in the Liberal Party’s hounding of former speaker Peter Slipper – may be fodder for further topics, but what it shows most clearly is that sex-shaming is as insidious as ever.

As a footnote, let us return to the phenomenon of teen sexting. Those who push for stronger regulations in this area cite the negative impacts endured by the victims – depression, loss of self-esteem and, in a few tragic cases, suicide – but, in doing so, they miss the point entirely. It is not the mere act of sending intimate photographs to a partner or friend that brings about such devastating consequences; nor could the dispersal of such images, devoid of cultural context, ever be sufficiently harmful on its own to lead young girls to kill themselves. The primary cause of these deaths, of most of the other forms of damage caused by these incidents, is a culture in which shame and stigma are attached to sexuality; in which people – particularly young women – are bullied and derided as ‘sluts’ if they are seen to engage in sexual behaviour. We must ask ourselves whether or not that is an ethos that we are comfortable perpetuating.