At least 49 people are dead following a terrorist attack on two Mosques in Christchurch, New Zealand. The attack took place during Friday prayer and was live-streamed by one of the gunmen.
At least 49 people are dead.
Words fail me on days like today. The horror is simply too much to process; the hate too much to fathom. It seems unbelievable that such a thing could happen; yet the news is sadly unsurprising given the global rise of vicious and xenophobic rhetoric. Perhaps it was only a matter of time.
Nothing I can say or do will bring them back; nothing can undo what has been done. This isn’t some shining Hollywood movie; there’s no loophole to undo the past. This is, I’m afraid, simply the world we all live in now.
All I can do — all any of us can do — is to bear witness and to…to try to do better for tomorrow than we have done for today. To not use this moment of shock and grief to comfort ourselves before moving on with our lives, but to really think deeply and carefully about how our daily actions, inactions, and interactions shape this shared world we all live in.
Perhaps it’s because I’ve been reading N.K. Jemisin’s Broken Earth Series, but I can’t help thinking that these violent eruptions of horror should not be taken as isolated, random incidents, but are better interpreted as periodic expressions of the hate that’s been steadily building all along.
These attacks don’t come from nowhere and they aren’t the product of a single, deranged mind. Rather, the hate, anger, and fear which drive such violence have been growing steadily just below the surface.
It’s a privilege, in a way, to be able to describe these sentiments as “just below the surface.” By that, I mean many of us have the privilege in our daily lives to ignore the constant build-up of hate; many of us are only forced to confront this reality when it erupts in a particularly violent, horrific, and public way.
Many others, of course, are not so fortunate. They are forced to live anxious, guarded lives knowing full well such hate is thriving all around them. They experience its tremors every day.
It is hard to see what we don’t experience directly. It is hard to know what to do when our problems are so overwhelming. It is hard to find words in the face of such horror. It is hard to accept that this violence has occurred and nothing any of us can do will undo it.
It is hard. But this is the work we have ahead of us.
I am still processing the news coming out of New Zealand this morning. I am still trying to make sense of such senseless violence. I am still gasping for air, trying to find my footing on this rocky ground. I don’t know what to do, I don’t know what to say, I don’t know how to repair this broken world.
But I do know that there is a role for each of us in this work. Today’s violence may have been half a world away, but it could have just as easily been in my own backyard. Every single one of us — even a nobody like me — has a responsibility to quell our local tremors; to do everything in our power to make tomorrow better than today.
This doesn’t happen as some grand, dramatic scene in which we get to play the hero. No. No will ever likely ever know or recognize this work. But this is the work to be done. Acts of love, acts of humanity, acts of peace and kindness; embracing a mode everyday existence that actively seeks to quell this hate and to subtly prevent such violence by never letting it get that far.
It is, perhaps, a small comfort on days like today. It feels too small, too insufficient. And in many ways it is. There is much more work to do. There is always more work to do.
But the question we should be asking ourselves this morning isn’t how we let ourselves get here; how such violent hate continues to exist in the world. Rather, we should be examining the ways in which we, individually, have been complicit in allowing such hate to fester. We should be asking ourselves what we, individually, will do differently tomorrow; what we will do differently every day after that. We should be asking ourselves what specific steps we will take to make this world better, to actively work everyday towards building a world of love.
In recognition of International Women’s Day, SAGE Ocean — an initiative from Sage Publishing which supports computational social scientists — asked me to contribute to a blog post discussing challenges facing women in academia and reflecting on strategies for improvement.
I encourage you to read the full post here, featuring comments from Laura K. Nelson, Megan Squire, Lily Fesler, Diyi Yang, Kimberly A. Houser, Aleksandra Berditchevskaia and myself.
I’ve included my own answers below:
How do we nurture an academic landscape that is more accessible to women?
The challenges faced by women and gender minorities in academia goes beyond a “pipeline problem.” In many cases, good scholars are actively forced from the academy by cultures of harassment and systems designed for people with power and privilege. Nurturing a more accessible academic landscape, then, means critically evaluating and rebuilding these systems; working to make academia more inclusive for all. In a practical sense, this means genuinely listening to concerns that are raised, learning from other people’s perspectives, and working collaboratively to make our antiquated systems better. This work goes beyond the dimension of gender, and seeks to make space for all who have traditionally been barred from academic life. The key thing to realize here is that “the way it has always been” is — by definition – imperfect, since those past strategies were developed around a relatively narrow sub-sample of the population and cannot be expected to generalize. As we get more data, as we learn from different types of academic experiences, we ought to adjust our thinking and our systems to ensure that no one’s scholarship is being systematically excluded.
What are the key challenges facing women in academia?
As an academic, I expect to be regularly critiqued for the substance of my ideas and the quality of my methods. This is good for science and it is how we all learn and improve. However, far too often, the possibility that I have the capacity to contribute intellectually is simply dismissed out of hand or the harshness of the criticism I receive exceeds what is conducive to scholarly debate. On numerous occasions I have had men yell over me while I try to explain my work, I’ve had men simply walk away when I introduce myself, I have been told that my presence indicates a lowering of the bar, and I have been sexually harassed in academic spaces. These are the things that make me want to leave academia. To be clear, there are many things I love about the ethos of academic pursuits: I love getting feedback, I love learning from people who are smarter than me, and, if I’m being honest with myself, I even love the stress and neuroses that come with trying to be successful in academic life. But to be constantly harassed and degraded, to have it made it so perfectly clear that a significant portion of the community will never even consider me to have intellectual potential – that is the most disappointing challenge of all.
I share these experiences because they are not just mine – these stories are endemic amongst women in the academy. Non-binary and genderqueer people often face even worse harassment and regularly have their very existence questioned. Furthermore, gender is but one dimension along which people may experience exclusion and discrimination within academic communities. When academia protects and even rewards the scholars who perpetuate such harassment, it only further emphasizes the narrative of intellectual inferiority: we mourn the careers of abusive scholars more than we mourn the loss of the many people they push out.
What do you envisage the impact of increased gender diversity in academia be?
While increased gender diversity is a good in its own right – representing increased diversity in perspectives and giving future scholars more opportunities to see themselves in academia – it can, perhaps, be better understood as an indicator rather as an outcome in itself. Existing gender disparities in academia are indicative of a system in which people are regularly bullied and harassed out of the field; they are indicative of a system in which anyone who falls outside a perceived cis, white, male norm is put at tremendous disadvantage. They are indicative of patterns of abusive and dismissive behavior which serve to keep a broken status quo in place and systematically silence some voices. This is why I suggest that levels gender diversity can be seen as an indicator rather than an outcome: a world in which academia has more gender diversity would be a world in which academia is less toxic. A world in which, quite simply — scholars are assessed by the quality of their scholarship.
Don’t forget to visit SAGE Ocean’s post to read other people’s responses as well!
One of the core tenants of deliberative theory is that every individual has agency and we each have a moral responsibility to respect, support, and meaningfully engage with ourselves and others as agents in the world.
In fact, I would be inclined to go so far as to argue that this is the core tenant of deliberative theory. That is, while the word “deliberation” itself suggests a process of discussing and reasoning, the ontological justification of “deliberative democracy” relies on “deliberation” as a deeply democratic process in which all perspectives are genuinely welcomed as adding value to the whole. This is what makes deliberation more than “just talk.”
Dewey seems to prefer to imbue the “democracy” portion of “deliberative democracy” with this power: when he argues that “democracy is a way of life,” he means that “democracy” is a metaphysical orientation which pervades the very ways in which we act and interact in the world. An orientation dedicated, roughly, to the belief that all people are equal and should share an equal role in co-creating the world.
While I’m not entirely sure I disagree with Dewey here, I am generally inclined to think instead of “deliberation as a way of life.” Dewey is right that democracy isn’t merely a system of government, yet the word does imply certain instrumental and institutional norms — eg, not only recognizing every person’s voice, but some process of giving every person a vote. Democracy ensures that deliberation is not simply performative, that each person’s voice equates to some measure of power.
This is certainly appropriate as a societal approach, but the challenge in trying to “live democratically” is that many sites of daily life do not (and, arguably, should not) afford equal power to individual participants. There are good and important arguments to be had about whether this is appropriate or not, but the widespread assumption is that children, students, employees, and others in hierarchal systems should not expect to be afforded the power of citizens in a democracy.
Perhaps we ought to seek to push the bounds of democracy into all these spaces — I would certainly agree that many are less democratic than they ought to be — but I would broadly be inclined to argue that democracy is not appropriate under all circumstances and we would be wise to treat it as such.
If the building is on fire, I don’t want to take a vote before someone decides to pull the fire alarm.
(One could, of course, argue here that we’ve all given implicit consent for a fire alarm to be pulled under such circumstances and thus this action would not represent of breach of democratic living; but nonetheless there are less dramatic circumstances under which it is appropriate for executive decisions to be made.)
All of this is to say that while “democracy” makes normative claims about the way decisions ought to be made, “deliberation” makes claims about the normative interaction of epistemic relations. That is – deliberation is fundamentally about the thoughtful and thorough exploration of knowledge, with the explicit assumption that we each not only have access to different knowledge but that we build and interpret knowledge in different ways.
These concepts, of course, go closely together — as, indeed, the phrase “deliberative democracy” would suggest. The Good Society ought to give all members power in making those decisions (ie, democracy), but members of The Good Society ought to interrogate their own and others views, seeking as much understanding as possible (ie, deliberation).
My concern about “living democratically,” then, is there are too many places at which a person may be inclined to cut corners — perhaps a toddler should not get to decide what they’re going to have for dinner each night. But while a reasoned person could certainly make an argument to that effect, making any exceptions quickly becomes a difficult slope — if there are, indeed, some circumstances under which democracy is not appropriate, how does one arbitrate those circumstances?
My answer here is, of course, deliberation — we ought to talk and discuss and argue about these questions.
But this also makes “living deliberatively,” as it were, a more powerful mode of interacting with the world. Taking deliberation as way of life means accepting that there are settings which are hierarchal, where some people have more decision making power than others. But it also means accepting, in a deep and meaningful way, that everyone brings value to the process, that every voice and perspective is desperately needed; that we all have agency regardless of how decisions are ultimately made.
This in turn creates a set of obligations – to share your voice, to listen to the perspectives of others, to fight for everyone’s voice to be heard; to recognize yourself and others as agents in the world.
I’ve been thinking about this a lot recently in terms of my pedagogical approach. My classroom is certainly not a democracy — and I’m not convinced that it ought to be — but my teaching style is decidedly deliberative. I want to know what my students think, I want them to tell me what works and what doesn’t work, I want them to argue against me, and to make suggestions for how they would like to see things done.
I like to share my pedagogical reasoning – to explain what types of learning styles a given approach is aiming to support; to ask whether my approach seems to meet that goal and whether there are other learning styles which need to be supported differently. As an instructor, these are my decisions to make, but they are not decisions I can or should make on my own. My classroom may not be a democracy, but my students are agents of their own learning and it’s important that they be treated as such.
This semester I’m teaching Programming with Data for Social Science, and my students have recently started reading Matt Salganik’s excellent book, Bit by Bit: Social Research in the Digital Age. The book gives a detailed and thoughtful overview of the many challenges and opportunities of computational social science.
One dimension Salganik introduces early on is the difference between “custommade” and “readymade” data. Borrowed from the art world, these phrases suggest different origins: custommades are intentionally created with a specific purpose in mind while readymades are repurposed.
Traditional social science methods tend to be custommades — you design your experiment, surveys, and sampling approach with a specific research question in mind. Data science, on the other hand, relies more on readymades — data are detritus of some other goal.
Both of these approaches have their strengths and weaknesses, and each appropriate under different contexts. Understanding this is a key piece of the art of computational social science.
Speaking of art, as I mentioned, the terms “readymade” and “custommade” come from the art world, and Salganik illustrates this metaphor by comparing two specific works of art:
[Marcel] Duchamp is best known for his readymades, such as Fountain, where he took ordinary objects and repurposed them as art. Michelangelo, on the other hand, didn’t repurpose. When he wanted to create a statue of David, he didn’t look for a piece of marble that kind of looked like David: he spent three years laboring to create his masterpiece. David is not a readymade; it is a custommade.
This excerpt doesn’t quite do justice to Duchamp’s work. Being somewhat less known that Michelangelo, I’m afraid this makes the metaphor incomplete. That is, you may understand Fountain as readymade without really appreciating what that means.
So, here’s some more detail on that readymade piece of art. As the Philadelphia Museum of Art describes:
In the spring of 1917, Duchamp, with the help of several friends, notoriously submitted a porcelain urinal to an unjuried exhibition held by the Society of Independent Artists in New York. Purchased from a store that sold plumbing fixtures, this object, which was titled Fountain and signed “R. Mutt” was rejected by a vote of the organizers, touching off a fierce debate.
Duchamp, who had been one of the organizers, resigned in protest.
In May of that year, the avant-garde magazine, The Blind Man — published by Duchamp and his friends — ran an editorial defending the work:
They [said] any artist…may exhibit.
Mr. Richard Mutt sent in a fountain. Without discussion this article disappeared and was never exhibited.
What were the grounds for refusing Mr. Mutt’s fountain: –
- Some contented it was immoral. Vulgar.
- Others, it was plagiarism, a plain piece of plumbing.
Now, Mr. Mutt’s fountain is not immoral, that is absurd, not more than a bath tub is immoral. It is a fixture that you see every day in plumber’s store windows.
Whether Mr. Mutt with his own hands made the fountain or not has not importance. He CHOSE it. He took an ordinary article of life, placed it so that it’s useful significance disappeared under the new title and point of view – created a new thought for that object.
While it may seem somewhat tangential to computational social science, the story of Mr. Mutt’s urinal makes me more fully appreciate the concept of readymade data.
Is it vulgar? Is it mundane? The art of Duchamp’s Fountain is in that very debate itself: he took an every day item and made it worthy of public discussion, encouraging us to question conventional wisdom and to ask questions we didn’t even know we had.
Similarly, there are important concerns about readymade data – questions of ethics and meaning, which should be rigorously debated. But that debate itself is an integral part of the scientific endeavor. The art of this work is in engaging critically with these concerns.
We may now be so inundated with data, so used to our movements and habits being passively tracked, that it’s easy to forget: there’s something profoundly radical about repurposing found data for social science research.
We CHOOSE it, we place it so that it’s useful significance disappears under the new point of view. In reimagining and reinterpreting these data we bring new knowledge into the world; we create a new thought for that object.
There is no shortage of pithy, pseudo-inspirational, questionably attributed social justice quotes.
Margaret Mead’s apocryphal “Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it’s the only thing that ever has” is a favored target among friends of mine, complete with a compilation of suggestions for making the expression more accurate.
I, for one, though, have always found myself intrigued by the quote commonly attributed to Mahatma Gandhi: “be the change you wish to see in the world.”
It’s the kind of thing I like to glibly quip in lieu of a well-placed do it yourself.
This is in line with how the phrase is commonly interpreted: if you want to seechange in the world, you have to bethe change: you have to engage in the work and make the change happen.
It’s this DIY spirit that makes “be the change” a favored expression among service organizations.
Of course, there’s no evidence that Gandhi ever said this.
Rather, the writing that comes closest to this sentiment comes from an Indian Opinionarticle published on August 9, 1913 in which Gandhi wrote:
If we could change ourselves, the tendencies in the world would also change. As a man changes his own nature, so does the attitude of the world change towards him….We need not wait to see what others do.
This suggests a more quixotic vision — it’s not just about doing the work, it’s about fundamentally reorienting your relationship to the world in order to force the world to fundamentally reorient its relationship to you.
In the Salt March, for example, Gandhi used the traditional practice of producing salt from saltwater to protest British regulation and monopolization of salt production. It was more than civil disobedience — it was an act intentionally designed bring the world the protestors wanted to life.
“Be the change,” then, is more than a call to service or an admonition to do the work. It is a challenge to unapologetically interact with the world as if it were the world you would have it be: to normalize realities by treating them as normal, to relentless tilt at windmills until the world accepts the truths you see.
There is something lovely in this sentiment, something inspirational in this vision of living in the world you want to live in, of building a better world by modeling a better world.
Yet, as with many things — reality is far more complicated, and we would be wise to critically interpret Gandhi in the context of his broader personal and philosophical approach.
While I would certainly be remiss to point to Gandhi’s words without acknowledging his deep anti-black racism and concerning sexual interest in young girls — the core commitment to non-violence for which Gandhi is so lauded is arguably problematic in its own right.
Indeed, it’s something of an understatement to say that Gandhi believed in non-violence. Rather, he believed in the transcendence of unshakeable virtue; that pureness of body and spirit could confront the most vile of evils; that suffering voluntarily brings such an inner strength as to provide the greatest thanksgiving, joy and deliverance — no matter what the cost.
That last sentence, incidentally, is taken largely from a November 26, 1938 piece titled simply, The Jews, in which Gandhi wrote: “If the Jewish mind could be prepared for voluntary suffering, even the massacre I have imagined could be turned into a day of thanksgiving and joy that Jehovah had wrought deliverance of the race even at the hands of the tyrant.”
So you can see, perhaps, why I would argue that Gandhi’s commitment to satyagraha went too far and even represents a moral failing when taken to its extremes.
It is also worth noting that the Indian Opinion passage above which served as the inspiration for “be the change,” comes from an article (p. 242) in which Gandhi essentially argues that love is the cure for snake-bites:
…one of the best defences against snake-bite is to have only as much as we need of wholesome food…to avoid anger and fear and, even when bitten by a snake, not to fall dead with fear before even a remedy has been tried. One should have confidence in the potent effect of the purity of one’s life and ultimately take courage in the thought that the length of one’s days is that ordained by God.
If only we could “rid ourselves of all enmity towards any living creatures, the latter also cease to regard us with hate,” Gandhi argues.
But regardless of whether I love or hate a snake — it may still bite.
Ultimately, though, the interpretation of “be the change you wish to see in the world” comes down to a question of power.
Power isn’t just about about the ability to control or coerce others, it is, in a sense, more fundamentally about the ability to control reality – to control the topics which get covered, the questions that get asked, and the perspectives that are considered. Power determines the bounds of normal and the imagination of what is possible. Power permeates our lived experience.
What’s inspirational about “be the change” is that it serves as a reminder that you have power, that your mere existence provides a pathway for shaping our shared experience of reality. “Be the change” is a proclamation that only you get to decide the kind of person you will be in this world; you get to decide what kind of world will be built from having a person like you in it.
The trouble is — there are far too many people who don’t get to decide. There are far too many people whose mere existence is under attack, who are met with hate and fear and violence just for the radical act of existing in this world.
You don’t get to “be the change” if you have to fight simply to “be.”
“Be the change,” then, is perhaps better interpreted as a statement of privilege; a commitment to allyship.
It is not enough to talk about making the world more just or more equitable; it is not even enough to engage in “the work” — though that’s certainly an important step.
No. If you have the privilege to be the change, if you have even a modicum of power over the tendencies of the world — then you hold that power in your every interaction, your every choice, your every experience.
Act like it.
Earlier this week, the United State Supreme Court issued two stays of injunctions, allowing a Trump administration policy barring transgender people from serving in the military to go into effect while cases against the policy proceed in lower courts.
If you’ve noticed some coverage of this case describing it as a “partial ban,” that’s because there are two exceptions baked into the now-implemented policy, which you can read in its entirety.
First, openly transgender individuals who are currently serving in the military will be allowed to continue their service, though there’s a caveat about “deployability” which I will return to shortly.
Just two and half years ago, on June 30, 2016, the Obama administration lifted the long-standing ban on transgender personnel. Known as the “Carter policy” since it was officially announced and implemented by then Defense Secretary Ashton B. Carter, the move came after the release of a commissioned RAND Corporation study which found that allowing transgender personnel to serve openly would have “minimal impact on readiness and health care costs.”
The same study estimated that between 1,320 and 6,630 transgender people were already serving on active duty. A more recent study, released by the Palm Center in in 2018, puts that number at closer to 14,700.
In the time since the ban was lifted, a portion of those service members — an estimated 900 active duty personnel — have begun the process of transitioning. It’s worth noting here that there are service members who came out before the Carter policy took effect, and it’s entirely possible that more have been serving openly under the Carter policy without transitioning. However, I haven’t been able to find any estimates on either of those populations.
The 2018 memo describing the Trump administration’s new policy on transgender service members indicates that those who began serving openly since the Carter policy will be exempt from the ban: “The reasonable expectation of the Service members that the Department would honor their service on the terms that then existed cannot be dismissed…[they] may continue to receive all medically necessary treatment, to change their gender marker…and to serve in their preferred [sic] gender, even after the new policy commences.”
However, the new policy also includes an exception to the exemption: “the Service member…may not be deemed to be non-deployable for more than 12 months or for a period of time in excess of that established by Service policy (which may be less than 12 months).”
On it’s face, this caveat seems reasonable — it is the Department of Defense’s standard policy that “Service members who are considered non-deployable for more than 12 consecutive months will be evaluated for a retention determination by their respective Military Departments.”
Yet, the new policy on service by transgender individuals seems to have little leeway, whereas the broader policy for all military personnel allows that the “Secretaries of the Military Departments may retain Service members who are non-deployable in excess of 12 consecutive months, on a case-by-case basis, if determined to be in the best interest of the Service.” It’s entirely unclear to me, however, which of these policies would take precedence.
Furthermore, the 12-months non-deployable clause can be “gamed.” While I am not aware of any formal data on this, service member Cathrine Schmid suggests that some transgender military personnel are already being considered non-deployable for longer than necessary “as a sort of backdoor ban.”
If such manipulation seems unlikely, consider that in 2017 — before the announcement of the Trump administration’s policy — a similar backdoor move was used to discharge Riley Dosh, the first openly transgender graduate of West Point.
She recently shared her story in reaction to the new ban:
…As I prepared for my graduation from West Point, I was handed a memo from the Pentagon that said despite completing every requirement asked of me, I would not be allowed to commission as an officer. The reason? Despite the lifting of the ban on trans military troops by the Obama administration — one reason why I came out — I still required a medical waiver to become an officer. Both the previous administration and West Point supported my commission, but the Trump administration did not. Unable to receive a commission, I was discharged upon graduation. I remain the last person to be discharged for my gender identity. Two months after my graduation, President Trump tweeted that transgender individuals would no longer be allowed to serve in the military.
While technically allowing current personnel to continue serving, the new policy creates a mechanism through which transgender troops can be forced out, and it certainly makes it clear they are not welcome.
But — I’ve only discussed one of the exceptions to the ban on transgender military service.
In a move reminiscent of don’t ask, don’t tell, the other exemption is for a sort of Schrödinger’s transgender person: a person who is somehow simultaneously trans and not trans. As the “New Transgender Policy” memo outlines:
Transgender persons who have not transitioned to another gender and do not have a history or current diagnosis of gender dysphoria — i.e., they identify as a gender other than their biological sex but do not currently experience distress or impairment of function in meeting the standards associated with their biological sex — are eligible for service, provided that they, like all other persons, satisfy all mental and physical health standards are are capable of adhering to the standards associated with their biological sex. This is consistent with the Carter policy, under which a transgender person’s gender identity is recognized only if the person has a diagnosis or history of gender dysphoria.
There’s a lot to follow in this paragraph, but it’s important to understand it. This argument is the most insidious part of the ban on transgender service members.
Gender dysphoria, as defined by the American Psychiatric Association, “involves a conflict between a person’s physical or assigned gender and the gender with which he/she/they identify…People with gender dysphoria may often experience significant distress and/or problems functioning associated with this conflict between the way they feel and think of themselves and their physical or assigned gender.”
A formal diagnosis of gender dysphoria from a medical professional is the first step in seeking medically-necessary care related to transgender health. However, it’s important to note that there’s a difference between experiencing gender dysphoria and seeking out a medical professional in order to receive a diagnosis of dysphoria.
The diagnosis serves as a gateway for additional medical care but does not alone demarcate who is transgender. A person may be transgender whether they receive a medical diagnosis of dysphoria or not.
Furthermore, while I’ll save the American Psychiatric Association’s history of gender diagnoses for another post, the inclusion of gender dysphoria as a psychological condition is not without controversy. As Dr. Daphna Stroumsa explains:
The inclusion of gender identity and transgender-related matters in the [American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders] reflects an inherent problem. Although diagnostic coding is necessary to facilitate access to medical and surgical transition care, the pathologizing and stigmatizing suggested by its designation as a mental disorder is not. Such designation gives rise to an inherent contradiction in terms: what is presented as a mental condition has recognized medical and surgical treatment. These treatments are aimed not at affecting or changing mental state but rather at addressing the physical components that lead to the dysphoria. Such logic makes GID or gender dysphoria a unique case of surgically treatable mental illness, which is an oxymoron.
In other words, while it represents a problematic pathologizing of the transgender experience, the existence of “gender dysphoria” as a validated medical condition helps transgender people receive the health care they need.
Now, let’s go back to that policy memo. “Transgender persons who have not transitioned to another gender and do not have a history or current diagnosis of gender dysphoria…are eligible for service.”
There are precisely two types of people who do not have “a history or current diagnosis of gender dysphoria”: people who do not experience gender dysphoria and people who suppress, hide, or otherwise bury their experience of gender dysphoria.
Those who do not or have never experienced gender dysphoria are not relevant to this ban at all — they are not transgender. Service members who are transgender, then, may only continue to serve if they steadfastly avoid disclosing or seeking any support for their experiences of dysphoria.
In other words, they can be transgender…as long as they’re not transgender.
Lawyer Chase Strangio describes this bind brilliantly:
…[The policy states that] trans people can serve as long as we don’t transition, are comfortable in our assigned sex at birth, and have no health care needs related to transition. But this is what defines a person as trans and why the ban is definitionally a ban on trans service. Yet, this idea that we can suppress our trans-ness or that our identity is less real than the identities of non-trans people is behind much of the anti-trans rhetoric we see in government policy and public discourse.
And this is why I argue that this clause is the most insidious part of the ban on military service by transgender personnel.
It’s bad enough that the policy will bar good people from serving their country. It’s bad enough that current service members will be forced out or may choose to leave rather than serve in the face such bigotry. It’s bad enough that this discrimination is being justified by unfounded concerns about cost and cohesion.
But the most disturbing thing is that at its core, this policy suggests that transgender people have the choice to not be transgender; that their identity is not real, that it can be easily suppressed. This policy suggests that the concept of “transgender” doesn’t truly exist.
And that is unconscionable.
Transgender people exist and they have existed for far longer than we’ve had the word. It is no passing fad or modern invention, it a fundamental piece of the human experience. A piece we’ve collectively been suppressing and deriding for far too long. If the existence of transgender people seems like it’s “new” it is because we’ve only recently begun to grapple with a truth that has always been there.
The ban on transgender personnel serving in the military is just one element in a coordinated attack on the transgender community; an attack that aims to roll back the clock, to shove a whole population of people back in the closet and permanently lock them in there.
We cannot let that happen.
A new study from the CDC estimates that 2 percent of high schoolers — around three hundred thousand Americans — are transgender. And presumably, that number is depressed by significant underreporting. These young people are real. They exist. We cannot let them grow up in a world that would deign to tell them otherwise.
The transgender community will not be erased.
Here are a few things you can do to help:
Donate to one of these great organizations — or at the very least sign up for their mailing lists.
- National Center for Transgender Equality (NTCE, https://transequality.org)
- NTCE’s political action fund (https://www.ncteactionfund.org)
- Transgender Law Center (https://transgenderlawcenter.org).
Call your representatives (Sorry, DC) — Even if you think they already agree with you, it’s good for them to know this is an issue their constituents are passionate about.
The word ‘crazy’ has the remarkable power to instantly render invalid whatever person, perspective, or practice it is applied to.
It suggests behavior that is illogical or irrational; that is so unpredictable as to defy the bounds of ‘normal’ human reason. It therefore invalidates through implicit othering — crazy people can not be reasoned with, their behaviors can be neither interpreted nor explained, their beliefs carry little more meaning than noise.
Perhaps this is why ‘crazy’ is typically used as a pejorative.
Yet, the beliefs and behaviors that are deemed to be ‘crazy’ change over time. They are continually interpreted and reinterpreted to fit the narratives of the day. Madness, in other words, is a social construct.
Foucault documents this in detail, pointing to stories of the mad, insane, and crazy that seem absurd to our modern sensibilities. Scientifically-defended theories of hard bile and hot blood, concerns over contagious epidemics of women’s ‘hysteria,’ illness interpreted as a failure of morality.
Again and again in the West, cognition and behavior have been interpreted through a narrow normative lens: anyone who thinks or acts outside this framework is taken to be crazy.
‘Crazy’ then, is perhaps better understood not as a property of a person, but as a property of society. To call something crazy is to place it outside the bounds of standard social norms, to say that it is too far out there to be reasoned with rationally. It is the intellectual equivalent of throwing up your hands and declaring there is nothing to be done — a reasonable person simply cannot engage with crazy.
Yet, its very nature as a social construct raises the question: who determines what is crazy? Creative works are full of stories of in which those deemed mad are perhaps the only reasonable ones. The French film King of Hearts, for example, contrasts the world created by asylum inmates with the brutal and senseless killing of World War I.
I find myself particularly drawn to the word ‘crazy’ because it is inexplicably gendered. It’s not quite as causal as the relationship between old and spry — but women are much more likely to be described as ‘crazy’ and the word has a long history of being used to discredit women and their experiences.
Given my description of ‘crazy’ above, this makes sense — if you can’t reason with someone who is crazy, if you can’t meaningfully interpret their words or actions, then you are free to dismiss their claims. There is simply nothing to be done. In this sense, the epithet intrinsically provides authority to the person using the word while diminishing the power of the person it’s applied to. It’s actually quite a brilliant tactical maneuver.
For this reason, many people prefer to avoid the word ‘crazy.’ There are other good reasons to avoid it, too — as you may have already inferred from the shaky language of this piece, ‘crazy’ has a deeply problematic tendency to casually lump together several different concepts. It dismisses mental health challenges, disparages neurodiversity, and glibly ostracizes any deviance from the supposed norm.
Yet — as someone who is ‘crazy’ along multiple of these dimensions — I find the word can give me power, too.
I wrote above that ‘crazy’ locates a person outside the bounds of the ‘norm.’ I think that’s true, but — I don’t find that the word itself places a normative judgement on that positioning. That is, we interpret ‘crazy’ to be bad because we implicitly assume that being outside the norm is bad. We accept that crazy people cannot be reasoned with because we implicitly assume that people who who are outside the norm cannot be reasoned with. We feel embarrassed or ashamed when labeled as ‘crazy’ because we implicitly assume that falling within the norm is good.
I reject those claims.
For one thing, I don’t really believe in ‘normal.’ We are all crazy. But more deeply — what we generally take to be ‘normal’ only refers to an idealistic conception of a small slice of humanity. Why should any of us fall over ourselves trying to fit into a norm that doesn’t exist?
I refuse to feel shame for who I am.
In that sense, I find being labeled crazy to be quite freeing, actually. Oh, you thought you could diminish me by saying that I exist outside the norm? Oh, no no no, my friend – this is where I thrive.
Being crazy means being free to discover and create yourself, it means not worrying about conforming to the norm, and it means not letting anyone dictate your truth for you.
To be clear, there are still plenty of other things to worry about. I hardly mean to suggest that nothing is true and everything is permitted. Rather, the types of things one ought to worry about — being good, compassionate, respectful — are very different from trying to be ‘normal’ or trying to fit someone else’s mold of who you should be.
And that, perhaps, is the best thing about accepting the mantel of crazy: it gives other people permission to be crazy, too. When we shy away from talking about mental health, when we assume a neurotypical view, when we accept ‘crazy’ as a personal fault, we implicitly reinforce the idea that these are somehow shameful or wrong.
Embracing and even showcasing those pieces of ourselves not only can be personally fulfilling, it implicitly sends the message: None of us should have to hide who we are.
So that is why I frequently choose to refer to myself as ‘crazy,’ why I tend to talk about my thoughts, actions, choices, and diagnoses with such levity. I cannot hide who I am, and more than that — I don’t want anyone else to do so either.
So, though it may defy all norms and reason, I will continue to describe myself with that word. I will continue to think my crazy thoughts, act on my crazy impulses, and aim to be the best person I can be with no regrets for the fact that person will never be ‘normal.’ And I will do my best to create spaces where others feel they can genuinely do the same. I feel no shame or hesitation in this commitment, it is simply who I am: a total crazy person.
I’ll start today with a somewhat bold claim: science cannot exist without humanism.
Note that I’m not merely saying that science is improved by humanists or that it might be wise to have ethics keep pace with our technological advances. To be clear, I would argue for both those points as well; but my claim here goes deeper:
Science cannot exist without humanism.
In other words, the thing we call “science” can only properly exist through a critical examination of the myriad ways in which humans create and interpret the world around them. The humanities are not some nice add-on or a means to slap an “interdisciplinary” sticker on your work — they are, indeed, an intimate a part of the scientific process itself.
To clarify, when I use the word science here, I more properly mean good science — science which is self-critical, methodical, and dogged in its pursuit of genuine understanding. There are, unfortunately, far too many things which would claim the mantel of science while definitively being bad science. Most notably, this includes some truly horrific medical experiments, but there are also more innocuous examples of bad science covering issues of replication, statistical techniques, questionable methodological choices, and even outright fraud.
My argument, then, is that the humanist orientation is a primary factor in differentiating between good science and bad science. I’m not sure I would go so far as to argue that it’s a sufficient condition, but I’ll argue here that it is a necessary condition. Science cannot exist without humanism.
I have done little so far to explain precisely what I mean by science and precisely what I mean by humanism, so let’s back up about two thousand years in order to elaborate.
Aristotle argues for three fundamental types of knowledge: technè, episteme, and phronesis. While not everyone may be familiar with these classifications, these categories still very much underly the Western conception of knowledge, especially, perhaps, within academia.
Techné, or technical knowledge, is the province of professional schools. Doctors, lawyers, and MBAs are educated in the techné of their trades. Episteme is the domain of the sciences. Closest to our modern interpretation of “knowledge,” episteme is the slow, methodical discovery of universal truths. Finally, phronesis is the core concern of the humanities. In, perhaps, a sign of our collective devaluing of this work, phronesis is the least tied to our modern understanding of knowledge and thus is the most difficult to explain.
Often translated as “practical wisdom,” phronesis is inherently action oriented. One of Aristole’s core virtues, it is the ability to determine the right action in any context and to unquestioningly follow through on that action. It is about being virtuous but perhaps more subtly about knowing what is virtuous.
Mcevilley, who argues for the translation “mindfulness,” quotes Epicurus in describing phronesis:
“[Phronesis] patiently searches out the motives for every act of grasping and fleeing, and banishes those beliefs through which the greatest tumult enters the mind.”
While the word defies a simple English translation, you can see, perhaps, why I associate phronesis with the humanities: it is the knowledge of critically analysis, of situating ethical judgements in the context in which they occur. It is the work of perpetually asking the question, what should be done?
When Thomas More, Erasmus, and others began arguing for humanist approaches which centered human — as opposed to godly — agency as a force in the world, this naturally drew on earlier conceptions of phronesis.
Now, these categories of knowledge aren’t perfectly split in the academy. Tenure track pressures of publishing, service, and teaching encourage a certain techné of their own — though someone considered brilliant in their field can often get away with poor demonstration of techné. Additionally, there have been some rather spirited discussions about a technical/humanist divide in philosophy, though here even the technical side — epitomized by metaphysics and epistemology — may still be more phronesis than techné. And Flybjerg has argued that trying to be episteme is the largest failing of modern social science — that to have meaning, social science must strive to be less like physics and more concerned with the phronetic questions of how to build the Good Society.
Yet, despite various intra-disciplinary battles, these type of knowledge have become largely separated from each other — and that divide is punctuated by a clear heirarchy of value. The war between episteme and phronesis is especially fraught – as episteme is broadly valued as a public good while phronesis is devalued as an indulgent exercise in self-reflection.
This divide is particularly striking in our so-called “post-truth” world that nevertheless pursues a strong positivist mentality. While you may be surprised to learn that we’re living in a “positivist” era, in the philosophical sense, the term roughly refers to the assertion that somethings are demonstrably factual and everything else is a matter of opinion.
This is, arguably, a core scientific tenant — if you can measure something, if you can systematically test different hypotheses, you can demonstrate whether something is factually true or not. If you cannot do these things you can make no rational argument as to the truth or validity of a given claim.
The positivist view implicitly devalues humanistic work. Anything that cannot be proven is subjective, and anything that is subjective is hardly worth rigorous study. Anybody may have a mere opinion.
Yet the positive claim also overlooks a core humanist tenant — everything we observe, measure, and interpret is done through the lens of human experience. Even in the hardest of the hard sciences we are biased by what questions we think to ask, what funding we can get to pursue those questions, what methods we choose to apply, what works we choose to cite, what interpretations we find in our results, and whose scholarship we choose to value. Science is, fundamentally, a human endeavor.
If anything, the increasing tendency of “factual” things to be interpreted as “opinion” should only serve to emphasize the permeability of the positivist line. We cannot maintain a positivist system if we cannot even agree on what qualifies as factual.
Perhaps the easy way out of this bind is to belittle those who do not see the facts that we do, who, as far as we can tell, refuse to be properly thoughtful and educated. The challenge here, then, is differentiating a noble heretic who fights for Truth against a biased system from a troublesome troll who maliciously spreads misinformation cloaked in “factual” arguments. History has seen no shortage of either type of agent, and each are equally greeted with scorn in their time.
The truth of tomorrow is not necessarily the truth of today.
That’s not to descend into total relativism and claim there is no such thing as truth and that all of reality is merely a matter of opinion. Rather, I would argue, truly good science requires remaining constantly skeptical. A good scientist interrogates the the biases of their data, methods, and fundamental way of thinking — and that inherently means being skeptical of our individual and collective ability to accurately determine what is “true.”
This is not at all easy to do — we are each products of and contributors to our collective social context and it is arguably impossible to entirely separate ourselves from that context. Given that this challenge comes at the bottom of an increasing to-do list of practical career pressures, the whole task even more daunting.
So while we each ought to seek to be humanists in our scientific endeavors, perhaps we’d do well to be glad that there are whole departments of scholars engaging seriously in this difficult work; questioning which parts of our received reality are deeply true and which parts are warping our precious scientific perceptions.
We cannot continue to pretend that science can be separated from the human experience, that it is somehow immune from the biases and fallibility of the humans who conduct it. We must recognize that the humanities are a public good and, indeed, provide the very foundation which allows for our work.
So when I argue that science needs humanism this is what I mean; that all scientific endeavors are prone to error and we cannot fully, scientifically, assess their truth-claims without first understanding the possible scope and implications of those errors. While we might prefer to separate the order of the scientific process from the messiness of human systems, aiming to do so is fundamentally bad science; it discards too many relevant variables. Good science requires self-skeptism, it requires an awareness of what is missing as much as it requires an awareness of what is there. Science needs phronesis, it needs to examine what is right as much as it needs to examine what is true.
Science cannot exist without humanism.
In July of 2013, I started writing publicly every (work) day. Then, after four and a half years, in November 2017, I stopped.
There are a lot of reasons why I started writing — and a lot of reasons why I let the habit go.
I was re-finding myself in 2013. After my father passed away in early 2012, I was absolutely shattered. I spent at least a year and a half just wandering the void; existing in the world without really living in it.
When at last I was ready to start thinking about picking up the pieces, I found I had become a very different person than I had been before. More caring, more compassionate, more acutely aware of the silent struggles we’ve all gotten so good at hiding from the world. And I felt more strongly than ever the need to put my own voice, skills, and energy to work towards the ongoing task of repairing the world.
This was a quandary for me. I’d long been committed to social justice; to doing what I could to make the world just a little bit better than I found it. But, at the same time, I had come to deeply internalize the belief which was consistently reinforced through so many of my experiences in the world: my voice didn’t matter. I didn’t matter.
I had aimed to put my time and energy towards good work simply because that was the right thing to do. It was laughable to think that anything I could do would ever amount to anything or that anyone would ever care for my opinion or insight.
It’s the sort of paradox which only makes sense within the bounded logic of one’s own head. I’d worked hard to elevate the agency of others; I’d argued that the voices and perspectives of all people are critical to building a more just world; I’d put so much of myself into advocating for these ideals — but I had never really believed them. How could I, if I didn’t believe in myself?
In my first post back in 2013, I described this challenge in relation to my plan to start writing publicly:
My struggle with blogging is that…in many ways, it requires a lot of ego. Well, I would say ego, but another may generously say “agency.” It requires standing up and saying, “I do have something to say, and I believe it’s worth your time to listen.” And that can be a lot to muster.
I see this challenge more broadly in the idea of being an active citizen, of truly engaging in public life…Even in smaller acts of engaging. To actively contribute to your community means believing that you have something to actively contribute.
Over the years, this sense of egoism continued to be the hardest struggle for me. Finding time and topics weren’t always easy, but those paled in comparison to the more fundamental challenge of constantly putting myself out there. Of acting like I had something worth saying even when I felt as though I were nothing at all.
But it was a good habit. It made me a better writer. It made me a better thinker. And doing all this writing publicly helped me find my voice. It helped me discovered who I am and showed me that, indeed — words do matter. Much to my surprise, I found that sometimes even my poor, broken words could help.
So I kept writing.
As foolish, egotistical, and self-important as it seemed. I kept writing.
But things changed over the years. I got busier with graduate school, I had other writing tasks I needed to prioritize, I needed to pass my qualifying exams and propose my dissertation. I have no end to my list of practical excuses.
There are reasons and there are reasons, though. Fundamentally, I was scared. I started meeting strangers who would seek me out to tell me how much they loved the way I write; who would tell me that I had somehow managed to put into words something they had been thinking or feeling. I started getting more pushback on every sloppy mistake I made as I rushed to fulfill my self-imposed quota of posting every single day. I started to more deeply appreciate the consequences of my words as actions — while it still seems impossible to imagine, I found that my voice did have power.
As I grappled with these issues in mid-2017, I reflected:
In some ways, public writing feels even more egotistical than before. Being a doctoral student raises the stakes of self-importance; I’m declaring a value for my contributions through my occupation before I even open my mouth. Doctoral students may be nobody in the fiefdoms of academia; but it remains a fairly fancy calling to the rest of the world. I can hardly consider myself to be a nobody while laying claim to the capacity to someday contribute to human knowledge.
This was a lot to take in. How could my voice matter? In what universe would people begin by assuming I was possessed by a comfortable air of self-confidence? What did it mean for me — a person holding so much privilege in this world — to be taking up space?
My writing started to feel like less of an exercise of civic duty and self-discovery and more of a venue for self-aggrandizement.
At the same time, I was becoming less impressed with the quality of my writing overall. I’d gotten tired, lazy — relying on tired tropes of self-righteousness without thorough thought or depth. This tone was popular in some circles, but it did little to advance the sort of dialogue I want to pursue. It didn’t reflect the sort of writer, scholar, or person I wanted to be.
So I stopped.
I’d once needed to find myself through writing in public and then I needed to find myself by reflecting in private.
But I’ve missed this. I’ve missed the intentional thought that comes from public writing. I’ve missed the ongoing learning I’ve gained through on- and offline conversations about my posts. I’ve missed hearing thoughtful criticism of my views and my writing — I remain grateful to every person who has trusted me enough to tell me when they think I’m wrong or when I could have expressed myself better. I’ve missed making time to think about things beyond what’s required of me.
Over the last few weeks, I’ve continually caught myself “writing in my head” as I used to do all the time. I’m not quite sure where that voice went in the fervor and anxiety of the past year, but I’ve started to realize that I need and value this space. Something has changed in me once again, it seems.
All of this is to say: I’m back. I won’t be posting every day, but I will be posting regularly — at least once a week.
I will write about science, math, social justice, and democratic theory. I will write about mental health and graduate school and random facts I picked up somewhere. I will write about whatever I need to say that week.
As always, I invite your thoughtful reflections as I continue this journey. We will certainly not always agree, but I will value your perspectives and consider your arguments seriously and genuinely.
They say that democracy is dead — that people can’t talk about anything of import any more. But I don’t believe that. I refuse to believe that. Democracy’s not dead — it’s only resting.
I look forward to learning from you all.