The intent of this series is ambitiously to address, section by section over the course of a year, the celebrated Crowds and Power by Elias Canetti[1]. This is the fourth entry in the series and the second addressing Part 1 (The Crowd), which Canetti breaks up into several sections. Rereading my other post about this particular section (The Fear of Being Touched), I realized I did little to offer an alternative to the generalizations Canetti seems to be insisting upon. This post addresses that.

Canetti begins[2]:

There is nothing that man fears more than the touch of the unknown. He wants to see what is reaching toward him, and to be able to recognize or at least classify it. Man always tends to avoid physical contact with anything strange. In the dark, the fear of an unexpected touch can mount to panic. Even clothes give insufficient security: it is easy to tear them and pierce through to the naked, smooth, defenceless flesh of the victim. ¶ All the distances which men create round themselves are dictated by this fear (15, emphasis in original).

A claim made by Canetti advocates is his tendency to drop things in a reader’s lap and leave them to decide whether they are true or not. On principle, I oppose both this explanation and method, to the extent that it describes Canetti’s book. I oppose this on principle because the idea that an author is (or can be) neutral is untenable.  Barabash (1977)[3] cites from Thomas Mann’s Reflections of An Apolitical Man,

a title which speaks for itself, to a recognition that “being apolitical is nothing less than being simple anti-democratic”, that “when culture rejects politics, the result is error and self-delusion; it is impossible to withdraw from politics in this fashion—one only ends up in the wrong camp”. Recalling how at one time, “in the name of culture and even freedom I resisted with all my strength what I called ‘democracy’, meaning the poiticisation of spiritual life”, Thomas Mann says that life taught him, and many like him, a terrible convincing lesson, graphically revealing the shameful ties between the apolitical aesthetic German burgher spirit and the most extreme forms of political terror, barbarism and totalitarianism. … We should recall these lessons more often. For the bourgeoisie today continues to increase and perfect its ability to play on notorious “anti-political” tendencies, and the danger of ending up in the camp of reactionaries may threaten the artist and intellectual who does not want to take any sides (15–6).

So it is not possible to “drop something in a reader’s lap” without an authorial point of view being implicated; such apolitical gestures must be understood as fundamentally reactionary. This fact brings out how reviewers at Amazon can turn Canetti’s work to the end of a kind of cynicism that resembles Schopenhauer’s:

The turning points of history reveal the true price of imaginary apoliticism with utmost clarity, as Mann convincingly demonstrates when he cites the example of Arthur Schopenhauer, “Nietzsche’s predecessor in the area of anti-intellectualism”. This scholar and philosopher, who declared that politics was philistinism, in 1948 called the revolutionary people “all-powerful scum” and “demonstratively proffered his opera-glasses to the officer who stood at the windows of his flat carrying out reconnaissance of the barricades so that it would be easier to direct fire against the insurrectionists”. “Is that what it means to be above politics?” the writer exclaims (16)[4].

Second, where an author seems to be indulging in this habit (of dropping something in the reader’s lap without any particular point of view expressed), it only seems that way because insufficient attention has been paid to the discourse of the work generally. For example, the observation dropped in a reader’s lap that Canetti begins his book with (“There is nothing that man fears more than the touch of the unknown”) is easily attached to an entire network of ideas that ground the statement in an imputable world-view or truth-class[5]. The idea that it is up to you to decide what something means, and that the author is merely an innocent reporter of these things, readily connects to the late-order capitalist/postmodern milieu that Barabash bashes.

But, saying this, what kind of engagement can come from confronting Canetti’s excessive generalization. How can I make sense of the notion that what I fear most is the touch of the unknown, &c?

A key move I have to do to “get onboard” with this is wrangle over the term “unknown”. If I am sitting in my study, and I suddenly feel something creepy-crawly on my leg or arm, I may tend to flinch violently; at a minimum, I will try to figure out what it is; I may try to bat it off. I think the strength of response may partly depend less upon the unknowability of the sensation and more the degree to which I am “violently” pulled away from whatever was engaging my attention. But it’s not necessary to insist upon this. The thing is: in the presence of such unexpected sensations, while I do most typically react in such a way so that I can identify (or see) what it is, it is not because the sensation originates from something “unknown”; rather, my life experience provides me with a wealth of things that it might be, including spiders or (potentially) other venomous insects, etc. My violent reaction (when it is violent) is not due to the thing being “unknown”—rather, it’s because it may be altogether too imaginable. The justness can be seen in flipping the circumstance around: if I am blindfolded, and my mate is going to lightly caress me with a feather, then the anticipation of that sensation and the very fact that I would put myself in such a “vulnerable” position shows that I am responding not to the unknown, but to the (in this case) pleasurably imaginable.

So if Canetti’s use of “unknown” may have some sense, then it is by understanding unknown in a strictly subjective way, as “something currently not understood by me.” This is where Canetti’s generalization loses traction, less because “unknown” transforms merely into “what I’ve not experienced before” (or something like it), but in the insistence that an encounter with this must be fearful. Self-evidently, if something imaginably repugnant touches me, I will react with repugnance, and if something imaginably pleasant touches me, I will react with pleasure, ceteris paribus. There is in all of this a remarkably limited view of the unknown, insofar as here it boils down (in all human individuals as system) as only those things not hitherto experienced (or, slightly more precisely, as any sensation that can be construed correctly or incorrectly in an either repugnant or pleasurable sense).

It is a false dichotomy to imagine that everything must be either repugnant (fearful) or pleasurable (attractive). It is, of course, child’s play to cram every human experience into the Procrustean bed of “love” or “fear” but this hyperbinary (by which I mean a dichotomy that is meant as an overarching simplification) does not actually get us anywhere analytically in the final analysis, particularly because it cannot make sense of how we are sometimes attracted to what we fear and fearful of what we love. Like all mere binaries (particularly one’s rooted in a false opposition, like this one), it is at least minimally necessary to expand the range of categories understood by the binary. Thus:



Attraction Love Fascination
Repulsion Loathing Fear

These proposed terms can only be considered connotative, not denotative. If “love” and “fear” may be taken as obvious (a perilous assumption, to be sure), “fascination” (I considered “abnegation” also) points to those qualities that appeal to us despite our desire otherwise. Similarly, “loathing” points to those occasions where we “know it is good for us” and yet we want nothing but to negate, destroy, be done with it. These affects are perhaps most familiar with the sexual arena, where fascination may lure us (not again!) into yet another horrible relationship with a person we know is no good for us, while loathing may drive us to betray, break, or call quits with someone who otherwise seems to be perfectly suitable as a mate, spouse, &c. Minimally, then, we may see that the hyperbinary love/fear (and its expansion above) points to at least four more salient explanatory terms that “get more work done” and explain more empirically lived human experience.

The disservice done to the concept of the “unknown” here involves its misprision as something knowable (to me) in the first place. If we take the unknown seriously, as for example Lem takes seriously the notion of what constitutes the alien (particularly in his Fiasco and His Master’s Voice, but also indirectly in his presentation of robots in Return from the Stars), then it is not possible for me to fear it, or have any affective response to it whatsoever. Following that distinction I picked up from somewhere, fear requires an object; anxiety does not. This makes “fear of the unknown” to be rather “the experience of fear (or some other affect) in the presence of something as yet still unknown”. So, ignoring these various more solid groundings, what Canetti’s sentence seems more to point to is: “man dislikes nothing more than unwelcome intrusions.” He wants to see what intrusions are on their way into his orbit and be able to decide in advance to accept or decline the visitor. The fact that Canetti’s mind turns to the example of a robber in one’s house points to this.

With respect to, “In the dark, the fear of an unexpected touch can mount to panic,” we remain in the domain of anxiety, not fear. Women will tell us another story, but men may walk blithely down a dark alley or into a silent wood with an easy heart, so that it is only by the presence of a sensation (either in the environment or in one’s imagination)—a sound, a shadow, a fleeting thought, “here there be dragons”—that ease will turn to anxiety as we objectlessly imagine the altogether too imaginable sources for that sound, shadow, or rationale for the thought. But again, there are people for whom this would be positively exhilarating, loathsome, or fascinating, &c. gain, fear takes an object and, in general, is a wholly rational and sensible response to actually present dangers. The maniac brandishing a knife in the alley, the wild animal rearing up in the forest—these are sources of fear, so there is no question of these touches being “unknown”; quite the opposite, they are altogether too clear. It is certainly true that “in the dark, the fear of an unexpected touch can mount to panic,” but this is by (anxiously or dreadfully) imagining touches (like a murderer’s hand over our mouth) that are already known.

When Canetti insists “man wants to see what is reaching toward him, and to be able to recognize or at least classify it,” he is emphasizing the disjunction between the “unknown” and the “imaginable” that undercuts what he is writing. By definition, one can only misrecognize or misclassify the unknown, if it is noticed at all. . We do however have one category for misrecognizing something, while still trying to maintain its character as unknown: the “unknown”. It is likely this is really only “the unfamiliar” but nevertheless, one can take an attitude toward “the alien” the “numinous” or “the unknown” and label it as such. As meaning-making entities, we can never not “see” what is reaching toward us; our reflecting consciousness can never avoid (except by death, perhaps coma, or derangement so severe that the reflecting person disappears from consciousness) making sense of some kind of what we sense.

So again, it is less “the touch of the unknown” and more “the presence of unwelcome intrusions”. The adjective “unwelcome” is key here, and literally unlocks what Canetti seems to be getting at, because if the presence has not already been constructed in such a way that it is unwelcome, then one would either not notice the intrusion at all (even in a numinous sense) or wouldn’t fear (the presence of) it. This requires some kind of construction or identification of the intrusion (a priori even), so it is not a question of being unknown.  The descriptor, rather, is that it is unidentified, and (because it is unidentified) may default to unwelcome—at least in those cases where intrusion is not desired. Canetti is candid enough when he admits the jostle of someone attractive is another thing altogether.

So it is not that man “wants to see what is reaching for him”; rather, we each will see what is reaching for us, no matter where we direct our gaze. We may overlook one thing by looking away, but our gaze will then be met by whatever reaches from where we look, and even if we squint our eyes shut, then darkness, inner images, or ectopic phenomenon will reach for us. We can negate the significance of anything our gaze falls upon as well, but we can only do so because we have already been “imposed upon” by sensing—our freedom is (at least in potential) in what sense we make of what we sense, but we are not free to determine or deny that we sense. As self-aware beings, I suspect that this is an existential human need on par with eating, drinking, sleeping, etc.

“Man always tends to avoid physical contact with anything strange.” Even in the context of Canetti’s paragraph, this is an ambivalent remark, signaled especially in the “always tends”—a generalization and subjunctive hedging all at once. That is, one can only “tend”; to “always tend” belies a (justly) vacillating mind, I say. Canetti knows (perhaps even in himself) that man does not “always avoid physical contact with anything strange”. I submit that that was probably the original of the sentence (“man always avoids contact with anything strange”), but this overtaxed even Canetti’s capacity for overgeneralization, and so (perhaps all at once, perhaps gradually in several revisions) the words “tends” and “physical” were added—he’d lost his nerve a bit after “There is nothing man fears more than the touch of the unknown.”

Let’s allow this could be a fault of the translator or maybe even the translator herself losing her nerve in the face of the text’s overgeneralizations. If it is not already clear, let it be clear now and henceforth: when I say “Canetti” I cannot mean the once-living existential being who is imputed to have authored this text; I can mean only whatever sense I make of what seems to be evident through that text, not for the purpose of defaming the author, but for addressing the text. Second, lovers of Canetti might defend some point or sense in the original; one can say that the translator did a hack job here or there, and so forth. Be that all as it may, the myth that there is some “correct” text out there in lieu of this one solves nothing; there are, at most, only alternative texts and critical opinion and fashion may place the tiara on this one or that one. Meanwhile, the publishers of both texts will continue to publish their texts, and the sense derivable in either will continue to be inputs to people’s thinking about these matters.

So I propose the thesis that the original of “Man always tends to avoid physical contact with anything strange” was ““Man always avoids contact with anything strange” in order to better understand the text, to better get at the underpinnings that Canetti is laying out. For example, by allowing that “man always tends to avoid,” there is the acknowledgment in that that man sometimes tends not to avoid anything strange. We already have Canetti’s candor about attractive people, but I have pointed out above how the fear/love hyperbinary ignores cases of fascination (as attraction to things that are repugnant).[6] That is, the text’s admissions of an “always tends” may be pointing precisely at cases of fascination.

What does the word “physical” (“physical contact with anything strange”) add then? On the face of it, this opens the door to man sometimes tending not to avoid spiritual or emotional contact with anything strange. It also places a particular emphasis on somatic repugnance that ties in with the jostling of bodies Canetti  generally dwells upon in this section. It is difficult to ignore how this attaches to the protagonist of Canetti’s Auto-da-Fé, who walled himself in with books and wound up in a disastrous apocalypse by pursuing a physical liaison with a fascinating/repugnant woman (that is the book’s description of her, not mine). As Canetti uses the (hand-like) phrase “reaching toward” to describe the unknown, here particularly the sense is (in this “physical contact with anything strange”) one’s hand reaching out to touch that something strange.[7] But also, if one avoids physical contact with anything strange, an inference (perhaps necessary if not yet sufficient) is that one could also both (1) seek physical contact with anything familiar (love, in hyperbinary terms) and also (2) avoid physical contact with anything familiar (loathing).

So we have once again another expanded hyperbinary: familiar, strange, strangely familiar, and familiarly strange.

The switch from “unknown” to “strange” signals, I will say, the commitment “at work” in Canetti’s text. The strange is definitively not the unknown, though it’s easy enough to elide them.  The unknown is something unknown; the strange is something carrying markers explicitly other than my experience. The latter, if I take the notion seriously, offers me no category into which I may place it; the latter distinguishes itself as not a member of any category I already know. The former demands a kind of paradigm shift in my thinking to accommodate it—if I don’t simply misinterpret it into an available category; the latter requires me to make room in my existing thinking for this difference—if I don’t simply deny it presence in my thinking )or social world). One can liken these to the descriptions of adaptation prevalent in Piagetian cognitive psychology: assimilation and adaptation. In the case of assimilation (parallel here to the “strange”), complex but relative familiar or unfamiliar objects or experiences are simplified to fit preexisting categories in one’s experience and thinking. In the case of adaptation (parallel here to the “unknown”), the structure of one’s cognition must alter in order to fit the realities of objects or experiences.

Thus, this substitution of “strange” for “unknown” is not an innocent move at all, but this is evident socially as well.  If I encounter a human being who in manner, language, and appearance is essentially unknown to me, I can attempt to adjust my categories of “what is human” in light of my new experience or I can misinterpret all of the differences that I am noting as being essentially “the same” as what I already understand. Reactionary notions of “race-blindness” fit into this pattern, for obviously negative social ends.  By contrast, if I encounter this same human being, I can construe them as strange and attempt to incorporate their apparent similarities vis-à-vis “what is human” into my preexisting categories or I can work myself up into xenophobic reaction to that strangeness. The issue here is less how I might react and the fact that these two basic categories of reaction are endemic and familiar enough. And since assimilation and accommodation itself may be construed as a hyperbinary, one may see also the two categories of an assimilation of accommodation (i.e.,. the Devil’s Cultural bargain of assimilation in general; the reduction of all human difference into the main category of dominating hegemony) or the accommodation of assimilation (i.e., the exoticisation of the Other, as in Orientalism, the Noble Savage, and the like).

These larger social issues notwithstanding, here the elision in the text from “unknown” to “strange”[8] is actually a crucial moment. In the “problematic” versions of accommodation and assimilation: (1) if accommodation implies adjusting to the given object or experience, then the familiar differences that objects or experiences present become the focus of attention with an aim to incorporating them into thinking; and (2) where assimilation implies the adjusting of the given object or experience, then the unfamiliar similarities that the object or experience become the basis of strangeness (and xenophobia). These versions are problematic because the former “misses” the actual nature of the object or experience while the latter denies the existence of the object or experience. The apparent familiarity of the former allows us to misconstrue it (albeit in a “friendly”) way as recognizable, while the apparently unfamiliar similarities of the latter allow us to misconstrue it (unfortunately in an “unfriendly”) way as strange—as “not recognizable” as something, actually that society “cannot recognize”. The dangers of the former include paternalism, Orientalism, &c; the dangers of the latter include marginalization, ostracizing, and genocide. Just to finish the thought, the “unproblematic” versions of accommodation and assimilation might be termed “learning” and “wisdom” respectively.

For brevity, I am going to refer to the “unknown” as synonymous with problematic accommodation (the “friendly” construing of apparently familiar differences) and “strange” as synonymous with problematic assimilation (the “unfriendly” construing of apparently unfamiliar similarities). Socially, the way that calling someone or a people “strange” (as socially not recognizable) leads to marginalization, ostracizing, genocide, &c., is clear enough as a social negative. The opposite insistence of someone or a people as “unknown” may be less immediately obvious as a problem, even after mentioning Orientalism, &c. It is salient how gay activism shifted from the early-80s from the claim “we are everywhere” to the current demand for marriage, which is simply the assimilationist demand (or claim), “we are you.” Any number of critiques of gay marriage precisely on this ground are offered by activists on behalf of non-heteronormative values. I suggest that this is a shift from a sense of being perceived by US culture as “strange” to being perceived as “unknown”. (There are, of course, any number of self-elected pundits who continue to use the dominant club of “strange” to brow beat community people who are nonheterosexually identified.)  If in the 80s (and earlier) the equation of pedophilia and homosexuality was widespread enough to require answering,[9] the current laudable desire for necessary legal recognition of people who are not heterosexually identified is precisely a kind of “different but equal” discourse. Note the “inevitable” appearance of the word “recognition” there, because what is at stake (in terms of the descriptive social discourse about the issue) is precisely the shift from the not-recognizable reality of “homosexuality” in the past as opposed to the (now possible) recognition of “homosexuality” currently. The worried critique of this is that this recognition comes only at the price of the Devil’s bargain of cultural assimilation.

I pick this example because it’s more politically stomachable. Where the designation of someone or a people as “unknown” really shows its teeth is in how people discourse about (particularly not physically present) Others (i.e., in the Middle east, in Africa, in Asia). The emphasis here is on the “friendliness” of the discourse. Under the discourse of “strange” one can denounce lesbians, Jews, and immigrants within one’s culture as destroying it; that at least was who Juvenal blamed some 1800 years ago, showing that the list of usual suspects hasn’t changed much. But when it comes to colonizing, it is as useful to construe people as “unknown” as “strange” (i.e., Terrorists, “out to destroy our way of life”). LGBT activists (if it’s correct to call them that) in the Middle East have asked that the “help” offered by the Gay international cease, for the various problems it introduces. (This is not everywhere the case of course.) But one can tease out all kinds of problematics from these “friendly” insistences on “unknown”. Domestically, the fascination with (i.e., the cultural construction of) Black penises and octoroon mistresses is emblematic of this. Our enthusiasm for the Arab Spring is, of course, predicated on the paternalism of finally “enabling democracy” (amongst ungovernable Arab tribes). &c.

So I am not ignoring at all the problem of the “unknown”. If the bloodshed and violence in the Soviet Union was predicated (in part) on a class distinction that made non-proletarians “strange,” then the bloodbaths, disappearances, and widespread social destruction that occurred in South America[10] through neoliberalism proceeded (again, at least in part) by that same conceit that we now see in the Middle East as “exporting freedom.”  In this sense, both “the unknown” and “the strange”  can equally have devastating social consequences.

Only because Canetti himself shifts to the use of “strange” will I emphasize then the importance of resisting the desire to make repugnant the unfamiliar. Here, I am no longer using “unknown” or “strange” in the way I just did. Canetti’s shift from “unknown” to “strange” discloses (I claim) the actual foundation from which Canetti’s argument proceeds. Only indirectly, if ever (so far), would he have meant “unknown” in the (problematic) sense I mentioned. His emphasis of “physical contact” (the avoidance of “sticking your hands in something”) is emblematic. That he would shift the “unknown” (in an unproblematic sense) to the “strange” belies the recreation of tendencies toward marginalization, ostracizing, and genocide, and it is for that reason that I am particularly emphasizing resistance to that.

This emphasis on “physical contact” helps to illuminate Canetti’s “solution” to the fear of touch: the press of the crowd. The absent part of the equation here is, precisely, that such contact is “welcome”. Someone who is in a crowd and stays in a crowd (as Canetti describes it) self-evidently “welcomes” that presence, whether because it is finally a welcome touch (of the unknown or the known), because an unbearable prison of self is finally paroled into the mass-mind of the crowd, or whatnot. Just as in the mosh pit, all of the hard physical jostling and smashing together is, precisely, a “welcome intrusion” (in fact, an even specifically sought one). In part, this is precisely based on construing everyone present (whether in the mosh pit or the crowd) as having a similarity of purpose—in other words, one can pretend everyone is the same. Instead of becoming awash in an uncanny strangeness of others, there is the warm fuzzy of Gemütlichkeit (or even collective Schadenfreude). It is precisely this anti-strangeness that is the converse of the repugnance described when, walking on the street, bodies collide. Except that one is free to assume another’s intentions, it appears that Canetti cannot impute or put attention on similarity in a sidewalk setting—there, the somatic bump of one body against another must be an unwelcome intrusion.

It is worth noting that, under the notion of karma one can arrive at the conclusion (one can arrive there by other concepts as well) that wherever one meets, perhaps even in the most passing of ways, is not a stranger; this is someone you have known in previous lives, there is a preexisting relationship (perhaps blissful, perhaps, strained, but a relationship nevertheless). The point is not that such a person cannot be odd; it is rather that they cannot be a stranger. It is without a doubt swimming against the massively rushing current of hyperindividualism in the United States to suggest that we might benefit from not assuming everyone we meet (or see in public) is a stranger (is an unwelcome intrusion). A major trend of technology (portable music players and cell phones in particular) have allowed us to “carry our private world” into the public domain, more or less as a kind of boundary or bubble. It is not simply that we are all on our way to be doing this or that and cannot afford to stop to chat, but more that the trend is toward colonizing the public with our private worlds. There are limits to this, obviously. Etiquette has (spontaneously) developed that we tend to go outside when we get cell phone calls, but it takes being a bit more than simply an asshole to tell one’s friends around the table when out eating not to answer their phone or text, &c. One could argue that the “invasion of the public” into one’s home (through the Internet, social media) is a parallel move, but not quite.  I may complain that the TV’s news (or the Internet) fills my private world with terrible stuff, but I’m still in control insofar as I can turn it off, ignore it, etc. Whatever (attempted) colonization of my soul I wanted to claim as going on, I’m complicit in it in some way. But when someone denies me their public self (by wearing headphones in public), the mutuality of the social setting is fucked up.  It would be received as really weird to ask such a person to “be present to me in this social world we share.” If someone were listening to loud music on a laptop without headphones or if they were carrying on in some kind of awful way, it would be much more normal seeming for me to ask them, for the sake of the social world, to tone it down, but our courage to stick up for the public this way is heavily undermined. Or, not our courage, but rather the social feasibility of it—the chances of not seeming off your rocker for making such a request.

Obviously more could be said on this point, but in particular Canetti’s shift to the word “strange” and the way that the crowd (for all he claims) functions primarily as a “welcome” form of touch (in distinction to the anti-social construction of “unwelcome” touch) is one of the ways that a cynic can leverage this text. Insofar as cynicism (c.f., Barabash, above) is reactionary, this particular way of construing the social world (not necessarily the people in it) as unwelcome individually and welcome when I can imagine we are all the same is obviously an argument (whatever Canetti intends) that is good for the status quo. The less that people are people, the more manipulable they are. As Jung insists:

As the individual is not just a single, separate being, but by his very existence presupposes a collective relationship, it follows that the process of individuation must led to more intense and broader collective relationships and not to isolation … (Psychological Types, ¶758)

A norm serves no purpose when it possesses absolute validity. A real conflict with the collective norm arises only when the individual way is raised to a norm, which is the aim of extreme individualism … The more a man’s life is shaped by the collective norm, the greater is his individual immorality (ibid, ¶761).

The construal of contact between individuals in the public sphere as “unwelcome” is a sign of the individual way being raised to a norm; so that the collective norm of a crowd Is offered as a remedy (as “welcome”) because then not only is social life negated by giving absolute validity to a norm, but also individual immorality spikes. One can see this particularly in the degree of entitlement one encounters in public, with its implicit assumption that whatever is good is whatever I can get away with.


[1] All quotations are from Canetti, E. (1981). Crowds and Power (trans. Carol Stewart), 6th printing. New York: NY: Noonday Press. (paperback).

[2] I want to be clear, I’m dwelling on Canetti’s opening paragraph because his section (“The Fear of Being Touched”) has two main moments: providing discourse for the opening assertion and then characterizing its solution or opposite (that man overcomes his fear of being touched “in the crowd”). To the extent that the framing of a problem implies its solution, one can will be able to infer in advance some rejoinders to Canetti’s opening salvo, but the details are still worth pursuing.

[3] Barabash, Y. (1977). Aesthetics and poetics. Moscow, Progress Publishers.

[4] See Mann, T. (1955), Gesammelte Werke (vol. 12). Berlin: Band, pp. 828, 830–1.

[5] It is not the purpose of this post to describe or characterize this truth-class; it is enough simply to note that it is not difficult to discern here.

[6] It necessarily oversimplifies things to rely upon the hyperbinary of attractive/repugnant, just as love/fear itself is overly simplifying. Etymologically: fascinate (v.)

1590s, “bewitch, enchant,” from M.Fr. fasciner (14c.), from L. fascinatus, pp. of fascinare “bewitch, enchant, fascinate,” from fascinus “spell, witchcraft,” of uncertain origin. Possibly from Gk. baskanos “bewitcher, sorcerer,” with form influenced by L. fari “speak” (see fame). The Greek word may be from a Thracian equivalent of Gk. phaskein “to say;” cf. also enchant, and Ger. besprechen “to charm,” from sprechen “to speak.” Earliest used of witches and of serpents, who were said to be able to cast a spell by a look that rendered one unable to move or resist. Sense of “delight, attract” is first recorded 1815.

Historically speaking, it is pertinent that the sense of “delight, attract” is only from the 19th century forward, and that fascination previously signaled something that “rendered one unable to move or resist”—Medusa must be a classical expression of this. All the same, the useful and marvelous specifics of this need not ultimately limit the range of examples for the hyperbinary expansion (fear, love, love of fear or fascination, fear of love or loathing). One could make similar remarks about welcome/unwelcome (i.e., those seemingly unwelcome visits that turn out to be highly fortuitous, and those seemingly welcome visits that are finally exposed as terrible. In the romantic domain, two immediate images are highly illustrative: when a mere friend comes to visit and an unforeseeable night in bed occurs, or the joyous arrival of one’s spouse-to-be, only to be told they are breaking up). So if I resort again and again to ‘attractive” and “repugnant” as descriptors in examples, it is vehemently against the notion—more frequently encountered, especially in the love/fear hyperbinary—that it is the only or even a necessary contrast.

[7] Canetti devotes other sections of his books specifically to hands and fingers, so this may not be an overreading.

[8] Here again, objections about translations are moot. Repeating the paragraph from before: Let’s allow this could be a fault of the translator or maybe even the translator herself losing her nerve in the face of the text’s overgeneralizations. If it is not already clear, let it be clear now and henceforth: when I say “Canetti” I cannot mean the once-living existential being who is imputed to have authored this text; I can mean only whatever sense I make of what seems to be evident through that text, not for the purpose of defaming the author, but for addressing the text. Second, lovers of Canetti might defend some point or sense in the original; one can say that the translator did a hack job here or there, and so forth. Be that all as it may, the myth that there is some “correct” text out there in lieu of this one solves nothing; there are, at most, only alternative texts and critical opinion and fashion may place the tiara on this one or that one. Meanwhile, the publishers of both texts will continue to publish their texts, and the sense derivable in either will continue to be inputs to people’s thinking about these matters.

[9] I heard a commentator report, not without scorn and not without referring to the landmark moment when the American Psychiatric Association depathologized homosexuality in its diagnostics manual, that pedophilia will similarly be depathologized in the DSM-V (due out May 2013). This remains to be seen; the current proposed language does not appear to delete this diagnosis. A tangentially related, but illustrative, issue may be see here.

[10] I point to these examples principally to make clear how, as the heightening rhetoric against the Soviet Union in the Cold War advanced, the “Western world” was systematically destroying South America more aggressively than in previous eras under the bloody banner of neoliberalism. But neoliberalism has also wreaked devastation in the US and England—the accomplishment of Reagan and Thatcher was simply to manage the feat without as much overt bloodshed. That’s the claim at least. The error of this claim is the relative lack of violence. For instance, whatever extra-legal disappearances occurred under US-supported dictator Augusto Pinochet’s direction, the numbers pale in comparison to the “legal” disappearances that have led to mass incarceration (thanks to the Reagan Administration’s inauguration of the still on-going war on drugs). The plunge in literacy in Peru following the arrival of neoliberalism is at least offset by the rational decision of the Peruvian government to default on its onerous and socially destructive IMF loans. In the United States, the plunging literacy rate is met by calls for even higher standards that assure even more students will be resegregated (socially) and likely funneled into prison. In the calculus of viciousness, the thuggery of merely beating me over the head and dragging me off to shoot me seems at least less duplicitous and disingenuous than creating a social environment that funnels me on a slope almost inevitably to fail—all the while blaming me for it.

The intent of this series is ambitiously to address, section by section over the course of a year, the celebrated Crowds and Power by Elias Canetti[1]. This is the third entry in the series, and the first that addresses Canetti’s Part 1 (The Crowd), which begins:

There is nothing that man fears more than the touch of the unknown. He wants to see what is reaching toward him, and to be able to recognize or at least classify it. Man always tends to avoid physical contact with anything strange. In the dark, the fear of an unexpected touch can mount to panic. Even clothes give insufficient security: it is easy to tear them and pierce through to the naked, smooth, defenceless flesh of the victim. ¶ All the distances which men create round themselves are dictated by this fear (15, emphasis in original).

My intention is not ever and always to reproduce the full text of Canetti’s book, but here, at the very outset of the text, some key points bear pointing out. With respect to the first sentence, for instance— “There is nothing that man fears more than the touch of the unknown”—I can readily imagine how the rhetorical function of generalization here may serve (by accident or design) to attract (by an appearance or actuality of profundity, truth, insight; i.e., precisely that kind of generalization Canetti is said to drop in the reader’s lap, leaving him or her to decide if it is true or not) or to repulse (by an appearance or actuality of crassness, inaccuracy, short-sightedness, that has the further consequence of eliminating that reader as a reader; i.e., the reader is nonplussed by the stupidity of the book and tosses it aside).

The significance of this kind of generalization, then,  is not whether or not it might be true (or in what way it could be true) but rather to self-select the desired reader of the text (and eliminate the undesired reader). This may be deliberate or merely consequential. For it is obvious enough that one could, with equal seeming veracity, insist that there is nothing that man desires more than the touch of the unknown; or even deeper still, that whether a man fears or desires the touch of the unknown, these are but two sides of the same coin. What I notice, then, is not merely how this lends itself to becoming a shameless and unreflective preaching to the converted (e.g., the self-avowed cynic from examined in the first post in this series), but how it more or less ensures that those who most stand to lose from the promotion of this point of view in society will tend to feel irked or disinvited or impelled to opt out of listening to, and so participating in, this dialogue.

What is most charming (or insidious) about this is how it effectively shifts blame onto the reader for “abandoning” the discussion. Canetti (or arguments in the same form he presents) is simply making a point; it’s your fault for not hearing him out. But this is an inaccurate statement. This kind of generalization, perhaps even in its unintentional forms, precludes countervailing dialogue—except that someone (psychologically quoting Aloysha from Dostoevsky’s Brothers Karamazov) simply says, “That can’t be so,” which (by definition) hardly counts as a dialogue. Imagine a monotheist (judeochristian or islamic) and an atheist having a conversation—the monotheist begins with the usual salvo, presuming that some kind of singular deity exists, &c. The atheist might be a good sport and go along with this dubious (or, from the atheist’s point of view, false) thesis, in order to demonstrate the internal fallacies of the monotheist’s arguments, but from the atheist’s point of view, the entire endeavor is masturbatory at best. Or perhaps one is trying to converse with a eugenicist, who wants to begin with the premise that Africans, or Jews, or Arabs, or Greeks, or any other so-called race of people is inferior to another. Or one can be in the presence of an (unintentionally ignorant) educator who, not knowing the history of eugenics lurking in the background of both IQ and standardized testing, wants to discuss the necessity of standardized testing.

The point here is that the premise of the would-be dialogue is fucked up. I’m not describing or advocating some blanket refusal to engage in dialogue—that seems more to have been the strategy of the Right particularly aggressively since 9/11 in the US—but rather to note that certain premises, from the outset, are not socially tenable. The inferiority of “other races” is not a conversation anyone should have to take seriously—although there are obviously enormous socioeconomic forces at work to dignify this overt and covert bigotry. The non-ambiguity of this particular example makes it seem inapposite when applied to the premise “a monotheist god exists” but in fact very few people actually want to live under a theocracy in the US. Putting this point in a generalized way, the question is whether the premise under examination and the consequences of it predominating are desirable—which naturally demands we also get down to the brass tacks of determining our criteria for what constitutes desirability in society, but that is for other essays. Meanwhile, are the premises of IQ and standardized testing desirable in our culture? Is the premise of inferior races desirable in our culture? Is the premise of intolerant monotheism desirable in our culture? Is the premise that only an elite cadre of powerbrokers should be permitted access to answering these questions desirable?

Or, in a theoretically more modest guise, is the premise that man fears nothing more than the touch of the unknown desirable in our culture? For the purpose of this series, one has to at least allow that perhaps Canetti answers “yes.”

I want to remind any readers, I’m not interested in psychologizing about Canetti’s reasons for writing what he wrote. And in general, I intend to make myself deliberately blind to “other texts” and whatnot that might add or detract from the specific meaning of this text. So the discursion below will be rare, and is included solely for the help of contextualizing what we encounter here at the very outset of Canetti’s book. Specifically, in Auto-da-Fé (1935), Canetti’s protagonist notes, “You draw closer to truth by shutting yourself off from mankind” (15).

No mind ever grew fat on a diet of novels. The pleasure which they occasionally offer is all too heavily paid for: they undermine the finest characters. They teach us to think ourselves into other men’s places. Thus we acquire a taste for change. The personality becomes dissolved in pleasing figments of imagination. The reader learns to understand every point of view. Willingly he yields himself to the pursuit of other people’s goals and loses sight of his own. Novels are so many wedges which the novelist, an actor with his pen, inserts into the closed personality of the reader. The better he calculates the size of the wedge and the strength of the resistance, so much the more completely does he crack open the personality of his victim. ¶ Novels should be prohibited by the State.

It is no challenge to place an “=” between the ostensibly literary “You draw closer to truth by shutting yourself off from mankind” (Auto-da-Fé, 15) and the seemingly non-ironic, non-literary (i.e., discursive) “There is nothing that man fears more than the touch of the unknown” (Crowds and Power, 15).  Nor is it any great leap here to read out of this a neurotic phobia of contact—this has absolutely nothing to do with whatever good or poor reasons Canetti felt, believed, or thought he had for generalizing his experience or his observations of the world and everything to do with whether we should assent to that generalization. This is extra-factual, since Canetti is quite dead and it is a particular variety of social discourse (and publisher book sales) that keep this assertion in social circulation. And if it is true that man fears nothing more than the touch of the unknown, then one arguable consequence of this is we are condemned to the status quo, since we (or, more precisely, our handlers in the domain of the social) must then (1) go out of our way to avoid anything genuinely new as an alternative to the current untenable present, and (2) whatever “unknown” (or new) does come along will be treated in the mode of fear, rather than excitement, &c. In this way, Canetti’s text (by no means the only input to this problem) reinforces the walls of the oppressive labyrinth in which we current find ourselves (even those who think they are most free).

With the first four paragraphs (of the section titled The Fear of Being Touched), assertions point to how the fear of being touched manifest. In public, our repugnance in being touched can be overcome if we are attracted to someone—so already a covert eros comes to the fore in the argument. What becomes immediately clear is that it is not the touch of the unknown, after all, but apparently the touch of the altogether too imaginable that Canetti’s text describes. He points to the “promptness of apology” we offer if we bump into someone on the street. If such contact is already eroticized, then the repugnance has something of a rational base—a man brushing against me amounts to solicitation in this context. But, at the risk of seeming cheeky, such touchiness of interpretation does not align with our reasons for offering an apology. It is not merely a fantasy or piece of sublimation that one is sorry for colliding with another; it is not at all, everywhere and always the case, that such apologies are offered to avoid getting into an argument or physical altercation with someone.

What I feel must be resisted in the tendency here to want to insist on a singular explanation for things. Berlin in Russian Thinkers has written an extensive essay (and Aileen Kelly’s introductory remarks beautifully formulate) that explores why pluralism (as opposed to monism) remains so difficult for human (or at least Western) cultures. It seems to hinge less on a claimed exclusivity of truth per se and more on the social enforcement of that exclusivity—and the 20th century (to say nothing of other eras, including the present) continue to bear witness to the conflagrations that result.

But in the present case, it is not only an insistence on a single meaning (to the apology offered for bumping into someone), but the particularly grossly reductive quality of this explanation.

The promptness with which apology is offered for an unintentional contact, the tension with which it is awaited, our violent and sometimes even physical reaction when it is not forthcoming,, the antipathy and hatred we feel for the offender, even when we cannot be certain who it is—the whole knot of shifting and intensely sensitive reactions to an alien touch—proves that we are dealing here with a human propensity as deep-seated as it is alert and insidious; something which never leaves a man when he has once established the boundaries of his personality (15).

It won’t be stretching things to say this is stretching things, and it becomes child’s-play to see the temptation to resort to cheap psychologizing here. But again, if Canetti is merely neurotic, then why publish the book in several editions. I say, not because these words are “maybe even true,” but because they help to bulwark the status quo in a way that paralyzes resistance to the status quo.

A person might apologize out of a genuine sense of remorse for the collision; they might apologize out of a desire to avoid violence from giving someone offense; they may apologize out of a sense of something deep-seated and alert, &c. Nor must we reduce these three reasons to the “root” one (the deep-seated antipathy, &c). One can say—and any number of grumpy social critics, including Canetti—can insist that the polite gesture of remorse is nothing but a sublimated or well-trained (civilized) avoidance for the older (say, Renaissance-era) punctiliousness about one’s reputation. But we needn’t only go back in time for these things. In prisons, one may often need to be careful about who one jostles, because it is a matter of face, of reputation. Inmates may frequently, and with good cause, be touchy and sensitive to any possible gesture by another to “make him look like a punk”—as someone who can be jostled. Not paying attention to these things can have dire consequences; more pertinently, that the inmate can imagine they may (sooner or later) have dire consequences further drives their touchiness about being jostled.

The reality of this points not to its generality, but rather its social specificity. In other domains, the threat of violence or the consequences of jostling are either not in play or do not rise to the level of social significance. If someone in a general social situation is so inept as to run into me, there is less at stakes in terms of my reputation—it may be I nearly spilled my coffee, my sentence was interrupted, I experience some small pain somewhere. As with an inept display of driving, my reaction is a negative judgment about the competency of the other person—nothing like hatred ever enters into it. Only with a second, third, or fourth repetition of the same “clumsy” gesture would I begin to suspect that someone was fucking with me deliberately.

So, to be more precise, we might imagine that someone jostled will respond in whatever way they respond, and perhaps the range of those responses might be grouped into classes of response. Even in prisons, not everyone will or needs to be touchy about reputation—a simple, “Sorry, wood,” might do, &c. In any number of neighborhoods around the world (i.e., outside of prisons), face-saving is close on the cuff of jostling, &c. And in other locales, an aversion to contact may make it seem like the root of touchiness in general. So, what makes untenable the attempt to reduce a promptness of apology to something like “a human propensity as deep-seated as it is alert and insidious” is the failure of this premise to account for lived-experience. Like any other monistic explanation, it applies where it applies, and distorts when it is misapplied where it does not apply. Jung makes the same point bout the misapplication of Freudian and Adlerian psychologies (both of them monistic systems) where they do not apply.

Ultimately, all monistic gestures negate significance. They leave us without choices (except to rebel); they deny alternatives (and thus prove reactionary and supportive of the status quo).

Monistic framings of problems led to monistic (one-sided) solutions. Thus, as man fears nothing more than the touch of the unknown, it “is only in a crowd that man can become free of this fear of being touched” (15).

It needs to be said. There may seem to be an article of bad faith in “picking on” Canetti’s text for its generalizing words (i.e., the “only” of “is only in a crowd that man can become free of this fear of being touched”). However, notwithstanding that an author may be held responsible for what is written, the dominant note in the text are precisely these kinds of continuous overstatements. It seems to be a piece of particularly wooden-heated ignorance to start with the generalization “there is nothing that man fears more than the touch of the unknown” when a moment’s reflection suggests the alternative “there is nothing that a human fears more than what she or he most fears”. And if that seems a tautology, one may recall von Foerster’s quip that he would demonstrate just how much work one could do with a tautology. (One could invoke Wittgenstein too, but I’d sooner not.)  In the marvelous The Naked Civil Servant, the following dialogue (recomposed from memory) occurs:

                Woman to Quentin Crisp: “Oh Quentin, I love you.”

                Quentin to Woman: “Oh, but I’m afraid I only love men.”

                Woman (laughing) to Quentin: “That doesn’t matter.”

                Quentin to Woman: “It matters to those for whom it matters.”

The insistence that there can only be one explanation (the Truth) and all the rest doesn’t matter is a fine piece of propaganda for damning people to oppression. At a minimum, Jung was able to imagine at least 8 classes of viewpoints; the Jainists acknowledge that all truth is partial, so let’s not fight about it (and that truth manifests, in its partial forms, in seven ways). And do I really have to say, “It’s not that Jung or the Jainists are right and everyone else is wrong?” It is, rather, that empirical observation of lived-experience itself makes obvious that multiple classes of viewpoints are identifiable in culture and that oppression results from the (usually violent) imposition of one of those views to the detriment of all others.

So even if Canetti’s text could be excused these overstatements on the grounds that he doesn’t, in some way, really mean them as generalizations, their social use does not add to a cultural discourse where classes of points of view resist extirpating, silencing, or slandering other points of view.

Of necessity, this means (for this series) attempting to bracket the class of Truth embodied in the text. In the crowd, for instance, a person becomes identified with everyone else; it brings about “the reversal of the fear of being touched …. The feeling of relief is most striking where the density of the crowd is greatest” (16). The euphoria (or relief) that is being expressed here is not consonant with the cynicism marked by the Amazon reviewer.  That reviewer insists that Canetti’s text provides “a powerful and plausible view of life that you’re going to have to put out of your mind the next time you find yourself at a party, in the office, or in a crowded theater—well, really anywhere you find yourself confronted with other people. You see, they all have one driving passion: to survive you.” Whatever else the text might claim elsewhere, here there is seemingly solace (relief) to be found, and maximally, in the maximally condensed crowd. There is precisely no question of individual survival here, as all have become one, as it were—and the next few sections will suggest that in fact it is the crowd that wishes to survive, not anyone in it. Thus, while the affect of antipathy and hatred the text survives may seem to be a similar basis for the misanthropy the Amazon reviewer boasts of, it seems rather that the two points of view do not, actually, share the same truth-class.

But stepping back from this, it can be further noted—again by the least moment of reflection—that even those who long for the touch of the unknown do not ever and always call forth one’s surrendering to the crowd. Because the text’s generalizations are so overstated, it frequently suffices that merely one counter-example is enough to dissolve them. One can easily imagine the person who, swamped by a crowd, has no desire but to get the hell out of there and, at the very least, to the fringe—or one can have the vision of the bookworm, holding his books to his chest in a kind of apotropaion as people surge around him; or conversely the inspired punk, smashing about in the mosh pit with a liberated sense of freedom (if not literal ecstasy from the music), not at all subsumed into the crowd but exalted above it (perhaps even literally crowd surfing). Once again, the monism of Canetti’s text is distinctly fascistic.

What is striking in this section—the opening section—is its exaggerated one-sidedness. On the one hand is an absurd overstatement of revulsion at being touched followed by an absurdly limited reaction to being in crowds. One could talk about denial and repression here as a way to get at how the text could emerge like this—that in all of the touchiness of the repugnance and antipathy was really just the stone-faced or stoic denial of a more motive, inward desire to be touched, even crushed, as closely as possible to other people—a description familiar to the Underground Man, who for all of his sickness and spite confesses if you just give him a cookie and a pat on the head he’ll collapse in a jelly-heap of gratitude and sweetness.

This oscillation may clarify itself further along in the text. For now, in this, the text’s opening section, the fundamental point seemingly striven at is that the violations of one’s boundaries (as a personality) that occur in day-to-day interactions are relieved in the boundary-dissolving press of the crowd. The text gives the impression that it is in the crowd that relief occurs, so I’m inclined to read the “hatred and antipathy and repugnance” of the fear of being touched not necessarily as a positive statement (as one of the centers of this truth-class) but in a negative light, as the undesirable condition that (unfortunately) prevails typically outside the domain of a crowd.

It may be that such a description accords with your experience; it is the fact of the inaptness of that for other people’s experiences that I am writing to challenge the monism of this view—not to extinguish your view or experience. It is possible that Canetti’s description is ironic, in which case the truth-class he’s describing may be salvaged from the irony by removing the scum of distaste, repugnance, or irony. Or it may be that Canetti is ambivalent about his subject, in which case it becomes necessary to carefully sort out the contradictions in an attempt to find the statistical center from which the text “really” operates. Whatever nuances may come out of this, the vulgarization of those nuances that take Canetti’s text as a cool well at which to refresh one’s cynicism are immune to such niceties; the use of the text to support the premise that we should hate our neighbors grounds the center from which I’m operating and resisting.

[1] All quotations are from Canetti, E. (1981). Crowds and Power (trans. Carol Stewart), 6th printing. New York: NY: Noonday Press. (paperback).

This is part 2 of a two-part post. Part 1 is here.

(Here is the declaration for “superstition” I offered from part 1 of this post: “The essence of superstition is to believe something despite all knowledge otherwise”. The list of superstitions is:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I come out of the closet, people will reject me or hurt me.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.)

People who may have elected not to go see The Dark Knight Rises due to the shooting in Aurora, Colorado are exhibiting a variation of the closeted homosexual’s thinking, and the fact that some of you may be scoffing at such people does not let you off the hook. You will act every bit as superstitiously as soon as the consequences are dire enough. Thanks to the media coverage, one would have to have been incognizant of the event to go see the Dark Knight Rises after the shooting and not have the (unbidden) thought, “What if someone shoots up the theater I’m in” go through your head like a bullet. Echoing in the back of that thought is the foundational kernel of superstition, “Well, someone might.” This is why I say it should still be called a superstition.

But I want to dwell on this some more. First, it has to be becoming more and more obvious that there is some serious egotism at work in superstitious thinking. Apparently, I am so vitally important to the world that if I come out of the closet, simply everyone will continuously beleaguer me in every way possible; they’ll even go out of the their way to do so. And if I walk under a ladder, the very metaphysical principle of the cosmos itself (if not actually “god”) will personally intervene to punish me for my temerity. In this case, “bad luck” is exposed as “sin” insofar as transgressing a sin will have the utmost dire consequences imaginable[1]. So since, as an arch piece of superstition, re-casts the actual nature of the world into one where the individual takes on the utmost significance—so that by committing a sin, the whole apparatus of reality must be involved in correcting it. So it is the self-same egotism at work in the (rather excited, thrilling) fear that going to see a movie will put you in the thick of it, like those in Aurora, Colorado. I’m thinking specifically less of those people who stayed home out of fear (probably not most people) and more about the ones who added some additional spice to their experience of the movie by left-handedly “hoping” they’d get caught in a disaster as well—a wish that would evaporate the very moment the very first spritz of gas canister sprayed into the auditorium. One might pick infantile, juvenile, or adolescent as a way to describe this part of superstitious thinking[2].

So, superstitious thinking has a strong streak of excessive egotism about it. In fact, just as the shift from a claim of possible to plausible involves the slippery substitution of one “reality-principle” for another (s when sliding a “god” into a “godless” cosmos), the substitution of the communally shared and lived world of people for one in which I am the most determinative and important factor points to one of the central ways that superstition can get its hooks into us. Current life in the US (just to limit the scope of my comments) is certainly alienating; that adolescent sense of hopefulness or wanting to save the world seems relentlessly disabused by simply observing the world, &c. either because we remain emotionally regressive around this point (where life supposedly convinces us gradually of our insignificance) or because we really do live with a sense of that insignificance, superstitious thinking is an obvious (and obviously appealing) counterbalance.

In several places, Jung remarks to the effect that modern man thinks he’s ever so modern, but give him a bit of a smack and he turns back into the very living image of an other-possessed non-modern. With the view of superstition I am presenting here, self-honesty can only make us realize how frequently we get entrenched into superstitious thinking. And so we go to the movie, and in moments of distraction it might come back to us again that we could be killed at any moment. Or, what is more likely, as soon as we see someone in the theater who could serve as a plausible projection point for the idea “someone might shoot the place up,” the thought will manifest there. We can then laugh it off, and feel proud of ourselves for not being one of the mindless sheeple stampeded by fear, but the irrepressible recurrence of the thought “it might happen” is the most salient point of this dynamic, because it’s that recurrence that will eventually wear us down. It’s on this basis that the jackass insists, “There are no atheists in foxholes.” If that is supposed to be some kind of proof that religion is secretly our last resort, I’m inclined to see it as a terrible piece of (self-) humiliation that the atheist experiences. The terrible, brainless stupidity of “maybe it’s true” has finally (in a moment when he is most vulnerable) worn him down to his last nubbin, and (in desperation) he reaches out to a completely false, completely inefficacious “solution’ that he himself finds humiliating and revolting to grasp. If that’s the Good News, spare me.

I fault no one for wanting to avoid a sense of insignificance in life. But insofar as that is a caricature of human life, so the idealization proffered by religion (and superstitious thinking in general) that places the person at the center of the universe (as its sole and determining force) is equally undesirable and simply the exaggerated obverse of the coin. Of course there’s all kinds of “drama” in the religious view, if one bothers—the spiritual warfare of Satan himself (rarely ever one of his minions) showing up to personally fuck with you; just as it is considered the height of affective appeal that the Almighty Supreme loves you. One wants to survive a face-to-face with the devil, just as one wants to survive a face-to-face with James Eagan Holmes (or any of his ilk)—and all of this presupposes the necessity of such a confrontation in order for significance to be generated.

And this is where we begin to encounter superstition on a social level that has important, undesirable consequences. If we don’t walk under a ladder because it might bring bad luck, if we wash our hands because we might get sick, if we stay in the closet because violence might be done to us, this is all of a piece with not doing anything when oppressive security forces start rounding people up to be taken to camps because “maybe the charges are real.” On a practical level, we might choose to believe that because otherwise the alternative is too frightening, just as we might try to find a “rational” explanation for random violence that happens to people we know.

It must be reemphasized: these patterns of superstitious thinking apply only where people indulge them to be sure, but we all have our own patterns of thinking that will manipulate us just as surely as someone who believes in the power of black cats, germs, or homophobes. The fact that we have to talk down, manage, or ignore the recurrent argument “but maybe it’s true” in whatever context it appears in for us individually points to the nexus points for the issue socially.

One of the more boring and tedious tropes in criticism of the US is the notion of greediness. Most assuredly, many people are seriously greedy, but the overwhelming majority of people in the United States are consuming in excess for the sake of other people, usually children. We can always blame breadwinners and capitalists for being merely greedy, for benefiting from whatever financial depredations they practice upon other people, and of course those depredations have personally satisfying consequences for them, but they also are often in the service of supporting (willingly or not) loved ones, other family members, friends, &c. I mention this because these sorts of affective bonds can play important roles in leveraging our submission to an “it might be true” (superstitious) argument.

As a first fact, hardly anyone (even in Aurora, Colorado) would have thought twice about seeing The Dark Knight Rises after 20 July 2012 if the story were not widely distributed in the media. Superstition being cultural, there must be cultural dissemination to have superstition—more precisely, to leverage our resistance to the “it might be true” arguments of lived experience. The case of this movie is an excellent one, because so few of us are willing (really) to seriously believe the notion that anyone will jump up and open fire in the very theater I’m watching Batman in. In other words, the perceived consequences (though dire) are undermined because they are simply not credible enough. It’s probably this same “risk calculus” that permits us to get into a car and drive anywhere despite the overwhelmingly greater danger[3]. At this point it may be becoming clear why I left out the adjective “rational” when I wrote, “The essence of superstition is to believe something despite all knowledge otherwise.” Just as we might disingenuously elide the possible into the plausible, we similarly like to elide the reasonable into the rational—where the reasonable is simply that which might be done, and the rational is that which might most plausibly or sensibly be done.

I probably wouldn’t argue that it is reasonable to “exercise caution when driving, since there are dangers when driving that can be avoided” but I’m not going to agree that driving cautiously is rationally necessary in order to avoid accidents. This is exactly the same illegitimate move as when hand-washers cite medical mechanisms to justify other grounds for behavior. We all know perfectly well that both attentive and inattentive driving avoids accidents, &c. And that driving cautiously does not guarantee avoiding accidents. Nor is anyone going to claim that’s what they’re claiming, except that that’s exactly what they’re claiming. Just as hand-washing reduces the risks of infection (or smoking increases your risk of cancer), those statistical predictions are (1) only statistical, and 92) say nothing specific about the specific circumstances one is specifically involved in. We drive cautiously to avoid having accidents, but cautious driving can only guarantee a lowered risk of being in accidents. In other words, cautious driving isn’t really doing what we are hoping it does, just as washing our hands does not, just as being in the closet does not, and just as throwing salt over our shoulder does not. So even in this apparently obvious case of the “dangers of driving,” we see more of the same kind of pseudo-rational superstitious thinking at work. And the underlying objection that I have to this is that it means we are doing things that do not get us to the ends we desire (e.g., driving cautiously to avoid accidents, washing our hands to avoid disease)[4].

So it’s partly for this reason I do not claim that superstitious thinking overthrows our “rational knowledge”. My years of incautious driving could be used as an argument against the rational claim that I should drive cautiously. In other words, my lived experience runs contrary to the rational assertion that cautious driving is the only way to go. Lest there be any illusions, if one wants to say that “cautious driving” is synonymous with “defensive driving,” then I’ve not been a defensive driver; in fact, I’ve very often been an offensive driver, and when I was still in college, I was an offensive driver in the pejorative sense of the word. Looking back, I could say that in a sense I relied upon the defensiveness of other drivers to allow me to be an often irresponsibly aggressive driver. I’ve since calmed down a great deal, but it’s perfectly obvious to me that my empirical experience is contrary to what might be called a “rational knowledge” about how one should drive. Similarly, my empirical experience of not hand-washing is definitely not what might be called the “rational knowledge” claimed about hand-washing. And I know I’m not the only one in this regard. So, if one can conclude that it is reasonable to drive incautiously or not wash one’s hands (or walk under ladders), this is an empirical refutation of the “rational knowledge” claimed under the guise of “if you don’t drive cautiously, you’ll get in accidents” or “if you don’t wash your hands, you’ll get sick.”

So, succumbing to superstitious thinking, then, would be when I wash my hands on the presumption that  that is enough to avoid being sick, or when I drive cautiously placing all my faith that that will be enough to make me avoid accidents, or when I don’t walk under ladders with the notion that that’s enough to avoid bad luck. Now, of course one can immediately say, “Wait, that’s hardly a necessary or sufficient condition for any of those outcomes.” Of course, but a person doesn’t tell themselves, “I’m washing my hands to lower the risk of infection”; they say, “I’m washing my hands to avoid being sick” (or, perhaps even more often, “I’m washing my hands because it’s what you’re supposed to do”—which really points up the superstitious part of the thinking). But whichever of the three rationales people give for washing their hands after going to the bathroom, none of them actually have any relationship to the specific circumstances in which the person is. If there are no germs being washed off, then one is not in fact lowering the risk of infection, because there’s no infection to be risked. Or, in the more ironic case, it is the infection on the paper towel dispenser that gets them, because they had to dry their hands off—thus the air-driven hand-dryers in public restrooms, which we all—if we have any sense, of course—hit with our elbows to turn on. And so forth.

It might seem I’m being willfully perverse or that from all of this we should conclude that everything we ever tell ourselves can only be false, so fuck it. Thank you for the hysterics. The point of disconnect is not that everything we know must be necessarily false. I could wash my hands after going the bathroom because I don’t like how they feel or smell. I could drive cautiously because I’m able to pay more attention that way. Medical science is so fraught with competing explanations at this point that it’s quite hopeless for me to listen to what they have to say in order to make a decision about not getting sick—hand-washing is certainly not one of those strategies. And if I’m afraid of going to a movie theater because someone will kill me, I’ll stay home and be victimized by the burglar who’d counted on me being out—bad luck that. There are any number of people who would never go to sleep at night without locking their door—and other people who never lock their door. Some don’t lock their door because “that attracts the negative energy of robbers”. Some people don’t lock their car door because “if someone’s going to steal something in your car, at least you don’t have to pay to replace a broken window when you get it back.” All of these “rationales” (as at least worthy of being called pseudo-superstitions) are rooted in actual, lived (reasonable) experiences. I lock my door because, regardless of where I am in the world, I don’t want to create the opportunity of any random anyone wandering into my house—I don’t fantasize that it makes me secure or safe[5]. The only person who ever robbed me, was a housemate—and that’s far and away statistically the most likely source of any domestic theft. &c.

Let’s review the list of superstitions again:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I come out of the closet, people will reject me or hurt me.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

As already noted, if the media were not keeping me up-to-date on all the (relatively slight) mayhem and madness (compared to the overwhelming amount of peaceableness going on continuously), then the “but it might be true” would not manifest in my life vis-à-vis such mayhem and madness. And this is where the social aspect of superstition kicks in in earnest. If we want to wash our hands or lock our doors on what are essentially neurotic grounds, for the time being at least we can file that under the category of personal choice. But when 395,000 child abductions are by non-custodial parents yet we encourage people to think first and foremost about stranger danger, we have a problem. In terms of “where the problem is” in terms of crime (and this from 2001, for goodness sake):

according to the Centers for Disease Control’s Youth Risk Behavior Survey, and the Monitoring the Future report from the National Institutes on Drug Abuse, it is our children, and not those of the urban ghetto who are most likely to use drugs. White high school students are seven times more likely than blacks to have used cocaine and heroin, eight times more likely to have smoked crack, and ten times more likely to have used LSD. What’s more, it is white youth between the ages of 12-17 who are more likely to sell drugs: one third more likely than their black counterparts; and it is white youth who are twice as likely to binge drink, and nearly twice as likely as blacks to drive drunk; and white males are twice as likely as black males to bring a weapon to school.

So whatever neurotic activity you’re up to to avoid “criminal Black youth,” it’s pretty obvious that all of that activity, and thus the activity you support both in your politicians (and the policies they draft) and the media stories that “inform” us about criminal Black youth, is wildly destructive and misguided with respect to social life. When you move to a certain area to avoid crime, that’s right in line with the most pernicious kind of social superstition. Remember again, superstition is to believe something (and act on it) despite all knowledge otherwise. Find someone who has any kind of grip on the actual, on-the-ground numbers for so-called illegal aliens that the news is always trumpeting about, and yet how many of those millions have you actually encountered? More to the point, how many are ‘destroying American jobs” and the like. Of course, you don’t know, though you may be able to provide an anecdote about some business that hires “illegals” or whatnot. The salient point for your “argument” is: “what if it’s true.”

As with all superstitions, it is normalized by recourse to illegitimate usages of evidence from other domains. It is extremely uncontroversial to say there are many undocumented people in the United States. That’s of course non sequitur, and one could compare the number of jobs “lost” to “illegal aliens” and the number of jobs shipped overseas to places where wage exploitation is easier. People love to have all kinds of shit to talk about felons, and yet 1 in 4 people in the US now have criminal records of some sort, so it’s a dead certainty you know more than a handful. All that talk about ‘tougher crime laws” are not for a mysterious cadre of no-ones you’ve never met, but to all kinds of your neighbors, probably relatives. With James Eagan Holmes’ rampage, once against legislatures get to get on their high horse and talk about tighter gun control—superstitiously trying to convince us that such measures actually achieve the ends desired. (Remember, where superstition is involved, the rationale one proposes does not actually reach the ends desired.)

What is at stake here is how the media (as the loyal mouthpiece of the self-elected Kings) becomes a major outlet (if not the major outlet) for whispering in your ear, “But what if it’s true.” And by leveraging that unanswerable argument—even without necessarily having to make you fearful—they are able to then manipulate us into going along with whatever nuttiness they are up to—whether it’s tighter gun laws, bigger or smaller government, war with Iran, rolling back civil rights, suspending habeas corpus, detaining people indefinitely without trial, flouting the Rule of Law which (for better or worse) appears to be one of the major bulwarks of that part of civilization I’m not wholly opposed to, or generally distracting you from doing anything about the nuttiness they are up to. If it is too easy to turn the possible into the plausible, the reasonable into the rational, an essential part of this is how the unanswerable “it might be true” is turned into an irresistible “it might be true.”

Unfortunately, I don’t believe that politicians are merely monstrous jackasses out to fuck up the world. They are, however, no better than their fellow human beings when it comes to being susceptible to superstition, though we are right to expect them to be better than that. And much, much more is at stake when States follow superstitions. The part to emphasize is how the claim that the superstition makes does not actually achieve the end sought. For people who prefer conspiracy theories, it is easier to imagine that the War on Drugs was designed (from the start) to affect mass incarceration. I can imagine how this is comforting because otherwise the tragedy of the thing is wholly too epic to wrap one’s brain around. However, whether by design or as a consequence, more than 100,000 civilians in Iraq are dead thanks to the “what if it’s true” that Hussein had weapons of mass destruction. As a nation, we should be wholly behind resistance to superstitious polices that do not eventuate in the end sought. We invaded Afghanistan because the government there would not hand over the individual they did not have. &c.

In addition to whatever political and social organizing we might do to resist the susceptibility of government(s) to superstitious policy-making—a process that must be deemed all that much more alarming, not simply because more is at stake but because much more complicated and pseudo-sophisticated rationales will be tortured into existence to justify the superstitions; thus, Iraq was “the right war for the wrong reasons”—there is also the project of ferreting out superstitious thinking in our own lives. At this point, I must reiterate that this is not a question of acting only “rationally”.  It is clear that one can torment data into a rationale using one’s rational faculties—we humans can (and will) excuse anything on any grounds. Recognizing this tendency means we can resist it, even if it will never go wholly away (because we will never be able to wholly dismiss the unanswerable argument “what if it’s true” … Nevertheless, we can resist that unanswerable argument precisely before it become irresistible.)

Analogous to coming out, I’ve recently been in an online correspondence and have been having various sorts of self-insights and affective responses to them, and the process has been (obviously enough) interesting to me. But there was no reason to necessarily believe that my online interlocutor would be similarly charmed by my various mental and emotional conniptions. All the same, I wanted to share at least the fact that they were going on with him and, at the same time, worried that such sharing would not be well-received and might alienate him to the point of no longer corresponding with me. (Did I describe that well enough to make clear the analogy with coming out?) So, in the process of composing the email to him about all of this, I finally had a moment where I became utterly exasperated with my own self-consciousness and simply said “fuck me … Here’s what I’m trying to say.” Using myself as an example runs the risk of allowing you to dismiss the example as merely personal, so find in yourself your own example where you overcame whatever wall of self-consciousness you might have had going on. On one hand, how sweet it would be to say that, for all of my worrying in advance, it turned out he didn’t mind hearing about my psychological adventures in the least an, in fact, received them with brimming enthusiasm. Rather, what became clear to me after he wrote back was the disparity between what I anticipated and what he actually responded with. What I mean: everything I anticipated proved to be essentially beside the point. Now, of course, because the email was delivered with all my second-guessing, that shaped how he responded, but the fear of rejection (or, not even a fear of rejection, so much as more of a non-recognition of the human reaching-out I was doing) was not gratified.  I didn’t find myself pleased that he “didn’t reject me”; rather, I found myself annoyed that I’d allowed myself to succumb to such a worry, not because it was unfounded (i.e., didn’t come to pass after all) but because I’d indulged in a useless superstition.

This may start to seem subjectively trivial. We’ve all, I suspect, finally divulged how we feel one way or another to another person, and only after going through a lot of anticipatory conniptions first. People who come out after much quite real agonized soul searching then find that it was all quite the waste of time. The world-vision, the Weltanschauung, that was driving the whole “I can’t come out, because” line of thinking proves to be wholly unrelated to life as it is actually lived, even when one is subjected to anti-gay violence of this sort or another. Put another way, the person who avoids bad luck by not going under ladders never has that moment when (two or three non-bad-lucky days later) they think, “Gosh! How great it is I avoided all that bad luck.” Most likely, there’s no thinking about it whatsoever, but if there was some thinking, one might try to congratulate oneself for whatever it was you dodged. Never mind the grim possibility that you have some bad luck anyway.

This is all kind of the reverse of magical thinking, where we tend to remember certain moments of “parapsychological truth” while overlooking (because we don’t even see it) moments that refute that world-view. So, the person who comes out and realizes all those closeted conniptions were (ultimately) a waste of time can take courage from that fact, even if she or he then has to go into a new round of superstitious thinking that (like the closeted thinking) is ultimately the testing ground for finding one’s way out of that thinking. Inmates in prison who have settled into routines frequently loathe tremendously that thought of being transferred to a new prison (or even being released back into the prison of society). There can be a (very seemingly valid) anticipatory fear that things will be awful whenever they get where they’re going next, but then they get there and things are both fine and awful in the way that they are fine and awful, quite apart from whatever fine and awful stuffs was imagined in advance. Once again, all that preemptive worry turned out to be ultimately a waste of time (except as it provides evidence for why one needn’t engage in such preemptive worry.)

The more relentless such superstitious thinking and the more dire the consequences of resisting such thinking, the more liberating it must be to resist its rigors. Once again, the parallel has to be cited for those who have come out of the closet—it’s perhaps the most familiar, most culturally visible instance of anti-superstitious thinking. But the point to emphasize right now is how you can look in your own life and spot those times when worry proved needless—not because things turned out well, but because things turned out differently than anticipated (things turning out well is simply the “happy reward” of cynical anticipations). For people who are and want to remain cynical—despite a social obligation to other people not to be—this advice will not be welcome or heeded. But for those who are finding themselves distorting the practice of their life because superstitious “what if it’s true” arguments are shutting them down, then it’s worth making the attempt to resist that unanswerable argument.

To repeat the opening declaration: the essence of superstition is to believe something despite all knowledge otherwise. Superstition is not, therefore, that part of wisdom that learns from experience and uses it for future action. Event he cynic who has never managed to have a successful relationship has lived, empirical knowledge that his or her future relationships are probably not going to work out—which probably means some self-reflection is due on why things haven’t worked out. Superstition, rather, is the belief in something despite all knowledge otherwise. For the inmate who never shows his feelings “because he’ll be made into someone’s prison bitch,” even though he’s never actually tried showing his feelings is succumbing to superstition. The person who washes their hands to avoid sickness even though they have numerous examples (from themselves and others) that their rationale doesn’t reach the aim sought is being superstitious. The politician and the citizen who pass discriminatory laws on the basis that “too much is at stake not to” are using superstition to erode the whole reason one bothers to have a society in the first place. To act on experience is what we humans do all the time; to act on tortured avoidances of certain things (whether those things have ever occurred at all or are simply so imaginatively frightful that they can’t be brooked) is the essence of what I do not want to have in the world I live in.

[1] Significantly, according to Judaism, the wages of sin are loss of social reputation and the diminishment of one’s family over the following generations. In Christianity, of course, Hell is the consequence.  It is important to connect the “reputational paranoia” of the latter to the same pattern exhibited by some inmates—and to remember how being trapped in one’s thinking (like being trapped in a prison) can become preferable to freedom. Hell, of course, is simply an eternal prison—and so it is no accident that (most kinds of) Satanism construe Satan as the symbol of freedom.

[2] I’m not emphasizing some kind of essential Schadenfreude here, though there could be that as well. Rather, insofar as one may anticipate future consequences, then when that future is negative, the thinking resembles (if not is) superstition; when the consequences are positive, they’re something else (if not also still essentially superstition, but I will maintain there is a distinction still). So one may have a naughty, adolescent, thrilled kind of “hope” that a shooter might show up at one’s viewing of Batman without really meaning that anyone there would actually be harmed. &c.

[3] It’s probably not exactly that simple. There is certainly some reassurance in the fact that we are the one at the wheel and the wide-spread (and theoretically vetted skills of other drivers in the face of the) rules of the rod further decrease the sense of consequences. For all of my confidence as a driver, I’m not sure I would sit down on a moped and hazard driving in Saigon traffic, although the self-evident “chaos” of their traffic is obviously amazingly well-organized and effective. The Vietnamese would do well to be afraid of me in such a setting. Similarly were someone from the US to drive in England or on Germany’s autobahn, etc. Nevertheless, the point is that each of us either adduce arguments against the “it might be true” for “if I go driving, I’ll be hurt” or we find some alternative to the imaginable direness we’re confronting. It may also seem, at this point, that such a reasonable risk calculus as is involved here too much no longer resembles (or can be fitted into an interpretive scheme as) a superstition. For the agoraphobic (or the autophobic), this is clearly not the case. By definition, a phobia is an unreasonable fear, so that the only one who is not phobic is someone too unreflective to notice their phobias. My point is that just because we might be willing to pay the price for whatever sacrifice is demanded of us (in terms of giving in to superstitious thinking), such superstitious thinking should not be fostered in general, for reasons the rest of this essay makes clear.

[4] There is a kind of argument that could be raised here that is tangential to my main point. Just as hand-washing can increase one’s susceptibility to disease, so can cautious driving increase one’s risk of accidents. The only person I ever rear-ended at a stop light was someone who (turning right) started to go and then stopped, because she felt the oncoming traffic was too close. I moved forward as she started to go, and didn’t stop when she did. The point is not whose fault this is: she and I were in an accident because her caution, which I didn’t anticipate, made her drive in a way that permitted me to hit her. People driving too slow (less than speed limits) on freeways, etc., etc., etc—it would be interesting to know exactly how many accidents per year are caused by cautious driving. For the closeted homosexual, certainly back in the day, the fact of being closeted meant that trolling for sex (in parks, in bathrooms) often had considerably more danger. It used to be one could not be in security forces because, the argument was, being gay meant you were susceptible to blackmail. &c. One could multiply examples and provide counter-arguments, but the point is not “what happens in each of these specific situations” but rather “my anticipated thought about the circumstance must be true”. One can blame me (from a legal standpoint I certainly was culpable) for rear-ending the woman’s car, but in the lived world she was still in an accident—and it’s not negligible that for all of my wildly incautious driving over the years, that’s the only time I’ve ever actually hit someone. Against the doxa, “one should drive cautiously,” the rejoinder here is, “well, you’ve been pretty damn lucky then.” I mention this to point out how, in the face of a superstition, the resort to metaphysics (luck) is preferred to acknowledging that the doxa (“drive cautiously”) might not actually be as true as claimed. All of this feeds precisely into the effect of superstitious thinking in public life.

[5] I’m not congratulating myself for an anti-superstitious stance here. Obviously, for me, the consequences of leaving the door unlocked are dire enough that I’ll make the minimal sacrifice of turning the deadbolt when I’m in the house. And as much as I might want to say it is rational for me to do that—as a kind of domestic recasting of Pascal’s Wager, what really do I lose by locking the door—I have to call it pseudo-superstitious, because living the alternative (i.e., freely choosing whether to lock or not lock the door) is not really on the map of possibility for me. I could do it, and it would bother me. I have friends who leave their doors unlocked; I don’t begrudge them that, etc. On the most abstract level, I could argue that I should be a door not-locker—what makes my locking the door a superstition, in my view, is that I do it without any actually empirically good reason to do so. I could trot out some racist bullshit about “neighborhoods” and so forth, but what are the “real crime statistics” where I live? What is the plausibility that a home invader should suddenly strike in my neighborhood? “But he might.” Put another way, you are sorely deluded if you think I’m excluding  myself from the perils and social undesirability of superstitious thinking. My guilt dos not excuse yours, however.

This is part 1 of a two-part post our current social world.

The essence of superstition is to believe something despite all knowledge otherwise[1]. Examples:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I come out of the closet, people will reject me or hurt me.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

Obviously, I chose these examples to illustrate and broaden what I am proposing we understand as a superstition. Amongst what I would call the salient details at work in these superstitious conjectures, one in particular seems important above all else: they all hinge on unanswerable arguments.

What is important for us today is not that someone might actually seriously believe that walking under a ladder will bring bad luck; what matters is that, even when we know better, we still don’t walk under ladders. It is precisely for this reason that I walk under ladders; I am not going to hold myself hostage to what my thinking is doing at that moment.

Of course, walking under ladders doesn’t bring bad luck—belief in bad luck itself is already a popular superstition—but the thought may still cross one’s mind, “But what if it does?” On the basis of this laughable assertion of a possibility, we then hedge our bets and don’t run the risk of offending the gods of bad luck or whatnot. Resisting this kind of thinking is why people record themselves on Youtube blaspheming the Holy Spirit (i.e., publically commit the unforgiveable sin).

This is what I mean by an unanswerable argument. For all that we know walking under ladders doesn’t cause bad luck, the rejoinder “but it might” cannot be answered. Imagine a dialogue between two people: A; “Walking under ladders doesn’t cause bad luck.” B: “But it might.” A: “No, it does not.” B: “But it might.” Unless A walks away, this will never end with B admitting or coming to realize that the insistence here upon a possibility is, if not bogus outright, then utterly negligible. So, over against the idea that walking under ladders is bad luck, one might usefully ask, “Is that really plausible?” So, to no small extent, superstitions insist on “blackmailing” us with arguments based (spuriously or not) on possibilities, which can (or might at least otherwise) be viewed in terms of plausibilities.

Revisiting the initial list again, thinking in these terms:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I come out of the closet, people will reject me or hurt me.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

For your own sake, it is worth noting which of the above you laugh off as obviously and absurdly neither plausible or possible and which you are not so sanguine about.

Obviously, when people argue (with others or with themselves) about possibilities, they must implicitly take the thing to be plausible. However, certain kinds of possibilities, in order to be plausibilities, presuppose a shift of premise that itself may be groundless. A most notorious example of this is Pascal’s Wager, where he argues (logically enough, if to the great affront of Faith itself) that one loses nothing by believing in the biblical deity[2]. The argument hinges, in part, on the possibility that the biblical deity exists, but this possibility cannot be turned into a plausibility except by rejecting the whole of lived existence and substituting in its place a vastly different explanation for “life.” There is no question that, in the western monotheistic traditions, there has been no shortage of effort expended to interpret the nonexistence of the biblical deity in terms of faith (e.g., “invisibility is the form of god”; “silence is the voice of god”). The point is that, in everyday life, between two competing explanations, we almost invariably (however much this might be a problem) pick the explanation that accords with lived-experience. So if my prayer isn’t answered, I can thank the divine for unanswered prayers or I can—in a manner far more consistent with all of the other kinds of decisions I make each day about which explanation to follow—conclude that prayer isn’t an effective intervention. I am mentioning this not to bash religion, but to point out how the elision of possibilities into plausibilities is not necessarily a legitimate move, though it is a popular one.

I’ve gotten into some surprisingly contentious arguments about this when I attack the superstition, “If I don’t wash my hands after going to the bathroom, I’ll get sick.” The most I’ll grant here is that if I don’t wash my hands after going the bathroom (in a public restroom), then I may be increasing my exposure to infection. Let us be precise. In order to become “sick” (whatever that means, as opposed to the more precise “become infected”), there must be some germs present in the public restroom that I am likely to be susceptible to. It is of course possible that such infectious germs may be present, but is it plausible? In order to make such a determination, one would have to be a bit of an expert on this particular bathroom, to say nothing of knowledgeable about what sort of infections I’m susceptible to. In the face of this claimed plausibility, which I’ve encountered from adamant people, is the empirical experience (I have had) that not washing my hands after going to the bathroom has not made me sick. In fact, I can say that I know of no moment in my life when failing to wash my hands after going to the bathroom could be causally linked to me being infected to the point where I would describe myself as sick.

The people who make this argument to me are not necessarily neurotic hand-washers, although that class of person will be a leading and loud advocate of this superstition. But this points to the fact that the more “normalized” a superstition is, the more there will be pseudo-rational arguments to try to make it plausible. I say pseudo-rational because the quite evidently empirical medical fact about how infections occur in human bodies is leveraged here in an inapplicable context and toward an irrelevant end. That is, the universe of the neurotic hand-washer (and milder advocates of this superstition) is not one where a “public policy” of hand-washing would begin first by performing an actual risk assessment in any given (public) bathroom setting to determine whether or not hand-washing in this particular instance is actually necessary, but rather proceeds from the premise that piss or shit daubed fingers are unnerving and disgusting. A quasi-sophisticated version of this leverages notions like “herd immunity” and the socio-moral responsibility of not becoming a carrier of such germs (to other people). Again, the legitimacy of these observations in their proper contexts is illegitimately marshaled the advocate of the superstition’s purpose. Nor am I saying that any such policy of hand-washing could be or should be determined in this way; my object is only to point out that what is ultimately and relentlessly at root in the insistence “you should wash your hands after going the bathroom or you’ll get sick” is no better argument than “well, you might.”[3]

In any case, I suspect most people are definitively certain that walking under ladders, letting black cats cross their paths, and even not washing their hands after going to the bathroom[4] will have untoward consequences, even if people don’t readily admit the last.[5] But when it comes to the threat of potential violence if one comes out of the closet, the pseudo-rationality of the argument acquires a much thicker shell. In part, this is because the anticipated consequences are more dire. If with black cats and ladders, the threat consisted of the numinous but vague “bad luck” and with not washing one’s hands merely the unpleasant but transient “illness,” with the social consequences of rejection for being queer one faces both physical violence outright and all manner of potential social violence (e.g., loss of work, inability to get work, loss of social status, &c).

A disadvantage of using “if I come out of the closet, people will hurt me” as an example is that it may make you feel the example has nothing to do with you. Almost no one lives wholly without some piece of personal information they are not “out” about, and which they keep to themselves to avoid (pseudo-rational) consequences. Again, just as the superstition of hand-washing inappropriately leverages empirical medicine to argue for its plausibility, so the very real, ongoing, and potentially ubiquitous violence against people who are queer-identified provides potentially excellent arguments for not being out. I am in no way suggesting that such violence is “all in one’s head”; rather, it is most definitely committed by bigots, &c. But, one of the main differences between being in the closet as opposed to out is that the condition of the former is marked by potential awareness of violence 24/7 and in the latter only in those circumstances where one (more or less validly) determines that such violence is, in fact, likely[6]. The advantage of using this superstition is the experience countless queer-identified people have reported upon coming out. I like to say that I have never met anyone who regretted coming out of the closet; for the sake of completeness, I’ll say there have been two people in my life who claimed to know someone who regretted coming out. There have been people who resented being outed. &c. The issue here is not to bog down in the details of the example, but to point to the greater leverage we can get with ourselves because, in this case, the consequences of going “against” the superstition can be so much more dire (up to and including death), but also the immense relief that people report from not giving in to this superstition any longer. I am surely not alone in reporting the vast difference between the “paranoia” of my closeted existence and the overwhelmingly absent amount of direct violence as a “faggot” I have subsequently experienced. The point is not that I have not been beaten up, of course. The point is that I have not spent decades living constantly with that worry or fear.

Partly, this is exactly because of the direness of the consequences. To go out into the world everyday anticipating the worst violence on the off chance that someone figured out my sexuality was, of course, heavily tiresome and wearing. The clear implication is that the liberation of similar heavy and wearing superstitions in our lives equally portends that much relief. In prisons (one could cite historical and current examples as well), one’s reputation is (arguably) all one has. The pseudo-rational fear (pseudo-rational insofar as it is rooted factually in certain kinds of epistemologies in prisons but here inappropriate extended to cover every waking minute of the day) that if anyone “thinks ill” of you, you will be made to suffer in some way (e.g., your shit will be taken, you will be made into someone’s prison bitch, you will lose social status and thus access to the things that will meet your needs). Here again, the consequences of not taking this superstition seriously (“if I don’t maintain my reputation, people will take advantage of me”) are dire but the emotional cost of maintaining this superstition 24/7 are self-evidently not worth it, once the risk is taken not to worry 24/7 about this. Zillions of people who are queer identified have discovered, somewhat to their chagrin or embarrassment, that all the fuss they imagined about coming out turns out to be wrong—that for the overwhelmingly vast number of people, no one gives a shit that you’re gay[7]. So the man who is hypersensitive about his reputation discovers (perhaps thanks to a particularly humiliating public experience) that his touchiness was not well placed (even after the public disaster).

Hopefully it is not controversial to say that most of us are not happy to avoid living the life we’d rather be living because we feel we must act a certain way in order to avoid undesirable consequences. For something simple, like not walking under a ladder, we don’t “mind the sacrifice,” in particular because the consequences of disobeying aren’t so dire. Similarly, whatever 20 seconds of time wasted to rinse one’s fingers in a public restroom sink (never mind that studies show the sink to be the dirtiest area of a public bathroom—oh the irony!) may be called a sacrifice, perhaps it’s not really “warping the fabric of our lived experiences.” It is the smallness of this “sacrifice” that makes it unproblematic in general.  But where the threat of violence starts to loom, as with coming out of the closet, it becomes much more ubiquitous and depressing to maintain the superstition. This is exactly why hand-washing is actually a “trauma” for the neurotic hand-washer; it is precisely because the consequences of not washing one’s hands are construed in a very dire way—a way far more dire than most of us would credit[8].

[1] I’m tempted to say “rational” knowledge otherwise, but why I eschew this adjective will become clear soon enough.

[2] On atheistic grounds one obviously loses a tremendous deal, perhaps one’s wholly terrestrial existence, but this was obviously not what mattered to Pascal.

[3] It’s slightly gratuitous to continue this point, but some medical science suggests that excessively hygienic people actually increase their susceptibility to infection, because their immune systems are never tested in the trial by fire of infections. To whatever extent this proves true, the other arguments (about herd immunity and any social obligation not to pass along infectious germs) obviously must be similarly modified by this medical insight.

[4] There must doubtless be a gendered distinction here. I would expect males to be more convinced that women’s fingers are filthier after urinating and vice versa. Males (even gay ones) seem to be little in the habit of freaking out over the possibility that the man whose hand they are shaking may have recently touched his penis. To whatever extent we guys “keep our junk clean,” we can count on other males to do so as well—that is, we have no trouble assuming that, if we even think of it at all, which is probably rarely the case. Besides, it’s not as if we douse our fingers in piss as we urinate, so that the more probable source of any “grossness” in a bathroom will be assumed to be not our trusty and beloved penis but that loathsome cloaca of the flush handle, &c. Reasonable as this assumption may seem, this explanatory framework would still have to offer how the flush handle became that loathsome cloaca in the first place, if nothing but hygienic healthy genitals have been handled in the room. Doubtless, shit is the culprit—its viscous clinginess, the proximity of fingers (separated only by a tissue-layer of paper), &c. Because, seriously, while we might be able to count on people (guys) to not wag things around so vehemently that piss flies all over everything, isn’t there (honestly) just too much at risk when we’re asked to trust that some guy running his fingers over his anus hasn’t come into contact with the tiniest speck of feces? Finally, at last, we come to the heart of darkness, to the moment and presence of evil itself. Which is all to say, advocates of the hand-washing superstition will feel they are on much stronger, much less assailable ground, when insisting that people who have taken a shit should wash their hands.

[5] Somewhere there are those amusing or sad studies that demonstrate: people are more likely to use a sink to wash their hands in a public restroom if they know there is someone else in the restroom with them, than if not. If it helps you to be honest with yourself, I’ll readily confess this is often the case with me—more precisely, I feel a sense of being judged (imagined or not) when I walk out of a public restroom without washing my hands. This whole cognitive process is itself superstition. I can argue to myself that it’s plausible someone in the bathroom might judge me, and however much I insist that is not the case, I can always come back with, “But they might.” This points up clearly how the issue ultimately has nothing at all to do with “what actually happens” in lived experience, but rather the virtual anticipation of what might happen, and the decisions we (all) make in light of those virtual anticipations.

[6] Even in these “likely” scenarios, one can frequently be surprised. Where being out at work is imagined as impossible, it may turn out otherwise; where one is certain family gatherings will be a hotbed of rejection, it turns out otherwise. It may be going out on a limb to say this, but it seems that (whether one is in or out of the closet) the actual, predictable source of violence toward oneself will, in general, not actually be predictable. One can only learn by risking and experimenting. And at least in principle—as also the whole purpose of learning anything at all in life in the first place—the more one knows, the more one may become able to negotiate the terrain of life, even if violence still pops up out of unforeseen (or unforeseeable) circumstances. This may be akin to the fatuous notion that one can eliminate accidents; by definition, an accident is the thing that cannot be avoided. In fact, it may be that we often excuse our own inattentiveness and irresponsibility by claiming things were accidents that we actually might have reasonably done more to avoid.

[7] This actually has its own set of social problematics, but that’s for another essay.

[8] I’m a bit of a purist—if the consequences of walking under ladders or not washing one’s hands are really of no small moment, then we should with equal “freedom of spirit” walk under ladders and wash or not wash our hands. As soon as it becomes obvious that our thinking is being manipulated in such a way that the equal choice between “walking under a ladder or not” or “washing one’s hands or not” disappears, then we have veered back into the territory of the superstition. It would be a mistake, however, to construe “always walking under ladders” or “never washing one’s hands” as a neurotic compulsion, when it is done on principle, precisely in protest of the superstition.

(revised 1-11-13)


How we talk about the world of nature, and what meaning we project upon it, matters. This seemingly universal human tendency has also varied tremendously and continues to provide a rich source of insight about human nature (rather than Nature). Thinking about how we think about lions, vultures, and hyenas (whether within or outside of the context of adoption) can show how we not only (inadvertently?) valorize and reinforce those aspects of patriarchy (embedded in the ways we talk about lions and vultures) that ruin both our social world and our socio-personal lives, but also denigrate the alternative images and stories we might tell ourselves to get out of this mess socially and personally. This is especially true for the marginalized, the subaltern, the nonconformist, the anti-heteronormative–which, ultimately, we all can only not be a member of by pretending the power structure lionizes and (like a vulture) nurtures us.


Semi-recently (25 April, 2012), an online exchange addressed some of the characteristic online/social performances that adoptive parents, especially mothers, allow themselves to indulge in. The below is an excerpt (the whole is available here); this is a response to another’s (indented) original observation:

I followed links to the horrible conversation over at Adoptive Families Circle (the one about the woman whose “birthmom” changed her mind and decided to parent. [NOTE: Since deleted.]) I smiled when I read your comment:

“When deciding on the proper analogy to use to give advice to this poster, I originally thought of going with vultures. But vultures actually have more empathy and candor than the author here, in that they actually wait for their prey to die before taking from them. So I went with lions, who hunt in packs (like this forum), who have little concern for those they target (like this author), and whose only concern is their own sustenance and livelihood, at the expense of others.”

Where I come from, we call them hyenas because they hunt in packs and only show up when their prey is in distress. I used to think women like this were vultures, but then I learned more about vultures and realized that is FAR too kind of an analogy. Vultures are actually amazing creatures—creepy to those who don’t understand them as they are always associated with death—but quite astounding once a person learns more about them. Vultures have strong family structures and most stay together in extended family units for their entire lives. They are one of the few birds that have developed their own games and are one of the only animals that doesn’t kill anything for survival.

Vultures take nothing but what nature/God/Life/The Universe gives them. The female of the species are phenomenal mothers, gentle and patient with their young, and fiercely protective of them. Their mothering skills are legendary throughout the history of the world and consequently, vultures are tied to symbols of mothering, royalty, and rebirth in traditional cultures. Within the ancient Egyptian culture, the hieroglyphic for “mother” is a vulture. It is also the same hieroglyph for Nekhbet, the Egyptian goddess of motherhood, queenship, death and rebirth, and creation of life.

So as I said earlier, calling some of the women of the adopting class “vultures” is far too kind of a thing to do. Having been the target of attack from this type of women, calling them lions is too nice, as well. There is something noble about a lion…not so much a hyena. Lions will at least kill their prey before devouring it. Hyenas don’t.

I want to first frame my response by saying I’m aware there is a certain tongue-in-cheek quality to the above, since what is at stake is the choice of metaphor. But I’d rather be accused of too much seriousness by insisting on pointing out that these kinds of discussions about what metaphor to use end up easily enough in disgusting historical cul-de-sacs—what, for instance, is the best metaphor for describing the Jew? A parasite? How about the Palestinian? An animal? The German language is so punctilious about this point that it has a separate verb to distinguish how humans eat (essen) from animals eat (fressen). These things can have important consequences.

A first thing I note about the above exchange (in the quest for an apt metaphor) is the continuous “downing” that occurs. So from “vultures” the original poster shifts to “lions”; the respondent shifts from “vultures” to “hyenas”. Now, if these authors had the reins of a State at their disposal, we could rightly get nervous that an oppressive (neoliberal, fascist, totalitarian, theocratic) regime might be in the works; what I read here instead is an attempt to get out from under the mountain of shit we get piled on us by oppressive regimes, by a resort to a similar kind of “downing”. Again, I’m not ignoring the more or less tongue-in-cheek starting point for the exchange.  Also, in the final analysis, I have less “nervousness” about the original posters’ shift from vultures to lions, than the respondent’s shift from vultures to hyenas (as will be made clearer below).

Someone might accuse me of making an excuse for behavior I might otherwise condemn. However, similar behavior at different points within a hierarchy are not identical behaviors. When the State murders, that is very different than when a citizen of the State murders or when an individual murders. Similarly, why the State murders is very different from why a citizen of the State murders or why an individual murders. Blurring these distinctions allows arguments of bad faith begin to slip in so that various misapplications of the metaphors “State” or “individual” come into play—corporate personhood is the most glaring instance of this, but the personification of the government as the person of the President (in the US) is a similar phenomenon. The idea of applying the expectation that the State should follow the Rule of Law to the individual is an example in the opposite direction.

So reading into these authors’ “downings” a (literal) reproduction of the kind of hierarchical oppression sanctified by the State is an error that enhances State power, so I reject that reading. At the same time, however, those who are sensitive to underdogs can find an unintended sympathy in these essentially ad hominem attacks on adoptive mothers the above point to. That sympathy can be smashed to pieces by going and reading the actual words that said mothers proudly declaim online, so that’s worth remembering. For me personally, it’s a bit of a distraction that I’m distracted by the criticalness of the posters; I more want to be addressing other issues.

A first thing to point out about lions, hyenas, and vultures (as referenced by the posters specifically) is that they are all African creatures. The geographic ranges of lions (and vultures even more so) are not limited to Africa of course, while hyenas almost exclusively are (at least in our imagination, if not in their actual evolutionary history)—particularly the yellow spotted hyena (Crocuta crocuta), which tend to be taken as the definitive  version of the hyena (cf., The Lion King). So, when non-Africans start talking about African creatures, cue in earnest the problem of representation. (In the United States, this discourse has some touches of modification due to the actual presence of vultures in the environment.  USers—that is, people in the US—may have some face-to-face familiarity with lions from zoos and circuses (or from traveling in Africa, but almost no one has actually encountered a hyena, even rarely in zoos—less often than lions in any case.) And as far as the problem of representation goes, there is nothing like never encountering something to enable making pitifully grotesque generalizations about it (cf US constructions of Muslims, Arabs, terrorists, illegal aliens, sex offenders, felons in general). I am not going to try hazarding even a summary of the many issues at stake when the “Western” world takes up representing Africa. It can only suffice for now to say that the meaning and use of African creatures in general, and the relationship of those representations to the people of Africa, is a nasty tangled knot.

The Lion & The Vulture (Patriarchy’s Father and Mother)

My intention is not to educate about hyenas solely to defame lions or vultures. You can get all kinds of relevant information about the spotted hyena simply by reading all of this (the section on “cultural representations” most pertinently). What I particularly want to address is the use to which this imagery is put (as exemplified by the posters and as obvious in culture everywhere). For the sake of illustration, the following (from Wikipedia, of course) aptly frames the main features of narratives about hyenas, when people bother to have a narrative about them at all:

The spotted hyena has a long history of interaction with humanity; depictions of the species exist from the Upper Paleolithic period, with carvings and paintings from the Lascaux and Chauvet Caves. The species has a largely negative reputation in both Western culture and African folklore. In the former, the species is mostly regarded as ugly and cowardly, while in the latter, it is viewed as greedy, gluttonous, stupid, and foolish, yet powerful and potentially dangerous. … Explicit, negative judgements occur in the Physiologus, where the animal is depicted as a hermaphrodite and grave robber. … In Africa, the spotted hyena is usually portrayed as an abnormal and ambivalent animal, considered to be sly, brutish, necrophagous and dangerous. It further embodies physical power, excessivity, ugliness, stupidity, as well as sacredness … In west African tales, spotted hyenas symbolise immorality, dirty habits, the reversal of normal activities, and other negative traits.

The original poster makes a shift from vulture to lion, on the grounds that “vultures actually have more empathy and candor … in that they actually wait for their prey to die before taking from them.” To this may be added that lions often steal more than they kill (often stealing from hyenas no less) and that lionesses hunt down and “mercilessly” slaughter cheetah cubs. As human beings, of course, we can say, “Well, that’s just nature,” but we will still respond as human beings to these things. What I would say is not that this is a “misreading” of nature but rather that it points to a very serious and salient problem of human culture that we make excuses for and valorize a creature that hunts out the children of other creatures and murders them for no apparent reason other than that they can. History shows that humans could have “learned from Nature” in re slaughtering our neighbors and their children. One can call it disturbing then that we valorize as the “King of the Jungle” a creature that steals more than it earns—which in its way seems an (apparently inadvertently) apt symbol after all, although people do not ironically valorize lions for this reason. The person responding to the original poster above specifically defends lions as having “something noble” so that she feels compelled to find another target for a “most insulting metaphor” instead.

I’d hope it’s unnecessary to say this, but just in case: the point here has nothing at all to do with the actual nature or behavior of lions, but rather the stories (pro and con) that we make up to talk about them. And it would be an interesting byway of this topic to explore how those stories (supposedly “from nature”) became inputs for human social organization. But this also points out the issue: who gets to tell the stories, and who gets to make them up? If there is one thing that’s clear about cultural depictions of lions (in the United States, if not elsewhere), it is that male lions are the focus of attention. They fuck whoever they like—including younger males who run in their homosocial packs with other lions (this fact is not a widely acknowledged part of the lion “legend” but clearly goes along with the “secret” history of power, just as White plantation owners in the US south kept octoroons, &c). It is clear, as more and more evidence becomes more and more widely dispersed, that lionesses are where it’s at—they do the greater part of the “work” so to speak of lion culture (even if occasionally going off the rails and slaughtering their neighbors’ children). The male lion is most certainly an image of patriarchy, including the consequent direction of attention away from (i.e., and thus the second-class citizenship accorded to) females.

I don’t mean to overly psychologize the responder’s comments, but the language of how she shifts the metaphor from lions to hyenas is itself telling of patriarchy. For one, “There is something noble about a lion…not so much a hyena. Lions will at least kill their prey before devouring it. Hyenas don’t.” Obviously, I’ve been reacting to that word “noble” throughout my response here—what is ultimately noble about stealing more than you earn and killing your neighbor’s children; more to the point, what is the point of creating a narrative that calls that noble?[1] One can be awed by the murderous spectacle of a Stalin, but that doesn’t oblige you to count them a role model for morality.

On one hand, that the responder also complained of hyenas that “they hunt in packs and only show up when their prey is in distress” is hardly apposite of hyenas only; lions (to say nothing of countless other predators), in any case, are not appreciably different. In fact, insofar as this repeats what the first poster imputed to lions in the first place, it is unclear why this is even being raised against hyenas—unless I assume this is simply hypocritically excusing behavior in those one likes (lions) that one condemns in others (hyenas). The point of emphasis, rather, seems to be the pack (mob) behavior of hyenas as distinct from the “strong male” individualism of the lion. (Never mind that hyenas do often hunt alone; again, this is about stories we tell, not “facts”.)

So, in other words, there is an “anti-tribe” sentiment at work here. This comes out more strongly when looking at how the poster valorizes the vulture. For instance:

The female of the species are phenomenal mothers, gentle and patient with their young, and fiercely protective of them.[2] Their mothering skills are legendary throughout the history of the world (emphasis added).

This description dovetails (matches identically) the sort of discourse about “mother” that accompanies patriarchy—that is, it fits nicely and tidily with the “strong male” image of the lion, which has (be it remembered) “something noble about it”.[3] But if we set aside for a moment this idealizing fantasia of “mother” that at least post-Enlightenment Western culture has had to insist upon in order to co-opt women into going along with it, then we can also see in the poster’s remark something that’s actually not quite orthodox:

Vultures are tied to symbols of mothering, royalty, and rebirth in traditional cultures. Within the ancient Egyptian culture, the hieroglyphic for “mother” is a vulture. It is also the same hieroglyph for Nekhbet, the Egyptian goddess of motherhood, queenship, death and rebirth, and creation of life (emphasis added).

I make this remark without knowing where the poster is in the world. But in the US context where I’m encountering it, the emphasis here of “mother” rather than “Egyptian” is entirely consistent with Western hegemony. That is, there’s a not-so-subtle substitution going on here that is (attempting to) equate present-day and ancient Egyptian “mothering”. Specifically, the absence of death and rebirth from the everyday notion of “mother” in US culture (to say nothing of the stripping away of spiritual significance that goes along with those themes) points to the radicall emasculation (I use the word ironically and deliberately) that Nekhbet is “enjoying” in this equation.

But also, this vision of “mother,” which most assuredly is part of the human record of existence, is not meant in its aristocratic sense here either; queenship is wholly missing from US discourse, although in the Black Muslim community the language of women as “queens” does appear (as often with adulatory tones as with regressive/sexist implications about “traditional” roles, &c). As such, there are ways that this respondent’s description, by design or not, runs in parallel with (if not actually runs into) noble Savage discourses—ones that are still alive and well in how the mothering of “poor Black mothers” in the US has a mystique about it perpetuated on both sides of the race lines. Though perhaps intending otherwise, this boils down contextually to an articulation all over again of ages-old patterns of patriarchy—an image of the strong father who owns his wife and children, and the strong mother who reproduces her husband’s authority over her upon the children. (Recall that the word “family” derives from the Latin for “domestic servant”.) Engels traces how, precisely by breaking up the traditional boundaries of “tribe” and substituting artificial boundaries of “State,” there was the original rise of hierarchical oppression of the sort we can now recognize.

This makes even more obvious the “anti-tribe” sentiment at work here. The appearance of “queenship” not only invokes the role of Woman under patriarchy (some of whom borrowed or operated in light of the authority of their husband or father; the rest who were third-, fourth-, or fifth-class citizens), but specifically the shift from tribal to Statist social organizations. This shift was thousands of years in the making, of course, but this still records the lines along which it occurred. Out of this progress, thousands of years later, the “nuclear” family emerges, still wielding its patriarchal stick (Foucault’s Discipline and Punish or Kate Ferguson Ellis’ The Contested Castle both provide clear windows on this shift.) &c. Thus, we move to the world of the bourgeoisie and all its glowing claims with:

Vultures have strong family structures and most stay together in extended family units for their entire lives.

To parody the argument here most simply, vultures have “family values”—and hyenas don’t. (The issue, again, has nothing to do with the fact that lions or hyenas could equally be imputed these kinds of group/family dynamics. What matters is how we tell these stories to ourselves.)  In addition: vultures

are one of the only animals that doesn’t kill anything for survival. … Vultures take nothing but what nature/God/Life/The Universe gives them[4].

What I want to resist here is the framing that “ups” vultures as some kind of resister to an otherwise horrific creature-eat-creature world. Just as the poster finds lions more noble for denying hope to those it would eat (see footnote 1), here vultures get kudos for ignoring whatever misfortune befell that which they intend to eat.  When we find a $100 bill on the ground, if we keep it, we know there is someone who has lost it. We can console ourselves that whoever lost it (being the type to have $100 bills in the first place) perhaps won’t really miss it too much, but if we tell ourselves this it is precisely to minimize our sense of benefiting from another’s misfortune—or, we might just nihilistically laugh about someone else being too stupid to live (note the similarity between such laughter and the laughter imputed to hyenas?). If vultures were strict fruitarians, then I might find more promise in this line of argument. Also, strictly speaking, vultures both eat maggots (killing to survive) and by eating do so at the expense of everything else that might feed on whatever they are eating (thus surviving by killing). I realize this is splitting a hair, but the myth that we can “survive without in some way being detrimental” to other things is a myth strongly in need of resistance. I mean specifically, the idea that it is okay for me to get a cheap burrito at Taco Bell—which I can do only because costs have been externalized to my fellow human beings, is nothing more than a way for me to feel comfortable with an untenable situation. The moral high ground I want to try to occupy by supposedly consuming in a “moral” way is merely a sign of my ignorance of my own complicity in the systems of oppression that make the US (and other elsewheres) possible.

The sort of nebulous “peace-love” sort of discourse that seems to be floating around in this “doesn’t kill anything in order to survive” claim (I’m not going to attach this to claims from vegetarianism; vegetarians know they kill plants to survive) becomes overt with the claim that vultures only eat what some divine force (imaged as god, nature, cosmos, universe, et) provides.  If vegetarians are trying not to “kill anything to survive,” it’s a very different paradigm that says, “I eat only what the universe supplies.” This is some quietist mysticism at root, attempting, it seems, to escape the circumstance of the world that makes “predating” upon others necessary in the first place. I’m not at all interested in critiquing this on absolute moral grounds. Until we reengineer our bodies, we do need energy inputs to persist. The Jainists have it right when they say the rule should be to maximally minimize the extent of such predations—and they do (at least in their most serious practitioners) resemble strict scavengers. Luckily, Jainism is not stupid and doesn’t end its philosophy there, or else it would entail simply a kind of non-participation in the world that does nothing for the world (except to make the practitioner feel morally superior).

I’m not pretending we don’t want to live in the world comfortable in our own skins. A constant state of awareness about the irony of our existence could be difficult to bear, but I’d sooner say I’m a racist, sexist pig who is working on not being that way than pretend I’m already a sufficiently reconstructed human being. Whatever the respondent meant, in the current US context, the notion of eating “only what the universe supplies” is merely egoistic; it can be little more than a coping mechanism for dealing with a seemingly otherwise unbearable and crushing degree of guilt. That guilt, however, is just as much a coping mechanism. I say they are coping mechanisms because the end result of both outlooks is paralysis with respect to changing the circumstance that makes those coping mechanisms necessary. “Eating only what the universe provides” allows you to be “good” while constantly laboring under a mountain of guilt keeps you feeling “bad” but both fundamentally disable any impulse to change the social order. Rather, that I recognize myself as complicit doesn’t require me then to do nothing about it by excusing my immorality as a myth or as human nature; I will not accept either of these stories (metaphors) and want an alternative.

If you really want to get silly with the “vultures eat only what the universe provides,” then we can blame them for not intervening when whatever it was that was dying was dying. Vultures are not picky about how what they ate died. They’ll eat the aged and the inform as much as the young and freshly lion-killed (or hyena-killed). That’s precisely what gets vultures demonized as scavengers and gives us the sense of the use of that word as a synonym for those who prey upon misfortune. There’s a lost word (“quomodocunquizing”)—no wonder it’s lost—that means “to make money by any means whatsoever” (no matter how unscrupulous). Given our current commoditized world, it’s no wonder the word has been “forgotten”; but the vulturine benefit based in the misfortune of others (even the most natural of deaths) points to why that word gets used as it does, and that is also what makes it so appropriate for those vultures who prey upon the human misfortune for those who cannot have children (i.e., the adoption vultures who traffick in human babies). That these vultures may be “fiercely protective” of their own young, and legendarily “kind, patient, and gentle” with their own children only points up again the ultra-fundamental hypocrisy at work—they will die for their own children (in theory) but will make a buck selling other people’s.

In this, the issue might seem not so much that a shift to hyenas as a metaphor seems inappropriate, but that the re-framing of what vultures mean doesn’t seem to do the work the respondent above wants the shift to do. By switching attention from vultures (”scavengers who prey upon the misfortune of others”) to lions (those apt symbols of brazen patriarchy), this reemphasis gets at a clearer image of what’s going on in the adoption circus. Shifting attention from vultures (as great mothers and Great Mothers) to hyenas (as tribal rabble that lacks family values or patience with children) actually makes the circumstance involved (of hierarchical might makes right) less clear.

Hyenas (An Other of Patriarchy)

I want to specifically emphasize some of the dominant imagery of hyenas in cultural representations: their imputed hermaphroditism and their filthiness. But first some broad strokes about the hyena’s reputation.

Spotted hyenas are matriarchal. Females are significantly bigger than the males, who play virtually negligible roles in the “cultural life” of hyenas. By that I mean, if history is told by the winners (i.e., what history is visible), then the activity of females comprises the “history” of spotted hyenas when human beings are watching them. Who knows what the boys are up to on their own, as it were—out of sight. One can also run a very long way pulling out the details about how the State (or the Strong Man) are “anti-tribe” (i.e., anti-gang)—and the way that the State (and Strong Man) carefully hide that they are simply the biggest gang going. White guys like to rag on guys from Mexico about this, as the witty if racialized observation runs: “fuck with one bean, you fuck with the whole burrito”. The implied “unfairness” of this “gang action” feeds into a whole conceit of one-on-one “honor”, which of course breaks down as soon as “white” people find themselves sufficiently threatened. Bosnian Serbs certainly weren’t one-on-one in Srebrenica, &c. In this context, it doesn’t matter that the hyena matriarchy does in fact have an Alpha Female (a Strong Woman).  The sexism of patriarchy cannot acknowledge this, or simply doesn’t. Moreover, by insisting that hyenas can’t or don’t act individually, this allows a whole network of criticisms about how Africans (and Arabs) are incapable of democracy, how there can never be a rule of law, how the law of tooth-and-claw can only prevail, and the whole host of other lies about the ungovernability of human social circumstances that are not organized according to “First World” models.

Certainly competition between female spotted hyenas is not without its significant scuffles. It may even be safe to say that the one poster’s remarks about vulture mothers (“The female of the species are phenomenal mothers, gentle and patient with their young, and fiercely protective of them”) is diametrically the opposite with spotted hyenas. The point here is not to down hyenas for bad parenting (especially since doing so would be downing much of the human articulation of non-nucleated parenting over the whole duration of the human species’ existence), but only to point (along with matriarchal social organizations and accusations of tribal attacks) on how “badly suited” hyenas are for being assimilated into the current dominant patriarchalness that rules much of the world’s discourses.

Hyena “otherness” is expressed most obviously in their habit of eating dead things. This is bound to fail to endear you to a human being, that’s certain. Thus, the respondent had to provide a workaround for the vulture’s own habits in this regard. Maggots are the worst insect offenders (never mind that they become playful, annoying flies), and we may get along best with ants (as far as such scavengers go); we can even endure a plecostomus, provided we see them more as fish tank-cleaners than eaters of the dead. I have read, however, that hyenas have no predators and, remarkably, are not actually eaten by other (mammalian) carnivores or scavengers. If this is true, this rather remarkably puts them “outside” of the usual food cycles, further increasing their otherliness and probably adding to their sacredness in African folklore. (In folklore in general, that which is strikingly “other” is typically credit with sacred, but also therefore dangerous and useful, power—African witches make use of hyenas as mounts, &c). One reason advanced for why hyena corpses are not devoured (even by other hyena, who normally would not have such compunctions) is their wretched filthiness.

Hyena have been called the third smelliest creature on the planet, but these lists are contentious and change a lot, and I can’t remember where I found the list originally anyway anymore. Partly this is due to the notorious substance secreted by hyena’s anal scents glands, which is apparently ghastly and hair-raising (and can be used by witches). In one respect, it’s not the truth that matters here. Pigs will tell you they’re known as the dirtiest farm animals, which chickens snicker about continuously. What is obvious is that hyena odor has made a striking impression on human imagination. Whatever the justness of these accusations, the image of hyenas as unwashed rabble sets starkly against neuroses about the hygienics of civilization. And in fact an oft ignored part of US racism—which Faulkner either indulged in continuously or expressed brilliantly, depending upon how you read his work; some would say that the very fact that this question is considered not settled is evidence of still rampant racism—concerns the “funk” of African-Americans. Faulkner makes unmistakably clear how the different scent of Black people has an either attractive or repelling quality for the “white” nose, so that connecting that “funk” to the notorious stench of the (African) hyena further illustrates what historical lines are in play here. That the English and the French (at least) were notoriously stinky underscores that what is at stake here is not merely ‘smelling good,” but simply a more primal reaction to a difference in (body) scent. In the US, where there is a neurotic (Protestant) aversion to the body generally, the difference becomes between various forms of artificial scent (as a symbol of cleanliness, real or imagined) and “body odor” as a sign of slovenliness and a host of other attributes long racistly attributed to the descendants of people abducted from Africa (or recently arrived through immigration). This neurotic equation of cleanliness and godliness (and all that uncleanliness and ungolidness therefore imply) had its backlash in Flower Child “naturalism” and one of its modern expressions involves rejecting underarm deodorant for health reasons—some ingredients being credited with the onset of Alzheimer’s disease.

Today, I asked someone what she thought of hyenas. I wondered if she would mention the Lion King, and she didn’t. [This blog was published before the movie-version of Life of Pi came out; there may be a new common referent point for hyenas now.] She described them simply as spotted animals that laugh. Spotted hyenas, of course, have a wider range than this, and this description is particularly telling:

It is said that feasting Hyaenas engage in violent fights, and there is such a croaking, shrieking and laughing at such times that a superstitious person might really think all the inhabitants of the infernal regions had been let loose.

The famous hyena laugh is apparently more often a signal of nervousness, used when running away (as opposed to something they do all the time or in every circumstance) and sometimes when chasing prey. It’s clear enough that the human equation of hyena “laughter” with the death of creatures and feeding on the dead will tend to lead us to conclude “the infernal regions had been let loose” and that hyenas are some amoral motherfuckers (to say nothing of expressing “immorality and dirty habits” as the West African folklore has it). It hardly takes any effort at all to connect them to someone like Alan Moore’s Comedian. But I think we can be a little more honest than that. Perhaps we have forgotten just how ribald a proper  feast around carcasses can be for we humans at our barbecues and festivals of old. I’m not saying it was wholly disingenuous for us to connect the dot of “laughter” and “devouring the dead” but hyenas has a wide range of sounds when feasting, as the above indicates; there’s no reason to have zeroed in on the laughter as if that was the whole of the story. Clearly (among Europeans and Africans) that was human projection at work. And in the domain of racism particularly, one of the most frequent accusations I have ever heard leveled by whites about blacks—I feel obliquely privileged that I’m able to witness this and disturbed that people feel I’m someone they can make this complaint to—is, “Blacks are loud.” As part of the bigger picture of racism, I remind myself that this accusation can crop up not only with other groups of non-whites, but also as an accusation leveled against poor whites. I see no reason to dwell on this overmuch, but simply want to underline again one more reason and way that hyenas cannot be accommodated into stereotypical notions of civilization—they’re not quiet in the right way.

So hyenas are matriarchal, tribal, loud, smelly, nihilistic (it’s that laughter) and feed on the dead (along with other things). At this point, it’s clear that their “reputation” really only makes them suitable as an insult for adoptive mothers rather than carrying any kind of justice or aptness beyond “the most vile creatures ever”. However, if it was only to arrive at this, I would not have bothered to write this long post about hyenas. For that, I must return to that feature that is easily the most unsettling, weird thing about hyenas that humans have imputed to them: their hermaphroditism.

At first glance, it is difficult to tell if hyenas are more canine or feline. Most people say canine, and they would be wrong. So, right off the bat, hyenas thwart our impulse to categorize them. Cue the imputed hermaphroditism. Female hyenas have large forward facing clitorises that can actually become erect. (They greet one another with these erections; males must figure out how to mount the female through them; and young are born through it, which sometimes tears the passage, killing the mother). Thus, even a factually correct description of the anatomy can unsettle some people. Meanwhile, their association with hermaphroditism (and/or magical transformation) unsurprisingly links them to tales of werehyenas, (male) witchcraft, and thus exacerbates this ambiguity; and we all know how certain elements of society cannot tolerate ambiguity at all (or, they leverage threats of ambiguity for forward agendas like defense spending, pogroms, and the like). Thus, “In Africa, the spotted hyena is usually portrayed as an abnormal and ambivalent animal, … associated with immorality, dirty habits, and the reversal of normal activities” … Not to keep you in suspense any longer, the link here with homosexuality here is easy to make.

I don’t mean the link is exclusively with homosexuality. All morality that is not of the sanctioned morality, all habits that are not amongst the sanctioned habits, and the proper course of all normal activities (as opposed to the perversion of them in their reversal) are summarized in the hyena. It is being posited (in West African folktales) as “non-civilization,” though for that very reason it is (perfectly consonant with such a notion) also sacred (at times) in the way that everything that is Otherly with respect to civilization can be acknowledged (by cultures that choose the acknowledgement, rather than the repression, suppression, oppression, or sheer outright disavowal of the knowledge) as powerful and dangerous (i.e., sacred), and in need of containment or, barring that, warding magic. Thus, the hyena not only threatens the temptation of witchcraft (in places) but simply the temptation of “sin” (indulging in the profane) most generally.

So of course civilization must call all of that which is not civilization “brutish” or “ugly” or “stupid” or “smelly” or “loud”—despite the games hyenas have invented, despite their demonstrated learning and ability to coordinate hunting, despite their cuteness (or their visual aesthetic at least), and despite their power (with jaws, for instance, able to bite through elephant legs). It is arguably the case that—just as the early assertions of a period of human matriarchy (e.g., Bachofen) for the most part simply reversed the hierarchical oppression of man and woman—the matriarchy of spotted hyenas is one of naked dominance. I doubt it is that simple, but as a counter-demonstration of the universality of male-dominance (that is, as a counter-model in which males play a seriously subjugated role), the hyena can serve as the definition of a “feminist nightmare” and thus validly shake the foundations of patriarchal social organizations. Homosexuals (and the queer critique) play an important role in that foundational shaking as well.

Once again, we’re not dealing with hyena actualities here, but rather stories we tell about them, so the hyena (in its capacity for magic and transformation) posits a fluidity of identity that is highly Other relative to the fixed identities of hierarchical social organizations. I make a distinction between fluidity of identity (where one moves continuously, in the mathematical sense, from moment to moment) as opposed to fixed shifts in a hierarchy (where one can occupy different—changing—but structurally static roles within a hierarchy). The pack identity of the hyena (again: this is a function of human inability to differentiate sufficiently, more than a “fact” about actual hyena social organization) makes the blurring of identity that much harder for an observer to track; in racist terms, “all white people look the same” (&c). But aside from these accidents of inattention, which theoretically can be overcome by paying closer attention, the threat of transformation that the hyena proposes (the hermaphroditism) means you never really know what you are dealing with—thus you may be startled by the male who wants to “reverse the normal course of things” and get fucked in the ass by another male, or the female who wants to indulge in the “dirty habit” of cunnilingus. In its necrophagy, the hyena is a breaker of human taboos, and so thus can (for some) signal the whole range of transgressional sexual philias. A Luciferian psychology begins to emerge, where if it is good for civilization, the hyena stands as the opponent of that conceitedness—thus sometimes, in folktales, the trickster figures have to best the hyena figure. This is a curious (but logical) case of the Devil chastening the devils, since most trickster figures (though ambiguous) tend to incline ultimately toward being “on the side” of the society they love to twit. (Loki seems to be a most glaring exception of this, and Satan is not an example, because he is deemed an unambiguous evil in Christianity and part of Yahweh’s holy entourage in Judaism. Trickster figures can at best be only ambiguously evil, or good.)

In this respect, to call de Sade a hyena could not be far off. And part of the colloquial meaning of the use of the word hyena as an insult (i.e., those moments when writers may tend to resort to it as a descriptor) is not simply to point to a specific kind of depravity (vultures prey upon misfortune, sharks are coldblooded sociopaths, wolves are dangerously strong and rapacious), but to suggest the thing is depravity itself. If sharks are amoral, there are times when they will ignore you out of laziness; a hyena, by contrast, would drop everything to fuck with you. In one respect, the hyena is simply the creature “who goes too far”—in which case it is worth remembering that Sade said, “Anything carried to excess is good.”

Stepping back from this unholy nimbus, what I would particularly want to resist is the idea of letting this frame dominate the narrative we tell about hyenas. In all of nature, there is hardly a creature that has so vividly caught the human attention in the way the hyena had—it is the most doggedly (pun not intended) ambiguous creature—the folk tales bear witness to this. It has to be considered a kind of cultural brinksmanship to propose to spend one’s time concocting a story about something theoretically beneath one’s contempt, even as a morally instructive fable. The breadth of available stories makes this generalization perilous, of course, but there have been times when, having been treated to some blistering comments about the inferiority of Blacks, I then asked, “But if they’re really inferior, then how are you shown in a good light by being superior to them?” I would use the metaphor of beating up a child—if the barbaric nations are so childish and backward, then that might lead one to devise a program of “improvement” but how could the ego sustain the conceit that one is thereby superior? Of course I’m taller than people shorter than I am—is that really the actual basis for my inflated sense of self? How could I take that seriously? Every would-be spinner of a heroic tale knows that the evil one contends with must be worthy, otherwise the hero’s heroism is something any motherfucker could do. It’s not simply that this or that figure bests a hyena; it’s that people tell stories about how dumb they are as well—a tacit acknowledgment that (at the very least) it’s worth telling a story about; so much for negating them, therefore. Not that the hyenas live the reality they’re being described in terms of;  “The IUCN’s hyena specialist group identifies the spotted hyena’s negative reputation as detrimental to the species’ continued survival, both in captivity and the wild.” (Take that statement in social terms about people as well.)

I’m not at all saying the folktales about hyenas are stupid, misguided, wrong, etc. I’m saying that taking them at face-value misses a crucial element that the mythos of the hyena offers us as human beings who are caught in oppressive hierarchical relationships. Hyena stories point to the “world outside of the norms of culture”—for sexual nonconformists, the subaltern, the marginalized, this alone points to the value of the hyena as a symbol. Most assuredly, the dominant myth says we are disgusting, depraved, dangerous—this has all kinds of terrifying repercussions on our social life, so it’s worth remembering what hyena laughter really signals—escaping from the lions that would destroy them; keeping one’s humor in those dangerous situations is apparently a way to survive (and to alert others). The hyena embraces, if you will, its exalted wretchedness (a blending of the categories of the exalted and the wretched that a heterosexual mind—I’m going to engage in a dubious generalization—would never realize could be combined in the image of the homosexual). This isn’t an advocacy to accept negative labeling; quite the opposite of it. Someone once said, the best way to end an argument is to agree. Someone calls you a dirty cocksucker, try, “Yep.”

The counterargument runs: if you don’t deny it, they might hurt you. The fact is, they may hurt you whether you deny it or not—and if you deny it, you have to live with that. But more than this, where the will is absolute to hurt, denial or not will avail one of nothing—that’s clear enough from history, and hyenas “know this”. Thus gay men have long-since learned, if someone calls you a cocksucker, it’s frequently devastatingly more effective to quip back, “Why? Are you interested?” Etc. And there are times when that will surely get you killed, just as surely as denial will (or will get someone else killed), because sometimes the lions intend to destroy you no matter what happens, because they can.  It’s not for me to live in the shadow of that fear.


To return to the original point of departure for this essay: what’s the most apt metaphor for adoptive mothers. What I’m struck most by (now in retrospect) is how the three proposed animals actually emblematize the whole adoptive situation in general: the lions (as the overwhelming egotists who, like true Leos, are concerned only about themselves) are the adoptive and birth parents; the vultures are the ones gaining financially from the transaction; and the adoptee him- or herself is the hyena, the alien/Other hybrid creature, neither of this culture nor that culture around which a whole series of apotropaions must be placed to contain the power and the danger of the child—one with “foreign DNA” (and all that xenophobia and exoticism ascribes to “foreign”). How many adoptees have been credited with “dirty habits” “immorality” or a “reversal of the NORMAL course of things”? And so forth. From the hyena’s point of view, the thickness of the label put on them (minimally, as “adopted”) tries to stickily fix what is really the shifting fluidity of identity.  The parents say, “You’re mine,” but you know, at the same time, you’re also, at least in some sense, someone else’s. Or you were. You are of the culture, and not. Within and without, simultaneously. And all the while being crammed into the Procrustean (if loving) mold of the parents’ preconceptions in a way that seems inescapable. Simultaneously, you are conquered and unconquered, colonized and non-colonized—depending on how much you’ve figured out (or, if Jung is right, even if you’ve not consciously figured it out). You’ve been sold by vultures to lions, and you’re neither.

What is true for me, if it’s not already perfectly obvious, is that hyenas are magical in that proper sense of the word, so that I’m “into” what that myth does for me. The hope of transformation—a living creature that mediates between categories in a lived way without being lopsided or neurotically at war with itself. A diviner of taboos and places where others are nervous to enter. It also seems compelling and logical to me that it’s an apt symbol for anti-heteronormativity. So maybe I’m a hyena because I’m gay or because I’m adopted or because I’m both—and only being a hyena allows that (and other seeming) dualities without them negating each other in some kind of contradiction. This is not an and/both instead of an either/or, but a rejection of the and/both & either/or binary itself, insofar as it claims (not in solidarity with postmodernism) that all descriptions of experience (and so reality) can only be partial and transitory—the ambiguity of the hyena allows that condition to persist perpetually. As such, it provides an opportunity for becoming unstuck from whatever cultural ossifications prevail even if not yet a way out, through, or around them yet; however, being unstuck from the current categories that stick us is a necessary first step. In this respect, I think the above kind of reading of narratives about the hyena (to say nothing of the adopted child) may point to a particular leverage point or point of departure for a “liberatory radicalization of self-awareness and thus expansion from within one’s lived experience, as complicated as that might be, and contrary to anything arrived at via an imposed foreign theoretical framework, or from within extant systems of oppression” (to refer to the original referrer referring to Fanon).


[1] Originally I didn’t have this in an endnote, but I suppose it may seem to some like an aside. With respect to the responder’s point that lions are more noble for killing their prey first: it may seem a bit odd to say, but (personification aside) is it really nobler to be sure your prey is dead before eating it? In the first place, dead prey is more tractable prey—you’re less likely to be harmed by it. Alice Cooper had one of his pet boas die because the rat he fed it somehow internally injured the snake, prompting Alice to remark, “It’s kind of like being attacked by your Wheaties.” So, if we want to talk about brutal practicality, ensuring your prey is dead before devouring it is very calculating. The argument hinges on the notion that being eaten alive is worse than not—and that might be so. But the fact is, once death occurs, it is irrevocable. For all the horror I might experience being eaten alive, I would like the opportunity later to have the chance to look back on that experience—even with continuing horror, even with terror and loathing. I can kill myself then if it’s just all too much. There are plenty of circumstances I don’t want to experience, but we cannot (as human beings) seriously desire the “experience of being dead”—there is no such experience. But this is all to say, as the saying goes, where there is life, there’s hope. More precisely, where there’s life, there is the opportunity to choose hope, and I’d sooner have that hope than not, even if it is in a hyena’s jaws. Once I’m dead, I’m out of the picture; there’s no more wanting, no suffering, no nothing. It might be I’ve put this argument badly. The missing assumption in the poster’s remark is: the guarantee that you will not survive the encounter of being prey. If it is the case that I’m going to eat you, and there’s nothing that will change that, then it will seem more humane to do you in lion-style. (evidence from the natural world probably contradicts that lions always kill their prey before eating it; one of the more striking images from such moments in the documentaries is the strange calmness of the zebra as its neck is held in the lion’s jaws. I believe the claim is that the creature’s neck has been broken—so, in a context of paralysis, where one can’t feel the experience, is it nobler or not to then devour the prey “alive” … O moral conundrum!) The point here is—and the history of last minute saves in the world suggests—that the notion “you are guaranteed to be dead” is not convincing. Even the psychopath knows it’s only really all over when the breathing stops. All the fun is in the torture while the person is alive (notwithstanding those who are necrophiliacs). The agony of the person and the hope of living is what often fuels such sadism; once the person really believes they won’t make it, numb resignation more than struggling may take over (except for the biological struggling of our physical bodies themselves, which never can be persuaded to “give up” except by anesthesia). Our bodies, in that respect, are more optimistic than our selves at times.

[2] Re “ferociously protective mothers”, Toni Morrison’s Sula, a “fiercely protective mother” burns her heroin-addicted son to death in his bed. So, let’s not have any illusions about what fiercely protective might mean—such fierce protection (especially as the biologists would interpret it) has nothing to do with the “spiritual growth” of the child, and rather everything to do with the “vanity” of the mother’s survival into the next generation. If Erich Neumann (and more generally Jung) are onto something, it is precisely by separation from such “fierce mothering” (i.e., by avoiding reabsorption back into the Source) that we ever emerge in the first place as human beings (c.f., The Origins and History of Consciousness and The Great Mother). That this may involve a “Western conceit” is apt but beside the point here, as the blogger referring to the fierceness of mothers is communicating this in a Western context. I want to be clear, however; saying this (precisely by pointing to a post-enlightenment “Western” context) I’m not commenting upon the ways that vultures have been valorized.

[3] I find particularly amusing those pictures of male lions (I don’t know that I’ve ever seen a picture of a female lioness doing this) where “his Highness” has a positively stupid, slightly slack-jawed look on his face. In our house, we call this the “stupid lion look,” which our leopard-like female house cat sometimes exhibits as well—and which all of us have seen in “his Highness” particularly in any number of Bush Jr. images. The insistence that there is “something noble” about the lion must inevitably include this (not necessarily acknowledged) look of stupidity as well—it’s absolutely par for the course, and the way that we can laugh at or feel empathy for the stupid lion is just one way “the system” blackmails us into going along with it.

[4] On the one hand, this claim reaches back to that paradisiacal era when we “as pure children of Nature” simply held out our hand and sustenance dropped into it. Many of the Native American tribes in the US Pacific Northwest were accused of the most rank indolence because they lived so comfortably from the literally burgeoning bounty of Nature all around them. It was said this abundance made them non-innovative—that being one of the recurrent critiques of Utopia; if all of our needs are cared for, we’ll just become fat wastrels laying around doing nothing. To name only one of the ideas worthy of the human world’s heritage of ideas is the potlatch. This notion (not killing to survive) also plays into the discourse of a return to childhood that has a social function (in patriarchy) of keeping women and children helpless.

The intent of this series is ambitiously to address, section by section over the course of a year, the celebrated Crowds and Power by Elias Canetti. This is the second entry in the series.

The purpose of this post is to provide an introduction to a framework for encountering Canetti’s text[1]. It is expected this framework will evolve over the course of the series.  As a starting point, opening the book twice to a random paragraph, we can read:

No political structure of any size can dispense with order, and one of the fundamental applications of order is to time, for no communal human activity can take place without it. Indeed one might say that the regulation of time is the primary attribute of all government” (397, emphasis added).

It is unnecessary to point out that a crowd of spermatozoa cannot be the same as a crowd of people. But there is undoubtedly an analogy between the two phenomena, and perhaps more than an analogy (247).

I do not pretend that every sentence of a work must be worthy of its whole praise (as “revolutionary” or “breathtaking”, of Iris Murdoch’s or Susan Sontag’s admiration, as providing “dazzling insights and audacious intellectual leaps, some more convincing than others, [that] are startling, shocking–and maybe even true” or even “something brilliant but counterintuitive, and [left to] the reader to figure out both why he said it and whether it’s really true”)—the point here is merely to grab ahold of text not only to compare receptions of it but also to begin framing the framework for encountering it general.

The object here in any case is not yet to engage the specific content of the text, but only to point out qualities of the text, as it were. In the first paragraph above, the extreme generalization (i.e., “no communal human activity can take place without [the regulation of time]”[2]) easily invites contradiction. In the second paragraph, the text acknowledges the innecessity of its analogy in the first sentence, and then insists upon (and even reinforces) it in the next. A friendly reading might call this a reasonable acknowledgment on Canetti’s part; an unfriendly reading might call it duplicitous or a merely rhetorical move in the face of easily imaginable counter-evidence. Ultimately, the question of whether this analogy is apt (any imaginative analogy might be more or less apt) involves assessing the social consequences of it, and less any definitive or speculative analysis why Canetti resorted to this analogy—thus, it is not so much the reader’s task (as Canetti’s fans insist) to determine whether his observations are “really true,” but whether his description of phenomena is one that does desirable social work for us. In principle, this is not always easy. It takes precious little imagination to read mere reductiveness into Canetti’s notion that crowds consist of spermatozoa; the fact that this explicitly excludes female participation (as part of the social fabric) seems an important part of the analogy[3]. So, quite apart from whether it is “really true” that crowds can be analogized with seminal discharges, we can ask if we want to live in a social world where that is taken as true.

In attempting to assess the social desirability of the idea, it becomes tempting (and easy) to seek an explanation for why Canetti resorts to this analogy in his biography, but this misses the mark. To be sure, an artist (a writer) provides a vision of some sort, which is offered to the world (once a publisher decides it can and should be published). Whatever that vision purports, its social consequences may on the one hand be dismissed or defended (by fans or critics) as merely reflecting the idiosyncrasies of the writer. On this ground, Mein Kampf can be called simply the author’s personal vision—seemingly not much of an adequate or desirable stance, even if one isn’t willing to go so far as to insist that the book’s publication should be suppressed[4]. On the other hand, the argument runs that the use an author’s work is put to cannot be “blamed” on the book. In the world of music, this has been tested in court multiple times; Richard Ramirez blamed his murderous rampages on AC/DC’s Back in Black unsuccessfully. Similarly, certain kinds of outdatedness can wane an author’s fashion; Shakespeare’s Merchant of Venice tends to be unproduceable these days (one wishes the same were true yet of Taming of the Shrew, but the backlash against sexism hasn’t reached such a proportion yet).

The dichotomy here is between the (positive or negative) value of a work as an individual’s self-expression as opposed to the (positive or negative) value of the work in terms of its social consequences. Thus, we can argue that Mein Kampf or Merchant of Venice:

  • should be published (and remain in the public/social eye) for the cautionary tale (about a society of people) that we can read from the works, often quite against the authors’ intentions;
  • should be suppressed (and shielded from the public/social eye) since in the balance their social harm is greater than whatever good they provide;
  • should be published because the social good can be reached only by the contribution of individuals (good or bad);
  • should be suppressed because certain individual contributions can only have a socially disintegrative effect;
  • should be published so that the existence of certain kinds of people are never lost sight of; this is the cautionary tale again but directed to the kind of person in society rather than the kind of society of people;
  • should be suppressed because certain kinds of people should not be allowed to have their works produced; more generally, self-expression (as opposed to social expression) is socially detrimental;
  • should be published because individual works are simply that, individual works (good or bad); how other individuals use that work is not the author’s fault or responsibility;
  • should be suppressed because certain kinds of individual expression seem inevitably to be put to use in socially negative ways (c.f., the notion that the limitation of individual liberty increases social freedom)

Jung provides an alternative to the either/or of “self” and “society” (and all that those terms imply and cover) when he writes,

As the individual is not just a single, separate being, but by his very existence presupposes a collective relationship, it follows that the process of individuation must lead to more intense and broader collective relationships and not to isolation (Two Essays on Analytical Psychology, ¶241 f10)[5].

In one way, this may be taken as a restatement of “the social good can be reached only by the contribution of individuals” above, except that this accent remains still individualistic. Rather than the priority of “self” over “society” or vice versa, Jung may be taken as saying that we are already effectively collective and can only be “saved” from that collectivity by the process of individuation, which is contrary to the (current dominant) tenets of hyperindividualism.

On this view, we can neither reduce private action to an only personalistic interpretation (positively or negatively), particularly when expressed socially (e.g., as a published object) nor dismiss the uses of that social offering in merely personalistic terms either[6]. This means (against John Boswell’s exculpation of Christianity vis-à-vis social attitudes towards homosexuality) that the bible does in fact play a non-negligible role in the social reproduction of such homophobia. It means (against the NRA’s various forms of insistence) that guns cannot be deemed wholly innocent of gun deaths. It means in general (against the doxa of post Industrial Revolutionary social orders) that technology itself is not an innocent tool that is only ever variously used or misused by human beings. As Jung stresses when making a point about obtaining social influence (i.e., “magical prestige”), making a point so obvious that its implications seem easy to overlook,

One could easily assert that the impelling motive in this development [of the desire to obtain magical prestige or social influence] is the will to power. But that would be to forget that the building up of prestige is always a product of collective compromise: not only must there be one who wants prestige, there must also be a public seeking somebody on whom to confer prestige (Two Essays in Analytical Psychology, ¶239).

Thus, as against a one-side notion that “believers make liars,” there must also be those with the “will to power” of to promulgate a lie. And against the one-sided notion of an authoritarian personality, there must also be those with the “will to power” to found or enforce authoritarian states. On this view, the fruitlessness of the “self” or “society”  dichotomy becomes clearer: instead of (only) a Great man theory of history (self) or a sociology of the authoritarian personality (society), Jung underscores the dynamic, interactive dialogue that occurs between any given “will to power” (“self”) and the environment (“society” or “public”) where that will to power occurs. This would be a platitude—in fact, the matter-of-factness of Jung’s remark belies that he views this as a matter of course—were it not for the fact that discourse about these things (self or society) have and are still so divisively drawn.

Without covering this in further detail, my approach to Canetti’s text pays no attention to the “psychological” reasons for the works’ composition. What we have in our hands when we read Crowds and Power is not “Canetti’s vision” (at least not only); we also have whatever “social use” the publisher imagined for it. Whatever “merely autobiographical” elements inform Canetti’s book, those details are of interest only to people who want to understand the author, not the author’s work. Biography can be illuminating, of course, but it cannot be the justification or basis for a work’s value.

For example, we can psychologize till the cows come home why someone would write (how someone would come to arrive at writing) “It is unnecessary to point out that a crowd of spermatozoa cannot be the same as a crowd of people. But there is undoubtedly an analogy between the two phenomena, and perhaps more than an analogy” (247), but this would be non sequitur. We can revile or applaud Canetti on various grounds for such an observation, but what are the social consequences—that is what matters to me. What people pick up this thread of an idea and fly their (pessimistic) banner under it? It would be absurd to pretend Canetti’s work does not provide such an excuse—precisely because a socially offered work has consequences in general (both desirable and undesirable). It is absurd to pretend that the National Socialists in Germany did not put Wagner’s music and Nietzsche’s philosophy to a certain use. We might find that objectionable, but objectionableness does not negate the fact (except by killing or silencing those who make the point). That Sontag (on the book’s back cover) reports that “Canetti dissolves politics into pathology, treating society as a mental activity—a barbaric one, of course—that must be decoded” (emphasis added), we are in the presence of a definite kind of use that Canetti’s work has been put, as the reviews studied in the first post also make clear. And whatever salutary smart there might be in taking a view of social mental activity as “barbaric, of course,” the value of the individual’s observations on this point (both Sontag’s and, presumably, Canetti’s) are in light of and as part of the context of a publisher’s decision to add more brutality of interpretation to the social discourse, to include assertions that crowds are (male-only) hordes of sperm that have no object by to outpace everyone else (so they fail to reproduce and die), &c. Publishers, of course, have a vast array of justifications for such activity, but the most predominant one is selling books, of course[7]. To be more precise, to whatever extent prestige is desired by offering a work for sale, the money involved in the transaction never vanishes from the calculus of the transaction. And in that way, everything published may be implicated in the oppression of everyone. And pessimism is a great message for annihilating individual will to action.

So, whatever Canetti might have meant by offering Crowds and Power, an inextricable part of it at this point is its participation in collective oppression. Resisting that is this series’ main purpose.

[1] All quotations are from Canetti, E. (1981). Crowds and Power (trans. Carol Stewart), 6th printing. New York: NY: Noonday Press. (paperback).

[2] The elemental objection here is that prior to any historical invention of means for the regulation of time (or in those current periods where such means of regulation have ceased to work or are unavailable), human communal activity nevertheless has and did manage to occur.

[3]Taking this paragraph independently, the absence of women from the vision is only partial. Except that Canetti further limits the analogy of crowd as wad of sperm-filled semen, then Woman (as the passive object of insemination by the victoriously fastest swimming sperm) is in fact “in the picture” in a significantly non-essential way. “Woman” may be the goal of the sperm-crowd in this case, but She does not participate, she waits—wits to be destroyed, overrun, penetrated by the winning sperm, &c. I suspect it would be few feminists who would see this male-only sperm-crowd as an implicit compliment of some sort to women, as if they do not participate in crowd movements. In general, the absence of women from crowd movements has less to do with an imputed moral superiority (that is above participating in such things). But even were that the case, this would smack altogether too much of the bourgeois fantasy of the separation of the spheres (the world and the domestic), where the figure of Woman (a reclaimed Eve) now is imputed the capacity to save the world by their higher moral nature as a domestic angel. J.M.S. Tompkins’ The Popular Novel in England: 1770–1800 and Kate Ferguson Ellis’ The Contested Castle: Gothic Novels and the Subversion of Domestic Ideology ably expose and dismantle this idealize.

[4] There is something slightly surreal that one may go to “Adolf Hitler’s Amazon Page” at

[5] This remark is Jung quoting himself from Psychological Types (¶758).

[6] For more details on this, see particularly see Krippendorff’s (2003) “Design of Cybernetics and Cybernetics of Design”.

[7] This is another perfect moment to point out the justness of Jung’s observation about the will to power. With respect to any number of cultural proffered products (books, violent video games, racist music) one frequently heard remark is, “We don’t make the markets; we just sell to them.” The immediate counter-argument is that by offering something for sale, this manufactures a market. So on the one hand, the publishers “blame” the (authoritarian personality of) society for buying the products; they are just the helpless dictator, pandering to the wants of the masses. On the other hand, critics accuse publishers of a will to power—the desire to increase profits, status, social access to resources, &c (particularly in light of corporate personhood). The cyclical resurgence of variations of these arguments points to a circumstance of pseudo-analysis. (A critic once remarked that whereas in science there may be revolutions from time to time, nevertheless there is also a generally consistent forward progress in terms of knowledge gained. By contrast, where pseudo-science prevails, one encounters recurrent resurgences of different sides of a dichotomous argument—as between child-centered or standards-centered education, as between cognitivism or behaviorism in psychology, &c. By analogy, I am suggesting that in those domains where one encounters again and again simply reassertions of the same old claims, albeit dressed up differently, then one is dealing with a pseudo-analytics that has the purpose, by design or by accident, of maintaining the status quo. As opposed to a circumstance where one makes some kind of progress in the deepening of an understanding, even when punctuated occasionally by epistemological revolutions.) So, whatever role we want to impute to a “public seeking [a publisher] on whom to confer prestige,” this does not let the publisher off the hook via various arguments that they are simply “selling to the crowd”.

The intent of this series is ambitiously to address, section by section over the course of a year, the celebrated Crowds and Power by Elias Canetti. This is the first post in the series.

As an opening gesture, I want to characterize briefly some aspects of how this work is received, by taking two examples from’s book reviews. These are not, of course, scholarly reviews. But just as Northrop Frye once remarked in Anatomy of Criticism that popular literature often shows more clearly the archetypal outlines of literature than otherwise exemplary works of art, the “popular” readings of Canetti’s reproduced below my be particularly illustrative.

That the author won the Nobel may suade the reader one way or another. But as this work is what got him the prize, which to me says the Nobel must be worth something. If you don’t know Canetti’s work, you won’t get the impression from the title that the man is incredibly funny. But he is. And yet his brand of humor comes only from surgical-precise observation of the ordinary. Canetti is the Montaigne of our time, of modernity, bearing all the marks and scars of our age. If Canetti’s prose has the disarming rambling style that we associate with Montaigne’s, it also has the latter’s power to draw out the most unexpectedly profound from the ordinary. Sort of like old fencing masters: they never run, never sweat, are never fancy, but they always beat you to the jugular. All the scholarship, all the discipline is hidden, like the hull of a ship that keeps the whole thing afloat. In this book, without torturing language, Canetti tells you more about the nature of power than Foucault, and more about the nature of crowds than a room full of social psychologists. (That such a feat is possible ought to be a sobering lesson in itself!) Canetti’s book is a wonderful mix of the potentially tedious (kangaroo behavior) and the…funky. For example, in describing the psychology of mass fear as it relates to its twin, the desire to out-survive others, he cites unexpected examples: burial customs in rural India in which a strenuous attempt is made to appease the spirit of the child if it dies a preventable death; the peculiar madness of Roman emperors; and the Viking warriors’ tradition of piling up a mound of stones before going into battle. Each warrior brings a stone and adds to the pile. After battle, each warrior removes one stone, thus leaving a mound of stones that would represent the dead. Contemplating by the fire the remaining mound was immensely satisfying to the survivors, apparently. Canetti’s notion of the crowd is never just a bunch of people. Canetti defines crowd as a cumulation of small units into a large ensemble, causing it to become something entirely different from the units that make it up. He sees nature as the teacher that taught man to behave as a crowd, as a liquid. For example, for the Germans, it is the forest with its innumerable trees, standing vertically, that has inspired the German soul since time primordial in its aspiration to become a marching liquid. For the Arabs, it is the sand of the desert. For the Dutch, it is the threatening sea itself. For the Mongols, the wind. Etc. Canetti’s prose is muscular, never bloated. Given that he was a man of letters, and not an anthropologist, it may be of some significance that his lifelong project—it took him some 30 years to write this book—was shaped by his lifelong preference for a world as envisioned by the ancient Greeks and the ancient Chinese in matters literary, moral, and philosophical. His science is the science of a man confident in his experience and aristocratic power of observation. Canetti never sets out to convince. He has nothing to sell. It is his style to simply put it in front of you, and then leave. Take it or leave it, but this book will never leave you once you begin it.

I suggest returning to this description over the course of this series. For now, I will focus only on the opening and the closing of this review. We first are told that Canetti won a Nobel Prize (without mentioning what for), and at the end is an appeal, “Take it or leave it, but this book will never leave you once you begin it”[1]. In the next review below, it too ends with, “If you follow Canetti’s lead, you’ll surely come away changed by what you see.” This faith in the transformative capacity of Crowds and Power involves that “Canetti never sets out to convince. He has nothing to sell. It is his style to simply put it in front of you, and then leave.” In the review below, this sentiment is echoed in, “Canetti’s dazzling insights and audacious intellectual leaps, some more convincing than others, are startling, shocking–and maybe even true.” This language is similarly reprised in’s editorial review.[2]

Crowds and Power claims to be a nonfictional “study” of crowds and power, not a fictional work about the same. In this respect, a mode of argumentation that is “take it or leave it,” where “here’s how I see it” is dropped in your lap and the author then leaves makes for a very poor comparison with Foucault, even in his most postmodern methodologies. Foucault at least offers arguments for his evidence. Moreover, Canetti’s observations lean on being “maybe even true”? This is merely a slippery move (by the reviewer below) to say, “I want to believe this, but I really don’t have any reason to.” And why is this?

Well, if I’d ever once been a cockeyed optimist or a believer in the inherent goodness of humanity, this book would certainly have knocked the foundations out from under me and brought all my idealism crashing down. Fortunately, I guess, I already stand in the after-world of shattered illusions and so Canetti’s *Crowds and Power* didn’t disturb my uninterruptedly black view of human nature with even the briefest flicker of light. It only gave me another way to look at a bleak landscape.

This book is a massive—and for the most part massively entertaining—indictment of the human being at virtually every level of its existence. Whether alone, in packs, or full-sized crowds, our goal is not just survival, but to be the last man standing beside a pile of corpses. No kidding. Crudely put, that’s the bottom line, but its [sic] how Canetti adds up the facts to arrive at his thesis, or, perhaps more accurately, subtracts all the subterfuges we hide behind, that provides the real fascination of *Crowds and Power.*

Somewhat reminiscent at times of Frazer’s *Golden Bough,* Canetti’s masterpiece explores, in part, ancient as well as more recent, but still ‘primitive’ native cultures to reveal the power principle that drives civilizations and those who rule them. At the same time, he shows how the same ruthless dynamic is at work in modern society and in practically all human relationships. Animal behavior, paranoid schizophrenics, the hidden symbolism in the act of standing up, it’s all brought to bear. Canetti’s dazzling insights and audacious intellectual leaps, some more convincing than others, are startling, shocking–and maybe even true. The teeth in their smooth rows as mankind’s first inspiration for order, weapons, and eventually prisons? Is it possible? We laugh when someone trips and falls because it reminds us–in less `civilized’ times–of the fatal stumble of prey. As Canetti succinctly puts it, “Laughter is our physical reaction to the escape of potential food.”

I interrupt the review here. This reviewer allows that he is a cynic, and Canetti is essentially preaching to his choir, so that “Whether alone, in packs, or full-sized crowds, our goal is not just survival, but to be the last man standing beside a pile of corpses. No kidding. Crudely put, that’s the bottom line, but its [sic] how Canetti adds up the facts to arrive at his thesis.” As someone born in 1966, I was neither stunted by horror by World War II nor crushed of my idealism by 1968. Programmatically, I categorically deny that any one person’s or group’s experience can ever be generalized to explain the human condition—hence the uselessness and inappropriateness of Freudian, Adlerian, or Lacanian psychology (to name just three) to all of humankind (even limiting the meaning of “humankind” to Europeans and their main descendants).  By contrast, Jung and the Jainists long ago demonstrated that Truth (I use a capital T deliberately) manifests much like Hindu deities do: as necessarily partial avatars[3].

The salient point here, without allowing myself to go off on every tangent attached to it, is that whatever the personal/historical reasons for adopting a pessimistic or nihilistic view, pessimism and nihilism are exceptionally well-suited for ensuring the perpetuation of the status quo. The scoffing of many toward the Occupy movement is an excellent case in point (as is the scorn heaped on the audacity of hope), particularly in the face of the so-called Arab Spring. Most assuredly, nothing will change if nothing is tried, so whether Occupy fails or not, to sit on the sidelines and declare it will fail is against one’s self-interests (unless you are the one percent, but even then as well). We are all born to die, even in utopia, so if that “failure” is all that matters, commit suicide now and get out of the way.

It is particularly against the use of Canetti’s book (if not Canetti’s book itself) in nihilistic terms that I am answering it. One of the more repeated observations from it (here in the reviewer’s apt summary) is, “We laugh when someone trips and falls because it reminds us—in less ‘civilized’ times—of the fatal stumble of prey. As Canetti succinctly puts it, ‘Laughter is our physical reaction to the escape of potential food.’” To paraphrase one of the most obscene remarks from Dostoevsky, “Prove to me beyond the shadow of a doubt that [Canetti’s observation] is true, I will still reject it.” However, the observation is at best a hypothesis, even in the context of its “argument” (addressed much later in this series)[4]. The reviewer’s outlook here reflects what I would call an illegitimate use of the point of view exemplified by Augustine’s credo ut intelligam (“I believe so that I may understand”).The reviewer, admitting that

fortunately, I guess, I already stand in the after-world of shattered illusions and so Canetti’s *Crowds and Power* didn’t disturb my uninterruptedly black view of human nature with even the briefest flicker of light. It only gave me another way to look at a bleak landscape

believes that humankind is a disgusting mess and finds confirmation in Canetti’s observations. He adds, “There’s a short cautionary epilogue to the book in which Canetti holds out some scant hope, but you get the sense that he really didn’t feel it.” This reviewer posits that Canetti attempted to delude himself at the end of Crowds and Power; whatever the merit of this assertion, it is clear that the reviewer is working to maintain his cynical and nihilistic interpretation of the book.

A fundamental aim of this project is to resist this nihilism (whether in Canetti’s readers or the book itself), principally because it so ably damns most of us to an undesirable status quo. Resisting this move is worth bearing in mind while reading the rest of this review.

Supporting ideas and examples for such unsettling observations come from the most unexpected places and yet somehow they all come together through the medium of Canetti’s astounding intellect to provide a powerful and plausible view of life that you’re going to have to put out of your mind the next time you find yourself at a party, in the office, or in a crowded theater—well, really anywhere you find yourself confronted with other people. You see, they all have one driving passion: to survive you. […]

Here is one of the most significant aspects to be resisted. The thesis of an inherently disgusting and animal nature of humankind is being enthusiastically insisted upon here, emphasizing “Canetti’s astounding intellect” that provides “a powerful and plausible view of life that you’re going to have to put of our you mind the next time you’re at a party” &c. That power and plausible view of life is that: “You see, they all have one driving passion: to survive you.” One way to respond to this fundamentally repugnant and massively untrue idea would be to hope that believing it is its own reward, but not only would that be a cheap tit for tat, but also the primary historical victims of such an assertion have been all those people that the paranoiac believes are trying to survive them. It seems significant also that this socially untenable notion is exactly the kind of unsupported generalization one finds over and over in Canetti’s work—for example, in the opening sentence, “There is nothing that man fears more than the touch of the unknown”.

The reviewer concludes:

At times, *Crowds and Power* becomes mired in its own attempt at comprehensiveness; excerpts from source material, for instance, is either too long or repetitive or both and some of Canetti’s theories seem more the result of poetic imagination than philosophical speculation. But these are small caveats beside a work of such monumental scholarship and scope—a courageous work that stares relentlessly at the darkest places in the human psyche and doesn’t once squint. If you follow Canetti’s lead, you’ll surely come away changed by what you see.

Like the first reviewer, here at the end it becomes once again essential to resort to vague hyperbole, the “monumental scholarship and scope” of the work, its courage to stare “relentlessly at the darkest places in the human psyche” without once squinting or flinching. (Once again, it will be worthwhile to recall this description as we work through Canetti’s book.)

In any case, why must one should take the darkest places of the human psyche as the starting point for generalizing about the human condition is a dubious proposal, even as a corrective against staring too long at the brightest non-corners of the human psyche as well. Also, as Jung makes clear (and as Lem delights in engineering in his fiction), when one comes up against the epistemological limit, what appears are not the facts of the human mind but the mirroring images of the one looking. Nietzsche is the most frequently cited abyss-starer in this regard, but the products of the darkest recess of the human psyche (which by that very definition cannot be “seen” for being dark, no?) are not forever and perpetually dark themselves, as Jung saw over and over. The numinous quality of unconscious material might be terrifying or enlightening—the valence being determined by the attitude of the one looking.

There is no doubt that these reviewer’s “use” of Canetti in the manner they have is rooted exactly in one of the partial truths (lowercase t) referred to earlier. To say that this is a garish misreading of Crowds and Power is a bit lame—an academic point of view might add all kinds of articulations or details or simply point to deliberate or unintentional ironies Canetti built into his text to protect it from being “optimistically” received. Canetti’s text, in any case, is hardly the only one that feeds this particular pessimistic demon; the Bible provides ample reason as well. The point generally is that any use of a text (or a text itself that readily fosters) something more in the direction of nihilism or pessimism plays into the hands of the Powers That Be. Most assuredly, overly optimistic texts (and use of texts) achieve much the same end—the former “freeze the soul with horror” so that one no longer acts; the latter “light a fire under one’s ass” so that we waste ourselves on nonproductive or distracting tasks.

If the second reviewer enables himself to face each day by annihilating any significance in it, that’s his own business; socially, however, it only assures that the hole he’s speaking out of gets deeper for everyone. If for me it is the courage of despair that keeps me hopeful in these proceedings, then let the primordial image of refusing the status quo, even if it means destruction, be my guiding light.

[1] The back of Crowds and Power of course has the usual marketing shilling from Iris Murdoch (who was Canetti’s lover) and Susan Sontag. Such marketing material and language takes this form: “Crowds and Power is a revolutionary work in which Elias Canetti finds a new way of looking at human history and psychology. Breathtaking in its range and erudition … Canetti offers one of the most profound and startling portraits of the human condition” (underlining added) The shamelessness of this, of course (as well as invoking the magic words of Iris Murdoch and Susan Sontag) have everything to do with getting you to part with your money and nothing to do with any merit of the book’s ideas.

[2]“Elias Canetti’s 1981 Nobel Prize was awarded mainly on the basis of this, his masterwork of philosophical anthropology about la condition humaine on an overpopulated planet. ¶ Ranging from soccer crowds and political rallies to Bushmen and the pilgrimage to Mecca, Canetti exhaustively reviews the way crowds form, develop, and dissolve, using this taxonomy of mass movement as a key to the dynamics of social life. The style is abstract, erudite, and anecdotal, which makes Crowds and Power the sort of work that awes some readers with its profundity while irritating others with its elusiveness. Canetti loves to say something brilliant but counterintuitive, and then leave the reader to figure out both why he said it and whether it’s really true. —Richard Farr

[3] A massive digression threatens here. The point that needs stressing here is that these avatars of Truth are incommensurable; one doesn’t arrive at the truth by pooling them or comparing notes. And secondly, while each person’s avatar of Truth will necessarily have subjectively determined elements, there will also be broadly collective elements that whole classes of (literally) like-minded people will share. In Psychological Types, Jung proposed eight classes of truth (lowercase t) while Jainist doctrine stresses a sevenfold notion of truth (lowercase t). Thus, this position offers an alternative to the dichotomy of Absolute Truth versus Absolute relativism. It is, in fact, a radically different epistemology than one typically encounters in “Western” cognition, and cannot be parodied or “placed midway” between these untenable binaries.  This proposes a distinctly antipostmodernist view.

[4] It is anticipating things to say this, but here goes. Augustine’s famous or notorious credo ut intelligam (“I believe so that I may understand”) appears to put the cart before the horse (i.e., one should rather know, so that one may then believe), and it has certainly been utilized at times by those particularly interested in controlling people through religion, since “faith is the evidence of things unseen” &c. In one respect, this is the fundamental complaint of Aristotle against Plato. The distinction I want to mention here is that there is a legitimate and an illegitimate use of credo ut intelligam, which Canetti may prove to be guilty of.

This is a continuing part of the ongoing, fruitfully lengthy dialogue between Daniel ibn Zayd and myself personally, but also between elements in the rest of the world. It builds on two posts here and here.

If my original post was using personal experience and general reflection to think through issues in transcultural adoption I’d never had occasion to think about (prior to being inspired to do so by ibn Zayd), he then usefully widened the lens of those ruminations to bring in a more encompassing view. The post below widens it even further. In one respect, we seem to be “talking past each other” insofar as we not not rummaging around for the most part in the details of each others’ blogs. I personally would like to respond to the salient points in ibn Zyd’s response, but I don’t think I would have done so usefully without first offering the below.

As usual, I feel the atmospheric crushing of “TLDR” lingering in the background of any Internet offering. That Daniel refers to his response as “rambling” and is obviously comfortable letting himself work out his thoughts “aloud” and in detail, that makes me comfortable offering something of the sort in reply to him, particularly as it only edges up against his response in a sideways way. But for you (the random you who is here), you might feel you’re in for a long haul. I don’t see, really, how that can be helped. If you’re not willing or able to reflect thoughtfully on an issue besetting the world, if you’re unable to see the frame through which you are seeing the world and don’t want to see it or spend the time to look, then try to enjoy the trap you’ve bought into for yourself, even if I imagine I can hear you crying to be loosed from it. I can only say that now is no time to be giving up.


What I would want to ask immediately is the distinction between empire, colonization, and the social articulation of hierarchy that the Agricultural Revolution (c. 6000 BCE) required/necessitated.

As a network (a constellation) of related topics/replies to the above:

The Agricultural Revolution, by design or by consequence, formalized at least one variety of de jure or de facto slavery in the social orders people articulated through it. Lerner (1986) argues, however, that at least a proto-doulotic (“slave-istic”) discourse already existed in the status of female vis-à-vis male. The paradigm shift the Agricultural Revolution afforded may be read as a male colonization of the female; alternatively, one may read the history of colonization as instances of masculine (sexual) domination.

It is tempting to lay sexism at the feet of the Agricultural Revolution’s hierarchicalizing, but this is untenable. As ibn Zayd (2012) points out, it is helpful to distinguish between how a culture self-describes and what actually happens. Consequently, male-dominated anthropology on Papua New Guinea provides an inordinate amount of “woman bashing” by New Guinean males (in several cultures) without necessarily providing a window (partly due to androcentric biases on the anthropologists’ parts) to the actual lived-lives of women. Illustratively, among the Yąnomamö people (who live in the area of the border of between Venezuela and Brazil), males are at liberty to mistreat women seemingly at will, and a major past-time involves raiding neighboring shabonos (“communal dwelling structures”) for more women. Patterns of similar (male-reported at least) denigration of women in classically nomadic cultures may be identified as well[1].

The point of mentioning this is to acknowledge that the denigration of women, which one can find all throughout the anthropological record, does not necessarily arise only in cultures marked by agriculture or its historically derived social forms.

One argument is that these denigrating patterns exist only post-contact in societies not organized along intensive agricultural lines. Given that a tension in a traditional culture can involve a disparity between the description of lived reality and actual lived-reality, such that the (necessary) reproduction of past cultural patterns in light of changing (social, environmental, historical) circumstances becomes virtually impossible, the premise that contact has an essentially inevitable virulence seems like a negative reconstruction of the white man’s burden[2]. In a case like the Meiji Restoration (in Japan), the way that Christianity imposed a moralism on aspects of culture that had not previously been there is easily seen; something similar occurred over the course of the European Reformation and Counter-Reformation (particularly in view of homosexuality, which was handled far more laxly by the Catholic church up to the 14th century or so). When it comes to issues like “the denigration of women” or “homosexual behavior,” however, it becomes much more difficult (if not impossible) to strictly blame contact (with white people or anyone for that matter), because both occur so widely and in such a variety of different cultural settings.

If we are going to at least pretend to respect the cultures we speak of (i.e., “represent”), then we cannot deny that culture’s autonomy in the face of contact, except in one case: violence; that is, where violence is involved, there is an obvious gesture on the part of the one using violence precisely to negate the autonomy of another. And while I well-appreciate there are many kinds of violence, here I’m specifically referring to physical violence (and thus also intimidation, the threat of physical violence or death to oneself or toward those one would not want to see harmed).  On this view, “contact” specifically means conquering, domination, colonization—the Arabification of Africa, the Sovietization of non-Russian states; the Americanization of the Western Hemisphere; the civilization of the British empire; the Ottomization of southeast Europe, the Pharaohization of Africa, and so on and so on—and barring such  forceful conversions, then extermination (Idi Amin’s Biafrans, Europe’s Jews, Palestine’s Palestinians, &c).

In any case, the “contact thesis” needs more work at the least. As far as the Papua New Guineans go, for instance, they reported they’d never encountered other people (but for other Papua New Guinean tribes, of course); one has to resort to some conniptions to blame the apparent misogyny of Papua New Guinean men on contact or anthropological bias. It’s worth remembering that, almost alone in the whole world, some Papua New Guinean males effectively menstruate; that is, they cut their genitals in order to simulate the monthly bleeding of women. It’s not necessary here to go amplify the “we hate women, we’re like them” motif; these are simply examples. And maybe it’s perfectly obvious, but I’ll say it anyway: I point to these without insisting that every pre-agricultural civilization must denigrate women; many of the peoples in North America (perhaps most famously the Haudensosaunee, or Iroquois) reflected a (near-) equality of women—my qualification here (“near-equality”) is due to feminist examinations of these (largely) egalitarian cultures, insofar as they (the feminists) tend still to insist that there is significant gender inequality (and not just gender distinction recognition) in these civilizations. In one Peruvian Indian tribe—whose name I cannot recall at the moment—where small-scale agriculture flourishes, the degree of equality between men and women is striking.

Putting this another way, one cannot credit white people (whatever that means) with inventing violence toward other cultures or violence toward other human beings. This is not to deny the kinds of institutionally valorized forms of violence that have been invented, of course. Nor does the violence of “dark-skinned tribes” against anyone else (white, black, or otherwise) provide an excuse or justification for violence (much as people love to resort to that argument). And, just to round this out, I’m not suggesting that violence was “invented” by anyone (any culture) in particular. What I am saying is that violence directed toward women (whether physical, economic, social, &c) is pre-historic (in the literal sense of the word) and so is homosexuality (although its storied expression over the course of human history is suppressed, misinterpreted, confused, and so forth). For every matrilocal culture, where women had a measurable level of respect and dignity, there are cultures where “two-spirits” and “shamans” and other non-heteronormative cultural expressions are not treated to the holocausts and massacres endemic to cultures that can’t “handle it”. What I want to be clear about: I don’t think its “white cultural imperialism” to refuse to allow “beating women” to hide behind the excuse of “cultural relativism”. I think that it is humanly correct to call out the Yąnomamö for how they treat women, no matter how well-socialized Yąnomamö women are to such treatment. Of course you learn how to cope; so did US slaves, that doesn’t mean their enslavement was justified (much less right). So too with violence against homosexuals (for per se being homosexuals). I don’t doubt that gay activist meddling from the US can make things difficult for homosexuals in certain nations; but again, it is not white cultural imperialism to refuse to allow “oppressing homosexuals” to hide behind the excuse of “cultural relativism”. If sexuality (like gender—the two are obviously tangled together) were not fundamental facts of human existence, there would then be a very tiny, narrow window to try to squeeze that argument through, and then still only spuriously. It may sound like I’m saying everyone has to allow homosexuality to be openly practiced and celebrated in at least equal intensity to how heterosexuality is celebrated. (And why not?) But what I’m more particularly pointing to is that, due to violence toward gay-identified people, the very institutions of homosexuality that exist in a culture (and every culture has them) reflect that violence; insofar as violence against human beings is wrong, to defend those institutions of homosexuality (and heterosexuality) as culturally relative or right fails. This is not a slippery slope, because I’m not talking about all forms of cultural expression. Oppressing women for being women is wrong; oppression someone because of their race is wrong; oppressing someone because of their sexuality is wrong. So if Ugandans want to execute faggots, they’re wrong; if Afghans want to oppress homosexuals (on religious or traditional cultural grounds), too bad. It’s not cultural relativism insisting upon this. The fact that a majority of humans are not gay-identified permits the trouncing and silencing of this point of view—some people go to prison and make a positive experience of it, that doesn’t justify the prison system. Some people are colonized, and have satisfying lives—that doesn’t justify colonization. Some women find deep satisfaction in living whatever cultural lives their world permits—that doesn’t justify sexism. It’s humanly brilliant that someone is able to make a purse of a sow’s ear, but that human triumph cannot be dignified as the desired norm for culture.

But lest I seem to have digressed, I’d venture we agree that physical brutalization (against women, against homosexuals, against people in general simply because of who they are) is an inhuman bigotry you (and I) do not accept arguments for. As we move away from genocides and pogroms outright toward “softer” forms of cultural violence (e.g., economic privations such as one saw in apartheid South Africa or, less formally, in the distribution of poverty in the United States) it can become more and more ambiguous to what extent a piece of oppression is “legitimately cultural” or still “merely oppression”. (The quotation marks are especially necessary there.) This obviously scales intra- and interculturally (internal oppression and colonial oppression). More bluntly, when is it culturally acceptable to deny the right of women, of homosexuals, of a colonized people, to the forms of well-being and life-meaning a culture valorizes? At a minimum, such a denial would have to be a self-denial (US women can elect to be house-slaves, if they want), but those individual choices cannot suspend the abstract choice to choose that self-denial for others. Homosexuals can elect to live their sexuality through the ‘secret venues” of bathhouses or whatnot, but that cannot negate the society-wide abstract choice to choose that self-denial.

This is probably the baseline “theme” of this whole post. And the issue can be properly more complicated by turning (as transcultural adoptees) to an even more oppressed group: children. Here, “upbringing” is “colonization” of a sort. Over against boys who make themselves willing to undergo (or are forced to undergo) some of the rites of initiation that readily prompt folks to decry as “child abuse,” the seeming or real willingness of that process stands in stark contrast to the imposition of cliterodectomies (which Western feminism and elsewhere does, frequently, critique in certainly unconsciously or unabashedly “imperialist” terms). What seems to matter here is that, for boys, such tortures are for the sake of being acknowledged as socially equal (to other men, as adults). For girls, even if they wanted a cliterodectomy, these are for the sake of (as a consequence of) marking them as inferior relative to men (or, if we prefer, equal to women). Let me be clear, the apparent emphasis here on “exotic violence” only makes the broader issue that much more visceral. The violence toward children (of all sorts in the US) is largely undignified by any form of cultural tradition, being nothing but the gross practice of sheer terror (through rape, beatings, &c). Any aversion to violence in rites of initiation can only stink of hypocrisy. But even part from this—that is, if such violence can be avoided—there is for children of minorities (and queer-identified kids) the emotional and intellectual (event he “spiritual”) abuse to be endured, to be grown accustomed to, to be internalized in order to become a “member of society” (rather: in order to “be allowed” to accept the limited status as a member of the culture, since the full range of “happiness” is not and cannot be available—the devil’s bargain of assimilation again). So I’m not at all only “looking abroad” for examples of violence. I’m insisting, simply, that bigotry (against women, children, homosexuals—anyone defined as a lesser) is a universal human problem, and always is a human problem, no matter how deftly an individual or group may “take advantage” of the devil’s bargain.

You could take this as the “end of this post”. The bulk that follows continues to dig into and mine out and expand further the details impinging on the above.

Somewhere in The Dialogic Imagination, Bakhtin describes the theoretical primordial encounter between two languages (more or less as a synecdoche for the encounter of those cultures generally). His choice of language is interesting, because perhaps in some ways that is the most primordial strangeness of the stranger. A stranger dresses oddly, wears her hair a certain way, has tattoos, even walks backward or whatnot—all of these variations may be more or less readily comprehensible.  A uniform serves to signal immediately “I am not you” but it comes also with the thought, “It’s a uniform, a type of clothes; I wear clothes too”. No doubt, some outfits are (by design) intimidating—and we must, of necessity, attend to the threat they impose before we can indulge in the luxury of contemplating the varieties of human expression—nevertheless, those varieties are still recognizable as varieties. But with spoken language, we are facing something else. Even knowing that other people speak other languages, it proposes an unbridgeable gulf—unless we can find a translator. (In social practice, we resort to a lot of hand signals and ad hoc guess work, smiles, laughter, &c).

Bakhtin, then, uses this primordial encounter to emphasize an initial linguistic representation of an Other in terms of parody.  It might take a lot of effort to clown another culture’s headdress, for instance, but verbal parody of another way of speaking is immediately to hand—through imitation. Parody, in Bakhtin’s use, need not mean an unfriendly gesture—unfriendly mocking is more properly attributed to satire. Parody involves a shared laughter, one that includes the one making the parody. It is easy enough to imagine some playful eye-rolling in a sentence like, “You say mā khalaqnā? But we say ‘we created’” as opposed to more hostile situations where the mocking turns derisive and people start insisting that a language sounds barbarous or is the language of barbarians, &c. This hostility extents to linguistic extermination (Russians not allowing Ukrainians to learn Ukrainian, &c), of course. Bakhtin does not insist the encounter must necessarily or always be friendly, but his model points to a range of qualities that can arise when one culture meets another—from apathetic to affective, and then (in the domain of the affective) either an inclusive/welcoming kind of orientation that takes seriously the language of the Other or an exclusive/nonwelcoming kind of orientation that (hierarchically) subsumes the language of the Other in some inferior status.

I want to split a hair about this. The history of Latin around Europe doubtless provides an almost limitless number of examples for how the paradigm of a “master language” and all of the (supposedly inferior, but widely spoken) “local languages” interact and inform one another. So it’s clear, I’m raising this because it is emblematic of how one can model the variety of orientations cultures can have to one another (using language as the central metaphor).  Thus, my language can take the “word” of another and transplant it directly into mine (and it may remain italicized for generations and require  footnote any time someone uses it in a text until it finally reaches such a degree of expression that its use is finally familiar—as many Latin phrases now are in English), or I can (in good conscience) translate the word into my own language.  The latter particularly involves a kind of (friendly) misrepresentation, even without invoking some understanding of language as a hermetically sealed, incommensurable set of signifiers that cannot (even by analogy) move from one set (language) to another. By this very act of translation, I set the framework for assimilation—I have decided (in a friendly way) that this is what you “mean” and now I expect you to conform to that.  This notion of subsuming the meaning of the word to a totalizing vision—a gesture basic to poetry, in Bakhtin’s estimation—he calls monoglossia (“singlevoicedness”) Such language is inevitably culture-forming, and Bakhtin tends to mean that in a negative, oppressive way. By contrast, heteroglossia (“разноречие” in the transliterated Russian original) is many-voiced; words already carry multiple meanings (i.e., the history of its use by many different people). In this respect, parody involves less a co-optation of the word to oneself and more a shared (if laughing) use of it—laughing because it is (current, still) strange, but nevertheless shared in common with the Other user.

With this in mind, the more “contact” tends toward heteroglossia, the better; the more “contact” tends toward monoglossia, it tends toward colonization or extermination (as the two most overt forms of violence outright). These patterns play out, of course, intraculturally as well as interculterally—that is, as gestures these can be reintroduced at an point of interaction between two people or peoples. And these patterns play out at the discoursal level as well, where commentators make representations of the people(s) under discussion (as I am here).

If the United States, Britain, and Israel (my “axis of evil”) were ever not disgusting, that moment has surely passed. So I am in no way an enthusiast or apologist for those entities past and present, and even less so for the celebrity (celebrated) critics of the axis of evil who thrive here. Part of what I appreciate in our dialogue so far is that it is not marked overmuch by the kind of regulating negative feedback—in the cybernetic sense—that has the effect of ensuring that whatever disturbances are raised (by protests or criticism) are safely neutralized and absorbed as part of the normal operating of the system. That you are “outside the milieu” adds fresh air to typically stale arguments (which I recently allowed myself to plunge back into with others, rather stupidly), particularly because those arguments are framed (as often unconsciously as not) in a way that ensures they can do nothing but end in whimpering and helpless assent before the status quo. Meanwhile, I’m nonplussed by certain lines of argument or types of criticism that seem (equally unconsciously?) complicit in the crimes attributed.

It may be obvious, by bringing up the standard usage of “civilization” (i.e., the social articulations that came along with the Agricultural Revolution and its antecedents) that I’m going to say: to pretend that empire and colony are coterminal is already anachronistic, but even more than this is a (deliberately) short-sighted view of the behavior of civilization since its inception. Ancient Sumeria “civilized” its neighbors violently, and that’s been going on ever since, so that one can say—not entirely without some historical justification—that the complaints of the Middle East about colonization are disingenuous; merely the last death-rattle of whatever bad management and foolhardiness led to the collapse of the empires of Sumer, Assyria, Persia, &c. But such an argument strikes me as mere politicking—using some vague argument about historical self-determination as an excuse to be allowed to dominate people locally (c.f., Israel). Equally, the current axis of evil making this point doesn’t excuse its violent suppression of historical self-determination anywhere in the world. The whole thing boils down to force and what you can get away with, which is the worrisome part. I say this as someone who, whatever privilege I might have, can potentially lose it in a moment if security forces decide buttfuckers should be imprisoned or killed.

Bakhtin talks about “novelistic discourse” to describe elements that preexist his sense of the novel per se—as a particularly generic articulation of (artistic) elements. In the same way, I could try to identify colonistic discourse as elements that pre-date colonialization per se—as a particularly generic articulation of (social, oppressive) elements. Lerner (1987) for instance says that slavery (or patriarchy) began with the inferiorization of women. Basically, if women were not already saves vis-à-vis men, then slavery itself, as an institution, could not have arisen.

Since she devotes a book to this, it seems tricky to summarize. In a context where one group (violently) conquers another, it was generally the case that the men were killed, while the women (and children) were spared. Rape, then, becomes an important weapon in such subjugation—though I would distinguish between its use and meaning in recent events like Bosnia and Rwanda as compared to much more ancient times. One could parody what I’m saying here as claiming I’m saying women long ago didn’t mind rape the way modern women do; rather, I’m saying that it can only be perilous to be too certain we can understand how women experienced “rape” 6,000 years ago compared to more recently. In any case, I suspect that Lerner’s historically feminist view has misled her slightly.

In a circumstance where one group conquers another, it seems plausible to presuppose that men and women alike would rebel against being put to the yoke, being subjugated. The Roman empire “solved” this problem by asking loyalty of those subjugated only in terms of taxes—as far as the rest of their life went (barring “unlawful opposition to the Empire”), they could go on as they wished. So, the issue is how can you trust the ones conquered will be loyal (i.e., will not simply rebel). Part of the Romans solution, of course, involved deportation into the mines or the army—this “ensures loyalty” insofar as that particular body is no longer in his original neighborhood to start trouble.  It is a gesture that “extracts a resource”. Those who aren’t viewed as explicit resources in this way are at liberty to carry on as they like, provided they render unto Caesar. What this oversimplification points to is: how does Empire conceptualize a (conquered) region in terms of resources. This determination shapes heavily how Empire treats locals. If the region is militaristically strategic, keeping the locals happy and loyal provides the necessary buffer; if the region has natural resources, the locals can participate in its extraction but should be otherwise willing to give it up and get out of the way of extraction; if a labor force is needed (locally or elsewhere), prepare to move, &c, &c.

To return to the historically very early moments of subjugation Lerner describes: other conquering populations, following the advent of intensive agriculture, did not necessarily resort to the extermination of men.  In effect, “culture” (having already determined how to subjugate women) then applied that technology to the subjugation of (surplus labor represented by) men in slavery per se. Among the population of the conquerors themselves, that women were already enslaved as childcarers (if not also farmers, insofar as women seem to have invented farming in the first place), then as agricultural production grew in scope, only the conquering population of males would have been available for the managerial tasks of supervising large-scale agricultural. This would have contributed (over a long period of time) to the decline of female valuation (as the actual workers of agriculture itself) to the eventual valorization of the managers. This shift, I say, is visible in the gradual denigration of the Great Mother (from Fricatrix par excellence to mere vessel of male creation) and the rise of patriarchal creator-deities (most famously in Yahweh and Allah).

Okay, but one can still ask (as Lerner does) why women “allowed” themselves to be enslaved in the first place. Or, as Susan Parenti put it, if women initially were recognized as having all the power (if not necessarily the social power),t hen how did that get turned around to the point of their subjugation?

As one intuitive part of the answer, “violence” has to be a significant part of this process. This is why I’m bringing up circumstances where conquering occurred.  As such, one can imagine a necessary violence (rape) toward women as well, which needn’t be only in such a cut-and-dried circumstance as one group/tribe subjugating another—the case of the Yąnomamö involve ‘conquering” at the level of one shabono to another. More precisely, men residing in one shabono might raid another, and vice versa. If this has something of the quality of a “game” (albeit a serious one) for the men, it is much harder to feel comfortable saying that the women are, in any sense, playing along with it.

In Shakespeare’s morally appalling comedy The Taming of the Shrew, the first thing to get out of the way is the dominating confusion about the word “comedy” that drives many producers to try to make the play “funny”. For Shakespeare, a comedy (per the Greek original) points to the restoration of the norms of society following a period of threat or chaos (Frye, 1957). In this respect, the New Year’s rites detailed by Eliade are (festive) comedies. Literarily and dramatically, the most conventional (and obvious) way to signal “the restoration of social norms following a period of threat or chaos” is a marriage; thus Shakespeare’s comedies (and other historical comedies, including our contemporary romantic comedies) sometimes match-make dramatic personae with hair-raising offhandedness. Moreover, there is nothing about this understanding of a comedy that necessarily excludes wit, humor, or laughter in general. Again, New Years’ rites involve all manner of laughter—some voluntary, some involuntary, as it were (e.g., insofar as “festival” breaks down the usual social dichotomies, male transvestitism—curiously prevalent during “Pride Week” in high school—makes for a very conventional gesture, which we might find humorous simple in its strangeness or unexpectedness, even as there will be people doing it who do not see it as a “joke” at all). So, the fact that there is wit and humor and laughter in The Taming of the Shrew has deceived some into thinking that the whole thing is meant to be “funny”.  Meanwhile, the misprision of The Taming of the Shrew as “funny” requires considerable torsion of the drama’s circumstances—most particularly in the “hilarious” scene where Petrucchio (literally) tortures Kate with starvation and sleep-deprivation (along with rape implied but not dramatized onstage by Petrucchio’s male servants).  Perhaps the most to-hand resort for dealing with this “hilarious” material is to make it into a “knowing game” where Kate and Petrucchio alike are “in it together” or at least “in the same boat”. I saw a partial production of this torture scene where it was played so that it was obvious that Petrucchio had himself been up and without sleep for as long as Kate had. Here, the playfulness of the circumstance (if not also the play) was precisely in the “battle of wills between equals” going on. In fact, it was clear from the staging that Kate was actually faring better than Petrucchio; he was far more exhausted from staying up. The performers did a less successful job of “equalizing” the fact that Kate (and by implication, Petrucchio) had not eaten as well.  If Petrucchio had, for some reason, declared overconfidently, “I’ll not sleep till I see you submit to me, Kate,” and Kate is able (as an equal) to snub Petrucchio by showing greater endurance, it’s not quite as convincing that Petrucchio should elect to starve himself longer than Kate. Again, all of this presupposes that Kate is a willing and equal partner to this “game”. I presupposes (on grounds not at all supplied by Shakespeare) that Kate “really likes” Petrucchio all along, from the first (or perhaps as a consequence of something like Elizabethan Stockholm syndrome). The title of the play itself argues against such an imputation, but so does the course of action in the play. Nevertheless, these days without making such an inference (abetted sometimes also by deleting certain scenes that simply can’t be brought under the yoke of “funny”), the brutal misogynist fantasy of this “comedy” cannot be overlooked.

But maybe that is (besides its authorship by Shakespeare) precisely what allows this striking piece of nastiness and meanspiritedness to persist in the Western canon and performance. So long as the independent woman, either a priori or somehow along the way—by decision, fiat, or torture—comes to accept the necessity of her submission (as a wife), hidden under a layer of humor or by construing from the beginning that she really, secretly, desired her rapist all along—that’s certainly an elemental part of patriarchal oppression in our current culture.

So whatever game the Yąnomamö males are playing, it is surely the women who “play along” outwardly who persist, because (in Yąnomamö culture at least) too much overt resistance to that playing along affords violence and even death.

So let there be no talk that when conquerors execute the males and allow the women to live that women assented in some way to their captivity. Even so, as soon as I start to try to imagine some form of “enlightened self-interest” as a reason for how women managed to be unproblematic enough to their conquerors that it wasn’t necessary to exterminate them, I start to get queasy. For instance, modern physiological studies have demonstrated that women have a higher pain tolerance than men; in general, men kill themselves more frequently than women, which implies both that “men couldn’t take it” and that “women can”. So it could be said to be women’s capacity for endurance that made them able to bear subjugation better, and thus they could be spared execution. (Hypothetically, in the earliest days of subjugation, no one was killed on policy, but it gradually become clear that men, in their inability to “take it” proved more trouble than they were worth ultimately, so it become smarter simply to execute them at the outset). On this view, we can thank women for slavery because they taught us how to “endure anything”. I’d sooner not conclude that. Much as we’ve doubtless needed resilience to survive at all as a species, it’s also clear that our capacity to “endure anything” has led to innumerable circumstances where injustice prevails because we elected to endure, rather than risk death by resisting. Each situation is obviously more complicated than this, but the “patience” of women in the face of relentlessly immature male behavior is more than a little troubling.

Instead of impressive feats of endurance, one might instead attribute cow-like stupidity; women submitted to the yoke of their conquerors because … why not? Or some whorishness—whatever claimed attraction they experienced for their now-dead partner (or partners) translated with sensible ease to the smelly hirsutism or corporeality as it might be of their “new husband”—either because women are just easy that way or because the old husband was no less or more of a rapist than this new one. Any description of this sort seems to deny women any principled objection—like Molly Bloom’s too-celebrated “yes” it suggests that women simply can’t or don’t say “no”. Now, obviously, in the unrecorded annals of invisible history, for every man who, presented with the question from his conqueror “submit or die” answered, “Die!, there were also women who equally chose “die” … but of course, none of them hung around under the new order.

So, this suggests that conquerors could have been satisfied simply with submissiveness in males or females (by “submissiveness” I mean an “outward obedience that discloses nothing about how the person inwardly feels about that obedience”), but seemingly insisted on making a gender distinction instead. More simply, I’m saying that Lerner’s analysis (and the historical record) are missing that there must have doubtless been females who had to be executed just as necessarily as “all the men” (and that there could at least have been certain men who similarly didn’t need to be executed. If it is a question of the conqueror being certain of one’s “loyalty” to the new order, then no male—if males are warriors—might ever be above suspicion, so that due diligence demands killing all the males, despite all protests of loyalty by any conquered.  One can imagine a historical progression where compassion yielded to tear-jerking, only to come back to haunt conquerors later. Those conquerors being, generally, wiped out for their kindness, the historical record suggests only that brutal “no male survivor” policies persisted.) I submit, then, that it was not simply women who were not executed point blank but women with children (whether that meant collectively cared-for children or their own biological children). This accounts for why children were typically not killed, why as a matter of course the conquerors inseminated nonpregnant women as soon as possible, and why conquerors could rely upon the outward cooperativeness of women (without having to impugn some kind of woeful character defect upon them).

Lerner seems not quite to reach this conclusion. For her, it is enough that women are (biologically) differentiated from men in their ability to bear children (and the intimidating and humiliating fact of being raped) that accounts for their submission to being enslaved. This seems insufficient. Who knows, perhaps even this category (of woman-as-loyal via childcare) even explains the (hypothetical) male who was not executed at the outside (i.e., precisely the female-identified male who embraced all the social affects of maternity without the biological fact of it).

The point Lerner wholly overlooks is that this means women are not the first slave; children are. Lerner tries to acknowledge this at one point, denying this category of enslavement (to children) by noting that boys will gradually grow up to be slavers while girls can only grow up to be slaves.

In the first place, some of our oldest human myths indicate the human being (in general) as in the position of the child vis-à-vis reality. If we had no parents, no one “already here,” and came into the world, our explanation of our origin would not be bifurcated into male and female creation principles and we would be forever antecedent to reality. If we go back in human existence far enough, through all the chains of “parents,” we can only arrive finally at the first children—and the myths in that realm characterize the origins of children themselves as coming out of rocks and streams and in general the natural world. One could literally become pregnant by wandering too close to a given stream and whatnot, because the source (eventually the “mother”) was Nature, reality, the world, call it what we will. Jung is very exacting about this—we can hardly be blamed for imagining that our (mythological) origins are not (1) coterminal with our biological creators, or (2) mythologically related to the projection of our (human) biological creators in universal Mothers and Fathers. But that explanation itself is already self-evidently told from the point of view of a child—it is a story that children-turned-parents have passed on to more children. Thus, humankind at the dawn of its consciousness was already explanatorily in thrall to imagined/imaginary parents—and children ever since have been in the same boat.

These days, if someone admitted they bought a woman, the immorality of that would be almost universally recognized.  Immediately, the mail-order bride notion springs to mind, along with prostitution (as a temporary purchase), matters surrounding dowries (at least paid lip service to as secondary to the marital arrangement itself), and prenuptial agreements—this kind of “buying” has been/is normalized so that the transaction is invisible or at least downplayed. Ditto with adoption, obviously—except that one would find enormous consensus that the money has absolutely nothing to do with it at all except (if even this) in the most accidental way. In other words, one can quite openly buy a child (through the proper channels) and get in no trouble whatsoever.

Children embody the original form of slavery. But as subaltern, their nonrepresentation may be even more historically obscured than the experience of women. At the least, feminism has had the opportunity to speak up on behalf of women (and to misrepresent the original slavery as belonging to them, rather than children), etc. They’re still the only slaves openly traded.

The foregoing means to encapsulate some moments of institutionalized oppression. With the Agricultural Revolution, whatever preexisting slavery (of children and women) there was became more aggressively articulated. It’s not that I need or want to assert that slavery was invented at this time, but there does seem to be a quality of the enslavement that differs from earlier human social arrangements. What, in all of these cases, is the gesture of non-equal differentiation is crucial at all steps along the way. If the original children of humankind felt themselves hierarchically less than the Source (however imagined), then that inferiority was imputed to their children, thereby creating the first slave-like creatures.  I want to add here, however, that I have seen what happy children look like when they are growing up. These were youngsters in Vietnam, where I was able to visit for approximately 3 months total (over a few years). This is already rambling enough not to need to go into a report on what I saw in Vietnam, but it was definitely markedly different than what I have witnessed in the US. The best slave, however, is one that doesn’t know he is, one who is glad she is. In the US (and elsewhere), the emphasis would be “you are my child”; in Vietnam, it appeared to be, “you are my child”. There is that notion, locatable in various times and places, that children are simply small adults, which provides social pros and cons (greater autonomy and respect, more readily exploitable as labor). There are certainly culture and times where the parental ownership of a child is much stronger and more damaging (creating something much more closely analogous to slavery per se), but so long as the child-adult distinction prevails, to be a child is not to have all the powers and appurtenances of the adult. The point of this, in general, is that if colonization (as it has been characterized in the last century or so) is going to make special claims, then it would help to have clear whatever further distinctions those are that make its special claims valid—apart from the much older various enslavements of human beings history (and nonhistory) has witnessed and not recorded.

This bears on the transcultural adoptee directly. The colonized subject of the child—the sapient human being who is subjected to adultification by more powerful presences who insist on the total righteousness of their adultifying mission, as both the imposers of an alien language, the suppressors of whatever “native” language a child would otherwise have/invent/reinvent (the fact that such a language can only seem wildly fictional at best shows how deeply this colonizing project has imprinted its necessity and capacity to real-ize), and the obstacles to other available languages; the exclusive provisioners of resources and goods to the child; the ready extractors of resources from the child (in “benign” cases by exploiting the child’s labor, in less benign cases, sexually exploiting the child); imposing upon them religious beliefs (even when merely “framing” what options are out there); and so on. The fact that all of this is surrounded by a neutron-star dense shroud of sentimentality and cries by parents that they mean well does not negate this analysis.  Also, the particular ghastliness of this may be far more exacerbated in the United states than elsewhere, because the notion of child (as property) is so much more desperately an invisible doxa.

By the way, my remarks about childhood in Vietnam were directed to prepubescent kids. The closer one gets to the tween years, the more I am for all intents and purposes biologically an adult but not socially recognized as such, the more there will be butting heads and the more certain things, previously treated more laxly, can become more serious. The fact of arranged marriages is one—parents who’d never been hardcore might suddenly pull the parent card for arranging a marriage. Similarly, homosexual behavior is often a matter of little to no concern (sometimes it is even encouraged, to avoid pregnancies), but only to a certain age. Once one becomes marriageable, it’s time to be marriageable. There are various social technologies in Vietnam that make extra-marital sexual gratification quite possible, but it is extremely difficult for the gay male or female to avoid marriage altogether (from family pressure). I have no idea what kind of social pressure or repercussions there would be for an openly same-sex  married couple in Vietnam—there are, as far as I know, no laws against it (and never have been). The point in all of this is that there’s a kind of tacit contract: “we had you, we cared for you and all of that … give us grandchildren.” To violate the terms of this (non-negotiated) contract tends to be extremely difficult; even Vietnamese people who get to the United States may spend the rest of their lives dodging their mother’s question, “When will you get married.”

Which reminds me of something in the hierarchy of agriculture. In Mesoamerica at least, Raul Garcia has characterized the social structure of the civilizations there as moral economies. This is the premise that: “yes, we’re in a hierarchy; I’m on top, you’re not. However, just as you have obligations and rights to me, I have them to you as well.” Garcia contends that in the main those “up” in the hierarchy actually did honor their obligations. Of course, nothing but vanity and culture compelled them to, but the “workability” of the scenario is not the point, especially as the scenario apparently did “work” for a long time. This notion of moral economy is extensible in both directions, provided that the motivations that compel both sides to honor their mutual obligations do not break down. Just as you emphasize the predominant importance of family worldwide, this can be seen as an expression of an effective moral economy where the parent and the child are in a “workable” scenario. However, you also draw attention to “community”. In the US, family tends to mean immediate nuclear family; community is an abstraction usually; at best (it seems), families can aggregate up into a sense of community (especially around a common threat) but stress easily drives that tenuous solidarity out of the picture as immediate families simply attempt to get by. In the broader story of human history, “family” has tended to mean “community”—the people of Hawai’i make this explicit by having kin names only for “parent” and “child”. What I in English call my “uncle” in Hawaiian would be my “father”. My point is that “family” (in this community sense) is probably already where the “moral economy” originated.

Whether “family” is understood in a limited familial sense or in a broader community sense, what is obviously and immediately problematic involves the introduction of strangers. I don’t need to iterate the various possible origins that a stranger (or strangers) might have; I will note that the root “xen” in Greek (as in “xenophobe”) connotes both to “stranger” and “hospitality”. Whatever moral economy is in effect, there is then the situation where the newcomer, the stranger, is confronted by the available rights and obligations of those up and down in the moral economy, with a typical expectation by locals that the stranger will either leave or assimilate. The transcultural adoptee cannot, of course, leave (except by worlds of hassles).

I particularly mention this notion of a moral economy, because it is perhaps a way to add the distinction I want to colonization (as opposed to resisting the ideal of analyzing things in terms of colonialism because, really, at root that’s simply the most recent wide-scale edition of domination). That is, while the domination of one people over another, as an extension of the domination of men over women, as an extension of the domination of adults over children, may trump and generalize the discourse of anticolonialism in general, then by understanding colonialization as patriarchy (a moral economy involving the enslavement of women and children under men) without the moral economy (or without a reliable mechanism for ensuring it operates), this might be enough to distinguish colonialism once and for all.

If Empire is “colonization of strangers” this necessarily presupposes not only hierarchy, but a hierarchy of differentiation between neighbors (as nonstrangers and strangers). Bosnia and Rwanda (as two recent examples) showed how almost literally overnight “neighbors” can bifurcate into its problematic distinction. This necessary distinction can never go away—I’m repeating now some of the points you made in your reply (e.g., that the process of assimilation can never be achieved). Just like a child (for being a child) can’t be “taken seriously” so the strangers of a colonized place cannot be taken seriously, cannot be trusted with self-determination, etc.  What is curious for the colonized is the colonizers insistence that their terms of the situation must prevail—but then as children, we know this strangeness already; parents are notoriously blind about saying, “Because I say so”—it pulls back the corner of the curtain and shows off the ever-waiting violence.

It’s anachronistic to say “men colonize women” or “adults colonize children” although the useful articulations of “race” and “gender” and “culture” that are present in the anticolonial discourse seem provocatively useful in the circumstance of man & woman and adult & child, particularly for the transcultural adoptee. But one might say this is problematic also, since colonized cultures most assuredly have their own articulations of colonization (of men over women, of adults over children) that antedate the colonial era per se by millennia. Nonetheless, the project is not simply to liberate a culture so that it may then oppress itself or others (c.f., Israel)—the sins of the fatherland do not excuse the sins of its sons.


Bakhtin, M.M. (1981) (ed. M. Holquist & C. Emerson, Caryl).  The dialogic imagination : four essays.  Austin, TX:  University of Texas Press .

Eliade, M.  (1954).  The myth of the eternal return (trans. Willard Trask).  New York, N.Y. :  Pantheon Books.

Frye, N.  (1957).  Anatomy of criticism : four essays.  Princeton, NJ:  Princeton University Press.

Ibn Zayd, D. (2012). FRAME AND PORTRAIT II: More thoughts on transcultural adoption. Available at

Lerner, G. (1987) The creation of patriarchy. New York, NY: Oxford University Press.

[1]Significantly, amongst the Yąnomamö this “woman bashing” co-exists where sexual behavior is under an imposition of discreteness but is not otherwise constrained. That is, except for certain incest taboos, there are no taboos about child/child or male/male sexual interaction. There are, however, prohibitions on any sexual behavior by a women with anyone (even herself) other than her partner. There is a tendency to predict that demands on female chastity will be accompanied by at least claims for chastity in general (whatever else is actually going on—e.g., England’s Victorian-era obsession with female chastity and the vast number of prostitutes in London), but that is not the case here. So this makes particularly clear how specifically female nonsexualness has been singled out.

[2] In a collection of essays by Smith (1988), Reimagining Religion: from Babylon to Jonestown, he illustrates a vast disparity between the cultural description of bear hunting versus the actual lived reality of it–or something like that, I’m working from memory here and no longer have the book in front of me. The purpose of his essay is to denigrate certain kinds of anthropology (as naively accepting a culture’s self-description), amongst other issues. At the farthest remove, there is always a disparity between our descriptions of experience and our experience itself; that is the existential circumstance we all find ourselves in. So if one research prefers too much the cultural description, this is not corrected by throwing it out in favor of an “actual description” (ignoring the perennial trickiness that the anthropologist too must be offering a description of “lived realities”). In any case, the cultural meaning is more bound up in and visible in the description of cultural activity rather than in any outsiders description of “what’s really going on”.  Consequently, the changing lived realities will be lensed precisely through the cultural descriptions of lived experience; this is how traditional societies “translate” new “data” into the existing molds. We can say this is a duplicitous move, but it’s what we necessarily must do at the cognitive level continuously—our descriptions of lived reality as we encounter them, right now, at this very second, not only are subjectively determined, we have no access to reality per se in the first place. So Smith’s (1988) objections are not apposite, especially as they seem politically motivated, rather than motivated by the subject of analysis. One can also see how traditional culture responds in a “modern” setting in Vietnam; it would be too digressive to go further into details here.