The intent of this series is ambitiously to address, section by section over the course of a year, the celebrated Crowds and Power by Elias Canetti[1]. This is the fourth entry in the series and the second addressing Part 1 (The Crowd), which Canetti breaks up into several sections. Rereading my other post about this particular section (The Fear of Being Touched), I realized I did little to offer an alternative to the generalizations Canetti seems to be insisting upon. This post addresses that.

Canetti begins[2]:

There is nothing that man fears more than the touch of the unknown. He wants to see what is reaching toward him, and to be able to recognize or at least classify it. Man always tends to avoid physical contact with anything strange. In the dark, the fear of an unexpected touch can mount to panic. Even clothes give insufficient security: it is easy to tear them and pierce through to the naked, smooth, defenceless flesh of the victim. ¶ All the distances which men create round themselves are dictated by this fear (15, emphasis in original).

A claim made by Canetti advocates is his tendency to drop things in a reader’s lap and leave them to decide whether they are true or not. On principle, I oppose both this explanation and method, to the extent that it describes Canetti’s book. I oppose this on principle because the idea that an author is (or can be) neutral is untenable.  Barabash (1977)[3] cites from Thomas Mann’s Reflections of An Apolitical Man,

a title which speaks for itself, to a recognition that “being apolitical is nothing less than being simple anti-democratic”, that “when culture rejects politics, the result is error and self-delusion; it is impossible to withdraw from politics in this fashion—one only ends up in the wrong camp”. Recalling how at one time, “in the name of culture and even freedom I resisted with all my strength what I called ‘democracy’, meaning the poiticisation of spiritual life”, Thomas Mann says that life taught him, and many like him, a terrible convincing lesson, graphically revealing the shameful ties between the apolitical aesthetic German burgher spirit and the most extreme forms of political terror, barbarism and totalitarianism. … We should recall these lessons more often. For the bourgeoisie today continues to increase and perfect its ability to play on notorious “anti-political” tendencies, and the danger of ending up in the camp of reactionaries may threaten the artist and intellectual who does not want to take any sides (15–6).

So it is not possible to “drop something in a reader’s lap” without an authorial point of view being implicated; such apolitical gestures must be understood as fundamentally reactionary. This fact brings out how reviewers at Amazon can turn Canetti’s work to the end of a kind of cynicism that resembles Schopenhauer’s:

The turning points of history reveal the true price of imaginary apoliticism with utmost clarity, as Mann convincingly demonstrates when he cites the example of Arthur Schopenhauer, “Nietzsche’s predecessor in the area of anti-intellectualism”. This scholar and philosopher, who declared that politics was philistinism, in 1948 called the revolutionary people “all-powerful scum” and “demonstratively proffered his opera-glasses to the officer who stood at the windows of his flat carrying out reconnaissance of the barricades so that it would be easier to direct fire against the insurrectionists”. “Is that what it means to be above politics?” the writer exclaims (16)[4].

Second, where an author seems to be indulging in this habit (of dropping something in the reader’s lap without any particular point of view expressed), it only seems that way because insufficient attention has been paid to the discourse of the work generally. For example, the observation dropped in a reader’s lap that Canetti begins his book with (“There is nothing that man fears more than the touch of the unknown”) is easily attached to an entire network of ideas that ground the statement in an imputable world-view or truth-class[5]. The idea that it is up to you to decide what something means, and that the author is merely an innocent reporter of these things, readily connects to the late-order capitalist/postmodern milieu that Barabash bashes.

But, saying this, what kind of engagement can come from confronting Canetti’s excessive generalization. How can I make sense of the notion that what I fear most is the touch of the unknown, &c?

A key move I have to do to “get onboard” with this is wrangle over the term “unknown”. If I am sitting in my study, and I suddenly feel something creepy-crawly on my leg or arm, I may tend to flinch violently; at a minimum, I will try to figure out what it is; I may try to bat it off. I think the strength of response may partly depend less upon the unknowability of the sensation and more the degree to which I am “violently” pulled away from whatever was engaging my attention. But it’s not necessary to insist upon this. The thing is: in the presence of such unexpected sensations, while I do most typically react in such a way so that I can identify (or see) what it is, it is not because the sensation originates from something “unknown”; rather, my life experience provides me with a wealth of things that it might be, including spiders or (potentially) other venomous insects, etc. My violent reaction (when it is violent) is not due to the thing being “unknown”—rather, it’s because it may be altogether too imaginable. The justness can be seen in flipping the circumstance around: if I am blindfolded, and my mate is going to lightly caress me with a feather, then the anticipation of that sensation and the very fact that I would put myself in such a “vulnerable” position shows that I am responding not to the unknown, but to the (in this case) pleasurably imaginable.

So if Canetti’s use of “unknown” may have some sense, then it is by understanding unknown in a strictly subjective way, as “something currently not understood by me.” This is where Canetti’s generalization loses traction, less because “unknown” transforms merely into “what I’ve not experienced before” (or something like it), but in the insistence that an encounter with this must be fearful. Self-evidently, if something imaginably repugnant touches me, I will react with repugnance, and if something imaginably pleasant touches me, I will react with pleasure, ceteris paribus. There is in all of this a remarkably limited view of the unknown, insofar as here it boils down (in all human individuals as system) as only those things not hitherto experienced (or, slightly more precisely, as any sensation that can be construed correctly or incorrectly in an either repugnant or pleasurable sense).

It is a false dichotomy to imagine that everything must be either repugnant (fearful) or pleasurable (attractive). It is, of course, child’s play to cram every human experience into the Procrustean bed of “love” or “fear” but this hyperbinary (by which I mean a dichotomy that is meant as an overarching simplification) does not actually get us anywhere analytically in the final analysis, particularly because it cannot make sense of how we are sometimes attracted to what we fear and fearful of what we love. Like all mere binaries (particularly one’s rooted in a false opposition, like this one), it is at least minimally necessary to expand the range of categories understood by the binary. Thus:

Attraction

Repulsion

Attraction Love Fascination
Repulsion Loathing Fear

These proposed terms can only be considered connotative, not denotative. If “love” and “fear” may be taken as obvious (a perilous assumption, to be sure), “fascination” (I considered “abnegation” also) points to those qualities that appeal to us despite our desire otherwise. Similarly, “loathing” points to those occasions where we “know it is good for us” and yet we want nothing but to negate, destroy, be done with it. These affects are perhaps most familiar with the sexual arena, where fascination may lure us (not again!) into yet another horrible relationship with a person we know is no good for us, while loathing may drive us to betray, break, or call quits with someone who otherwise seems to be perfectly suitable as a mate, spouse, &c. Minimally, then, we may see that the hyperbinary love/fear (and its expansion above) points to at least four more salient explanatory terms that “get more work done” and explain more empirically lived human experience.

The disservice done to the concept of the “unknown” here involves its misprision as something knowable (to me) in the first place. If we take the unknown seriously, as for example Lem takes seriously the notion of what constitutes the alien (particularly in his Fiasco and His Master’s Voice, but also indirectly in his presentation of robots in Return from the Stars), then it is not possible for me to fear it, or have any affective response to it whatsoever. Following that distinction I picked up from somewhere, fear requires an object; anxiety does not. This makes “fear of the unknown” to be rather “the experience of fear (or some other affect) in the presence of something as yet still unknown”. So, ignoring these various more solid groundings, what Canetti’s sentence seems more to point to is: “man dislikes nothing more than unwelcome intrusions.” He wants to see what intrusions are on their way into his orbit and be able to decide in advance to accept or decline the visitor. The fact that Canetti’s mind turns to the example of a robber in one’s house points to this.

With respect to, “In the dark, the fear of an unexpected touch can mount to panic,” we remain in the domain of anxiety, not fear. Women will tell us another story, but men may walk blithely down a dark alley or into a silent wood with an easy heart, so that it is only by the presence of a sensation (either in the environment or in one’s imagination)—a sound, a shadow, a fleeting thought, “here there be dragons”—that ease will turn to anxiety as we objectlessly imagine the altogether too imaginable sources for that sound, shadow, or rationale for the thought. But again, there are people for whom this would be positively exhilarating, loathsome, or fascinating, &c. gain, fear takes an object and, in general, is a wholly rational and sensible response to actually present dangers. The maniac brandishing a knife in the alley, the wild animal rearing up in the forest—these are sources of fear, so there is no question of these touches being “unknown”; quite the opposite, they are altogether too clear. It is certainly true that “in the dark, the fear of an unexpected touch can mount to panic,” but this is by (anxiously or dreadfully) imagining touches (like a murderer’s hand over our mouth) that are already known.

When Canetti insists “man wants to see what is reaching toward him, and to be able to recognize or at least classify it,” he is emphasizing the disjunction between the “unknown” and the “imaginable” that undercuts what he is writing. By definition, one can only misrecognize or misclassify the unknown, if it is noticed at all. . We do however have one category for misrecognizing something, while still trying to maintain its character as unknown: the “unknown”. It is likely this is really only “the unfamiliar” but nevertheless, one can take an attitude toward “the alien” the “numinous” or “the unknown” and label it as such. As meaning-making entities, we can never not “see” what is reaching toward us; our reflecting consciousness can never avoid (except by death, perhaps coma, or derangement so severe that the reflecting person disappears from consciousness) making sense of some kind of what we sense.

So again, it is less “the touch of the unknown” and more “the presence of unwelcome intrusions”. The adjective “unwelcome” is key here, and literally unlocks what Canetti seems to be getting at, because if the presence has not already been constructed in such a way that it is unwelcome, then one would either not notice the intrusion at all (even in a numinous sense) or wouldn’t fear (the presence of) it. This requires some kind of construction or identification of the intrusion (a priori even), so it is not a question of being unknown.  The descriptor, rather, is that it is unidentified, and (because it is unidentified) may default to unwelcome—at least in those cases where intrusion is not desired. Canetti is candid enough when he admits the jostle of someone attractive is another thing altogether.

So it is not that man “wants to see what is reaching for him”; rather, we each will see what is reaching for us, no matter where we direct our gaze. We may overlook one thing by looking away, but our gaze will then be met by whatever reaches from where we look, and even if we squint our eyes shut, then darkness, inner images, or ectopic phenomenon will reach for us. We can negate the significance of anything our gaze falls upon as well, but we can only do so because we have already been “imposed upon” by sensing—our freedom is (at least in potential) in what sense we make of what we sense, but we are not free to determine or deny that we sense. As self-aware beings, I suspect that this is an existential human need on par with eating, drinking, sleeping, etc.

“Man always tends to avoid physical contact with anything strange.” Even in the context of Canetti’s paragraph, this is an ambivalent remark, signaled especially in the “always tends”—a generalization and subjunctive hedging all at once. That is, one can only “tend”; to “always tend” belies a (justly) vacillating mind, I say. Canetti knows (perhaps even in himself) that man does not “always avoid physical contact with anything strange”. I submit that that was probably the original of the sentence (“man always avoids contact with anything strange”), but this overtaxed even Canetti’s capacity for overgeneralization, and so (perhaps all at once, perhaps gradually in several revisions) the words “tends” and “physical” were added—he’d lost his nerve a bit after “There is nothing man fears more than the touch of the unknown.”

Let’s allow this could be a fault of the translator or maybe even the translator herself losing her nerve in the face of the text’s overgeneralizations. If it is not already clear, let it be clear now and henceforth: when I say “Canetti” I cannot mean the once-living existential being who is imputed to have authored this text; I can mean only whatever sense I make of what seems to be evident through that text, not for the purpose of defaming the author, but for addressing the text. Second, lovers of Canetti might defend some point or sense in the original; one can say that the translator did a hack job here or there, and so forth. Be that all as it may, the myth that there is some “correct” text out there in lieu of this one solves nothing; there are, at most, only alternative texts and critical opinion and fashion may place the tiara on this one or that one. Meanwhile, the publishers of both texts will continue to publish their texts, and the sense derivable in either will continue to be inputs to people’s thinking about these matters.

So I propose the thesis that the original of “Man always tends to avoid physical contact with anything strange” was ““Man always avoids contact with anything strange” in order to better understand the text, to better get at the underpinnings that Canetti is laying out. For example, by allowing that “man always tends to avoid,” there is the acknowledgment in that that man sometimes tends not to avoid anything strange. We already have Canetti’s candor about attractive people, but I have pointed out above how the fear/love hyperbinary ignores cases of fascination (as attraction to things that are repugnant).[6] That is, the text’s admissions of an “always tends” may be pointing precisely at cases of fascination.

What does the word “physical” (“physical contact with anything strange”) add then? On the face of it, this opens the door to man sometimes tending not to avoid spiritual or emotional contact with anything strange. It also places a particular emphasis on somatic repugnance that ties in with the jostling of bodies Canetti  generally dwells upon in this section. It is difficult to ignore how this attaches to the protagonist of Canetti’s Auto-da-Fé, who walled himself in with books and wound up in a disastrous apocalypse by pursuing a physical liaison with a fascinating/repugnant woman (that is the book’s description of her, not mine). As Canetti uses the (hand-like) phrase “reaching toward” to describe the unknown, here particularly the sense is (in this “physical contact with anything strange”) one’s hand reaching out to touch that something strange.[7] But also, if one avoids physical contact with anything strange, an inference (perhaps necessary if not yet sufficient) is that one could also both (1) seek physical contact with anything familiar (love, in hyperbinary terms) and also (2) avoid physical contact with anything familiar (loathing).

So we have once again another expanded hyperbinary: familiar, strange, strangely familiar, and familiarly strange.

The switch from “unknown” to “strange” signals, I will say, the commitment “at work” in Canetti’s text. The strange is definitively not the unknown, though it’s easy enough to elide them.  The unknown is something unknown; the strange is something carrying markers explicitly other than my experience. The latter, if I take the notion seriously, offers me no category into which I may place it; the latter distinguishes itself as not a member of any category I already know. The former demands a kind of paradigm shift in my thinking to accommodate it—if I don’t simply misinterpret it into an available category; the latter requires me to make room in my existing thinking for this difference—if I don’t simply deny it presence in my thinking )or social world). One can liken these to the descriptions of adaptation prevalent in Piagetian cognitive psychology: assimilation and adaptation. In the case of assimilation (parallel here to the “strange”), complex but relative familiar or unfamiliar objects or experiences are simplified to fit preexisting categories in one’s experience and thinking. In the case of adaptation (parallel here to the “unknown”), the structure of one’s cognition must alter in order to fit the realities of objects or experiences.

Thus, this substitution of “strange” for “unknown” is not an innocent move at all, but this is evident socially as well.  If I encounter a human being who in manner, language, and appearance is essentially unknown to me, I can attempt to adjust my categories of “what is human” in light of my new experience or I can misinterpret all of the differences that I am noting as being essentially “the same” as what I already understand. Reactionary notions of “race-blindness” fit into this pattern, for obviously negative social ends.  By contrast, if I encounter this same human being, I can construe them as strange and attempt to incorporate their apparent similarities vis-à-vis “what is human” into my preexisting categories or I can work myself up into xenophobic reaction to that strangeness. The issue here is less how I might react and the fact that these two basic categories of reaction are endemic and familiar enough. And since assimilation and accommodation itself may be construed as a hyperbinary, one may see also the two categories of an assimilation of accommodation (i.e.,. the Devil’s Cultural bargain of assimilation in general; the reduction of all human difference into the main category of dominating hegemony) or the accommodation of assimilation (i.e., the exoticisation of the Other, as in Orientalism, the Noble Savage, and the like).

These larger social issues notwithstanding, here the elision in the text from “unknown” to “strange”[8] is actually a crucial moment. In the “problematic” versions of accommodation and assimilation: (1) if accommodation implies adjusting to the given object or experience, then the familiar differences that objects or experiences present become the focus of attention with an aim to incorporating them into thinking; and (2) where assimilation implies the adjusting of the given object or experience, then the unfamiliar similarities that the object or experience become the basis of strangeness (and xenophobia). These versions are problematic because the former “misses” the actual nature of the object or experience while the latter denies the existence of the object or experience. The apparent familiarity of the former allows us to misconstrue it (albeit in a “friendly”) way as recognizable, while the apparently unfamiliar similarities of the latter allow us to misconstrue it (unfortunately in an “unfriendly”) way as strange—as “not recognizable” as something, actually that society “cannot recognize”. The dangers of the former include paternalism, Orientalism, &c; the dangers of the latter include marginalization, ostracizing, and genocide. Just to finish the thought, the “unproblematic” versions of accommodation and assimilation might be termed “learning” and “wisdom” respectively.

For brevity, I am going to refer to the “unknown” as synonymous with problematic accommodation (the “friendly” construing of apparently familiar differences) and “strange” as synonymous with problematic assimilation (the “unfriendly” construing of apparently unfamiliar similarities). Socially, the way that calling someone or a people “strange” (as socially not recognizable) leads to marginalization, ostracizing, genocide, &c., is clear enough as a social negative. The opposite insistence of someone or a people as “unknown” may be less immediately obvious as a problem, even after mentioning Orientalism, &c. It is salient how gay activism shifted from the early-80s from the claim “we are everywhere” to the current demand for marriage, which is simply the assimilationist demand (or claim), “we are you.” Any number of critiques of gay marriage precisely on this ground are offered by activists on behalf of non-heteronormative values. I suggest that this is a shift from a sense of being perceived by US culture as “strange” to being perceived as “unknown”. (There are, of course, any number of self-elected pundits who continue to use the dominant club of “strange” to brow beat community people who are nonheterosexually identified.)  If in the 80s (and earlier) the equation of pedophilia and homosexuality was widespread enough to require answering,[9] the current laudable desire for necessary legal recognition of people who are not heterosexually identified is precisely a kind of “different but equal” discourse. Note the “inevitable” appearance of the word “recognition” there, because what is at stake (in terms of the descriptive social discourse about the issue) is precisely the shift from the not-recognizable reality of “homosexuality” in the past as opposed to the (now possible) recognition of “homosexuality” currently. The worried critique of this is that this recognition comes only at the price of the Devil’s bargain of cultural assimilation.

I pick this example because it’s more politically stomachable. Where the designation of someone or a people as “unknown” really shows its teeth is in how people discourse about (particularly not physically present) Others (i.e., in the Middle east, in Africa, in Asia). The emphasis here is on the “friendliness” of the discourse. Under the discourse of “strange” one can denounce lesbians, Jews, and immigrants within one’s culture as destroying it; that at least was who Juvenal blamed some 1800 years ago, showing that the list of usual suspects hasn’t changed much. But when it comes to colonizing, it is as useful to construe people as “unknown” as “strange” (i.e., Terrorists, “out to destroy our way of life”). LGBT activists (if it’s correct to call them that) in the Middle East have asked that the “help” offered by the Gay international cease, for the various problems it introduces. (This is not everywhere the case of course.) But one can tease out all kinds of problematics from these “friendly” insistences on “unknown”. Domestically, the fascination with (i.e., the cultural construction of) Black penises and octoroon mistresses is emblematic of this. Our enthusiasm for the Arab Spring is, of course, predicated on the paternalism of finally “enabling democracy” (amongst ungovernable Arab tribes). &c.

So I am not ignoring at all the problem of the “unknown”. If the bloodshed and violence in the Soviet Union was predicated (in part) on a class distinction that made non-proletarians “strange,” then the bloodbaths, disappearances, and widespread social destruction that occurred in South America[10] through neoliberalism proceeded (again, at least in part) by that same conceit that we now see in the Middle East as “exporting freedom.”  In this sense, both “the unknown” and “the strange”  can equally have devastating social consequences.

Only because Canetti himself shifts to the use of “strange” will I emphasize then the importance of resisting the desire to make repugnant the unfamiliar. Here, I am no longer using “unknown” or “strange” in the way I just did. Canetti’s shift from “unknown” to “strange” discloses (I claim) the actual foundation from which Canetti’s argument proceeds. Only indirectly, if ever (so far), would he have meant “unknown” in the (problematic) sense I mentioned. His emphasis of “physical contact” (the avoidance of “sticking your hands in something”) is emblematic. That he would shift the “unknown” (in an unproblematic sense) to the “strange” belies the recreation of tendencies toward marginalization, ostracizing, and genocide, and it is for that reason that I am particularly emphasizing resistance to that.

This emphasis on “physical contact” helps to illuminate Canetti’s “solution” to the fear of touch: the press of the crowd. The absent part of the equation here is, precisely, that such contact is “welcome”. Someone who is in a crowd and stays in a crowd (as Canetti describes it) self-evidently “welcomes” that presence, whether because it is finally a welcome touch (of the unknown or the known), because an unbearable prison of self is finally paroled into the mass-mind of the crowd, or whatnot. Just as in the mosh pit, all of the hard physical jostling and smashing together is, precisely, a “welcome intrusion” (in fact, an even specifically sought one). In part, this is precisely based on construing everyone present (whether in the mosh pit or the crowd) as having a similarity of purpose—in other words, one can pretend everyone is the same. Instead of becoming awash in an uncanny strangeness of others, there is the warm fuzzy of Gemütlichkeit (or even collective Schadenfreude). It is precisely this anti-strangeness that is the converse of the repugnance described when, walking on the street, bodies collide. Except that one is free to assume another’s intentions, it appears that Canetti cannot impute or put attention on similarity in a sidewalk setting—there, the somatic bump of one body against another must be an unwelcome intrusion.

It is worth noting that, under the notion of karma one can arrive at the conclusion (one can arrive there by other concepts as well) that wherever one meets, perhaps even in the most passing of ways, is not a stranger; this is someone you have known in previous lives, there is a preexisting relationship (perhaps blissful, perhaps, strained, but a relationship nevertheless). The point is not that such a person cannot be odd; it is rather that they cannot be a stranger. It is without a doubt swimming against the massively rushing current of hyperindividualism in the United States to suggest that we might benefit from not assuming everyone we meet (or see in public) is a stranger (is an unwelcome intrusion). A major trend of technology (portable music players and cell phones in particular) have allowed us to “carry our private world” into the public domain, more or less as a kind of boundary or bubble. It is not simply that we are all on our way to be doing this or that and cannot afford to stop to chat, but more that the trend is toward colonizing the public with our private worlds. There are limits to this, obviously. Etiquette has (spontaneously) developed that we tend to go outside when we get cell phone calls, but it takes being a bit more than simply an asshole to tell one’s friends around the table when out eating not to answer their phone or text, &c. One could argue that the “invasion of the public” into one’s home (through the Internet, social media) is a parallel move, but not quite.  I may complain that the TV’s news (or the Internet) fills my private world with terrible stuff, but I’m still in control insofar as I can turn it off, ignore it, etc. Whatever (attempted) colonization of my soul I wanted to claim as going on, I’m complicit in it in some way. But when someone denies me their public self (by wearing headphones in public), the mutuality of the social setting is fucked up.  It would be received as really weird to ask such a person to “be present to me in this social world we share.” If someone were listening to loud music on a laptop without headphones or if they were carrying on in some kind of awful way, it would be much more normal seeming for me to ask them, for the sake of the social world, to tone it down, but our courage to stick up for the public this way is heavily undermined. Or, not our courage, but rather the social feasibility of it—the chances of not seeming off your rocker for making such a request.

Obviously more could be said on this point, but in particular Canetti’s shift to the word “strange” and the way that the crowd (for all he claims) functions primarily as a “welcome” form of touch (in distinction to the anti-social construction of “unwelcome” touch) is one of the ways that a cynic can leverage this text. Insofar as cynicism (c.f., Barabash, above) is reactionary, this particular way of construing the social world (not necessarily the people in it) as unwelcome individually and welcome when I can imagine we are all the same is obviously an argument (whatever Canetti intends) that is good for the status quo. The less that people are people, the more manipulable they are. As Jung insists:

As the individual is not just a single, separate being, but by his very existence presupposes a collective relationship, it follows that the process of individuation must led to more intense and broader collective relationships and not to isolation … (Psychological Types, ¶758)

A norm serves no purpose when it possesses absolute validity. A real conflict with the collective norm arises only when the individual way is raised to a norm, which is the aim of extreme individualism … The more a man’s life is shaped by the collective norm, the greater is his individual immorality (ibid, ¶761).

The construal of contact between individuals in the public sphere as “unwelcome” is a sign of the individual way being raised to a norm; so that the collective norm of a crowd Is offered as a remedy (as “welcome”) because then not only is social life negated by giving absolute validity to a norm, but also individual immorality spikes. One can see this particularly in the degree of entitlement one encounters in public, with its implicit assumption that whatever is good is whatever I can get away with.

Endnotes

[1] All quotations are from Canetti, E. (1981). Crowds and Power (trans. Carol Stewart), 6th printing. New York: NY: Noonday Press. (paperback).

[2] I want to be clear, I’m dwelling on Canetti’s opening paragraph because his section (“The Fear of Being Touched”) has two main moments: providing discourse for the opening assertion and then characterizing its solution or opposite (that man overcomes his fear of being touched “in the crowd”). To the extent that the framing of a problem implies its solution, one can will be able to infer in advance some rejoinders to Canetti’s opening salvo, but the details are still worth pursuing.

[3] Barabash, Y. (1977). Aesthetics and poetics. Moscow, Progress Publishers.

[4] See Mann, T. (1955), Gesammelte Werke (vol. 12). Berlin: Band, pp. 828, 830–1.

[5] It is not the purpose of this post to describe or characterize this truth-class; it is enough simply to note that it is not difficult to discern here.

[6] It necessarily oversimplifies things to rely upon the hyperbinary of attractive/repugnant, just as love/fear itself is overly simplifying. Etymologically: fascinate (v.)

1590s, “bewitch, enchant,” from M.Fr. fasciner (14c.), from L. fascinatus, pp. of fascinare “bewitch, enchant, fascinate,” from fascinus “spell, witchcraft,” of uncertain origin. Possibly from Gk. baskanos “bewitcher, sorcerer,” with form influenced by L. fari “speak” (see fame). The Greek word may be from a Thracian equivalent of Gk. phaskein “to say;” cf. also enchant, and Ger. besprechen “to charm,” from sprechen “to speak.” Earliest used of witches and of serpents, who were said to be able to cast a spell by a look that rendered one unable to move or resist. Sense of “delight, attract” is first recorded 1815.

Historically speaking, it is pertinent that the sense of “delight, attract” is only from the 19th century forward, and that fascination previously signaled something that “rendered one unable to move or resist”—Medusa must be a classical expression of this. All the same, the useful and marvelous specifics of this need not ultimately limit the range of examples for the hyperbinary expansion (fear, love, love of fear or fascination, fear of love or loathing). One could make similar remarks about welcome/unwelcome (i.e., those seemingly unwelcome visits that turn out to be highly fortuitous, and those seemingly welcome visits that are finally exposed as terrible. In the romantic domain, two immediate images are highly illustrative: when a mere friend comes to visit and an unforeseeable night in bed occurs, or the joyous arrival of one’s spouse-to-be, only to be told they are breaking up). So if I resort again and again to ‘attractive” and “repugnant” as descriptors in examples, it is vehemently against the notion—more frequently encountered, especially in the love/fear hyperbinary—that it is the only or even a necessary contrast.

[7] Canetti devotes other sections of his books specifically to hands and fingers, so this may not be an overreading.

[8] Here again, objections about translations are moot. Repeating the paragraph from before: Let’s allow this could be a fault of the translator or maybe even the translator herself losing her nerve in the face of the text’s overgeneralizations. If it is not already clear, let it be clear now and henceforth: when I say “Canetti” I cannot mean the once-living existential being who is imputed to have authored this text; I can mean only whatever sense I make of what seems to be evident through that text, not for the purpose of defaming the author, but for addressing the text. Second, lovers of Canetti might defend some point or sense in the original; one can say that the translator did a hack job here or there, and so forth. Be that all as it may, the myth that there is some “correct” text out there in lieu of this one solves nothing; there are, at most, only alternative texts and critical opinion and fashion may place the tiara on this one or that one. Meanwhile, the publishers of both texts will continue to publish their texts, and the sense derivable in either will continue to be inputs to people’s thinking about these matters.

[9] I heard a commentator report, not without scorn and not without referring to the landmark moment when the American Psychiatric Association depathologized homosexuality in its diagnostics manual, that pedophilia will similarly be depathologized in the DSM-V (due out May 2013). This remains to be seen; the current proposed language does not appear to delete this diagnosis. A tangentially related, but illustrative, issue may be see here.

[10] I point to these examples principally to make clear how, as the heightening rhetoric against the Soviet Union in the Cold War advanced, the “Western world” was systematically destroying South America more aggressively than in previous eras under the bloody banner of neoliberalism. But neoliberalism has also wreaked devastation in the US and England—the accomplishment of Reagan and Thatcher was simply to manage the feat without as much overt bloodshed. That’s the claim at least. The error of this claim is the relative lack of violence. For instance, whatever extra-legal disappearances occurred under US-supported dictator Augusto Pinochet’s direction, the numbers pale in comparison to the “legal” disappearances that have led to mass incarceration (thanks to the Reagan Administration’s inauguration of the still on-going war on drugs). The plunge in literacy in Peru following the arrival of neoliberalism is at least offset by the rational decision of the Peruvian government to default on its onerous and socially destructive IMF loans. In the United States, the plunging literacy rate is met by calls for even higher standards that assure even more students will be resegregated (socially) and likely funneled into prison. In the calculus of viciousness, the thuggery of merely beating me over the head and dragging me off to shoot me seems at least less duplicitous and disingenuous than creating a social environment that funnels me on a slope almost inevitably to fail—all the while blaming me for it.

This is part 2 of a two-part post. Part 1 is here.

(Here is the declaration for “superstition” I offered from part 1 of this post: “The essence of superstition is to believe something despite all knowledge otherwise”. The list of superstitions is:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I come out of the closet, people will reject me or hurt me.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.)

People who may have elected not to go see The Dark Knight Rises due to the shooting in Aurora, Colorado are exhibiting a variation of the closeted homosexual’s thinking, and the fact that some of you may be scoffing at such people does not let you off the hook. You will act every bit as superstitiously as soon as the consequences are dire enough. Thanks to the media coverage, one would have to have been incognizant of the event to go see the Dark Knight Rises after the shooting and not have the (unbidden) thought, “What if someone shoots up the theater I’m in” go through your head like a bullet. Echoing in the back of that thought is the foundational kernel of superstition, “Well, someone might.” This is why I say it should still be called a superstition.

But I want to dwell on this some more. First, it has to be becoming more and more obvious that there is some serious egotism at work in superstitious thinking. Apparently, I am so vitally important to the world that if I come out of the closet, simply everyone will continuously beleaguer me in every way possible; they’ll even go out of the their way to do so. And if I walk under a ladder, the very metaphysical principle of the cosmos itself (if not actually “god”) will personally intervene to punish me for my temerity. In this case, “bad luck” is exposed as “sin” insofar as transgressing a sin will have the utmost dire consequences imaginable[1]. So since, as an arch piece of superstition, re-casts the actual nature of the world into one where the individual takes on the utmost significance—so that by committing a sin, the whole apparatus of reality must be involved in correcting it. So it is the self-same egotism at work in the (rather excited, thrilling) fear that going to see a movie will put you in the thick of it, like those in Aurora, Colorado. I’m thinking specifically less of those people who stayed home out of fear (probably not most people) and more about the ones who added some additional spice to their experience of the movie by left-handedly “hoping” they’d get caught in a disaster as well—a wish that would evaporate the very moment the very first spritz of gas canister sprayed into the auditorium. One might pick infantile, juvenile, or adolescent as a way to describe this part of superstitious thinking[2].

So, superstitious thinking has a strong streak of excessive egotism about it. In fact, just as the shift from a claim of possible to plausible involves the slippery substitution of one “reality-principle” for another (s when sliding a “god” into a “godless” cosmos), the substitution of the communally shared and lived world of people for one in which I am the most determinative and important factor points to one of the central ways that superstition can get its hooks into us. Current life in the US (just to limit the scope of my comments) is certainly alienating; that adolescent sense of hopefulness or wanting to save the world seems relentlessly disabused by simply observing the world, &c. either because we remain emotionally regressive around this point (where life supposedly convinces us gradually of our insignificance) or because we really do live with a sense of that insignificance, superstitious thinking is an obvious (and obviously appealing) counterbalance.

In several places, Jung remarks to the effect that modern man thinks he’s ever so modern, but give him a bit of a smack and he turns back into the very living image of an other-possessed non-modern. With the view of superstition I am presenting here, self-honesty can only make us realize how frequently we get entrenched into superstitious thinking. And so we go to the movie, and in moments of distraction it might come back to us again that we could be killed at any moment. Or, what is more likely, as soon as we see someone in the theater who could serve as a plausible projection point for the idea “someone might shoot the place up,” the thought will manifest there. We can then laugh it off, and feel proud of ourselves for not being one of the mindless sheeple stampeded by fear, but the irrepressible recurrence of the thought “it might happen” is the most salient point of this dynamic, because it’s that recurrence that will eventually wear us down. It’s on this basis that the jackass insists, “There are no atheists in foxholes.” If that is supposed to be some kind of proof that religion is secretly our last resort, I’m inclined to see it as a terrible piece of (self-) humiliation that the atheist experiences. The terrible, brainless stupidity of “maybe it’s true” has finally (in a moment when he is most vulnerable) worn him down to his last nubbin, and (in desperation) he reaches out to a completely false, completely inefficacious “solution’ that he himself finds humiliating and revolting to grasp. If that’s the Good News, spare me.

I fault no one for wanting to avoid a sense of insignificance in life. But insofar as that is a caricature of human life, so the idealization proffered by religion (and superstitious thinking in general) that places the person at the center of the universe (as its sole and determining force) is equally undesirable and simply the exaggerated obverse of the coin. Of course there’s all kinds of “drama” in the religious view, if one bothers—the spiritual warfare of Satan himself (rarely ever one of his minions) showing up to personally fuck with you; just as it is considered the height of affective appeal that the Almighty Supreme loves you. One wants to survive a face-to-face with the devil, just as one wants to survive a face-to-face with James Eagan Holmes (or any of his ilk)—and all of this presupposes the necessity of such a confrontation in order for significance to be generated.

And this is where we begin to encounter superstition on a social level that has important, undesirable consequences. If we don’t walk under a ladder because it might bring bad luck, if we wash our hands because we might get sick, if we stay in the closet because violence might be done to us, this is all of a piece with not doing anything when oppressive security forces start rounding people up to be taken to camps because “maybe the charges are real.” On a practical level, we might choose to believe that because otherwise the alternative is too frightening, just as we might try to find a “rational” explanation for random violence that happens to people we know.

It must be reemphasized: these patterns of superstitious thinking apply only where people indulge them to be sure, but we all have our own patterns of thinking that will manipulate us just as surely as someone who believes in the power of black cats, germs, or homophobes. The fact that we have to talk down, manage, or ignore the recurrent argument “but maybe it’s true” in whatever context it appears in for us individually points to the nexus points for the issue socially.

One of the more boring and tedious tropes in criticism of the US is the notion of greediness. Most assuredly, many people are seriously greedy, but the overwhelming majority of people in the United States are consuming in excess for the sake of other people, usually children. We can always blame breadwinners and capitalists for being merely greedy, for benefiting from whatever financial depredations they practice upon other people, and of course those depredations have personally satisfying consequences for them, but they also are often in the service of supporting (willingly or not) loved ones, other family members, friends, &c. I mention this because these sorts of affective bonds can play important roles in leveraging our submission to an “it might be true” (superstitious) argument.

As a first fact, hardly anyone (even in Aurora, Colorado) would have thought twice about seeing The Dark Knight Rises after 20 July 2012 if the story were not widely distributed in the media. Superstition being cultural, there must be cultural dissemination to have superstition—more precisely, to leverage our resistance to the “it might be true” arguments of lived experience. The case of this movie is an excellent one, because so few of us are willing (really) to seriously believe the notion that anyone will jump up and open fire in the very theater I’m watching Batman in. In other words, the perceived consequences (though dire) are undermined because they are simply not credible enough. It’s probably this same “risk calculus” that permits us to get into a car and drive anywhere despite the overwhelmingly greater danger[3]. At this point it may be becoming clear why I left out the adjective “rational” when I wrote, “The essence of superstition is to believe something despite all knowledge otherwise.” Just as we might disingenuously elide the possible into the plausible, we similarly like to elide the reasonable into the rational—where the reasonable is simply that which might be done, and the rational is that which might most plausibly or sensibly be done.

I probably wouldn’t argue that it is reasonable to “exercise caution when driving, since there are dangers when driving that can be avoided” but I’m not going to agree that driving cautiously is rationally necessary in order to avoid accidents. This is exactly the same illegitimate move as when hand-washers cite medical mechanisms to justify other grounds for behavior. We all know perfectly well that both attentive and inattentive driving avoids accidents, &c. And that driving cautiously does not guarantee avoiding accidents. Nor is anyone going to claim that’s what they’re claiming, except that that’s exactly what they’re claiming. Just as hand-washing reduces the risks of infection (or smoking increases your risk of cancer), those statistical predictions are (1) only statistical, and 92) say nothing specific about the specific circumstances one is specifically involved in. We drive cautiously to avoid having accidents, but cautious driving can only guarantee a lowered risk of being in accidents. In other words, cautious driving isn’t really doing what we are hoping it does, just as washing our hands does not, just as being in the closet does not, and just as throwing salt over our shoulder does not. So even in this apparently obvious case of the “dangers of driving,” we see more of the same kind of pseudo-rational superstitious thinking at work. And the underlying objection that I have to this is that it means we are doing things that do not get us to the ends we desire (e.g., driving cautiously to avoid accidents, washing our hands to avoid disease)[4].

So it’s partly for this reason I do not claim that superstitious thinking overthrows our “rational knowledge”. My years of incautious driving could be used as an argument against the rational claim that I should drive cautiously. In other words, my lived experience runs contrary to the rational assertion that cautious driving is the only way to go. Lest there be any illusions, if one wants to say that “cautious driving” is synonymous with “defensive driving,” then I’ve not been a defensive driver; in fact, I’ve very often been an offensive driver, and when I was still in college, I was an offensive driver in the pejorative sense of the word. Looking back, I could say that in a sense I relied upon the defensiveness of other drivers to allow me to be an often irresponsibly aggressive driver. I’ve since calmed down a great deal, but it’s perfectly obvious to me that my empirical experience is contrary to what might be called a “rational knowledge” about how one should drive. Similarly, my empirical experience of not hand-washing is definitely not what might be called the “rational knowledge” claimed about hand-washing. And I know I’m not the only one in this regard. So, if one can conclude that it is reasonable to drive incautiously or not wash one’s hands (or walk under ladders), this is an empirical refutation of the “rational knowledge” claimed under the guise of “if you don’t drive cautiously, you’ll get in accidents” or “if you don’t wash your hands, you’ll get sick.”

So, succumbing to superstitious thinking, then, would be when I wash my hands on the presumption that  that is enough to avoid being sick, or when I drive cautiously placing all my faith that that will be enough to make me avoid accidents, or when I don’t walk under ladders with the notion that that’s enough to avoid bad luck. Now, of course one can immediately say, “Wait, that’s hardly a necessary or sufficient condition for any of those outcomes.” Of course, but a person doesn’t tell themselves, “I’m washing my hands to lower the risk of infection”; they say, “I’m washing my hands to avoid being sick” (or, perhaps even more often, “I’m washing my hands because it’s what you’re supposed to do”—which really points up the superstitious part of the thinking). But whichever of the three rationales people give for washing their hands after going to the bathroom, none of them actually have any relationship to the specific circumstances in which the person is. If there are no germs being washed off, then one is not in fact lowering the risk of infection, because there’s no infection to be risked. Or, in the more ironic case, it is the infection on the paper towel dispenser that gets them, because they had to dry their hands off—thus the air-driven hand-dryers in public restrooms, which we all—if we have any sense, of course—hit with our elbows to turn on. And so forth.

It might seem I’m being willfully perverse or that from all of this we should conclude that everything we ever tell ourselves can only be false, so fuck it. Thank you for the hysterics. The point of disconnect is not that everything we know must be necessarily false. I could wash my hands after going the bathroom because I don’t like how they feel or smell. I could drive cautiously because I’m able to pay more attention that way. Medical science is so fraught with competing explanations at this point that it’s quite hopeless for me to listen to what they have to say in order to make a decision about not getting sick—hand-washing is certainly not one of those strategies. And if I’m afraid of going to a movie theater because someone will kill me, I’ll stay home and be victimized by the burglar who’d counted on me being out—bad luck that. There are any number of people who would never go to sleep at night without locking their door—and other people who never lock their door. Some don’t lock their door because “that attracts the negative energy of robbers”. Some people don’t lock their car door because “if someone’s going to steal something in your car, at least you don’t have to pay to replace a broken window when you get it back.” All of these “rationales” (as at least worthy of being called pseudo-superstitions) are rooted in actual, lived (reasonable) experiences. I lock my door because, regardless of where I am in the world, I don’t want to create the opportunity of any random anyone wandering into my house—I don’t fantasize that it makes me secure or safe[5]. The only person who ever robbed me, was a housemate—and that’s far and away statistically the most likely source of any domestic theft. &c.

Let’s review the list of superstitions again:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I come out of the closet, people will reject me or hurt me.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

As already noted, if the media were not keeping me up-to-date on all the (relatively slight) mayhem and madness (compared to the overwhelming amount of peaceableness going on continuously), then the “but it might be true” would not manifest in my life vis-à-vis such mayhem and madness. And this is where the social aspect of superstition kicks in in earnest. If we want to wash our hands or lock our doors on what are essentially neurotic grounds, for the time being at least we can file that under the category of personal choice. But when 395,000 child abductions are by non-custodial parents yet we encourage people to think first and foremost about stranger danger, we have a problem. In terms of “where the problem is” in terms of crime (and this from 2001, for goodness sake):

according to the Centers for Disease Control’s Youth Risk Behavior Survey, and the Monitoring the Future report from the National Institutes on Drug Abuse, it is our children, and not those of the urban ghetto who are most likely to use drugs. White high school students are seven times more likely than blacks to have used cocaine and heroin, eight times more likely to have smoked crack, and ten times more likely to have used LSD. What’s more, it is white youth between the ages of 12-17 who are more likely to sell drugs: one third more likely than their black counterparts; and it is white youth who are twice as likely to binge drink, and nearly twice as likely as blacks to drive drunk; and white males are twice as likely as black males to bring a weapon to school.

So whatever neurotic activity you’re up to to avoid “criminal Black youth,” it’s pretty obvious that all of that activity, and thus the activity you support both in your politicians (and the policies they draft) and the media stories that “inform” us about criminal Black youth, is wildly destructive and misguided with respect to social life. When you move to a certain area to avoid crime, that’s right in line with the most pernicious kind of social superstition. Remember again, superstition is to believe something (and act on it) despite all knowledge otherwise. Find someone who has any kind of grip on the actual, on-the-ground numbers for so-called illegal aliens that the news is always trumpeting about, and yet how many of those millions have you actually encountered? More to the point, how many are ‘destroying American jobs” and the like. Of course, you don’t know, though you may be able to provide an anecdote about some business that hires “illegals” or whatnot. The salient point for your “argument” is: “what if it’s true.”

As with all superstitions, it is normalized by recourse to illegitimate usages of evidence from other domains. It is extremely uncontroversial to say there are many undocumented people in the United States. That’s of course non sequitur, and one could compare the number of jobs “lost” to “illegal aliens” and the number of jobs shipped overseas to places where wage exploitation is easier. People love to have all kinds of shit to talk about felons, and yet 1 in 4 people in the US now have criminal records of some sort, so it’s a dead certainty you know more than a handful. All that talk about ‘tougher crime laws” are not for a mysterious cadre of no-ones you’ve never met, but to all kinds of your neighbors, probably relatives. With James Eagan Holmes’ rampage, once against legislatures get to get on their high horse and talk about tighter gun control—superstitiously trying to convince us that such measures actually achieve the ends desired. (Remember, where superstition is involved, the rationale one proposes does not actually reach the ends desired.)

What is at stake here is how the media (as the loyal mouthpiece of the self-elected Kings) becomes a major outlet (if not the major outlet) for whispering in your ear, “But what if it’s true.” And by leveraging that unanswerable argument—even without necessarily having to make you fearful—they are able to then manipulate us into going along with whatever nuttiness they are up to—whether it’s tighter gun laws, bigger or smaller government, war with Iran, rolling back civil rights, suspending habeas corpus, detaining people indefinitely without trial, flouting the Rule of Law which (for better or worse) appears to be one of the major bulwarks of that part of civilization I’m not wholly opposed to, or generally distracting you from doing anything about the nuttiness they are up to. If it is too easy to turn the possible into the plausible, the reasonable into the rational, an essential part of this is how the unanswerable “it might be true” is turned into an irresistible “it might be true.”

Unfortunately, I don’t believe that politicians are merely monstrous jackasses out to fuck up the world. They are, however, no better than their fellow human beings when it comes to being susceptible to superstition, though we are right to expect them to be better than that. And much, much more is at stake when States follow superstitions. The part to emphasize is how the claim that the superstition makes does not actually achieve the end sought. For people who prefer conspiracy theories, it is easier to imagine that the War on Drugs was designed (from the start) to affect mass incarceration. I can imagine how this is comforting because otherwise the tragedy of the thing is wholly too epic to wrap one’s brain around. However, whether by design or as a consequence, more than 100,000 civilians in Iraq are dead thanks to the “what if it’s true” that Hussein had weapons of mass destruction. As a nation, we should be wholly behind resistance to superstitious polices that do not eventuate in the end sought. We invaded Afghanistan because the government there would not hand over the individual they did not have. &c.

In addition to whatever political and social organizing we might do to resist the susceptibility of government(s) to superstitious policy-making—a process that must be deemed all that much more alarming, not simply because more is at stake but because much more complicated and pseudo-sophisticated rationales will be tortured into existence to justify the superstitions; thus, Iraq was “the right war for the wrong reasons”—there is also the project of ferreting out superstitious thinking in our own lives. At this point, I must reiterate that this is not a question of acting only “rationally”.  It is clear that one can torment data into a rationale using one’s rational faculties—we humans can (and will) excuse anything on any grounds. Recognizing this tendency means we can resist it, even if it will never go wholly away (because we will never be able to wholly dismiss the unanswerable argument “what if it’s true” … Nevertheless, we can resist that unanswerable argument precisely before it become irresistible.)

Analogous to coming out, I’ve recently been in an online correspondence and have been having various sorts of self-insights and affective responses to them, and the process has been (obviously enough) interesting to me. But there was no reason to necessarily believe that my online interlocutor would be similarly charmed by my various mental and emotional conniptions. All the same, I wanted to share at least the fact that they were going on with him and, at the same time, worried that such sharing would not be well-received and might alienate him to the point of no longer corresponding with me. (Did I describe that well enough to make clear the analogy with coming out?) So, in the process of composing the email to him about all of this, I finally had a moment where I became utterly exasperated with my own self-consciousness and simply said “fuck me … Here’s what I’m trying to say.” Using myself as an example runs the risk of allowing you to dismiss the example as merely personal, so find in yourself your own example where you overcame whatever wall of self-consciousness you might have had going on. On one hand, how sweet it would be to say that, for all of my worrying in advance, it turned out he didn’t mind hearing about my psychological adventures in the least an, in fact, received them with brimming enthusiasm. Rather, what became clear to me after he wrote back was the disparity between what I anticipated and what he actually responded with. What I mean: everything I anticipated proved to be essentially beside the point. Now, of course, because the email was delivered with all my second-guessing, that shaped how he responded, but the fear of rejection (or, not even a fear of rejection, so much as more of a non-recognition of the human reaching-out I was doing) was not gratified.  I didn’t find myself pleased that he “didn’t reject me”; rather, I found myself annoyed that I’d allowed myself to succumb to such a worry, not because it was unfounded (i.e., didn’t come to pass after all) but because I’d indulged in a useless superstition.

This may start to seem subjectively trivial. We’ve all, I suspect, finally divulged how we feel one way or another to another person, and only after going through a lot of anticipatory conniptions first. People who come out after much quite real agonized soul searching then find that it was all quite the waste of time. The world-vision, the Weltanschauung, that was driving the whole “I can’t come out, because” line of thinking proves to be wholly unrelated to life as it is actually lived, even when one is subjected to anti-gay violence of this sort or another. Put another way, the person who avoids bad luck by not going under ladders never has that moment when (two or three non-bad-lucky days later) they think, “Gosh! How great it is I avoided all that bad luck.” Most likely, there’s no thinking about it whatsoever, but if there was some thinking, one might try to congratulate oneself for whatever it was you dodged. Never mind the grim possibility that you have some bad luck anyway.

This is all kind of the reverse of magical thinking, where we tend to remember certain moments of “parapsychological truth” while overlooking (because we don’t even see it) moments that refute that world-view. So, the person who comes out and realizes all those closeted conniptions were (ultimately) a waste of time can take courage from that fact, even if she or he then has to go into a new round of superstitious thinking that (like the closeted thinking) is ultimately the testing ground for finding one’s way out of that thinking. Inmates in prison who have settled into routines frequently loathe tremendously that thought of being transferred to a new prison (or even being released back into the prison of society). There can be a (very seemingly valid) anticipatory fear that things will be awful whenever they get where they’re going next, but then they get there and things are both fine and awful in the way that they are fine and awful, quite apart from whatever fine and awful stuffs was imagined in advance. Once again, all that preemptive worry turned out to be ultimately a waste of time (except as it provides evidence for why one needn’t engage in such preemptive worry.)

The more relentless such superstitious thinking and the more dire the consequences of resisting such thinking, the more liberating it must be to resist its rigors. Once again, the parallel has to be cited for those who have come out of the closet—it’s perhaps the most familiar, most culturally visible instance of anti-superstitious thinking. But the point to emphasize right now is how you can look in your own life and spot those times when worry proved needless—not because things turned out well, but because things turned out differently than anticipated (things turning out well is simply the “happy reward” of cynical anticipations). For people who are and want to remain cynical—despite a social obligation to other people not to be—this advice will not be welcome or heeded. But for those who are finding themselves distorting the practice of their life because superstitious “what if it’s true” arguments are shutting them down, then it’s worth making the attempt to resist that unanswerable argument.

To repeat the opening declaration: the essence of superstition is to believe something despite all knowledge otherwise. Superstition is not, therefore, that part of wisdom that learns from experience and uses it for future action. Event he cynic who has never managed to have a successful relationship has lived, empirical knowledge that his or her future relationships are probably not going to work out—which probably means some self-reflection is due on why things haven’t worked out. Superstition, rather, is the belief in something despite all knowledge otherwise. For the inmate who never shows his feelings “because he’ll be made into someone’s prison bitch,” even though he’s never actually tried showing his feelings is succumbing to superstition. The person who washes their hands to avoid sickness even though they have numerous examples (from themselves and others) that their rationale doesn’t reach the aim sought is being superstitious. The politician and the citizen who pass discriminatory laws on the basis that “too much is at stake not to” are using superstition to erode the whole reason one bothers to have a society in the first place. To act on experience is what we humans do all the time; to act on tortured avoidances of certain things (whether those things have ever occurred at all or are simply so imaginatively frightful that they can’t be brooked) is the essence of what I do not want to have in the world I live in.


[1] Significantly, according to Judaism, the wages of sin are loss of social reputation and the diminishment of one’s family over the following generations. In Christianity, of course, Hell is the consequence.  It is important to connect the “reputational paranoia” of the latter to the same pattern exhibited by some inmates—and to remember how being trapped in one’s thinking (like being trapped in a prison) can become preferable to freedom. Hell, of course, is simply an eternal prison—and so it is no accident that (most kinds of) Satanism construe Satan as the symbol of freedom.

[2] I’m not emphasizing some kind of essential Schadenfreude here, though there could be that as well. Rather, insofar as one may anticipate future consequences, then when that future is negative, the thinking resembles (if not is) superstition; when the consequences are positive, they’re something else (if not also still essentially superstition, but I will maintain there is a distinction still). So one may have a naughty, adolescent, thrilled kind of “hope” that a shooter might show up at one’s viewing of Batman without really meaning that anyone there would actually be harmed. &c.

[3] It’s probably not exactly that simple. There is certainly some reassurance in the fact that we are the one at the wheel and the wide-spread (and theoretically vetted skills of other drivers in the face of the) rules of the rod further decrease the sense of consequences. For all of my confidence as a driver, I’m not sure I would sit down on a moped and hazard driving in Saigon traffic, although the self-evident “chaos” of their traffic is obviously amazingly well-organized and effective. The Vietnamese would do well to be afraid of me in such a setting. Similarly were someone from the US to drive in England or on Germany’s autobahn, etc. Nevertheless, the point is that each of us either adduce arguments against the “it might be true” for “if I go driving, I’ll be hurt” or we find some alternative to the imaginable direness we’re confronting. It may also seem, at this point, that such a reasonable risk calculus as is involved here too much no longer resembles (or can be fitted into an interpretive scheme as) a superstition. For the agoraphobic (or the autophobic), this is clearly not the case. By definition, a phobia is an unreasonable fear, so that the only one who is not phobic is someone too unreflective to notice their phobias. My point is that just because we might be willing to pay the price for whatever sacrifice is demanded of us (in terms of giving in to superstitious thinking), such superstitious thinking should not be fostered in general, for reasons the rest of this essay makes clear.

[4] There is a kind of argument that could be raised here that is tangential to my main point. Just as hand-washing can increase one’s susceptibility to disease, so can cautious driving increase one’s risk of accidents. The only person I ever rear-ended at a stop light was someone who (turning right) started to go and then stopped, because she felt the oncoming traffic was too close. I moved forward as she started to go, and didn’t stop when she did. The point is not whose fault this is: she and I were in an accident because her caution, which I didn’t anticipate, made her drive in a way that permitted me to hit her. People driving too slow (less than speed limits) on freeways, etc., etc., etc—it would be interesting to know exactly how many accidents per year are caused by cautious driving. For the closeted homosexual, certainly back in the day, the fact of being closeted meant that trolling for sex (in parks, in bathrooms) often had considerably more danger. It used to be one could not be in security forces because, the argument was, being gay meant you were susceptible to blackmail. &c. One could multiply examples and provide counter-arguments, but the point is not “what happens in each of these specific situations” but rather “my anticipated thought about the circumstance must be true”. One can blame me (from a legal standpoint I certainly was culpable) for rear-ending the woman’s car, but in the lived world she was still in an accident—and it’s not negligible that for all of my wildly incautious driving over the years, that’s the only time I’ve ever actually hit someone. Against the doxa, “one should drive cautiously,” the rejoinder here is, “well, you’ve been pretty damn lucky then.” I mention this to point out how, in the face of a superstition, the resort to metaphysics (luck) is preferred to acknowledging that the doxa (“drive cautiously”) might not actually be as true as claimed. All of this feeds precisely into the effect of superstitious thinking in public life.

[5] I’m not congratulating myself for an anti-superstitious stance here. Obviously, for me, the consequences of leaving the door unlocked are dire enough that I’ll make the minimal sacrifice of turning the deadbolt when I’m in the house. And as much as I might want to say it is rational for me to do that—as a kind of domestic recasting of Pascal’s Wager, what really do I lose by locking the door—I have to call it pseudo-superstitious, because living the alternative (i.e., freely choosing whether to lock or not lock the door) is not really on the map of possibility for me. I could do it, and it would bother me. I have friends who leave their doors unlocked; I don’t begrudge them that, etc. On the most abstract level, I could argue that I should be a door not-locker—what makes my locking the door a superstition, in my view, is that I do it without any actually empirically good reason to do so. I could trot out some racist bullshit about “neighborhoods” and so forth, but what are the “real crime statistics” where I live? What is the plausibility that a home invader should suddenly strike in my neighborhood? “But he might.” Put another way, you are sorely deluded if you think I’m excluding  myself from the perils and social undesirability of superstitious thinking. My guilt dos not excuse yours, however.

This is part 1 of a two-part post our current social world.

The essence of superstition is to believe something despite all knowledge otherwise[1]. Examples:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I come out of the closet, people will reject me or hurt me.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

Obviously, I chose these examples to illustrate and broaden what I am proposing we understand as a superstition. Amongst what I would call the salient details at work in these superstitious conjectures, one in particular seems important above all else: they all hinge on unanswerable arguments.

What is important for us today is not that someone might actually seriously believe that walking under a ladder will bring bad luck; what matters is that, even when we know better, we still don’t walk under ladders. It is precisely for this reason that I walk under ladders; I am not going to hold myself hostage to what my thinking is doing at that moment.

Of course, walking under ladders doesn’t bring bad luck—belief in bad luck itself is already a popular superstition—but the thought may still cross one’s mind, “But what if it does?” On the basis of this laughable assertion of a possibility, we then hedge our bets and don’t run the risk of offending the gods of bad luck or whatnot. Resisting this kind of thinking is why people record themselves on Youtube blaspheming the Holy Spirit (i.e., publically commit the unforgiveable sin).

This is what I mean by an unanswerable argument. For all that we know walking under ladders doesn’t cause bad luck, the rejoinder “but it might” cannot be answered. Imagine a dialogue between two people: A; “Walking under ladders doesn’t cause bad luck.” B: “But it might.” A: “No, it does not.” B: “But it might.” Unless A walks away, this will never end with B admitting or coming to realize that the insistence here upon a possibility is, if not bogus outright, then utterly negligible. So, over against the idea that walking under ladders is bad luck, one might usefully ask, “Is that really plausible?” So, to no small extent, superstitions insist on “blackmailing” us with arguments based (spuriously or not) on possibilities, which can (or might at least otherwise) be viewed in terms of plausibilities.

Revisiting the initial list again, thinking in these terms:

  • If I walk under a ladder, I’ll have bad luck.
  • If a black cat crosses my path, I’ll have bad luck.
  • If I don’t wash my hands after going to the bathroom, I’ll get sick.
  • If I come out of the closet, people will reject me or hurt me.
  • If I see the Dark Knight Rises, I’ll be shot by a madman.
  • If I move to a new town, bad things might happen.
  • If my spouse walks to work, he’ll be mugged or worse.
  • If we don’t build a wall on the border with Mexico, the US will be overrun.

For your own sake, it is worth noting which of the above you laugh off as obviously and absurdly neither plausible or possible and which you are not so sanguine about.

Obviously, when people argue (with others or with themselves) about possibilities, they must implicitly take the thing to be plausible. However, certain kinds of possibilities, in order to be plausibilities, presuppose a shift of premise that itself may be groundless. A most notorious example of this is Pascal’s Wager, where he argues (logically enough, if to the great affront of Faith itself) that one loses nothing by believing in the biblical deity[2]. The argument hinges, in part, on the possibility that the biblical deity exists, but this possibility cannot be turned into a plausibility except by rejecting the whole of lived existence and substituting in its place a vastly different explanation for “life.” There is no question that, in the western monotheistic traditions, there has been no shortage of effort expended to interpret the nonexistence of the biblical deity in terms of faith (e.g., “invisibility is the form of god”; “silence is the voice of god”). The point is that, in everyday life, between two competing explanations, we almost invariably (however much this might be a problem) pick the explanation that accords with lived-experience. So if my prayer isn’t answered, I can thank the divine for unanswered prayers or I can—in a manner far more consistent with all of the other kinds of decisions I make each day about which explanation to follow—conclude that prayer isn’t an effective intervention. I am mentioning this not to bash religion, but to point out how the elision of possibilities into plausibilities is not necessarily a legitimate move, though it is a popular one.

I’ve gotten into some surprisingly contentious arguments about this when I attack the superstition, “If I don’t wash my hands after going to the bathroom, I’ll get sick.” The most I’ll grant here is that if I don’t wash my hands after going the bathroom (in a public restroom), then I may be increasing my exposure to infection. Let us be precise. In order to become “sick” (whatever that means, as opposed to the more precise “become infected”), there must be some germs present in the public restroom that I am likely to be susceptible to. It is of course possible that such infectious germs may be present, but is it plausible? In order to make such a determination, one would have to be a bit of an expert on this particular bathroom, to say nothing of knowledgeable about what sort of infections I’m susceptible to. In the face of this claimed plausibility, which I’ve encountered from adamant people, is the empirical experience (I have had) that not washing my hands after going to the bathroom has not made me sick. In fact, I can say that I know of no moment in my life when failing to wash my hands after going to the bathroom could be causally linked to me being infected to the point where I would describe myself as sick.

The people who make this argument to me are not necessarily neurotic hand-washers, although that class of person will be a leading and loud advocate of this superstition. But this points to the fact that the more “normalized” a superstition is, the more there will be pseudo-rational arguments to try to make it plausible. I say pseudo-rational because the quite evidently empirical medical fact about how infections occur in human bodies is leveraged here in an inapplicable context and toward an irrelevant end. That is, the universe of the neurotic hand-washer (and milder advocates of this superstition) is not one where a “public policy” of hand-washing would begin first by performing an actual risk assessment in any given (public) bathroom setting to determine whether or not hand-washing in this particular instance is actually necessary, but rather proceeds from the premise that piss or shit daubed fingers are unnerving and disgusting. A quasi-sophisticated version of this leverages notions like “herd immunity” and the socio-moral responsibility of not becoming a carrier of such germs (to other people). Again, the legitimacy of these observations in their proper contexts is illegitimately marshaled the advocate of the superstition’s purpose. Nor am I saying that any such policy of hand-washing could be or should be determined in this way; my object is only to point out that what is ultimately and relentlessly at root in the insistence “you should wash your hands after going the bathroom or you’ll get sick” is no better argument than “well, you might.”[3]

In any case, I suspect most people are definitively certain that walking under ladders, letting black cats cross their paths, and even not washing their hands after going to the bathroom[4] will have untoward consequences, even if people don’t readily admit the last.[5] But when it comes to the threat of potential violence if one comes out of the closet, the pseudo-rationality of the argument acquires a much thicker shell. In part, this is because the anticipated consequences are more dire. If with black cats and ladders, the threat consisted of the numinous but vague “bad luck” and with not washing one’s hands merely the unpleasant but transient “illness,” with the social consequences of rejection for being queer one faces both physical violence outright and all manner of potential social violence (e.g., loss of work, inability to get work, loss of social status, &c).

A disadvantage of using “if I come out of the closet, people will hurt me” as an example is that it may make you feel the example has nothing to do with you. Almost no one lives wholly without some piece of personal information they are not “out” about, and which they keep to themselves to avoid (pseudo-rational) consequences. Again, just as the superstition of hand-washing inappropriately leverages empirical medicine to argue for its plausibility, so the very real, ongoing, and potentially ubiquitous violence against people who are queer-identified provides potentially excellent arguments for not being out. I am in no way suggesting that such violence is “all in one’s head”; rather, it is most definitely committed by bigots, &c. But, one of the main differences between being in the closet as opposed to out is that the condition of the former is marked by potential awareness of violence 24/7 and in the latter only in those circumstances where one (more or less validly) determines that such violence is, in fact, likely[6]. The advantage of using this superstition is the experience countless queer-identified people have reported upon coming out. I like to say that I have never met anyone who regretted coming out of the closet; for the sake of completeness, I’ll say there have been two people in my life who claimed to know someone who regretted coming out. There have been people who resented being outed. &c. The issue here is not to bog down in the details of the example, but to point to the greater leverage we can get with ourselves because, in this case, the consequences of going “against” the superstition can be so much more dire (up to and including death), but also the immense relief that people report from not giving in to this superstition any longer. I am surely not alone in reporting the vast difference between the “paranoia” of my closeted existence and the overwhelmingly absent amount of direct violence as a “faggot” I have subsequently experienced. The point is not that I have not been beaten up, of course. The point is that I have not spent decades living constantly with that worry or fear.

Partly, this is exactly because of the direness of the consequences. To go out into the world everyday anticipating the worst violence on the off chance that someone figured out my sexuality was, of course, heavily tiresome and wearing. The clear implication is that the liberation of similar heavy and wearing superstitions in our lives equally portends that much relief. In prisons (one could cite historical and current examples as well), one’s reputation is (arguably) all one has. The pseudo-rational fear (pseudo-rational insofar as it is rooted factually in certain kinds of epistemologies in prisons but here inappropriate extended to cover every waking minute of the day) that if anyone “thinks ill” of you, you will be made to suffer in some way (e.g., your shit will be taken, you will be made into someone’s prison bitch, you will lose social status and thus access to the things that will meet your needs). Here again, the consequences of not taking this superstition seriously (“if I don’t maintain my reputation, people will take advantage of me”) are dire but the emotional cost of maintaining this superstition 24/7 are self-evidently not worth it, once the risk is taken not to worry 24/7 about this. Zillions of people who are queer identified have discovered, somewhat to their chagrin or embarrassment, that all the fuss they imagined about coming out turns out to be wrong—that for the overwhelmingly vast number of people, no one gives a shit that you’re gay[7]. So the man who is hypersensitive about his reputation discovers (perhaps thanks to a particularly humiliating public experience) that his touchiness was not well placed (even after the public disaster).

Hopefully it is not controversial to say that most of us are not happy to avoid living the life we’d rather be living because we feel we must act a certain way in order to avoid undesirable consequences. For something simple, like not walking under a ladder, we don’t “mind the sacrifice,” in particular because the consequences of disobeying aren’t so dire. Similarly, whatever 20 seconds of time wasted to rinse one’s fingers in a public restroom sink (never mind that studies show the sink to be the dirtiest area of a public bathroom—oh the irony!) may be called a sacrifice, perhaps it’s not really “warping the fabric of our lived experiences.” It is the smallness of this “sacrifice” that makes it unproblematic in general.  But where the threat of violence starts to loom, as with coming out of the closet, it becomes much more ubiquitous and depressing to maintain the superstition. This is exactly why hand-washing is actually a “trauma” for the neurotic hand-washer; it is precisely because the consequences of not washing one’s hands are construed in a very dire way—a way far more dire than most of us would credit[8].


[1] I’m tempted to say “rational” knowledge otherwise, but why I eschew this adjective will become clear soon enough.

[2] On atheistic grounds one obviously loses a tremendous deal, perhaps one’s wholly terrestrial existence, but this was obviously not what mattered to Pascal.

[3] It’s slightly gratuitous to continue this point, but some medical science suggests that excessively hygienic people actually increase their susceptibility to infection, because their immune systems are never tested in the trial by fire of infections. To whatever extent this proves true, the other arguments (about herd immunity and any social obligation not to pass along infectious germs) obviously must be similarly modified by this medical insight.

[4] There must doubtless be a gendered distinction here. I would expect males to be more convinced that women’s fingers are filthier after urinating and vice versa. Males (even gay ones) seem to be little in the habit of freaking out over the possibility that the man whose hand they are shaking may have recently touched his penis. To whatever extent we guys “keep our junk clean,” we can count on other males to do so as well—that is, we have no trouble assuming that, if we even think of it at all, which is probably rarely the case. Besides, it’s not as if we douse our fingers in piss as we urinate, so that the more probable source of any “grossness” in a bathroom will be assumed to be not our trusty and beloved penis but that loathsome cloaca of the flush handle, &c. Reasonable as this assumption may seem, this explanatory framework would still have to offer how the flush handle became that loathsome cloaca in the first place, if nothing but hygienic healthy genitals have been handled in the room. Doubtless, shit is the culprit—its viscous clinginess, the proximity of fingers (separated only by a tissue-layer of paper), &c. Because, seriously, while we might be able to count on people (guys) to not wag things around so vehemently that piss flies all over everything, isn’t there (honestly) just too much at risk when we’re asked to trust that some guy running his fingers over his anus hasn’t come into contact with the tiniest speck of feces? Finally, at last, we come to the heart of darkness, to the moment and presence of evil itself. Which is all to say, advocates of the hand-washing superstition will feel they are on much stronger, much less assailable ground, when insisting that people who have taken a shit should wash their hands.

[5] Somewhere there are those amusing or sad studies that demonstrate: people are more likely to use a sink to wash their hands in a public restroom if they know there is someone else in the restroom with them, than if not. If it helps you to be honest with yourself, I’ll readily confess this is often the case with me—more precisely, I feel a sense of being judged (imagined or not) when I walk out of a public restroom without washing my hands. This whole cognitive process is itself superstition. I can argue to myself that it’s plausible someone in the bathroom might judge me, and however much I insist that is not the case, I can always come back with, “But they might.” This points up clearly how the issue ultimately has nothing at all to do with “what actually happens” in lived experience, but rather the virtual anticipation of what might happen, and the decisions we (all) make in light of those virtual anticipations.

[6] Even in these “likely” scenarios, one can frequently be surprised. Where being out at work is imagined as impossible, it may turn out otherwise; where one is certain family gatherings will be a hotbed of rejection, it turns out otherwise. It may be going out on a limb to say this, but it seems that (whether one is in or out of the closet) the actual, predictable source of violence toward oneself will, in general, not actually be predictable. One can only learn by risking and experimenting. And at least in principle—as also the whole purpose of learning anything at all in life in the first place—the more one knows, the more one may become able to negotiate the terrain of life, even if violence still pops up out of unforeseen (or unforeseeable) circumstances. This may be akin to the fatuous notion that one can eliminate accidents; by definition, an accident is the thing that cannot be avoided. In fact, it may be that we often excuse our own inattentiveness and irresponsibility by claiming things were accidents that we actually might have reasonably done more to avoid.

[7] This actually has its own set of social problematics, but that’s for another essay.

[8] I’m a bit of a purist—if the consequences of walking under ladders or not washing one’s hands are really of no small moment, then we should with equal “freedom of spirit” walk under ladders and wash or not wash our hands. As soon as it becomes obvious that our thinking is being manipulated in such a way that the equal choice between “walking under a ladder or not” or “washing one’s hands or not” disappears, then we have veered back into the territory of the superstition. It would be a mistake, however, to construe “always walking under ladders” or “never washing one’s hands” as a neurotic compulsion, when it is done on principle, precisely in protest of the superstition.