Category Archives: Postmodernism

Putin captured Chernobyl

So Putin captured Chernobyl. A symbolic victory for someone who threatens to nuke Europe if he is opposed in his wars of aggression. Do we even have a plan for how to respond if he follows through? Because this is 1939 all over again, only with nuclear weapons on both sides.

For more on how to #StandWithUkraine, read my Medium post here.

Leave a comment

Filed under Corruption, Economics, False controversies, Poetry, Postmodernism

boundaries

Leave a comment

December 17, 2021 · 12:30 am

A Beautiful Mind

The screenplay for A Beautiful Mind, inspired by but very different from the book of the same name, has been criticized by some as whitewashing the hero’s faults to keep the audience sympathetic and promote a more understanding view of schizophrenia. One reviewer claimed that it was pure Hollywood to depict Nash as lonely and socially excluded, claiming he was more a monstrous bully in his own right than a victim of bullying. But I found biographical details elsewhere supporting the movie’s portrayal of Nash, at least for capturing the spirit of his story, though important themes were re-imagined through fictitious events to streamline the plot.

These themes touch on a now-marginalized approach to understanding schizophrenia that focuses on the correspondence, however seemingly distant, between the content of delusional beliefs and hallucinations and the patient’s real life experiences, and looks into the subjective stress levels during real life events triggering psychotic episodes as being disproportionate not only because of genetic vulnerability but also for experiential reasons, with special attention to pre-existing cognitive processes that enable these events to have such catastrophic consequences. What sets this approach to understanding schizophrenia apart is simple: the accounts given by the patient, even during a psychotic break, are not considered “un-understandable,” and treatment enlists the patient’s own cognitive powers, instead of suppressing cognition by refusing to listen to psychotic ramblings and using powerful sedative drugs or newer antipsychotics designed to inhibit the experience of “salience,” the perception that something (any observation, internal or external to the mind) matters, has significance, or holds meaning.

Representing these now-subversive themes and what they meant to John is important, considering how other discrepancies between the screenplay and events in his life conceal one of his most remarkable accomplishments. In reality he did not give an acceptance speech when he received the Nobel, as the organizers lacked confidence in his state of mind. He had not been in treatment for decades, and the line in the movie attesting to his use of newer antipsychotic drugs was inserted out of deference to the views of the screenwriter’s mother, a mental health professional who feared the movie would inspire other patients with psychosis to refuse these drugs. Nash is one of the rare survivors of schizophrenia known to have gone into remission despite refusing antipsychotics since his last discharge from involuntary inpatient care in 1970.

Dr. Rosen: You can’t reason your way out of this!
Nash: Why not? Why can’t I?
Dr. Rosen: Because your mind is where the problem is in the first place!

This exchange privileges the conventional wisdom that the schizophrenic mind is the patient’s own worst enemy, and not a potential ally in treatment. At other moments, the screenplay credits Nash with waging a tenacious battle to turn his mental prowess to his advantage through introspective critical thinking, despite the handicap of sedative drugs and entrenched self-defeating thinking patterns underlying his delusions and hallucinations. But Nash is held to be almost alone in having succeeded along these lines, and it is unheard of to counsel schizophrenic patients to try.

I think what Nash achieved through his own efforts is more widely attainable than most experts believe, particularly if supportive talk therapy is available. My account of schizophrenia draws on two bodies of research on psychosis that fall outside the traditional realm of genetic explanations and drug therapies: traumagenic models of psychosis and newer research linking lifelong social cognitive deficits to vulnerability to psychosis and treatment outcomes. I don’t want to discount the potential benefits of biomedical treatment altogether, but I do not believe these treatment strategies are always necessary or ever sufficient on their own. I have not delved into the literature arguing that the heritability of psychosis is less clear than is widely supposed, or research on the limitations and side-effects of anti-psychotic drugs. But I am familiar with research showing that many psychotic patients find little relief using the available medications, those who do experience relief are at significant risk of frequent relapse even if they adhere to treatment guidelines, and some of the most common medication side-effects are more immediately stigmatizing than the illness itself, particularly movement disorders (ticks and tremors) that unlike psychotic symptoms can be ever-present and impossible to conceal.

The conventional wisdom is now scathingly against traumagenic models of psychosis, but retains the theory that psychotic patients erupt into delusions and hallucinations in response to stressful life events – the key to maintaining both views is to always maintain that the events in question are in no way traumatic, and that the patient has a hair-trigger response to seemingly ordinary sources of stress. This only makes sense if you utterly ignore how clearly distressed the patient has become, treating the subjective experience of distress as meaningless in the same way the delusions and hallucinations are considered meaningless. It makes more sense when you look at how trauma is defined in the diagnostic manual of psychiatry, quite unlike the dictionary definition, with the criterion that unless the situation is life-threatening or threatens bodily harm, it is not trauma. Leaving aside how often assault histories recounted by psychotic patients are dismissed out of hand by treatment providers quick to infer that these are artifacts of psychobabble and not real events, this definition of trauma was designed to rescue combat veterans with PTSD from being lumped in with schizophrenic patients with whom they might have many symptoms in common, recognizing that schizophrenia is perhaps the most stigmatizing diagnosis of all.

The expert community is more receptive to new approaches to linking psychosis to ordinary cognition by calling attention to the mediating role of social cognition, that is, the patient’s grasp of ordinary rules of human behavior and their ability to make accurate inferences about the intentions and dispositions of others. Without social skill it is not unusual to make mistakes when trying to read other people’s minds from their nonverbal cues and sometimes oblique statements about what they’re thinking, feeling or doing. And without close human contact, it is impossible to check one’s own speculations and ruminations against alternative perspectives, a process called “reality-testing” that keeps all of us from becoming profoundly out of touch. Without social skill, it is easy to fall short of opportunities for reality-testing and live increasingly in one’s private thoughts, which can steadily grow less and less realistic. In fact, living alone is a known risk factor for developing full-blown psychotic symptoms even in people with no history of mental illness.

I see both these factors in biographical details about John Nash before he attended Princeton as a promising mathematics student. His brilliance as an original thinker was not quickly recognized in real life, and for his social awkwardness he was in fact bullied and socially excluded, experiences that would have reinforced the tendency he showed at an early age of being a loner with little sympathy for others. I imagine his commitment to intellectual challenges in childhood was in part compensation for his lack of social skill, as a source of pride and as something his parents and teachers, when they recognized his potential, warmly encouraged him to pursue. And I question whether his childhood was peaceful because of his bisexuality, with many who knew him as a boy attesting to his having been bullied for showing homosexual interest in other boys. Later in life he was dismissed from his first job as a professional mathematician because he was caught in a police action targeting homosexual activity in public bathrooms, a blow predating any sign of his psychotic symptoms that could only have been humiliating in the extreme. In this context, his later grandiosity seems in part justified by his intuition about the significance of his own work, which would not soon be acknowledged, and in part a natural defensive maneuver he needed to overcome shaming experiences that had undercut his sense of dignity.

The traumagenic model of schizophrenia does two things differently: it looks at trauma histories as factors in the patient’s biological and cognitive vulnerability to psychotic episodes, and it looks at the subjectively traumatic nature of stressful life events that trigger psychotic episodes in adulthood. Childhood trauma can lead to some of the same brain abnormalities related to heightened sensitivity to stress observed in schizophrenia, suggesting life experience, particularly during the brain’s early development, can lead to biological differences conventionally attributed strictly to genes. And recent histories of interpersonal victimization are very common in acute psychotic patients, as well as being prominent themes in the content of delusions and hallucinations for many patients, not unlike intrusive trauma memories or experiences of reliving traumatic events in patients with PTSD. The literature on social cognition in psychotic patients, on the other hand, points to many areas of continual misunderstanding that, in a patient with hyper-arousal and inner fears of victimization, would allow paranoid ideation to arise when no real persecution is evident. A real history of being bullied could create these conditions of hyper-arousal and inner fear of victimization, and without good social cognition the beginnings of paranoia need not be utterly bizarre. By the time psychosis is discovered, however, delusions are by definition bizarre. Even so, I am convinced these beliefs are organic cognitions that can be explained, if the patient will enter into a therapeutic dialogue about the content of their delusions reflectively, and help develop a narrative about how they arose. Left to one’s own ruminations in a state of paranoia that discourages reality-testing, I think bizarre beliefs could find a prominent place in one’s day to day thoughts and become fixed and unshakable because of the way their emotional intensity bends other thoughts toward them and allows them to become organizing principles in making sense of the outside world.

In attacks on A Beautiful Mind that focus on how much more endearing the movie’s hero was than the real life John Nash, I will ignore horrified reactions to homosexuality and a deficit of patriotism, and focus on tales of the mathematician’s ego-centricity and mean-spiritedness. Egotism and cruelty are often ascribed to another patient population known chiefly for their poor performance at social cognition, people on the autism spectrum. A central idea explaining these traits is their inattention to the perspectives of others – in the extreme case, they seem unaware of other minds, and at minimum they find putting themselves in other people’s shoes counterintuitive and demanding of concentration, unable to readily guess where someone else is coming from if they have a disagreement. The jargon for perspective-taking is “theory of mind,” and new research is showing that patients with psychotic symptoms also have pronounced difficulties in this area. This makes reality-testing especially difficult, in the sense that one would not readily believe anyone else who contradicted their preconceived ideas, and would not seek out alternative perspectives to keep oneself from getting “out of touch”. Maintaining seemingly bizarre delusions hinges on the patient’s ability to avoid contradictory evidence and privilege any perceptions that seem to reinforce the delusion somehow. Utter reliance on one’s own perceptions to evaluate the world without critical feedback from others can make this possible.

What could possibly short-circuit this self-reinforcing delusion, except change from within? Cognitive behavioral therapies offer a way to harness the authority of the patient’s own thoughts by teaching them to do their own reality-testing with limited and abstract guidance that allows them to continue to trust themselves as interpreters of their own lived experience. It appeals to their capacity for rational cognition instead of assuming they have none, teaching flexible methods for reality-testing instead of telling them what is real and what is not. The appeal of these methods above the status quo, from the patient’s perspective, is that they are offered as adaptive tricks to question the validity of distressing, potentially immobilizing perceptions instead of being tormented by one’s thoughts. Thus the patient’s distress is validated and the therapeutic alliance is focused on relieving this subjectively real distress, no matter how bizarre the patient’s account of private torments. The content of the delusions may be insurmountable in some respects, for no one in their right mind can shake the convictions of private knowledge – much of what we experience in life that matters to us is not witnessed by others, verifiable with evidence, or easy to validate with external sources. It is in the nature of memory to be rich in utterly private knowledge to which we can testify only on our own personal authority. Hopefully the cognitive distortions driving disproportionate reactions to seemingly minor sources of day to day stress can be redressed effectively enough to minimize the patient’s recourse to flight (out of the here and now, into a private and unreachable reality) and reduce their avoidance of potentially stressful social situations that, more than anything else, can distract them from their inner world and at times offer critical feedback they can accept.

Nash: I’ve gotten used to ignoring them and I think, as a result, they’ve kind of given up on me. I think that’s what it’s like with all our dreams and our nightmares, Martin, we’ve got to keep feeding them for them to stay alive.

This moral of the story in A Beautiful Mind is not so far removed from what the real Nash says for himself. He was pleased with the movie (though he thought the music was too loud). In real life Nash concedes that returning to mathematical work and finding distraction from his delusional beliefs has been a recovery process, with the caveat that recovery from grandiose delusions is bittersweet. Yet in an interview with Schizophrenia Bulletin, he argued that the only way to de-stigmatize psychotic experiences would be to do away with the diagnosis schizophrenia itself. I agree with him that the biomedical paradigm does not, as its proponents claim, reduce stigma at all. If anything it elevates the disdain of the treatment provider for the patient’s accounts, and in the presence of such disdain there is no therapeutic alliance at all.

What would that leave us with in the way of understanding? Some experts in mental health would argue that “it is more productive, theoretically and clinically, to research specific behaviours and cognitions than the heterogenous and disjunctive construct of schizophrenia, which has poor reliability and validity” (Read et al. in Trauma and Psychosis, eds. Larkin and Morrison 2006, citing Bentall 2004 and Read et al. 2004). I would not go as far as Foucault in stripping madness of objective validity and treating it as a relative category always socially constructed for political purposes, because I take issue with the postmodern project of understanding medicine as a political discipline that can best be critiqued with liberation ideologies that privilege every marginalized position above the claims of the hegemony of consensus. I should think the choice to have someone institutionalized in a psychiatric hospital against their will is often about containing (and trying to correct) behaviors that are objectively an imposition on others, if not as a forensic patient then because the behaviors seem self-destructive and intractable and are so difficult to understand. Involuntary hospitalization may not be the most constructive solution to containing or correcting such behaviors, but resort to these tactics is testimony to how readily others can agree that the behaviors are problematic. Only radical subjectivity would argue against understanding that something is wrong, and I doubt honest subjectivity would credit the person in question with freedom from distress brought on by these provocative behaviors, or the cognitions behind them.

That said, I will point out that hallucinations and delusions are not uncommon in healthy people who will never seek help for mental illness, and the best way to reduce stigma against people who do suffer from psychosis is to normalize these experiences and focus on therapies that improve quality of life outcomes, rather than treating any recurrence of hallucinations or delusions as symptoms of relapse in their own right. Nash thinks of mathematics as an art, and of madness as something great artists risk by virtue of their gift for originality, and their willingness to seek new ideas by taking unconventional perspectives. So I will close with some lines from the poetry of Paul Celan, one of my favorite writers, and an artist who did not survive his battle with psychosis:

“Autumn eats its leaf out of my hand: we are friends.
From the nuts we shell time and we teach it to walk:
then time returns to the shell.”

– from Corona, translated by Michael Hamburger

Unreal as this experience sounds, it is objectively magical, and enchantment with such otherworldly experience is surely not wrong in itself.

Leave a comment

Filed under A Beautiful Mind, Poetry, Postmodernism

Bagehot’s namesake

Since this fan site is dedicated to Bagehot, I thought I’d better explain why I named my beagle after a Victorian economist I know chiefly as the author of a rhetoric on the progress of civilizations, which he felt culminated in the selection of the fittest, his own, the British Empire. Why not a Victorian poet, a less bigoted student of natural science, or a progressive political idealist? What made Bagehot’s name a heroic one for me was the boldness of his rhetorical defense of empiricism as the basis of political idealism. Already in Victorian England, on the heels of the Enlightenment and the scientific revolution, Bagehot’s positivism was on the defensive as a paradigm in which beliefs can be tested against evidence, knowledge is perfectible and decisions can be rational. Instead of abandoning its flag, he strove to carry it forward into the field of political science. Can this be done without his solipsism? In Systems of Survival, a rhetoric on human morality that points toward organizing principles that transcend history and culture, I see reason to believe there is a way. An alternative to the dizzying array of signs without definite referents proposed by postmodern theories of governance.

“Every way of man is right in his own eyes, Byron; the lord ponders the heart. Proverbs: 21.”

If it can be found in the Bible, there is nothing new in postmodernism, and in its earlier incarnations it was trumped by demand for a justice system that serves the community, expressed as a religious doctrine meant to trump diversity in statecraft when the chosen people could not be ruled by one of their own. What is new is the idea that subject communities have rights of self-determination within polities that could overwhelm them but could not rule them humanely. It accommodates social differences that are economically alienating and prevent subject groups from thriving in the political economy of the state. It approaches language barriers in the information economy as having cultural integrity and holds that if these languages were dissolved to facilitate information exchange, identity would be lost and the consequences would be profound for individuals and society. It privileges voice over status, allowing individuals from affected populations to contradict experts who designed public policies and point out that they have created perverse incentives. But these reforms are meaningless without the state’s commitment to protecting the subject groups from the depredations of its own political economy. And for the state’s purposes empiricism is paramount. Postmodern thought in the hands of a criminal defense attorney is as wicked as Ben Wade. And this is only a tantalizing hint at its potential to act as a double-edged sword.

In my experience, relativism is commonly cited as a justification for condoning corrupt practices in health services in developing countries. Perverse incentives are dressed up as cultural imperatives and disinformation is reinforced in the name of protecting access to uninformed patients, even where it threatens public health. In Systems of Survival I have found a theory of corruption that recognizes the importance of identity, diversity and minority rights. Too late to name my beagle after the author, Jane Jacobs, but time to apply her ideas to my own work. Bagehot, by the way, means badger in Old English.

What led me into this apologia for my beagle’s namesake was the centrality of indigenous rights to Q’Orianka Kilcher’s activism in Peru. I’ve been blogging about The New World a lot because it’s my favorite subject for love poetry, and she is one of my favorite actors. So I try to follow her work as an activist as well, mobilizing youth and engaging documentary film makers to empower the defenders of the Amazon who live and die on the front lines. The logging equivalent of poachers murder and terrorize those who live in the rainforest when they threaten new roads across their territories, sometimes with the collusion of a government determined to extract export revenues from the hinterland to finance development in the cities where voters are concentrated. It is a genocidal low-intensity conflict across all the borders of the Amazon, but it is rarely in the news.

Genocide used to give us greater anxiety about our international obligations as members of a global society than it does today. The vitality of video evidence of human rights abuses competes with its viability as a hook for selling newspapers, commercials and emergency relief largesse that can be pilfered by local bandits or kleptocrats for our sense of conscientiousness. Our President had to act almost unilaterally to respond to the threat of genocide in Libya, and only the innovation of drone warfare made doing so politically viable. If the war in Iraq taught us anything, it is that a civilian population cannot be protected from irregular fighters without foot patrols and heavy military casualties. These are sacrifices we are increasingly reluctant to make, particularly when the perpetrators of human rights abuses are not our military adversaries.

Postmodern political science has given us the notion that genocide arises out of human nature under conditions of class conflict along ethnic fault lines. Some ask, why would we sacrifice to fight the law of the jungle? Couldn’t genocides arise in ever increasing numbers around the post-colonial world and render our efforts futile? But a worked example of this theory amounts to rationalizing in the defense of the perpetrators. One war correspondent I’ve read can do better.

In The Warrior’s Honor, Michael Ignatieff links atrocities like genocide to two predictors, use of irregular fighters like militias and the availability of media monopolies to bombard the population with racist propaganda. The formula for racist propaganda is class conscious in the irrational sense of fomenting paranoia about the distribution of wealth, but rather than using evidence of unexplained wealth to target aggression, a ‘predatory’ or ‘parasitic’ class is invented on ethnic distinctions that the militias can ferret out in their neighbors on the basis of their vanity over slight differences in appearance and custom that are considered important to their identities. Indeed he senses from his interviews with fighters in ethnic wars around the world that the slighter the difference, the more preening the distinction and the more pronounced the paranoia and violence, as though tenuous motives corrupt the soul more completely.

These insights are rare even in someone who has seen the face of genocide first hand in many incarnations. The narrative of a career photographing small wars in Africa in The Zanzibar Chest is bewildered, alienated, ironic and traumatized instead. The author lost a friend to the events depicted in Ridley Scott’s Black Hawk Down, and I only recommend the book for a different point of view on those events. A more sensitive portrait of the experience of witnessing violence as a civilian is the movie Triage. This poem about it could be read as a spoiler, so if you haven’t seen it you might not read this until you do.

A loss of innocence is inward, pure
in its compass of privacy fulfilled
by the annihilation of a ghost
that crept inside the circle of defense,
the magic outline of the protected
and self-consulting few, body and mind
and fellows whose bodies and minds matter.
The loss is truly invisibly yours,
its witness will not survive left outside.
A ghost is only visible to friends.
The peopled world outside does not look in,
the boundary is as solid as your skin
and as tactile, recoiling from danger
even to a place within your body
if your body comes to harm. Sentient,
the world within has artifacts, culture,
the tools you use to bring to life the sights
that your imagination calls its voice.
When you stand near a wildflower it stands
within the circle and is yours, alive
with meaning and identity and charm.
You move away and see instead a field,
a swath of color, a bright impression.
To leave behind a friend is not the same.
He left an artifact with you, a gift,
a way of answering you, too, you hear
the same inside your head when questions come.
He knows your voice, accompanies you toward
your inspiration, distracts you telling
his own story of your lives, resists you
without resentment, follows after all.
You are unfair to one another, kind
and selfish by turns, you owe each other.
To walk away is never just, alive
or dead, and in the unwhole world between
men hold a great deal over all, a right
to be salvaged from darkness and made well.
Only when death is certain do they fail
to compel every effort from us, then
the struggle is for dignity, not life.
And yet this is a fiction, dignity
is comfort and resilience and defense
from every insult, death can be gentle
only with the greatest of care (or luck).
Perhaps no ghost competes with self-defense,
but when the circle just recoils, like that,
and leaves a friend outside to die, it hurts.
To turn aside a ghost before his time
for fear of being drawn out of the world
he once inhabited with you is hard,
but deep inside the brainstem works a drive
to breathe, to bleed, to work against the dead,
and the reflecting mind is adamant
that life will be preserved. You will endure
and leave behind the defenseless at last.
The question is what you will have to say,
when loved ones crowd the safety of your home,
intruding on the circle, asking why
you find yourself alone, why you don’t fill
the conversation for your friend with shared
memories and the anticipation
of seeing him again as soon as planned.

Leave a comment

Filed under Poetry, Postmodernism, Puppy love

Private knowledge in relationships


If you aren’t watching closely, you might not know what to believe about Lara Brennan. If you question everything, you might not believe the button in the last scene in Pittsburgh is the button she described. How you decide what to believe might say a lot about you. But I will simply say that I believe her husband knows.

I don’t think it’s healthy to abandon the desire to be believed. But for Lara Brennan, being believed has to come from faith, and so far as we know, only one person in her life has faith in her as a human being, her husband. Locked up with women who are either guilty or without that solace for what she believes will be the rest of her life, she tries to undermine his faith to adapt to the helplessness of her situation and set him free of the isolation his belief imposes on him.

Would these conditions matter if belief were truly subjective? Would they have the same motivation if their lives were governed by illusions, empty signs, and contingent cultural conventions? Maybe it’s just them, but their struggle is out of sync with the widely hailed collapse of human existence into solipsistic mind games. Are the Brennans mythical archetypes themselves, champions of a faith in the triumph of the real that survives as a superstition in a world dominated by the surreal? They don’t resemble the real life prison breaks for love described in one of the special features on the DVD.

“The object of life is not to be on the side of the majority, but to escape finding oneself in the ranks of the insane.” – Marcus Aurelius

Psychology, philosophy of mind and philosophy of language are deeply rooted in political life. I’m reading a history of the concept of intellectual disability right now that is in many passages fairly arcane, but seems driven by one scholar’s impression that legal competence evaluations are by and large a scandal motivated by sophistry and the posturing of average intellects as a rational elite. I question the notion that postmodernists have given us good reason to set aside any hope of firmly grasping empirical truths in private or public life. I suspect they are positing unanchored signs in language for political purposes, to facilitate flexible interpretations of written laws. But more on that later.

For now I just want to remark on how universal the experience of private observation is, and what that means for us as human beings when our private observations become contentious. Being disbelieved is extraordinarily isolating. It cuts deeper than being misunderstood, to face those who discredit what you have to say for yourself. I think this is why patients are not permitted to obtain copies of mental health professionals’ notes with their medical records. If the person you trust with your most private utterances responds by noting doubt, this is not to be disclosed. Whether they are mistaken or not, the patient’s sense of having been heard is at stake, and at times the best the treatment provider can offer for this need is a placebo. In other cultures I suppose this sort of mediation between patients and ghosts is handled by witchdoctors with ritual and herbal tea.

Leave a comment

Filed under Postmodernism, The Next Three Days

Aristotle, meet Baudrillard

Virtual reality is on our horizon, and already video games are competing with face to face interaction as the dominant social venue for youth culture. Why are pictures of people more compelling than people in this market? The idea that our world is increasingly dominated by empty signs and simulations has been explored extensively in postmodern philosophy, particularly in the work of Baudrillard. Baudrillard is notoriously difficult to read, but someone at the Stanford Encyclopedia of Philosophy decided we ought to know something about the gist of his later work, and boiled one book down to this passage: “the society of production was passing over to simulation and seduction; the panoptic and repressive power theorized by Foucault was turning into a cynical and seductive power of the media and information society; … and revolution and emancipation had turned into their opposites, trapping individuals in an order of simulation and virtuality. .. within the transformations of organized and hi-tech capitalism, modes of Enlightenment become domination, culture becomes culture industry, democracy becomes a form of mass manipulation, and science and technology form a crucial part of an apparatus of social domination.”

Late capitalism is consumerism, the economics of compulsive spending and frivolous production for impulse buying and conspicuous spending. Consumerism posits money buys happiness, and if this is the thinking, spending displays are as significant as conspicuous consumption, because the idea is that money spent produces happiness, even though money is intended to have only token exchange value and does not magically add up to market conditions for acquiring happiness. The sign becomes the object, in the sense that the paper itself is believed to have the power to bestow happiness on its possessor, and acquiring money is as exciting as any Pavlovian precondition for a reward. This inspires industriousness in the pursuit of more money that far exceeds a successful individual’s appetite for discernibly more valuable goods. Hence the art in the vault, represented by a knockoff on the wall that can be replaced if stolen.

Is this really strange? You have to look back to find solid empiricism taken for granted. For empiricism free of caveat or equivocation, you need Aristotle:

“The most distinctive mark of substance appears to be that, while remaining numerically one and the same, it is capable of admitting contrary qualities.” From a postmodernist, this would be a preamble to showing that a word’s empirical referent can be two things at once. For Aristotle, it can only be one thing at a time. “The same statement, it is agreed, can be both true and false. For the statement ‘he is sitting’ is true, yet, when the person in question has risen, the same statement will be false.” For Aristotle, only the writing of the law is proactive, for this is how power is dispensed among the governed, and the truth about power changes with the letter of the law. Its interpretation he would expect to have empirical clarity that in our society is not the rule at all.

“Those things are called relative, which, being either said to be of something else or related to something else, are explained by reference to that other thing.” Aristotle gives examples of comparative words that denote relationships among the things described such as ‘superior’ or ‘double.’ But then he stakes his position deep in empiricism by giving examples of incomplete generalizations, such as ‘ruddered’ for a boat (a boat is not necessarily ruddered) or ‘winged’ for a bird (a winged animal is not necessarily a bird). These descriptions are not correct relationships for Aristotle. A word’s usage is only correct in relation to its empirical referent if it is exact. “All relatives, then, if properly defined, have a correlative. I add this condition because, if that to which they are related is stated as haphazard and not accurately, the two are not found to be interdependent.”

There you are Baudrillard. A world in which a sign that has taken on a life of its own is an aberration, and its endorsement is considered naïve. The reasoning is still available to us, if we allow that in this language there is no word for A, but in another language there are many particulars for A; should speakers of the two languages ever enter into conversation on A, the one would readily convince the other that a referent for A existed to be described, with the limitation that where conventional descriptions of A have novelty value their validity will be a subject for skepticism pending further enquiry.

Has experience of the real per se changed? Among video gamers, I assume investment in virtual reality on an emotional level erodes the importance of differences between pictures of people and people. But the availability of rewards in virtual reality exceeds their availability in the rest of the world, and reward seeking behavior is harnessed with empirical neuroscience in the marketing of products with an addictive potential.

I credit the simpler philosophy with the more enduring truth. Baudrillard understands contemporary culture, but Aristotle understood underlying reality, even if he never made the connection between tadpoles and frogs. The referent Aristotle had in mind is still there. Our level of self-assurance when we go about describing it is all that changed. And of the modest lady, it is often said she doth protest too much.

Leave a comment

Filed under Economics, False controversies, Postmodernism, Virtuosity