Philosophy

153 readers
1 users here now

A community for the sharing and discussion of all things philosophical and theological. From analytic and continental philosophy, to theology, psychoanalysis, literary and critical theory, all are welcome here.

Rules:

founded 1 year ago
MODERATORS
1
2
3
2
submitted 1 year ago* (last edited 1 year ago) by Swarming to c/philosophy
 
 

The Nature of Nature: 20th Century Explorations of the Body-Subject

Abdal Hakim Murad, Aziz Professor and Dean of Cambridge Muslim College, on 'The Nature of Nature: Two Early 20th Century Explorations of the Body-Subject.

4
5
6
1
Why Freud Matters Now, More Than Ever (www.psychologytoday.com)
submitted 1 year ago by Swarming to c/philosophy
7
 
 

It was Arthur Schopenhauer’s birthday on February 22. Born in Danzig (now Gdańsk) in 1788, he was in some ways the most European of philosophers.

His parents chose the name Arthur because it had the same spelling in German, French and English. They wanted him to be cosmopolitan, and had tried to arrange for him to be born in England to have the rights of a British citizen, but his mother became ill during pregnancy and that plan fell through. He spent his youth in Hamburg, and as a young man travelled extensively in Belgium, France, Switzerland, Germany and England (where he attended a boarding school in Wimbledon for three months). He quickly became fluent in English and French as well as German.

As a philosopher, he’s best known for his pessimism, and the way he combined eastern philosophy with western metaphysics in his major work The World as Will and Representation; that and the fact that once, in a rage, he pushed an old neighbour down the stairs because she was being noisy outside his apartment – he had to pay her compensation until she died.

Schopenhauer’s philosophy influenced a remarkable number of musicians, artists and writers, including Richard Wagner and Thomas Mann. Leo Tolstoy described him as “the most brilliant of men” and hung his portrait in his study. He was Ivan Turgenev’s favourite philosopher. He influenced Zola, Proust, and Joseph Conrad, too.

Perhaps his most important influence in philosophy was on Friedrich Nietzsche, who later turned on him, mocking him for being a pessimist who played the flute (Schopenhauer did literally play the flute). But I want to focus here on Schopenhauer’s writing style. His clarity was what allowed him to have such a profound influence. He always wrote to be understood, something not all philosophers do. He risked being comprehensible, and the risk paid off.

His familiarity with the French and English literary traditions helped him to escape the forbidding style of Idealist philosophers writing in German, typified by Immanuel Kant and, later, by Georg Wilhelm Friedrich Hegel. Schopenhauer admired Kant’s thought, but utterly despised Hegel’s: he dismissed him – somewhat unfairly – as a charlatan and accused him of wilful obscurity. Once the public had recognised that Kant’s work was genuinely deep, although it had initially seemed incomprehensible, they began to associate incomprehensibility with profundity in general. That confusion of obscurity with depth still goes on – you need only to look at the popularity of some impenetrable philosophy books by Slavoj Žižek to see that.

Schopenhauer, in contrast, wrote lucid prose because he wanted to be read and understood. “Truth,” he declared, “is fairest naked”: the simpler its expression the more profound its influence. Humans, he believed, can only think clearly when dealing with one thought at a time.

He attacked the German tradition (as he saw it) of saying six things at once, in long, jargon-laden sentences as an abomination. Write clearly, avoid obfuscation, say one thing at a time – that was all very good advice and hardly radical. Sadly many subsequent philosophers, perhaps most notably Martin Heidegger, didn’t take it to heart.

I’m sympathetic with Jonathan Glover’s assessment of Heidegger. In his book Humanity: A Moral History of the Twentieth Century, he suggested that Heidegger’s moral failing went further than his infamous enthusiastic support for the Nazis.

Glover argued very plausibly that Heidegger undermined philosophy’s role in “developing a climate of critical thought”. That’s true whenever philosophers adopt an oracular and obscure way of addressing the world. There is very little possibility of engaging with their ideas when it is almost impossible to pin down what it is they are saying. Glover slammed Heidegger for embodying the idea that “philosophy is an impenetrable fog, in which ideas not clearly understood have to be taken on trust”. Too often philosophers have written and continue to write in ways that encourage their readers to believe that if the language is obscure enough it must mean something important, and that they should be deferential, not critical.

In most areas of philosophy, even the most intractable ones, it is possible to write clearly and precisely. That way the critical conversation about some of the most profound questions we can ask ourselves can continue.

Obscurantists shouldn’t be treated with deference. We should call them out rather than assume they are deep thinkers. Sometimes fog is just fog.

It’s high time Schopenhauer, not Hegel or Heidegger, became the poster-person of continental philosophy.

8
2
submitted 1 year ago by Swarming to c/philosophy
9
10
5
submitted 1 year ago by Swarming to c/philosophy
 
 

To the people who know me best, I am a bizarre mix of discipline and ineffectuality. I rearrange my fridge daily with the efficiency of a professional Tetris player, but I once vanquished a snake plant after forgetting its monthly hydration needs. Waking up before sunrise poses no challenge for me, yet I lack the patience to cook anything that takes more than seven minutes. Recently, I completed a 16-mile run but scraped my knee in the process, didn’t bother to disinfect the wound, and found a healthy colony of bacteria on my leg the next day.

In many cultures and across many time periods, my minor triumphs would be seen as virtuous, and my habitual idleness might be considered a moral failure. Sloth is one of the seven deadly sins. Napoleonic France, the late Ottoman empire, and the contemporary United States have all generally stigmatized laziness and praised industriousness. The notion that a person can embody both of those characteristics might feel incongruous.

Yet because of a linguistic fluke, I have never seen my actions as a problem. I grew up in South Korea, where there are two words that can roughly translate as “laziness”: geeureum and gwichaneum. Geeureum’s connotations are more or less identical to the English—the word bears the same condescension.

But gwichaneum lacks the negative valence. There’s even a kind of jest to it. To feel gwichan (the stem word of gwichaneum, which I’ll use here for simplicity) is to not be bothered to do something, not like it, or find it to be too much effort. The key to understanding the term, however, is how it fits into Korean grammar: You can’t say “Bob is a gwichan person”; you can only say something like “Doing laundry is a gwichan endeavor for Bob.” The term describes tasks, not people. It places the defect within the act. Errands that are gwichan induce laziness in you.

To me, this is not mere verbal trickery. On the contrary, it is an illumination. Gwichan nails what’s wrong with the litany of errands that plague our everyday existence: Many of them don’t merit our devotion.

Thinking about our moments of indolence this way is not a renunciation of responsibility—life still demands that toilets be scrubbed and toddlers be fed. Gwichanism (a popular neologism in Korea) is not an apologia for anti-productivity or anti-work, and the gwichanist will still fulfill their vital life obligations.

You see, gwichanists aren’t unproductive; they’re perhaps meta-productive, interrogating the merit of every undertaking. For example, you wouldn’t call the ancient Greek philosopher Diogenes, who allegedly wrote at least a dozen books and seven tragedies, lazy. But in the presence of Alexander the Great, his only request was that the monarch get out of the way of his sunbathing. He illustrates a key difference between a lazy person and a gwichanist: The former is chronically unmotivated, and the latter is selectively, purposefully unmotivated. In that sense, gwichanism is a kind of controlled slackerism, a conviction that achievement is possible without a type-A personality.

In fact, to me, the downsides of being a go-getter or a lazy person might manifest similarly: Both an overcommitment to all of life’s responsibilities—however petty—and a refusal to commit to any of them can lead to an absence of agency in one’s life. Instead, being a gwichanist might help you reclaim your time. During my mandatory military service in Korea, I chose to cover up scratches on my boots with black marker instead of polishing them. Yes, this was the kind of aggressive corner-cutting that my superiors would have abhorred if they’d been aware of it, but it spared me enough time to read practically all of Vladimir Nabokov’s oeuvre.

The gwichanist philosophy comes with costs. Because of the wide array of tasks that I choose not to bother with, my everyday life sometimes feels comically inefficient. On a recent trip to Europe, I traveled with a portable printer, because I didn’t want to have to find places to print my writing for proofreading. At home, I put pens, Post-its, and legal pads everywhere, because I won’t write anything down if they are not readily available.

What other people find to be worth their effort or not might be arbitrary. But embracing gwichanism allows me to assert the primacy of my preferences, however esoteric. In its ideal form, the gwichanist lifestyle isn’t sloppy so much as breezy. Sure, some tasks just don’t get done. But the ones that matter do.

11
5
submitted 1 year ago* (last edited 1 year ago) by Swarming to c/philosophy
 
 

I’m very grateful for the invitation to respond to the question in this year’s Big Ideas question: “Who do you think you are?” It’s a question I often ask myself and sometimes pose to others.

But the gadfly philosopher in me wants to be pesky here: I’m compelled to first question the question itself, to interrogate the assumptions that cause us to ask it in the first place.

When we are asked who we think we are, we are essentially being asked to explain how our sense of self develops. One approach to doing this is detailed in the guidance from The Big Ideas editors — to provide a “personal essay or narrative,” or more simply put, a “good story.” Perfectly reasonable. Who among us doesn’t like a good story?

But that, not surprisingly, leads me to ask a few other questions: What is the connection between the development of a sense of the self and a narrative of the self? How does the question of who you are or who I am become a question of storytelling? Is the self a story?

It’s this last question that I must pause here to investigate.

There is simply no denying the omnipresent force of the narrative idea of the self. It dominates our culture: We are the stories we tell about ourselves — and the better the story, the better the self that tells it.

Major philosophers such as Alasdair MacIntyre, Paul Ricoeur and Charles Taylor all defend variations of the idea that the unity of an individual life is consistent with the unity of the narrative that we recount about that life. To the philosopher Daniel Dennett, “We are all virtuoso novelists,” who “try to make all of our material cohere into a single good story. And that story is our autobiography.” For the philosophical neurologist and compulsive storyteller Oliver Sacks, “Each of us constructs and lives a ‘narrative’ … this narrative is us, our identities.”

It is a very compelling idea, even an irresistible one. Who doesn’t want to see their life as an unfolding story, a quest or what the popular scholar and writer Joseph Campbell called in his mythological studies “the hero’s journey”? The story is what motivates and validates the hegemony of memoir, which accounts — lest we forget — for much of what remains of the publishing industry.

What is so seductive about narrative ideas of the self is that one’s life story can be told as something with a beginning, a middle and an end. Often the story will be told around some defining trauma from the past — abuse or addiction will serve — or just having too much of a good time with sex and drugs and rock ‘n’ roll.

The template for such an idea of the self is religious. Stories of the self are very often narratives of salvation or redemption: I was a sinner and now I am saved. One thinks of Augustine’s “Confessions,” arguably the most influential autobiography in the Christian West. It tells the story of Augustine’s conversion from good-time pagan to bishop and strident defender of the faith.

Among the most widely read books in the 18th and 19th centuries was John Bunyan’s “Pilgrim’s Progress,” which vividly describes the self’s journey from the sin and destruction of this world to the Celestial City after passing through the Slough of Despond, or Swamp of Despair. Such redemption stories are incredibly popular.

We might think these stories are things of the past, since we’re supposedly living in a secular age. I have always been a skeptic in that regard, particularly in religion-obsessed cultures like the United States, where autobiographies that are essentially Christian accounts of salvation dominate. Just think about Barack Obama, who had written two such autobiographies by his mid-40s. They sold very well.

The moral conviction that drives this idea of the narrative self is that it is only through narrative that we can attain that great shibboleth of the modern age: authenticity. An authentic self is a self that can be made into a great story. And such stories can be relentlessly commodified. It’s not just Mr. Obama and his fellow politicians who resort to this technique. Almost everyone, from the obscure and downtrodden to the rich, fabulous and famous do it: Tell your story to sell your story.

But marketability and book sales are not the point here. It is the validity of the concept of the narrative self I’m interrogating, and to be clear, I am skeptical. I suspect that those of us who tell and endlessly retell stories of our lives — keeping diaries, constantly journaling (which at some point became a verb) and imagining future memoirs — are simply engaged in an act of self-serving self-absorption. Even worse, this self-absorption often masquerades as a moral lesson.

I am leaning heavily here on the work of the British philosopher Galen Strawson. He criticizes the sense of the self as an ongoing, continuously retold story and proposes instead an episodic concept of the self, one that is more transient and fleeting, more discontinuous. The upshot — and, possibly, the upside — of this view is that rather than living tightly in the constantly retold past and imagined future, the episodic self might be said to live in the present, to let go and live now. Like Mr. Strawson, I think the self is more truthfully described as a series of episodic blips than some grand, unified self-aggrandizing tale.

Let us pause for a moment here to turn inward and inspect our own selves: What do we find? We become aware of something like a mental presence that might or might not be completely defined by our physical being. But such periods of unbroken experience of the self are very brief, just a few seconds or so, and are soon interrupted by the bleeps on our phones or by nodding off. And what about the self that just gazes off into middle distance in a vague, absent-minded awareness? Why isn’t that who we really are?

The self is an odd, inconstant and variable thing. It can be quiescent, a kind of flatscreen humming or whirring, and then it can flare up or be flooded by anxiety or bitten or torn in two by bad conscience. Or again, the self can be completely immersed and engaged, as when watching a captivating sports game or enjoying a piece of music that we love. At such moments, we can be transported, taken outside ourselves. But then we fall back.

The self is not a stream. It is less a seamless flow than a series of jumps and starts that flicker into intense alertness before sliding back into inactivity or boredom. It is a discontinuous series of episodes, an amalgam of blips, pauses, stalls and restarts.

If such an episodic life dooms us to inauthenticity, then I say: So be it. The self is not a story.

But a discontinuous self forces us to live in the here and now rather than in the retold past and imagined future. We might even have the feeling that the self is constantly just beginning.

Life doesn’t need a narrative arc. We don’t have to be the stories we endlessly tell and retell about ourselves. Those stories are fabulation and — if told too often — falsification. The more gusto with which we tell stories about ourselves, the further we risk slipping from the truth. One doesn’t have to control one’s sense of self by constantly tying it back to some fictional story of identity.

To live episodically is to allow for the possibility of surprise in relation to the self. Sure, sometimes those surprises are bad. But sometimes they can be rather good.

To live episodically is also to accept that the self is something more fleeting, something composed of bundles of insights which very often come not from within us but from others. From people we love and trust. Or sometimes from people we mistrust and loathe.

Most importantly, to give up the narrative sense of the self is to allow for the possibility of metamorphosis, of the self being able to embrace new forms, new identities and new personae that we can inhabit and discard, then move on. Perhaps our very freedom consists in the capacity for such a metamorphosis.

Simon Critchley is a professor of philosophy at the New School for Social Research and the author of, most recently, “Bald: 35 Philosophical Short Cuts” and “Question Everything: A Stone Reader.”

12