Exploring a theme brought up by the visual analysis of Jacques-Louis David’s “The Death of Socrates” (1787), we look at the concept of The Socratic Method and how it could be beneficial to us today.
(The following is a transcript of the above video)
Socrates’ mother was a midwife. Seeing his mother help women birth babies, he declared that he modelled his philosophical career after her, dedicating himself to assisting people in the throes of birthing thought. Looking for a simple formula, he developed a form of cooperative argumentative dialogue based on asking and answering questions to stimulate critical thinking and elicit knowledge. It's about collaboratively spotting all the holes in the bucket; interrogating presuppositions through a dialectical method. It involves discussion where one person defends a point of view, another asks questions which may lead the defender to contradict themselves in some way, eventually weakening the original statement.
But it’s not about winning an argument. It’s about collaboratively looking for truth. There would be few people willing to rationally compromise if the exercise was simply an ‘I’m right, you’re wrong’ debate. Which is why the person asking the questions needs to appear as impartial as possible.
The problem we all face is that facts don’t change minds. Cognitive scientists recognise our basic tendency towards confirmation bias, where we embrace everything that supports our position and reject anything that contradicts it, as an early evolutionary tool which allowed humans to excel in hyper social groups. When living cooperatively, the ability to persuade is hugely beneficial and our chances of success increase the more we are convinced of our own position. This entrenched desire still reigns supreme and once opinions are forged, they are forcefully perseverant.
Another aspect of our fixed perceptions is what Sloman and Fernbach dub the “illusion of explanatory depth.” This is where people believe they know much more than they actually do. Asking students to rate their understanding of everyday devices such as toilets and zippers, the students rated themselves highly. When subsequently asked to provide detailed explanations of how the devices worked and asked to rate their understanding a second time, their self-assessments dropped. The effort revealed their own ignorance to themselves.
We persist in our beliefs of our own knowledge because we assimilate others’ expertise. Sloman and Fernbach write: “One implication of the naturalness with which we divide cognitive labour [is that there’s] no sharp boundary between one person’s ideas and knowledge [and] those of other members [of the group].” We find something easy to operate and assume we must have knowledge, rather than attributing our own successfulness of the task to someone else’s expertise. We see this on the casual sports field, individual players over-estimating their own skill thanks to their team’s success. When we work together, it’s hard to know where one person’s expertise ends and another’s begins.
This borderlessness of knowledge and our tendency to be influenced socially rather than by facts, evolved to allow us to cooperate successfully in turn not risking ostracisation which would lose us the benefits and protection of living in a group. So at once, we are incredibly fixed in our beliefs, we over-estimate our knowledge and yet we are socially persuaded by the groups of people with whom we wish to belong.
This couldn’t be more true in today’s Post-Truth era and it is within the political sphere that these human tendencies are the most dangerous.
We are fooled by our innate desire to believe the people we have relationships with on social media; by our distrust of experts and our reverence of celebrity; by our general desire not to make waves at social gatherings where we passively accept the news a friend might tell us; by our miscomprehension of our own depth of knowledge and belief that we have the answers when we can respond with a large regurgitated brushstroke such as “it’s good for the economy” without considering the facts behind the generalisation or why that might be our priority in the first place and at the expense of what. We are fooled by pedlars of hatred and fear and most unfortunately, our own laziness.
So how do we break the cycle? The simplest answer would be to start living a Socratically examined life, to think about our own thoughts, to engage in conversations with each other and break things down, objectively deconstruct our bias and our generalisations, to be honest, we’d start having much more interesting conversations. That examination should extend to the media we consume, starting with an understanding that many people’s positions are baseless or self-interested or not well thought out - remember that teacher you knew was wrong, but they had authority, so you were supposed to just listen to whatever they said? I’m pretty sure they remember me.
The Socratic Method isn’t a simple formula for uncovering truth. It’s a crucial exercise to reveal our own ignorance. In the process of interrogating our own beliefs we learn about our personal values and the plurality of truth.
But there’s another more idealistic proposal. What if we required that our media take on the responsibility of not simply ‘reporting’ but instead thoroughly interrogating the implications of government policy; if we didn’t allow our media to under-estimate populations and accept a low-level of engagement and a superficial understanding of our world; if we accepted the fact that most people’s positions don’t hold up to interrogated reasoning even if they are on radio or tv; perhaps we could make more informed choices about the products we buy, the celebrities we give voices to, the governments that lead us.
Engaging in thoughtful variations of the Socratic Method like these has the power to shatter the illusion of explanatory depth and reduce our urge to fight each other over arguments we don’t fully comprehend.
We need our media to act as the midwives of knowledge.