The idea of ‘ecologies of ideas’ stems from a number of thinkers including Spinoza, Hasana Sharp and Félix Guattari. In his writings The Three Ecologies (2000), Guattari describes the ethico-political-existential framework of ecology of the psyche, ecology of nature, and ecology of the social.
Drawing on this, let’s explore some of the tendencies that he notes. When there is a desire for homogeneity or static sameness and segregation, this corrupts social relations, or sociality. We could think about ‘filter bubbles’ here as well as, of course, the preoccupation with purity of extremist groups. We might then reflect on how ideas circulate and how some ‘stick together’ and dominate other ideas. Word Clusters can communicate this somewhat, but what interests us here is which ideas come together and which exclude one another, eg pluralism and white supremacy. Do they change in different contexts?
We will move from a more ‘personal’ approach to ‘my ideas’ to just looking at ideas and how they move, which ones ’stick together’, which repel each other, which try to dominate and which are more open, which are more powerful and why, we can map ecologies of ideas. You can also map your own psyche, society, community…
Which ideas are toxic or poisonous and which creative? This allows us to take up a different relation to ideas and to see how ’they have us’. Our own biographies, traditions, and histories will shape our responses and associations.
Source: Guattari, F. (2000) The Three Ecologies. London: Athlone Press.
Michalinos Zemblyas develops another way of thinking about cartography, geography, and mapping affects. He describes how emotions move and stick (to bodies). This language of ‘emotional geographies’ explores how how they move rather than personalising them. Emotions are public and bound up with discourses of power, including organisation of hate, disgust, etc.
He notes that socialisation practices and discourses, including non-corporeal (non-bodily) spatial and discursive signs, and hierarchies of power and position, are critical to shaping the presence or absence, as well as the intensity of any given emotion. He explores how bodies are drawn together or move apart on racial or ethnic terms, with discourses having bodily markers to separate people.
He writes “Hence movement is always embedded within certain socio-spatial contexts and connects bodies to other bodies; attachment to certain bodies (which are perceived to be similar) and distance from others (which are considered dissimilar) takes place through this movement, through being moved by the proximity or distance of others.
To put this differently: emotions do not come from inside us as reaction, but are produced in and circulated between others and ourselves as actions and practices. This circulation happens precisely because individuals do not live in a social and political vacuum but move and thus emotions become attached to individuals united in their feelings for something. If emotions shape and are shaped by perceptions of race and ethnicity, for example, then it is interesting to investigate how certain emotions ‘stick’ to certain bodies or flow and traverse space”
Zembylas suggests that a systematic investigation of the movement of emotions and bodies in certain spaces/places would be very helpful. This could be in relation to the nation-state, the border, the boundary, proximity (to sit, to touch), departure and distance, segregation, separation, occupation..
Zembylas, M. (2011) ‘Investigating the emotional geographies of exclusion at a multicultural school’, Emotion, Space and Society (4), pp.151-159
The last two themes explored ecologies of ideas and emotional geographies. Now we’d like to think about how these work in the online space. It’s important that young people understand how technologies work from the perspective of finance (e.g. click bait), sociality (e.g. identity feeling formation and filter bubbles), and design.
The diagram to the left shows some of the ways in which algorithms operate. Algorithms are just instructions, rules or procedures to perform a task or solve a problem – a recipe is a kind of algorithm. They offer useful ways of sorting, ordering, classifying and categorising information.
However, in online spaces they can serve to amplify bias and create ‘clusters’ whereby people already in agreement or who might be sympathetic are directed to information that reinforces this. This means that in the online space, people might not encounter different perspectives in their social media feeds. Also, the design of platforms also encourage certain kinds of user engagement – as much as possible! Developing digital literacy – reading the online world – means understanding how this works.
One simple exercise is to check in with your body and emotions when you read a headline, or something pops up on your feed. Sit with that emotion rather than letting it drive your responses.
Earlier we described some of the ways in which ideas feel, for example the idea of justice. Understanding feeling or affect is important because it is often what drives responses and brings people into particular kinds of formations. White supremacy, for instance, feeds off and cultivates particular structures of feeling including grievance, loss, anger, fear and so on.
Digital platforms can bring together racial formations and structures of feeling, creating an intensive experiences as people try on different kinds of racism, for example, often masked as jokes. Online platforms can intensify emotions – for example, Reddit involves long scrolling which draws someone further and further in.
Platforms promoting hate invite deep immersion and operate with a logic that is viral.
Platforming hate attracts users and encourages users to engage, interact, share and spread ideas. This is also financialised. Any use is good use. Young people are also attracted by the aesthetics of platforms.
These create spaces of participatory community which encourage not only the circulation of certain kinds of ideas, but also of certain kinds of affects, like fear or quest for (perceived) justice exchange of grievances, or purity. Not all kinds of relationality are positive, in particular when they are motivated by fear, grievance or a ‘philosophy of loss’.
But talking with young people about how technologies work, how algorithms work, how ideas and feelings circulate, and how statements are often knowingly are designed to provoke certain kinds of emotions is helpful.
The experience of the global pandemic meant that some (young) people were attracted to conspiracy theories, sometimes because others dismissed their fears, ridiculed them, or thought arguments would work. Evidence and information is important, but so too is listening with compassion when someone is drawn into these online spaces.
It can be hard to know how to talk about how technology works as an educator, but it is important before getting into discussions about content of ideas to talk about where ideas come from, how they cluster, how platforms work, how money is made from clicks, and so on. Whilst there is more work to be done about ‘education and technology’ this overview taken from the Institute of Strategic Dialogue’s Digital Citizenship Education: Programming Toolkit, provides a very useful overview of key terms.
Fake news are articles or posts that look like they contain factual information but which contain intentional disinformation with the aim of deceiving people, or misinformation, as people inadvertently share misinformation. People who create fake news can do so for financial reasons, political goals (to influence opinion) or personal motives (to cause divides in society.) It can be hard to distinguish fake news from truthful news.
Biased writing is when a writer shows favouritism or prejudice toward a particular opinion rather than being fair and balanced. It's important to distinguish opinion from fact as it can lead to a poor understanding of issues by people, making it harder to deal with society's problems.
Echo chambers are social spaces in which ideas, opinions and beliefs are reinforced by repetition within a closed group. Within echo chambers, dissenting views are unexpressed or unrepresented, dismissed or removed. They are comfortable because it's easier to agree than disagree, but they can cause political fragmentation or polarisation, and reduce opportunities to listen to people from different perspectives and backgrounds, and reduce empathy for those who hold different perspectives.
Filter bubbles are the result of personalised search and newsfeed functions. They can be useful, directing you to the content you want to consume, but they can also be harmful, separating users from information that disagrees with their viewpoint. This can isolate users in political, social or ideological bubbles, in a phenomenon closely related to the echo chamber. It can push people towards more extreme positions and reduce their empathy for people who think differently.
Stereotyping occurs when people use an oversimplified and over-generalised set of characteristics to describe a group of people. People often adopt stereotypes because they offer a simple way to perceive the world. They become embedded in people’s thinking because they assume that the characteristics of one person are true for every other person who shares one or of the same identifying characteristics, e.g. race, religion, gender, class or sexual orientation. When we use stereotypes we reduce people’s individuality and character nuances to a list of characteristics that are easy to fit into a particular category. This has the negative effect of distorting someone’s understanding of another person or group and stops them from recognising similar traits and commonalities they may have.
Scapegoating is the practice of singling out a person or group within society for negative treatment and blaming them for social or political problems. Scapegoating is a key driver of intolerance and prejudice. Scapegoating a group and blaming them for social problems presents a simple and clear narrative that can drive polarisation and hatred within society. Examples of scapegoating include the treatment of Jewish people by the Nazis, or the blaming of ethnic minorities for social or economic problems.
Us v Them thinking
An ‘us vs them’ mentality divides the world into a negatively viewed, stereotyped out-group (them), and a positively viewed in-group (us). Divisions can be based across a wide range of identities such as race, religion, gender, sexual orientation, class, nationality and political views. Differences are often projected through the use of stereotyping, and all members of the out-group are characterised as the same. ’Us vs them’ thinking is often used to polarise people, whether online or in real life, forcing individuals into a binary view of the content creator’s own making. The out-group is often blamed for the problems experienced by the in-group, and this is used to strengthen the way the in-group views themselves.
While there is no international legal definition of hate speech, it is widely recognised as speech which attacks, intimidates, humiliates, discredits or promotes violence against a person or group ‘based on their religion, ethnicity, nationality, race, colour, descent, gender or other identity factor’. Online hate speech is a major problem and something most individuals encounter at some point.
Online and offline spaces intersect so it’s important to keep an eye on how information moves and to look at the outcomes of this and assess the sources of it. Some individuals and groups trade in hate and disinformation and have different reasons for doing so. Sometimes memes play into our existing prejudices, so notice when you agree a little too quickly to something that is about denigrating someone seen as ‘other’. Ciarán O’Connor of the Institute of Strategic Dialogue describes extremism and ideologies in the Irish context. We have come across many of these ideas through this module. It underlines why it’s so important for educators to address these issues as they arise with young people and with their peers.
O’Connor in his talk for EDURAD which is available on the website describes how hate is orchestrated online, and memes are spread to deliberately spread disinformation.
Other avenues include gaming, and youth workers from Ireland and the Netherlands who have been part of our dialogues, describe how connecting with young people through gaming is important to understand the kinds of opinions they are expressing and exposed to. TikTok is another safe space to hate, and as with other platforms and social media, code words are used, for example, “joggers” in order to bypass detection. Other strategies like this include dog-whistles, where something might seem harmless to a wider group, but is understood by the in-group.
Groups, such as the far-right, are also piggy-backing on anti-vaxx protests in order to disseminate their views, as well as through ‘well-being’ platforms and influences.
In the world of big data, our identities can be described in lines of code. A way of opening up questions of identity can involve imagining identities written as code, describing characteristics, likes, preferences, habits, dislikes, activities.
Think about how identities are coded, whether they reflect or relate to anyone real, and how your own identity can be understood as just coding. This opens up possibilities for hacking the identity or code you’ve been given to create something new. Or for deciding to stay with your codes.
With thanks to Jessica Foley