Algorithmic literacy: facing the black box society – 11/07/2023 – Education

Algorithmic literacy: facing the black box society – 11/07/2023 – Education

[ad_1]

In recent years, we have made significant progress in understanding the urgent need to build the autonomy of young people so that they can act in society’s informational environments safely, ethically and responsibly. Increasingly present in educational standards, legislation and various civil society efforts, media education presents itself as a more effective and sustainable way of dealing with misinformation, rumors, hate speech, propaganda and other phenomena that can violate rights and even destabilize democracy.

But, in addition to the content that circulates in the media, there is also the most opaque part of communication ecosystems: the algorithms that, subject to logic and commercial interests, personalize what we see to the point of exposing us to selective slices of reality, directing behaviors, shaping our opinions in subtle and sometimes harmful ways. These algorithms often prioritize and reinforce engagement with biased, offensive or violent content, and can even push certain more susceptible individuals towards extremist environments — and actions —.

With digital environments increasingly mediating our worldview, facing these challenges requires us to look not only at the abilities to access and evaluate messages, but also, and increasingly, to educate young people to understand the functioning and effects of their own technological environment. In times of artificial intelligence, where human questions can find incorrect or biased answers created by predictive systems, computing urgently needs to enter the media education agenda.

However, it must be explored critically, to understand its impacts on social justice and democracy — and not just as a work tool in a digital society. This new field, which expands the limits of information education and offers a bridge between computing and media education, we call “critical algorithmic literacy”.

Today we are experiencing the exponential growth of data-based automation — technologies called algorithmic or artificial intelligence capable of making predictions and taking decisions based on the data that feeds them. These systems operate silently and almost omnipresently in contemporary life, impacting everything from the choice of video that will be shown to a child on YouTube to the system that will regulate their job offer or credit when they grow up. This is what has been called a “black box society” — in which automated decisions, generally invisible to the average user, shape their access to rights, services and information (Selwin, 2022).

In practice, media education can develop the necessary skills for young people to be able to perceive, question and influence the behavior of technological systems. Children and young people should be encouraged to explore how the algorithms that shape the results of our internet searches work; They can question the ethics of prediction and recommendation systems, or even the design behind the interfaces of the social networks they use, including the so-called “dark patterns”, which manipulate our decisions. They must be alert to dynamics that promote unattainable images or make certain groups vulnerable. They need to notice and question exclusions or biases reflected in the production of generative AIs. Above all, they must understand the engagement and attention mechanisms that favor content that segregates, offends and destabilizes communities.

In short, educating for new socio-technical dynamics implies recognizing that technologies are not neutral, and incorporate values ​​of those who create or program them; that its effects are ecological, impacting and redefining social and economic relations; and that, acting on unequal societies, can exponentially amplify social injustices and exclusion.

In this new environment, media education must go beyond building the skills to access, evaluate and create messages, examining authorship, purpose and context; it must also encompass a deeper understanding of the complex, and often hidden, dynamics between individuals, media, and technological systems that shape our world. Without the ability to identify and act on these systems, we become vulnerable to the destabilizing effects of misinformation and polarization, which threaten institutions and social peace itself, and to the exclusionary potential of AIs. You need to open the black box.

[ad_2]

Source link