The importance of re-imagining tech to revolutionise its damages
According to Edelman’s Trust Barometer 2020, 61% of the people surveyed had the feeling that the pace of change in technology was too fast. It is clear that digital transformation is something we cannot stop as users or consumers, and we are more aware that there are existing harms in technology that must be addressed. Turning dangerous systems into obsolete seems a utopia, but these mechanisms are the ones leading the damages of our modern society.
Many people have been already speaking about the consequences of harming systems that impact peoples’ lives. However, they have been silenced, ignored or punished for reclaiming a better use, for instance, of AI systems. Often they have been treated like alchemists in the old days, as if their work was more speculative than factual. These people are approaching AI concepts and automated systems’ abuses to the public. In many cases, they are building the first stepping-stone for those who want to understand and change the narrative.
Technologists, designers, developers and civil society organisations must work together in order to transform or re-signify the current technologies, but how do we make sure everyone is aligned? How can we get all these actors connected?
There must be one or more people leading the conversation, one that must be able to speak the language that all the participants speak, the one who can build bridges, establish common sense, and understanding. This person is key to make sure no one is left behind and keep the motivation forward. All the actors are important to build strategies that can change the narratives and therefore the idea we have about certain technologies.
This is a very critical issue that cannot wait. People cannot continue blaming a machine for its bad decisions when a human was behind. It is time underrepresented communities lead the movement, as they did with other historical shifts.
The best way to understand the consequences of harming technologies is by listening to people coming from low income or underrepresented communities. Their stories weave unique and unbelievable narratives driven by digital mechanisms that denied them assistance, employment, or housing.
These people are the best example of how AI, for instance, is not designed for all. We already have enough evidences that this kind of tools only benefit white middle and upper class citizens, why do we use them to decide who must eat today and not tomorrow? This people need immediate reparation, and automated systems must be evaluated by external auditors that can assess the risk of its application in our society. Apart from this, the re-signification of the narratives that orbit around AI must be revised.
We need to build spaces where we can speak openly of how technologies affects our daily lives, acknowledge their damages and talk about how they make us feel. Because many times we forget care when speaking about these topics. Wellbeing and healing must always be in the centre, as well as creative alternative stories of what we want to see in our near future, powered by conscious technologies.
Harming technologies must stop working. The reasons exposed in this article are only one of the approaches we can follow in order to dismantle them. We require strong coalitions with people from the digital and social space to work together towards a transformation of the current narratives. This is a call to action that cannot be ignored. Dismissing it will lead us towards more automated inequalities that will impact more and more individuals until it becomes a generalised thing. Is that the world where we want to live in?