Dernière mise à jour de cette page le
The Association for Progressive Communications welcomes the focus of the UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance on the acute and structural threats that new information technologies, including artificial intelligence (AI), pose to the rights to non-discrimination and racial equality, human rights principles and standards, and also welcomes the opportunity to contribute to her report on this important topic.
Contrary to popular belief that AI is neutral, infallible and efficient, it is a socio-technical system with significant limitations. One possible explanation is that the data used to train AI systems “emerges from a world that is discriminatory and unfair, and so what the algorithm learns as ground truth is problematic to begin with. Humans building these systems have their biases and train systems in a way that is flawed.
But there is another explanation that focuses on the global power relations in which these systems are built. AI systems are flawed because they amplify some voices at the expense of others, and are built by a few people and imposed on others. “In other words, the design, development, deployment and deliberation around AI systems are profoundly political.” The impact of AI is significant and unique, depending on the context in which these systems are deployed, and the purposes for which they are built. It is a matter of reckoning with the imperfect, discriminatory and unfair world from which these systems arise, and the underlying structural and historical legacy in which these systems are applied.
The 2019 edition of the Global Information Society Watch (GISWatch) report, produced by APC in partnership with ARTICLE 19, focuses on the impacts of AI from the perspectives of human rights, development and social justice, with a specific focus on the global South. This submission draws heavily on GISWatch 2019, extracting elements that are most relevant to the topic of this consultation.