No. 48 (2021)
Articles

Published 2021-12-31

Keywords

  • algorithms,
  • discrimination,
  • objectivism,
  • statistics,
  • surveillance

How to Cite

Idárraga Franco, H. F. (2021). DISCRIMINATION AND COLONIALITY IN THE ALGORITHMIC VISION. ARTISTIC APPROACHES FOR A CRITICAL REVIEW OF PEOPLE’S CLASSIFICATION. Revista 180, (48). https://doi.org/10.32995/rev180.Num-48.(2021).art-1000

Abstract

In the light of artistic proposals with digital media and, specifically, with artificial intelligence, this article proposes two perspectives that will critically analyze the discrimination of people in machine learning models; in particular, in those classification models designed to carry out surveillance and social control tasks. On the one hand, it will be proposed that, according to a philosophical version of objectivism, different algorithmic models attempt to "objectively" classify people based on their bodily features, linking them with psychological and behavioural profiles. The dubious relationship that will be questioned here is that between visible features and invisible characteristics of people, forged by a colonial gaze that is currently reproduced in the operation of algorithmic vision. On the other hand, it will be affirmed that this discrimination is materialized in the same design of the classification models. To do this, in the first place, the importance of statistics for the operation of machine learning will be addressed from the perspective of its historical relationships with police practices; and, secondly, the way in which that colonial gaze is reproduced in the data sets and in the names of the categories under which the images are labelled and classified, thus determining the algorithmically perceived reality by these classification models.