Gendered reality traps: How "objective" machine learning contributes to intersectional exploitation

In 2012, the multinational supermarket chain Target sent targeted advertisements for maternity products to women, none of whom had disclosed to the supermarket that they were pregnant. This targeting was based on the volunteered and behavourial data of its customers with regard to 25 products, which enabled Target to assign its customers an automated “pregnancy prediction” algorithmic score. Shortly after, an angry man stormed into one of Target’s stores to complain that his daughter, who was still in high school, had received coupons from the store in the mail for baby clothes and cribs. The manager admitted a mistake and called the man again two days later to apologise. Upon which the father confessed that he had had a talk with his daughter since and that she was indeed pregnant.

Let’s consider another instance: Founded in 2011, Lenddo is a Singapore-based startup that uses entire digital footprints of potential loan applicants in emerging markets  many in the global South where people often lack traditional histories or even bank accounts to assess their creditworthiness by having individuals download their app onto their smartphones. Lenddo claims that five million people have received loans via their partners in this way. This obviously also indicates that some people have also been denied loans in this way.

These examples illustrate that the trend in today’s world is not just widespread data collection but the processing of this data for profiling. In both these cases, it is not the mere collection of data, which was indeed obtained with the rightful legal requirements of consent, but the collation and processing of such data together to find correlative patterns between several random variables in order to gauge new information about the person concerned. On the basis of this information, the person is either targeted for maternity marketing or assessed for a loan.

Though both examples above leave one with a sense of unease, it is difficult to articulate what exactly is wrong about such big data-based targeted advertising or credit score calculations.

Continue reading at GenderIT.org.

« Retourner