A team of independent researchers from the University of Southern California (USC) have released a report, titled: "Auditing for Discrimination in Algorithms Delivering Job Ads".
The goal of the research was "to develop a novel methodology that measures skew in ad delivery that is not justifiable on the basis of differences in job qualification requirements in the targeted audience". The independent researchers created advertisements for delivery drivers in North Carolina, and ran them on both Facebook as well as LinkedIn (to control for gender on LinkedIn, which does not track this, they used an ontology of common male vs female names in the region).
The researchers registered as an advertiser on Facebook as well as LinkedIn, and bought ads for jobs with the same qualifications. They advertised for two delivery driver jobs: one for Domino’s (pizza delivery) and one for Instacart (grocery delivery). There are currently more men than women who drive for Domino’s, and vice versa for Instacart, as a control for demographics on the part of the researchers.
LinkedIn recruitment advertising has never really been studied in this manner, so the results are limited. However, the researchers report that: "Our work provides the first analysis of LinkedIn’s ad delivery algorithm. With the exception of one experiment, we did not find evidence of skew by gender introduced by LinkedIn’s ad delivery, a negative result for our investigation, but perhaps a positive result for society."
Facebook, however, continues to lag. ProPublica brought the issue of gender and racial discrimination within Facebook employment and housing advertising in 2016. At the time, the publication reported that: "[Facebook] gives advertisers the ability to exclude specific groups it calls 'Ethnic Affinities.' Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment."
These findings, coupled with a settlement agreement to a legal challenge, forced Facebook to make changes to the targeting capabilities of its platform.
The findings from USC indicate that those changes were not enough. (The platform's advanced settings have already been shown in prior studies to continue to allow advertisers to discriminate -custom and lookalike audiences can be used to run discriminatory ads, for example, if an advertiser with ill-intent makes an effort at discrimination.) This study looked at advertisers who did not start off with ill-intent. Even with that in mind, Facebook consistently discriminated against women.
We confirm that Facebook’s ad delivery can result in skew of job ad delivery by gender beyond what can be legally justified by possible differences in qualifications, thus strengthening the previously raised arguments that Facebook’s ad delivery algorithms may be in violation of anti-discrimination laws. We do not find such skew on LinkedIn.
This report adds on to charges that Facebook discriminates against people of color - black men, in particular. The company is already under EEOC investigation for those charges.
According to MIT Technology review: "Facebook’s algorithms are somehow picking up on the current demographic distribution of these jobs, which often differ for historical reasons. (The researchers weren’t able to discern why that is, because Facebook won’t say how its ad-delivery system works.) “Facebook reproduces those skews when it delivers ads even though there’s no qualification justification,” says Aleksandra Korolova, an assistant professor at USC, who coauthored the study with her colleague John Heidemann and their PhD advisee Basileal Imana.
Speaking to MIT, Northeastern algorithmic bias Christo Wilson, added: “How many times do researchers and journalists need to find these problems before we just accept that the whole ad-targeting system is bankrupt?”
Sign up to get our monthly newsletter and updates about RNN.