Measuring Bias in Job Recommender Systems: Auditing the Algorithms
We use an algorithm audit of China’s four largest job boards to measure the causal effect of a job seeker’s gender on the jobs that are recommended to them, and to identify the algorithmic processes that generate those recommendations. Focusing on identical male and female worker profiles seeking jobs in the same industry-occupation cell, we find precisely estimated but modest amounts of gender bias: Jobs recommended to women pay 0.2 percent less, request 0.9 percent less experience, come from smaller firms, and contain .07 standard deviations more stereotypically female content such as requests for patience, carefulness, and beauty. The dominant driver of these gender gaps is content-based matching between posted job ads and the declared gender in new workers’ resumes. ‘Action-based’ mechanisms – based on a worker’s own actions or recruiters’ reactions to their resume – contribute relatively little to the gaps we measure.
-
-
Copy CitationShuo Zhang and Peter J. Kuhn, "Measuring Bias in Job Recommender Systems: Auditing the Algorithms," NBER Working Paper 32889 (2024), https://doi.org/10.3386/w32889.Download Citation
-
-