Blog
1 week ago
That Time We Found Gender Bias Hidden in a Podcast Recommendation System
This case study examines how gender bias manifests in podcast recommendation systems through latent feature representation (LFR) models. Using a dataset of 19,000 users and 31,000 podcasts, the researchers analyzed associations between user gender and podcast genres like true crime and sports. By comparing models trained with and without gender as a feature, they visualized bias directions, tested classification scenarios, and flagged statistically significant gender associations in item embeddings. The findings reveal how algorithmic systems can unintentionally replicate societal gender preferences, underscoring the need for fairness auditing and bias mitigation in real-world recommender models.
Source: HackerNoon →