Debiasing Recommendation Systems: Lessons from Uber Eats' Home Feed
This work represents an important step forward in creating fairer, more accurate recommendation systems.
Understanding Position Bias in Recommendation Systems
In today's data-driven world, recommendation systems power our digital experiences, from what we watch to what we eat. However, these systems often suffer from various statistical biases that can skew recommendations and prevent users from discovering truly relevant content.
One of the most significant challenges in recommendation systems is position bias - the tendency for users to interact more with items at the top of a feed, regardless of their actual relevance. This creates a feedback loop where top-positioned items receive more interactions simply due to their placement, reinforcing their high ranking in future recommendations.
How Uber Eats Tackled Position Bias
Uber Eats recently shared their innovative approach to addressing position bias in their home feed recommendations. Their home feed is a personalized list of restaurants tailored to each user's preferences based on their order history. To provide truly personalized recommendations, they needed to accurately estimate conversion rates (CVR) - the probability of a user ordering from a particular restaurant after seeing it in their feed.
The team at Uber Eats discovered that position bias was significantly affecting their recommendation quality. They visualized this by randomly permuting store rankings for a small percentage of users and measuring conversion rates across different positions. In an unbiased system, conversion rates should remain consistent regardless of position. However, they observed a clear decline in conversion rates as position decreased, confirming the presence of position bias.
The Examination Model Approach
Uber Eats developed an "examination model" to understand and address position bias. This model distinguishes between three key concepts:
Impression: When a store appears in a user's feed
Examination: When a user actually looks at the store
Conversion: When a user places an order
The key insight is that position bias primarily affects the probability of examination. Once a user examines a store, the probability of ordering (true CVR) should reflect the store's actual relevance to the user.
Beyond Position: Multiple Factors Affecting Bias
Interestingly, Uber Eats discovered that position bias isn't solely determined by vertical position. Other factors significantly influence examination probability:
Device type and operating system
UI layout differences
Whether stores are presented individually or in carousels
This multi-faceted nature of position bias required a more sophisticated solution than traditional approaches like:
Training on randomized data
Inverse propensity weighting
Using position as a feature in the model
The Two-Tower Solution
Uber Eats implemented an innovative deep learning architecture with two specialized towers:
Position Bias Tower: Estimates the probability of examination based on position-related features (position, device type, UI layout)
True CVR Tower: Estimates the actual conversion probability based on relevance features (store ratings, cuisine type, etc.)
The outputs from both towers are combined during training, but during inference (when making actual recommendations), only the True CVR Tower's predictions are used for ranking. This effectively removes position bias from the recommendations.
To prevent the towers from interfering with each other's learning, they:
Provided each tower with exclusive access to relevant features
Applied regularization techniques like L1 regularization and dropout to the Position Bias Tower
Impressive Results
The results were compelling:
Offline analysis showed the true CVR estimates were less correlated with position
Online experimentation revealed statistically significant increases in orders per user
Users placed more orders directly from the home feed and relied less on search functionality
By addressing position bias, Uber Eats created a more relevant and engaging recommendation experience that better reflects users' actual preferences.
The Future of Debiased Recommendations
This work represents an important step forward in creating fairer, more accurate recommendation systems. As Uber Eats continues to refine their approach, they plan to tackle other biases like neighbor bias and selection bias.
For businesses building recommendation systems, this case study highlights the importance of identifying and addressing statistical biases. By doing so, we can create more personalized experiences that truly reflect user preferences rather than artifacts of interface design or algorithmic feedback loops.
What biases might be affecting your recommendation systems? The journey to truly personalized recommendations requires constant vigilance against these hidden influences.
read more: Improving Uber Eats Home Feed Recommendations via Debiased Relevance Predictions | Uber Blog