7 Actions by Uber Eats in Scaled Search Without Breaking the System (or the User Experience)
The backend bloat was real - but they rewrote the rules, not just the code
TLDR
Uber Eats stopped overloading client tokens and moved all metadata lookup server-side for faster, cleaner, and safer search flows
They rebuilt ingestion to prioritize what matters (like price or availability) over cosmetic fluff-real-time where it counts
Search was split into recall-first and rank-second layers-fetch fast, personalize later
Delivery zones switched to hex-based sharding (H3) for better balance, accuracy, and performance
They grouped stores by real delivery time (0–10, 10–20 min etc), not just geography-so results feel faster and smarter
Indexes were reorganized by vertical-Eats vs Grocery-based on how users search, not how data is stored
Result? 60% faster retrieval, 20% smaller index size, and happier users who get what they want before they finish typing
They didn’t just scale-they made every millisecond matter.
They stopped stuffing tokens like overpacked tiffins
Old way - Clients carried bloated tokens with every tiny ad detail. Think of it like giving the waiter the whole menu instead of just saying “paneer tikka.”
New way - Token just says “ID: 123.” Server does the rest.
Why it matters - Faster, lighter, less client headache. Now a toaster oven doesn’t need to remember the whole recipe.
If you're building anything - Stop pushing context downstream. Build a memory upstairs.
Ingestion isn’t sexy, but it decides if you’re on time or irrelevant
Batch for big updates, Kafka for real-time. Urgent stuff jumps the line.
Example - Price just changed? That goes in now. “New store banner” can wait.
What does that really mean - Not everything deserves prime time. If you prioritize like a people pleaser, you’ll crash like one.
Search isn't magic, it’s math in disguise
First pass = fast and wide. Like scanning a room for your friend.
Second pass = deep and personal. Like choosing who to marry.
What went wrong before - Trying to search everything, everywhere, all at once. Latency spiked, servers cried, users yawned.
Geo-sharding isn't cute-it’s survival
They use H3 hexes to divide the world. Not by cities, not by ZIP, but hexes.
Why hex? Because circles don't fit in squares. Literally. Hexes pack tighter and give cleaner delivery zones.
If you’ve ever shipped a map-based product - Don’t divide by geography, divide by user behavior and system load. That’s where the real world lives.
They bucketed delivery time like Netflix buckets genres
Not all stores deliver at the same pace. Uber grouped them into buckets: 0–10 min, 10–20, 20–30.
Example - A user in Manhattan sees Shake Shack in the 10-minute bucket, not Domino’s from Jersey.
What it fixes - You don’t want “best rated” if it’s cold by the time it gets to you. Fast is relevant.
Shards aren't balanced? Then your users won't be either
Latitude sharding? Cool for timezone diversity, bad for dense cities.
Hex sharding? Clean, consistent, and doesn’t choke during dinner rush.
Think of it like this - Would you split your team by department or by how loud they type? Choose the one that spreads the load, not just the labels.
They didn’t fix performance with algorithms-they fixed it with empathy
Index layout was the bottleneck. Grocery vs restaurant-different query behavior, different document structure.
If you treat every user the same, you’ll waste every millisecond you saved.
Example - Don’t return 200 kinds of “milk” from one store. Give variety, relevance, and speed. Like a good playlist, not a random shuffle.
Why It Matters Now
Because everyone’s pretending they’re “building for scale” without realizing scale doesn’t mean more-it means smarter. It’s not “handle more traffic,” it’s “handle more pressure without losing poise.”
And if you’re still architecting systems like it’s 2018, just remember-Uber didn’t just fix their search, they rewrote the rules of how fast food finds you before you even know you're hungry.
Reference
https://www.infoq.com/presentations/optimization-search-uber/?utm_source=substack&utm_medium=email