patterns and estimation
"about how many piano tuners are in chicago?"
this is a fermi estimation problem. you're not supposed to know the answer. you're supposed to reason your way to a reasonable estimate using math you already know.
chicago has about 2.7 million people. maybe 1 in 20 households has a piano — call it 5%. average household size is about 2.5, so roughly 1 million households, meaning ~50,000 pianos. a piano should be tuned once or twice a year — say 50,000-100,000 tunings per year. a tuner can do maybe 4 per day, works 250 days per year, so about 1,000 tunings per year per tuner. that gives 50-100 piano tuners in chicago.
the actual answer is somewhere around 100-200. we're within a factor of 2, which is excellent for a problem we "knew nothing about."
why estimation matters
estimation is the skill of getting approximately right answers with minimal information. it's arguably more useful than precise calculation because in most real situations, you don't have precise inputs.
in my math modeling competition work (HiMCM, MCM/ICM), the first step for every problem is estimation. before building a fancy model, you sanity-check: what's the right order of magnitude? if your model predicts that a city needs 50,000 ambulances, something is wrong. if it predicts 50, that might be reasonable. engineering and modeling always starts with this kind of gut-check.
order-of-magnitude reasoning
the most powerful estimation tool is thinking in powers of 10:
- 10⁰ = 1 (a person)
- 10¹ = 10 (a classroom)
- 10² = 100 (a lecture hall)
- 10³ = 1,000 (a small school)
- 10⁴ = 10,000 (a stadium section)
- 10⁵ = 100,000 (a large stadium)
- 10⁶ = 1,000,000 (a city)
- 10⁹ = 1,000,000,000 (a country, roughly)
- 10¹⁰ = 10 billion (the world, roughly)
once you have these reference points, you can locate almost any quantity. "how many restaurants in new york city?" well, 8 million people, maybe 1 restaurant per 100 people... so ~80,000. (actual answer: about 27,000 — we're in the right order of magnitude, and the discrepancy tells us something interesting about the restaurant-to-person ratio.)
the key insight: being wrong by a factor of 3 is fine. being wrong by a factor of 1,000 means you're confused about something fundamental. order-of-magnitude reasoning catches the catastrophic errors.
pattern recognition
humans are pattern-recognition machines. we see faces in clouds, hear words in noise, find trends in random data. this is both our greatest mathematical strength and our greatest mathematical weakness.
when it works: noticing that sales spike every december. recognizing that a function is growing exponentially. seeing that two seemingly different problems have the same structure. pattern recognition is the engine of mathematical intuition.
when it fails: seeing patterns in random noise. the "hot hand" debate in basketball raged for decades partly because humans are so eager to see streaks in random sequences. financial "technical analysis" finds patterns in stock charts that often aren't really there. conspiracy theories are pattern recognition run amok.
the antidote is statistical thinking: asking "how likely would this pattern be by chance?" if you flip a coin 100 times, you'll almost certainly get a streak of 6 or 7 heads somewhere. that's not a pattern — it's expected randomness. this connects directly to probability — distinguishing signal from noise is fundamentally a probabilistic question.
mental math tricks
some useful estimation heuristics:
the rule of 72: to find how long it takes money to double at x% interest, divide 72 by x. at 6% interest, money doubles in ~12 years. this works because ln(2) ≈ 0.693, and 72 has many factors making division easy.
dimensional analysis: if your answer has the wrong units, it's wrong. this catches a surprising number of errors. "speed = distance × time" — nope, the units don't work out. must be distance / time.
anchor and adjust: start with something you know and adjust. "how tall is that building?" well, each floor is about 3 meters, I count 15 floors, so ~45 meters. you're using arithmetic and visual estimation together.
break it down: any complex estimation becomes tractable when you decompose it into simpler estimates. this is the core of fermi estimation — you might be wrong on individual factors, but errors tend to cancel when you multiply several rough estimates together.
the connection to abstraction
estimation is where layer 1 thinking meets layer 3 thinking. the act of estimating forces you to build a mental model: what are the relevant quantities? how do they relate? what's the structure of the problem?
this is exactly what harrison means by "the organizational lens" — you're not just computing, you're structuring your understanding of a situation. a good fermi estimate reveals the key parameters of a system, which parameters matter most (sensitivity analysis), and which you can safely ignore. that's mathematical thinking at its most practical. pattern recognition itself is what abstraction formalizes — noticing that two different situations share the same structure is both the essence of estimation and the essence of abstract mathematics.