Who Do You Want to Be When You Grow Up?

You have been asked this question. Almost every adult you met, from your aunties to strangers, asked you the same thing again and again: “Who do you want to be when you grow up?”

And then you came up with some sort of answer that changed every other day: doctor, pilot, president. The list goes on.

At first glance, it seems like an innocent question. What harm could there be in asking children what they want to be when they grow up?

Well, not much at first. But then, as they get older, the question doesn’t change, but the way it’s asked does.

When you are 16, your parents or teachers aren’t asking out of curiosity. They are asking to see if you have already decided on a path you will follow. They want to make sure that you know what you will be doing with your life. They want to make sure you can get into a decent school, secure a job, and most importantly, avoid unemployment.

Suddenly, the innocent question you used to be asked is gone. It feels judgmental and pressurizing. God forbid you are a high school senior with no idea what you want to do with your life. You will probably be seen as a failure because supposedly everyone else has already figured it out.

Sure, the pressure often starts with parents, but it certainly doesn’t end there. University admissions only add to the problem. The way applications are designed pushes high school students to prematurely choose their life’s direction.

I’ve been down that path. When I was applying to colleges, I stopped thinking about what I wanted to do. Instead, I started optimizing my activities to fit a “career narrative.”

I thought I wanted to do business. But since I hadn’t done anything resembling “real business,” I convinced myself it was a weak application. On the other hand, I had plenty of experience in politics and international relations, so I decided to apply in that direction.

I found myself crafting a story, connecting my activities and honors into a cohesive narrative that supposedly proved I was destined to be a politician.

I had to choose. I had to declare a career path. I had to pick a major. Most importantly, to even have a chance at a good university, I had to identify my “spike,” my narrative, my supposed passion, and build my application around it.

It became a game of optimization:

If I apply to economics, how do I make my experiences fit that story?

If I apply to politics, how do I turn my TEDx talk into evidence for that?

At just seventeen or eighteen, I was forced to figure out what I wanted to do with my life.

Looking back, I realize that even that “choice” was never entirely mine. It was shaped by what surrounded me: the opportunities I had and the ones I didn’t. If I had grown up exposed to different paths, maybe I would have chosen differently. The truth is most of us just pick from what we can see.

Educational systems in most countries expect you to commit to a field when you’re barely starting your life. You have no idea what you truly want, yet you’re already pushed to pick a high school focus that leads to a college major that leads to your first job. It is an unnervingly linear process, built on the assumption that teenagers are capable of making a lifelong decision about their future.

But the world these systems were designed for no longer exists. The meaning of “career” itself has become unstable. Our parents could pick a profession and keep it for decades. Today, the ground moves constantly. Technology reshapes industries every few years, and people are expected to reinvent themselves more than once. The idea of one fixed professional identity has quietly collapsed, yet our institutions still behave as if permanence were possible.

Universities continue to sort students into narrow lanes because that is how their infrastructure was built. They need to forecast which departments will grow, how many faculty members to hire, how many classrooms to allocate. It is simpler to admit a “future economist” than an undecided student whose interests might span several fields. Employers rely on those same signals when they hire. The result is an education pipeline that favors efficiency over exploration, even when exploration has become the only realistic way to adapt.

This is why the problem goes beyond individual pressure. It is not just about ambitious parents or overbearing teachers; it is structural. The system rewards early specialization because it is convenient to manage, not because it reflects how life actually unfolds.

And so the question, “Who do you want to be when you grow up?” carries an outdated logic. It assumes there is a single, stable answer waiting to be discovered, when in reality, the answer keeps evolving. 

A better question would sound more open: What do you want to learn next? What direction feels meaningful right now? What are you curious about? These questions allow change, which is what real growth demands.

I still think about that original question, though. Adults keep asking it, just in more careful ways. Sometimes they mean “Will you be safe?” or “Will you be useful?” or “Will you be okay?” And maybe the only honest answer is that safety today comes from learning how to adapt by staying curious, staying capable, and staying willing to change your mind.

And if you asked me today who I want to be when I grow up, I’d probably still pause, because there’s still no way to answer it. But I’ll tell you this — I’d like to be someone who keeps learning, who doesn’t mistake a temporary path for a permanent identity, and who’s at peace with not having a final answer.

Execution Eats Ideas for Breakfast

People like to think of original ideas as one-off sparks. In reality, the same ideas keep appearing in different places with different people.

Around 10,000 years ago, in distant parts of the world, groups that had never met and knew nothing of one another began making the same choice. They started tending wild plants, learning which ones could be coaxed to grow, and, little by little, settled where those plants thrived. From these scattered beginnings, without coordination or contact, agriculture emerged.

Learning this made me wonder how original ideas actually start. And if agriculture wasn’t truly original, are startup ideas any different? History suggests they’re not.

The basic techniques for making stone tools were independently invented in Africa and Eurasia. Writing appeared independently in Mesopotamia, China, and Mesoamerica. Even beer shows up in multiple regions without an obvious single ancestor.

Science follows the same pattern. Calculus is credited to both Newton and Leibniz. Natural selection was articulated by both Darwin and Wallace.

When conditions align, many people converge on the same answer. That’s not an insult to genius; it’s a description of how the world pushes minds toward the next available step.

I prefer the word convergence to inevitable. “Inevitable” sounds like fate and ignores how many supposedly inevitable ideas die on the wrong cost curve. Convergence means the environment highlights a few options as the next steps. As new possibilities appear, choices narrow, and many people see the same good ideas. What’s scarce isn’t the idea itself, but the right way and the right time to act on it.

You can see the same pattern in technology. The first web browsers didn’t come from a single person in a dorm. Mosaic, Netscape, and Internet Explorer appeared within a short window because the Internet crossed a threshold that made a browser not just plausible but necessary.

Social networks arrived in a cluster for the same reason. Friendster, MySpace, Orkut, and Facebook weren’t copying a secret blueprint. The infrastructure and habits that mattered had clicked into place.

Video calling followed the same arc. Zoom didn’t invent talking over the internet; it made the idea feel reliable once bandwidth, codecs, and the messy details of real networks were good enough.

If originality guaranteed success, the first movers would dominate. They often don’t. Being early often turns out to be the same as being wrong.

Microsoft made tablets long before Apple. The timing was off. The complements were missing, the costs were wrong, and the behavior wasn’t normalized. Originality without readiness yields blueprints, not businesses.

Jobs and Apple taught the same lesson in personal computing. IBM and others had shipped PCs before Apple, but they were clunky, half-finished products that never became cultural or consumer breakthroughs. Apple didn’t invent the category; it executed it well. It took the same general idea and paired it with design, usability, and marketing that made it feel inevitable in retrospect. First didn’t matter; doing it right did.

So what’s actually scarce?

First, the right way to execute an idea: the architecture, the process, the go-to-market strategy, the pricing and market design, and the operational cadence that makes a concept not just exist but compound.

Second, timing: the state of cost curves, infrastructure, user habits, regulation, and capital availability.

Third, the pressure of convergence itself: once a domain is primed, many teams will see the same idea. The difference isn’t spotting it, but making it work when it can finally work.

This is uncomfortable for founders who want to claim being first, but reassuring if your focus is on execution and building. Consider how the winners actually won.

Facebook didn’t beat MySpace by inventing social media. It enforced a real-identity graph that raised trust and relevance, rolled out in stages to build dense local networks, built a platform that attracted others’ energy, and did the unglamorous work on spam and abuse that made the feed usable.

Google didn’t win by claiming the word “search.” It delivered better relevance through PageRank, paired it with an auction-based ad model and self-serve tooling that scaled monetization, and disciplined itself on latency, so using it became a habit.

Zoom didn’t win because video calling was new. It obsessed over call quality, simplified joining to a single link, handled ugly network realities so calls just worked, and rode distribution built into invites and links.

SpaceX didn’t get credit for dreaming about reusable rockets. People dreamed about them for decades. SpaceX gets credit for the method that made reuse routine and cheap: vertical integration, a high test cadence, control systems that could land a stage precisely, and operations that treated reuse as normal instead of a stunt.

Sometimes being first helps. If network effects are strong and compound quickly, getting to critical mass first can matter. If you can lock up distribution through defaults or platform bundles, that advantage can endure. If data flywheels start early and the marginal value of data stays high, an early lead can extend.

But being first won’t save you if regulation blocks adoption, if a crucial complement is missing, or if the unit economics only work after another tenfold drop in costs. Many ideas that feel inevitable on a whiteboard don’t clear those hurdles in a real market.

Founders often chase originality because it flatters the ego. They’d rather be credited with the idea than with the process that turns the idea into reality. But markets don’t care about ego. They care about solved problems. They care about whether the product works in the current state of the world.

There’s no point in hoarding ideas. Someone else is already working on them. The useful question is not whether the idea is new but whether the moment is right and whether your method will compound.

How do you tell if the moment is right?

Look at the boring parts. Which cost curve bent in your favor? Do you have the complements (payments, identity, APIs, hardware), or can you supply them? Has the habit normalized, or will you be teaching the market? Which channel is truly open to you? What rules could block you, and is there a path through? What’s your wedge: small, obvious, expandable? If you can’t answer with specifics, you’re early. And early is just the polite way of saying wrong.

If I said originality doesn’t matter at all, I’d probably be wrong. It does matter. It matters most when you find a unique angle, something true and valuable that others haven’t spotted, and customers want.

It matters when you design a business model that competitors can’t easily copy, when you run operations that cut costs differently, and when you tell your story so people want to buy. Original ideas might seem common; what’s scarce is an original angle people actually care about, which gives you an edge.

True leaps, the exceptions, exist too. CRISPR is a genuine step change. Blockchain opened a new design space. Deep learning unlocked capabilities that weren’t available before. But even then, impact depends on complements, tooling, regulatory clarity, and distribution. A breakthrough isn’t the same as a product or a company. Execution decides whether the idea matters.

If you’re a founder, translate this into a simple discipline. Don’t ask whether the idea is original. Ask why now. Ask what made this impossible five years ago and whether that constraint is actually gone. Ask which complement you still lack and whether you can create it cheaply enough. Ask where your moat will live, and force yourself to pick: code, data, distribution, operations, or regulation. Ask what would make your advantage grow faster if ten competent teams shipped similar products tomorrow.

Convergence means you’re not special for seeing it. You’re special for making it work.

The market remembers the team that turned the obvious next step into the obvious default.