Nobody’s running around panicking about the plumber shortage.
But in three years, we might be talking about a paralegal shortage. An accountant shortage. A customer service rep shortage—because the people who used to fill those seats figured out what the data already shows and got out early.
Andrej Karpathy (yeah, the former Tesla AI director and OpenAI founder) built something that should be required viewing for anyone with a salaried job. He analyzed every occupation in the US economy—143 million jobs total—and scored each one on AI exposure from 1 to 10, pulling data straight from the Bureau of Labor Statistics Occupational Outlook Handbook. You can see it yourself at karpathy.ai/jobs.
I’ve looked at a lot of “AI will replace jobs” content over the years. Most of it is vague enough to mean nothing. This one hits different.
Because it’s a map. And the pattern on the map is impossible to ignore.
Here’s What the Data Actually Shows
Let me give you the short version before we dig in:
Job Exposure Score
Plumber 1/10
Firefighter 1/10
Carpenter 2/10
Nurse 2/10
Surgeon 2/10
Personal Trainer 2/10
Chef 2/10
Teacher 5/10
Software Developer 5/10
Real Estate Agent 5/10
Marketing Manager 5/10
Accountant 7/10
Lawyer 7/10
Financial Analyst 7/10
HR Specialist 7/10
Customer Service Rep 9/10
Cashier 9/10
Receptionist 9/10
Bookkeeper 9/10
Data Entry Clerk 10/10
Look at that distribution for a second. Really look at it.
The jobs with the lowest exposure scores share something obvious once you see it: they require a physical body in a specific location, doing something that requires reading a room, a pipe, a patient. The jobs with the highest scores share something equally obvious: they’re mostly about receiving information, processing it, and outputting something back.
That’s it. That’s the whole thing.
If your job is “take info in, process it, give info out”—you’re in trouble.
If your job requires being physically present or earning human trust face-to-face—you’re probably fine.
Why This Pattern Makes Sense
We built an entire white-collar economy around tasks that AI is genuinely good at.
We called it “knowledge work.” We made it prestigious. We went $200K in debt to get the degrees that unlocked the door to it. And now the machines can do the core of it—not perfectly, not completely, but well enough that the economics of employment are going to shift.
A data entry clerk making $35K processes structured information for 8 hours a day. That is quite literally the textbook definition of what a language model does. It’s not a knock on data entry clerks—that work has real value—but the job description reads like a capability list for a 2022 AI demo.
The counterintuitive flip in this data is who’s actually safe.
The plumber. Exposure score: 1.
Think about what a plumber does. They show up to a house they’ve never seen before, crawl under a sink or into a crawl space, diagnose a problem that’s never identical to the last one, physically manipulate tools in a three-dimensional space, and then interact with a homeowner who’s stressed, skeptical, and making a judgment call about whether to trust them. All in the same visit.
AI cannot do that. Not today. Not in 2027. Probably not by 2030 in any meaningful deployed sense. The bottleneck isn’t intelligence—it’s embodiment, physical dexterity, and presence.
Meanwhile, the lawyer hits a 7/10. Not because lawyers aren’t smart or valuable—they absolutely are—but because a huge portion of legal work is exactly the pattern: ingest documents, identify relevant precedent, synthesize into an argument, output a memo or brief. The reasoning is sophisticated. The underlying mechanics are still information processing.
The Thing Nobody’s Talking About: The Middle Is Where It Gets Complicated
The 1s and 10s are easy to reason about. The 5s are where people are going to get blindsided.
Software developers score a 5. And I’ve already watched this play out in real time.
I talk to engineering leaders every week. The ones at companies actively deploying AI coding tools are telling me they’ve seen a 30-40% productivity jump on boilerplate work. Junior developers who used to spend their first year writing CRUD functions are now doing it in a fraction of the time.
That’s not a 5-year trend. That’s happening right now.
But here’s what’s also true: the senior engineer who can architect systems, who understands which abstractions break under load, who can walk into a room with a VP and translate business requirements into technical decisions—that person isn’t going anywhere. They’re actually getting more valuable, because now they can do more with less scaffolding.
Same with marketing managers. A 5 makes sense. The part of marketing that’s pure information processing—trend reports, campaign performance analysis, first-draft copy generation—is being automated. The part that’s judgment, taste, understanding what a brand stands for in a specific cultural moment, knowing when to break the rules—that’s still human territory.
Teaching is an interesting 5. The lecture component—explaining a concept clearly—AI can do passably. The relationship component, knowing which student is checked out because something’s going on at home versus which student genuinely doesn’t understand the material, adjusting mid-lesson when you read a room wrong—that’s irreducibly human.
The 5s are going to bifurcate. Some of those jobs will effectively become 8s as AI eats the repetitive core. Some will drop to 2s as the human-judgment component gets isolated and protected. Which direction your specific role goes will depend entirely on which half of the job you spend your time on.
What This Means If You’re Thinking About This Seriously
I’m not going to pretend I have a clean three-step playbook here. I don’t. The honest answer is that nobody knows exactly how fast this moves or where it stops.
But a few things seem clear:
Proximity to physical reality is going to be undervalued for a while. There’s a generation of people who were told that getting your hands dirty was for people who didn’t have other options. That narrative is going to flip. The skilled trades have a real labor shortage already. If you’re 22 and good with your hands, you might be sitting in a better position than the 22-year-old who took the paralegal job.
The “information middleman” role is probably where the most disruption concentrates. Not the expert who generates the insight—but the layer between the expert and the decision-maker, whose job is to package and transmit. That layer is thin already and getting thinner.
Human trust is becoming an actual competitive moat. This sounds fluffy until you think about it concretely. The financial advisor who’s been your family’s financial advisor for 15 years—who knew your father, who was there when you got divorced, who you called when your business was struggling—that person isn’t competing with a chatbot. The trust layer is structural, not just relational. It’s built on presence and time and mutual vulnerability. AI can’t fake that. At least not yet.
The $3.7 trillion in wages exposed to high-AI-impact jobs is a macro signal. Karpathy’s tool shows $3.7 trillion in wages across high-exposure occupations. That’s not a rounding error. If even a fraction of that productivity gets absorbed by AI tools over the next decade, the downstream economic effects—on employment, on wages, on where people choose to invest in education—are going to be significant. This isn’t a technology story anymore. It’s a macroeconomics story.
The Question I Keep Coming Back To
I spent a good chunk of my career working in operations—figuring out how to do more with the resources available, which increasingly means figuring out how to deploy technology against tasks that used to require headcount.
And I’ve watched smart, capable people get caught flat-footed by this shift. Not because they were lazy or unaware. But because they were heads-down building expertise in the thing that was working, while the ground shifted under them.
The heat map Karpathy built is the most honest version of that ground shift I’ve seen in a visual format. It’s not a prediction. It’s a description of structural vulnerability based on what these jobs actually require people to do.
The question worth sitting with isn’t “is my job on the list?”
It’s: “Which half of my job is on the list?”
Because that’s where the real work starts—figuring out which parts of what you do are irreducibly human, doubling down there, and being honest about what’s already being eaten.
The map is out. The only move left is to read it.


