Art: Charter

Software engineering is going through a revolution. The latest versions of AI coding tools like Anthropic’s Claude Code and OpenAI’s Codex have wowed professionals while leaving many anxious about their careers. Engineers at some of the AI labs say AI writes 100% of their code now.

Amjad Masad, CEO of Replit, which lets users create apps and websites by describing what they want in plain language, even said at Charter’s Leading with AI Summit that he thinks eventually, “software engineering, as a role, sort of disappears. What [you’ll] have is generalist product people, problem solvers, or systems people.”

For a better picture of how AI is changing the job of a software engineer today and where things are headed, we spoke with Hadi Partovi, founder and chairman of Code.org, a nonprofit dedicated to ensuring that all K-12 students learn about AI and computer science. He believes that even as AI writes more code, software engineers’ jobs will change and their prospects remain bright. Here are highlights from that conversation, edited for length and clarity:

Sign up for Charter's newsletter to get the handbook for the future of work delivered to your inbox.

What’s your impression of the latest AI coding tools, as someone who’s used them?

I’m a strong believer that the old model of software engineering is dead. But that doesn’t mean that software engineering is dead. The way I describe this is I say, ‘Coding is dead, long live coding.’

The history of making software has had different levels of how you do it. It used to be punching holes in punch cards and there [were] literally ones and zeros. Then it was typing very low-level commands in assembly language. Then you could learn higher-level languages. Then with open source, there [are] entire libraries and frameworks of things that people have built.

Now, as [AI researcher and OpenAI founding member] Andrej Karpathy says, the new programming language is English. With every one of these tools, you [can] describe in English what you want. But it’s not English that everybody understands. It’s just like [how] being a lawyer is in English. If you ask the average teenager to read legal code, they’d just be like, ‘I don’t understand it,’ because you need practice and understanding. This is similar.

Anthropic CEO Dario Amodei recently said that he thinks we could be six to 12 months away from AI models being able to do “most, maybe all” of what a software engineer does. What do you think?

I don’t know if it’s six to 12 months, but I’m quite sure—because I was actually just listening to [one of Amodei’s] interviews—that he’s talking about the current tasks [of a software engineer]. The person who’s providing the English language instruction to the AI is not speaking your average mom or dad’s English. They’re not saying ‘Please make me a ridesharing platform app, thank you very much.’

How long before the person building the software doesn’t need to give the AI the technical specifications?

I think we are eventually going to get to a point where you just very simply say what you want [and] all the software is [made] for you. I don’t think that’s six to 12 months away. I also think once we get there, we’re at what people call the singularity. Everything gets weird because then anything you’d want to do is done by AI for you. Then there’s no jobs for anybody, not just coding jobs because you could be like, ‘I want to write software that automates what an accountant does.’

Before that happens, what will the software engineer of the future do?

The job of a future software engineer is like a merger of a software architect and a technical product manager. In a typical software engineering team, you might have the product manager who’s like, ‘This is what the customer wants and here’s what the design should be like.’ In an ideal world, they are technical. So they’re not just like, ‘I’m dreaming up things, but I don’t know how it could possibly work.’ They should ideally be technical enough to speak in a technical language.

Then a software architect figures out, ‘We’re going to need these functions, this database, these are the pieces it needs. Here’s what this engineer does, and here’s what that engineer does.’ The architect and the product manager combined—that’s the software engineer of the future. The AI will do the [work] that the software architect would have assigned to coders.

Last year, you made the case for people to continue studying computer science, and you argued that it’s a liberal art. Can you explain that case?

First of all, I’m not the only person saying this. The first person who said it was Steve Jobs. If you look at the Stanford University description of the computer science major, it doesn’t say, ‘This is a great major if you want to get a job in tech.’ It says, ‘This is a great major if you want to work in law, government, [or] business.’

When you design an operating system, there’s a limited amount of memory. There’s a limited amount of compute power. If you have multiple apps trying to use all of that stuff, how do you get them to share it? It’s a lot like in a government where there’s a limited amount of roads or park spaces. How do you design laws so everybody can share them?

The comparison to law is that an algorithm is a lot like a contract. When you write an algorithm, you say, ‘If the user clicks here, then do this. And if they double click that, then do this, unless they [do] this.’ In a contract you’d say, ‘If you’re buying the house, but it fails the inspection, we do this, but if it passes, but there’s these changes that need to be made, we do it that way.’

The comparison to business is when you write software, one piece of software and another piece of software will want to interface with each other. Running a business [and] designing an organization is also about figuring out what the marketing team need[s]. How do you ask it [for] the things you need? What’s the language they speak? What are the things they can do on behalf of the sales team?

Let’s imagine you’re talking to an 18-year-old who’s interested in studying computer science but is nervous because they’ve seen how much better AI has gotten at coding. What would you tell them?

I’d say you’ve been learning math your entire life—since you were six years old…we don’t teach people math because of the jobs in math. We teach them math because it teaches them logical thinking, [and how to think] about functions, variables, and problems. The same applies to computer science.

But unlike math, there’s [still] a ton of jobs in software engineering. And every other job is going to eventually be automated by somebody who’s a software engineer.

How [software engineering] work is being done is a lot easier. It’s a lot more human. It’s a lot more creative. When I studied computer science, 30% of the time was problem solving [and] 70% was, ‘a semicolon goes here’ [or] ‘I forgot to close the angle bracket over here.’ In many ways, now it’s a much more enjoyable job because you just focus on the problem solving and the creativity and less on the nitty gritty.

What else to read:

Read more from Charter

The handbook for the future of work, delivered to your inbox.

Subscribe
EDIT POST