in

How to level up your job in the emerging AI economy

rob dobi/Getty Images

Artificial intelligence (AI) — as with cloud computing a few years prior — is upending the economics of information technology. In many ways, AI has the power to make technology much more efficient. The challenge, however, is helping people and organizations move to the next level and adapt to the new AI reality.

I had the opportunity to discuss the evolving tech economy with Dr. Susan Athey, who was recently appointed chief scientific advisor to Keystone Strategy. Athey is also an economics professor at Stanford University and former chief economist for Microsoft.

Also: In a surprise twist, Meta is suddenly crushing Apple in the innovation battle

“It’s hard to fully capture quantitatively the benefits of being more nimble and being able to add more features and do more projects, and do experimentation and innovation that you might have not otherwise done,” said Athey. She sees opportunities ahead if people and organizations are properly prepared. 

“It’s difficult and expensive to build and deploy AI-driven systems, but the net result is technology infrastructures and applications that deliver more quickly and efficiently. Operating these systems may be a little easier once they’re up and running,” she said. “Relative to machine learning that I’ve done the last 16 to 17 years in industry, this latest round is easier to maintain, and requires less complex coding.”

<!–>

Overall, she continued: “I feel like we’re seeing the convergence and finally seeing the payoff of lots of investments that we’ve collectively made as an industry over time. People have learned how to make modular code. They’ve learned a lot of the optimization, which used to be very finicky and is now this very high-performing, general-purpose optimization routine. The newest algorithm can just plug into those optimization routines.”  

As a result of this transformation, Athey said technology professionals need to rethink their roles and careers. “I think that coding has gotten easier. My students at Stanford are probably writing 80% of their code using Copilot,” she said. “It’s good at finding syntax errors and writing tedious code. Knowing a particular language is less important. I coded in like 10 different languages since I started my career.”

Also: 3 ways to help your staff use generative AI confidently and productively

But while these technologies help with more straightforward coding processes, Athey said higher-level architectural skills – “structure and how things should be done” – are required for technology projects. In addition, the AI economy will also demand evaluation and logical-thinking capabilities. 

“We put out thousands of computer science and engineering students at Stanford every year. They all are very good at downloading a data set from the web and doing stuff with it. Training stuff, optimizing stuff, predicting stuff, classifying stuff, comparing model A to model B, and comparing their performance. However, they have very, very little training in asking, ‘What does it mean? How would you know when or why it is doing well? What are the weaknesses? What kind of data would help improve it?'”

The challenge with AI models is that “they are going to be giving you wrong answers a share of the time,” Athey said. “We don’t have the science to know when is it giving you wrong answers and when is it giving you right answers. Like maybe you don’t have enough young people in your data sets. You try to hallucinate more of them. But that may or may not actually help you learn more about young people. I need to assess that I’m not just hallucinating those features of young people. And that’s not built in. And the model doesn’t know – it won’t tell you. The model has no way to know that directly.”

Also: How your business can best exploit AI: Tell your board these 4 things

Athey said the bottom line is that today’s and tomorrow’s technology professionals will handle and pipe in the data that fuels AI-driven enterprises. “In using the new kinds of AI, there’s a bit of learning about the value of your data. What’s the value of external data sources? What initiatives have you tried before, or that didn’t work because you didn’t have enough data? Are there initiatives that you could try again now?” She said part of the challenge is that AI models may need to consume “historical unstructured messy data.”

Executives and professionals need to be versed “in the next layer of analysis that requires a lot of logical thinking. It requires understanding statistics and conditional expectations. You need mathematical framing. To ask, ‘What does it mean for this to be right? And to give an accurate answer too?'” And right now, that level of critical thinking, and the tools to support it, are still in short supply, said Athey.

42% of daily X users have a negative view of it – losing the block feature won’t help

Meta explores neural control and AI beats bot detectors