What comes after the AI coding boom
The final Sources Live interview from Davos, with Turing CEO Jonathan Siddharth.
One of the most common jobs in the world, by volume, will be evaluating and training AI models. That’s Jonathan Siddharth’s prediction. He believes companies will need millions of people across every industry and function to show AI systems what expert work actually looks like.
Siddharth is the CEO and co-founder of Turing, which has quietly become one of the most important companies in the AI supply chain. Founded in 2018 as a remote developer marketplace, the company pivoted to AI training data after OpenAI came calling in 2022, asking for help generating coding data that would go on to power ChatGPT’s reasoning abilities. Today, Turing’s data powers the top five models on SWE-bench, one of the most closely watched coding benchmarks in AI. The company works with all of the major frontier labs and says it has a network of around four million coders worldwide.
Back in January, I sat down with Siddharth in front of a live audience at Brunswick Home on the sidelines of the World Economic Forum in Davos, Switzerland, for the final installment of my inaugural Sources Live interview series. Our conversation got into why coding AI has advanced so rapidly compared to other domains, how the same reinforcement learning techniques are now being applied to other fields, and why Siddharth believes almost all future AI systems will be human-in-the-loop rather than fully autonomous.
He also shared his take on what makes models like Claude Code work so well, and made the case why current enterprise AI adoption across the S&P 500 still “rounds down to zero.” Towards the end of the conversation, we even got surprisingly philosophical.
You can watch the full interview on YouTube. Thanks to Disruptive for making it possible.
If you haven’t already, make sure to check out the full Sources Live series from Davos, including my conversations with the leaders of Google DeepMind, Scale, ElevenLabs, Skild, Reflection, and Meta.
The following transcript has been lightly edited for length and clarity:



