Photo illustration: Intelligence.Photo: Getty Images
Many industries are speculating about what AI means for their future. This could be a “particularly big problem” for legal professionals, and could “reinvent” market research. It might change what it means to be a graphic designer. Call centers can be empty. Will white-collar office workers around the world become significantly more productive, redundant, neither, or both? Elon Musk, Yuval Noah Harari, Andrew Yang An open letter calling for a “pause” of “giant AI experiments,” signed by unusual personalities, including frames the question as hopelessly as possible. Including challenging work?
These are mostly projections that one industry has made from afar about plans for all other industries. But inside the tech industry, we are a little more certain about when and where AI automation will be most important.The future may be autocomplete: AI is clearly First comes the software. Where else do you start?
as a sensation, This is understandable. Capital is pouring out of the rest of the startup scene and rushing into AI. Big tech companies are announcing big AI investments while laying off thousands of other jobs. That’s all anyone in the industry can talk about, and true believers are everywhere. If you’re already the kind of person who’s worried that you’re not working on the next big thing, emotionally, it’s the next big thing for staff software engineers at companies that have nothing to do with AI at all. may crush you to your feet, or at least change your job in unpredictable ways.
But the idea that software development will discover the results of LLM-based AI is based on more than nervous atmosphere. While ordinary people were playing for the first time with experimental chatbots like ChatGPT and image generators like DALL-E and Midjourney, the coder was using his AI assistant. Some of them are based on the same underlying technology. His GitHub Copilot, a coding assistant developed by Microsoft and OpenAI, aims to speed up common programming tasks by “analyzing the context of the file you’re editing and the related files and what happens next.” We will provide suggestions on how to Lately, I’ve become more ambitious and assertive, trying a wider range of programming tasks such as debugging and code commenting.
Copilot reviews range from enthusiastic to mixed. At the very least, it’s a pretty good autocompleter for many coding tasks, suggesting that the underlying model does an impressive amount of “learning” about basic software behavior. Game developer Tyler Glaiel said that GPT-4 fails to solve tricky and novel programming test problems and, like its content-generating ilk, is anyway prone to “hoaxes” and “wastes a lot of time.” It was found that there is a possibility to Still, he gave it some credit when asked whether GPT-4 could “actually write code”.
Given an explanation of the algorithm, or a description of a well-known problem with lots of existing examples on the web, GPT-4 is absolutely readable. Most of the time it’s just assembling and remixing what you see, but to be fair… a lot of programming is just that.
Of his encounters with these tools, former Twitter VP and Googler Jason Goldman likened the technology to a typical person in the industry: a manager who can’t really code.
OpenAI was an early release of usable AI coding tools, but this week Google announced a partnership with Replit, a popular software development environment for general-purpose coding assistants. In an interview with Semafor, Replit CEO Amjad Masad, who is pretty excited, described coding as “a near-perfect use case for LLM,” and ultimately his company’s goal is for assistants to be “fully autonomous.” I said that it is to become a target. You can treat them like temporary employees.
This month, SK Ventures’ Paul Kedrosky and Eric Norlin presented a broader bullish case for AI software development.
The current generation of AI models are missiles aimed directly, if unintentionally, at software production itself. Sure, chat AI could swim in crafting undergraduate essays, or crafting marketing materials or blog posts (just in case you need more of either), but such technology is software It’s like dark magic in creating, debugging, and speeding up software creation. And it costs almost nothing.
This is partly because[s]Software is even more rule-based and grammatical than spoken English or any other spoken language,” and “Programming is a good example of a predictable domain.” Like, they Investor — This will enable people to create and use software they couldn’t before, quickly reducing “society’s technical debt” before it unleashes an unpredictable wave of innovation.
And hey — maybe! In any case, it is clear that the software industry is highly exposed to the latest products, and that its workers and employers are quickly testing and adopting them. The impact of LLM automation on labor — fewer jobs, more jobs, different jobs, wage pressures, layoffs — is somewhere in the industry, if not the first, where it will be most fully deployed and It will be revealed where it seems so. especially competent.
One such place is within companies that create AI tools. Google is a software company that wants to bring AI-based software to users and customers of other companies. It’s also an employer with over 150,000 employees and has just cut his 6% of its workforce. In announcing the layoffs, Google CEO Sundar Pichai directly addressed the company’s investment in AI. “Constraints in some areas allow us to bet big in others,” he wrote. “The transformation of the company to be AI-first a few years ago was a game-changer for our business and the industry as a whole,” he continued, adding, “AI creates huge opportunities ahead of us. There is,” he emphasized. This can be read two ways. Google is certainly in an enviable position to sell and provide AI tools to others. It is also probably ideal Customers for proprietary tools purported to improve productivity: dozens of offices full of coders, product managers, email writers, deckers, meeting organizers — countless low-wage contractors spread across the globe Needless to say — test the tool, collect, in a single enterprise environment. Before Google really understands what their products do for their customers and employees, they will probably start to understand what those products do for themselves.
In its own analysis, OpenAI suggested that certain tech jobs are more likely to be exposed to LLM-based tools. “We found that roles that relied heavily on science and critical-thinking skills were negatively correlated with exposure,” the company claimed, adding that “programming and writing skills were positively correlated with LLM exposure.” “About 80% of the U.S. workforce expects at least 10% of their work tasks to be impacted by the introduction of LLM, and about 19% of workers have at least 50% of their tasks impacted.” You may receive it.”
Now one of the world’s leading LLM companies right That said, it’s easy to make wild guesses about what the future holds for companies with less than 400 employees. everyone else(However, OpenAI utilizes thousands of foreign contract workers to help clean up the model, doing work that could be “exposed” to automation in the short term.) It’s also a prediction that might interest OpenAI’s biggest funder, Microsoft. Farhis and his partners describe his AI-powered capabilities in a range of popular Microsoft software such as Windows, Office and, of course, GitHub. Like Google, Microsoft is cutting costs primarily by cutting thousands of jobs, including his GitHub overseas team. Similarly, investments in AI he can interpret in two ways. One is betting on a new kind of product that could be profitable, and the other is simply investing in automation to save labor costs, like a new machine. on the factory floor.