GitHub Copilot, based on OpenAI's Codex, helps boost productivity – Business Insider

GitHub Copilot, based on OpenAI's Codex, helps boost productivity – Business Insider

Jump to

While ChatGPT has spurred a lot of excitement around generative AI, the conversational chatbot is not the only technology to come from OpenAI.
The Microsoft-backed company has developed a slew of generative-AI products, including one called Codex that’s already helping engineers become more productive. 
Codex is the basis for GitHub Copilot, a tool that functions as a sort of autocomplete for software engineers. Copilot looks at the code an engineer is writing and suggests the next few lines they might want to write to solve their problem. Users can also write a comment in human language describing the function they want to build, and Copilot will provide a suggested solution.
Software engineers often have to put together scaffolding, or boilerplate code software needs; think the outline of an essay. Because the code is so frequently written, Copilot can easily reproduce it. GitHub has said that in files where it’s enabled, Copilot is responsible for upward of 40% of the written code. Engineers who spoke with Insider said it had already proved a major boon for productivity. 
Cyrus Ghazanfar, the chief technology officer at 401(k)-management startup Beagle, said his company began experimenting with Copilot in recent months. He now requires his team of three engineers to use the tool because he quickly found it saved him plenty of time by eliminating rote work. Already, Copilot has proved useful for creating databases, he said.
“In these frameworks, there’s a specific way of creating databases,” Ghazanfar said. “I find Copilot super useful because there’s a lot of repetition when you do these kinds of things.”
For example, many databases have the same properties, like a timestamp that says when the data was created. Ghazanfar said he could now use Copilot to quickly build a database that includes these properties, just by asking the AI-powered bot to make it. 
Bill Mers, the vice president of engineering at LookDeep Health, said it’s “almost creepy” when Copilot figures out exactly what code he’s trying to write. Copilot gleans its knowledge from the large corpus of projects on GitHub — which could present legal issues — but it also uses the context of the project an engineer is working on to suggest tailored code. 
Mers estimated that Copilot had saved him 10% of the time he would’ve normally spent coding. Occasionally, he writes comments inside his code explaining to other engineers what exactly the scripts were designed to do, and Copilot can pick up on that and do it for him, writing human-language descriptions of exactly what the code is doing.
“That was a bunch of typing, a bunch of thinking, I didn’t have to do,” Mers said.
While Copilot’s suggestions are “pretty accurate,” using the tool can be a double-edged sword because it’s easy to trust it too much, Ghazanfar said, which means the developer must go back and correct errors when the suggested code doesn’t work. Tools like Copilot and ChatGPT return their answers with confidence even when they’re wrong, so users still need to have some expertise to recognize errors and address them. 
Even former GitHub CEO Nat Friedman, who oversaw Copilot’s launch, has spoken about the product’s flaws. At a recent event in San Francisco, he compared using Copilot to a lottery experience. Programmers write their code, and while Copilot’s suggestions are frequently wrong, every now and then it suggests an answer that perfectly solves a developer’s need, Friedman said.
Though Copilot may be good at solving common programming needs, like ChatGPT, it is prone to “hallucination,” a term the AI industry has adopted to describe chatbots confidently suggesting answers that are completely made up.
“Copilot will give you a very compelling answer that looks great, and this code looks great, and these API calls look great. But then it turns out the APIs don’t even exist,” Mers said, referring to application programming interfaces, which help apps communicate with each other. 
In other words, if developers don’t already understand what they’re looking for, they’ll quickly get stuck using Copilot.  That’s why engineers likely won’t go extinct anytime soon. That and the fact that experts believe tools like Copilot will only increase demand for developers as software can be created at a faster cadence. Developers will simply be able to do more, but they’ll still need to understand how the code works.
“Sure, you could create a basic iOS app with Copilot,” Ghazanfar said. “But to do more-complicated things you need to know what you’re trying to ask, how to fix the suggested code when it’s wrong, and the best way to write code when it’s not a super opinionated framework where there’s just one way of doing things.” And because Copilot is relying on existing code from around the web, it’s not building the next generation of technology, which will still require the critical thinking skills of a human. 
Read next

source