OpenAI's language model GPT-3 has established a reputation for its flexibility, having been used for everything from AI-generated self-help blogs to identifying paintings with simple captions. In the latest development, the language model was used to generate game code, according to programmer Ivan Mathy (who goes by @OKatBest on Twitter). "All you need to do is describe what you want,” Mathy said, “and GPT-3 will try to write it down as a Unity C# script."
Just a week ago, the same language model was trained on cringe-worthy data sets of people trying to flirt. The resulting pick-up lines were surprisingly less awful than what we humans come up with when we attempt to woo others. One of the results was: "Picked up some pretty flowers. Wanna smell them? Here, try to take my hand off." From awkwardly flirting to creating game code that is surprisingly efficient, GPT-3 is proof that, although its skills are limited to rather basic, logic-oriented text creation, it still has the ability to create something unique.
Simple is best — A command written in simple language with direct and clear instructions will work most successfully, according to Mathy. So, for example, the GPT-3 will be able to execute the command to "create a script that moves game object to left and right in sine motion with speed and magnitude as parameters." The sentence may sound dull, but the AI will understand it and write the corresponding code accordingly. Here's proof from Mathy.
It may not be able to create full-fledged games, but programmers have wondered aloud if GPT-3 could assist with creating dialogue and conversations for Non-Playable Characters (NPCs) in games. Normally these characters aren't all that complicated, but perhaps with GPT-3's "175 billion parameter language model," developers might be able to create more in-depth dialogue, as AI researcher Chintan Trivedi explains
You can’t please everyone — One Reddit user, BaguetteTourEiffel, offered a more critical take on the AI-generated game code from Mathy. "GPT-3 is getting tiring," they said. "For those who don't know, it is a gigantic neural network (unrunnable on any normal computer) that has been trained on an ungodly amount of data."
While it is true that the GPT-3 model does not innately "understand" data, its ability to parse massive swaths of them and then generate coherent code is impressive nonetheless. Mathy wants programmers to try it, too. "If you have an OpenAI API key,” he tweeted, “you can try this yourself with the prompt followed by 'using UnityEngine,' on a new line which essentially tells GPT-3 that you're looking for a piece of code, since it's trying to complete what you started."
GPT-3 may have a long way to go before it’s creating the next Super Mario Bros., but we shouldn’t discount how far it’s come. Even a decade ago this sort of AI-powered automation was all but impossible. So who knows what it’ll be capable of in another 10 years’ time. Hopefully, it won’t take your job.