The ultimate racing experience on PlayStation has always been offered by the Gran Turismo series, but now Polyphony Digital’s racing game can offer the ultimate racer, too, in the form of GT Sophy.
Working in collaboration, Sony AI, Sony Interactive Entertainment (SIE), and Polyphony created a start-of-the-art artificial intelligence racing driver called Sophy. Using deep reinforcement learning, GT Sophy took on the task of learning to drive from scratch, before slowly progressing to be a capable and then unbeatable racing opponent.
Sophy started learning to drive in April 2020 using reinforcement learning combined with positive and negative feedback inputs. Sony then utilized cloud computing to allow Sophy to run “tens of thousands of simultaneous simulations” in order to master the art of racing as fast as possible. After that, it was just a case of adapting to a competitive race involving humans.
The culmination of this training was demonstrated in the second Race Together day held on Oct. 21 last year. All three races were won by Sophy, but the AI was also witnessed “adapting to a tough situation when it had a wipeout early in the third race and still went on to come in first place.”
Sony is keen to point out Sophy wasn’t created to simply be faster at Gran Turismo than human players. Instead, Kazunori Yamauchi, President of Polyphony Digital, explains that, “The goal with Gran Turismo Sophy is ultimately to entertain people.” So imagine Sophy being used as a way to push human players to get faster, or help them learn different race tracks or new techniques to be faster for specific sections of tracks, and you get the idea of what Sony and Polyphony are likely planning for this new AI. Sophy isn’t going to be limited to Gran Turismo, either. It’s an AI expected to be used in other games, too.