The rise of voice chat in online video games has inevitably increased bad behavior among players – finally, it is very easy to scream at other people, screaming dirt and abuse, much more to stop playing and typing them. It is difficult to moderate such toxins, and some companies are turning to the AI ​​to help it get worse. Addressing the GDC, Roblokes senior technical director Kiran Bhatt and Voice Safety Lead Hans Hacksin Hemo said that Roblox has been using the machine learning voicechat moderation for a year and has had good success with it – but it has been acknowledged that sometimes human works are better.
Bhatt said during the conversation, “In real time, a moderate voice is really difficult, because to moderate the sound, you not only have to know what the person is saying, but you also want to capture the tone and the intensity that is being asked to make a decision whether something is poisonous or not.” “There is also a context, which is a part of whether or not something is invasive or not. So this is a really difficult problem.”
Reducing this challenge, Bhatt explained that at least 85 % of it is considered toxic can be added to four major varieties, and the “majority” of crime is caught with a list of top 50 key words.
I am naturally skeptical of big claims about AI’s capabilities, but Roblox’s voice protection lead, Hackin Hemo claims impressive results: After running the About About system, it has been extended to cover 31 countries during this time, he said that the number of “active” is “active”.
One of the benefits of using machine learning for sound moderation is “consistency”, Hacken Hemow said: Unlike humans, machines do not have different perspectives or opinions, they are not tired or angry, nor are they bad days, they only do what they do. Therefore, “in very clear cases (toxic behavior), the machines are more efficient and more permanent than people”. But the essential humanity that can make us difficult to eliminate is enabled us to work better than system -driven machines that are not so clear: machines can be good in matching samples, but when it comes to understanding intentions and using discretion, well, “humans still leave the machines.”
“Humans are still better in things where you can be very close to the decision,” said Hacken Hemo. “Or it can be a matter that is rare where we do not have so much data for the learning of the machine learning system. So in these cases, humans can still be better.”
Roblox is not the only one who adheres to the AI ​​-powered sound. For example, in GDC 2024, we spoke with two software makers working on the machine learning system to help moderate the sound chat. It is also noteworthy that the activation has previously reported similar success in the call of duty with moderate tools from its AI -powered voice.
Keeping aside natural doubts, I acknowledge that the machine learning-II, whatever you want to say-are the most important real-world use, and this is a good example. But I also appreciate that it shows that it is a device, no cure, and that we man still has a place in this brave new world of the future.