PlayStation 5 not only competes only against the new Xbox Series X and Series S in terms of power and games catalog, but also in other aspects that may be equal or more important in the user experience. For this reason, the next Sony console will offer a new option to combat abuse when playing online . We explain how it works.
Sony will fight against toxic users
Online gambling is one of the great attractions for many players and players. However, there are times when it can be a real horror. Especially for the female players because there are users (to call them in some polite way) who have not yet realized that everyone has the same right to play and they are dedicated to insult through text and voice chats.
Sony wants to avoid this type of situation in voice chats and thus ensure that its user community is as healthy as possible. Therefore, although all of us will have to continue working on educating on equality issues, the company will offer a new tool on its PS5 with which you can report those toxic users that you may encounter while playing.
However, before talking about the operation itself, it is important that we clarify where it can be used. The abusive comment reporting tool will only be available on PS5, but PS4 users will receive or will have already received a notification when updating their system. This is because voice chats will have cross-platform support. So it is important that everyone is clear about what this new tool is.
What’s more, to avoid problems and accusations of privacy violations, Sony will warn all users that their conversations could be recorded on other users’ consoles. Only the conversations of the native chats and not the public of the game.
How PS5 chat moderation works
The new system for reporting abuse in voice chats will only work on the PS5 as we have already mentioned. The console will record your voice conversations, but will only store the most recent five minutes on the console. This is important to take into account because there will be no upload to an external server, so you can rest assured about your privacy.
With these audio fragments what they are looking for is that as a user you have the possibility of being able to report that other user who could have made an abusive, insult or similar comment. If this happens, then you will have the possibility to use said tool to select a 20-second fragment where what he said is heard.
In addition to those 20 seconds, the tool itself will add 10 seconds before and after. So in total, if you need to report abuse by a user in the console’s native voice chats, you will have 40 seconds in total.
This will then be the material that the team of Sony employees will receive in charge of assessing whether or not there is any type of illegal behavior. If that happens, that same staff will be in charge of taking the appropriate measures and applying them against the user profile that committed the offense.
A useful tool or the opposite?
Known the tool, what do you think? The idea and intention is good and could be quite useful to achieve, indeed, a much more satisfying gaming experience and a healthier user community. Even so, it is also true that in part you can understand the suspicion that some users might have regarding their privacy.
Here each one will have to assess and it will also be necessary to see how other users use it or what possible changes Sony can apply as you see how it works. But it is true that eradicating toxic users is necessary.