Microsoft has announced new security measures for its AltspaceVR platform. With virtual reality and metaverse environments under increasing scrutiny for their safety and security practices, Microsoft is taking additional steps to protect users, creating a safer, more inclusive environment.
Microsoft removes public social hub from AltspaceVR
In a recent AltspaceVR blog post, Microsoft confirmed major changes to the virtual reality platform’s public social hubs.
In short, to help prevent the harassment users face in these spaces, “social hubs including Campfire, News and Entertainment Commons hosted by AltspaceVR will be removed” effective immediately. In addition, AltspaceVR’s existing safety bubble feature will become the default setting for all users, while new participants to events will be automatically muted.
Along with those changes, Microsoft is also evaluating its content rating system for Events to make sure users don’t inadvertently enter rooms with inappropriate content. By extension, Microsoft will also promote moderation across the entire AltspaceVR platform, making it easier to report content.
AltspaceVR will require a Microsoft account
Another change is the mandatory use of a Microsoft account to sign in to AltspaceVR. The move will bring AltspaceVR in line with Microsoft’s other online services and help manage accounts using their age ratings.
We will also require all users to be logged-in to AltspaceVR using a Microsoft account. As we do for Xbox, Windows and other Microsoft services, we will integrate MSA accounts with Microsoft Family Safety, allowing parents to limit access to AltspaceVR for family members 13+ who download AltspaceVR from the Microsoft Store Accepted or limited permission will be given.
The Microsoft account requirement isn’t moving along with the other changes, but the company “will communicate the exact timing and how parents can implement these changes in the coming months.”
Big tech companies boost personal security in the metaverse
Metaverse security is a concern for companies racing to build virtual platforms. Users want to be in a virtual space, whether it is a VR experience or a metaverse world, and want to feel safe in their environment. Creating the tools to do this should be a baseline process for new projects.
However, as Microsoft is now discovering, retrospective protection tools like the default security bubble and explicit content rating systems are equally important for existing platforms like AltspaceVR.
They are not the first big tech company to implement security measures for the Metaverse platform. Meta’s Horizon Worlds and Horizon Venues have also launched personal bubbles after a series of videos of user harassment surfaced, where one Metaverse user invades another’s space.
Similar to Microsoft’s security bubble, Meta’s personal boundary creates a four-foot safety space around a user avatar. Vivek Sharma, vice president of Horizon, said that Meta’s personal limits will “create more personal space for people and make it easier to avoid unwanted interactions.”
Sharma also said, “If someone tries to enter your personal boundary, the system will block their further advance as they reach the boundary. You won’t feel it – there’s no hasty reaction.”
For once, Microsoft and Meta are reading from the same hymn sheet, recognizing that if people are to start using Metaverse platforms in earnest, tech companies will have to step up and ensure user security, right? Just like they would have to on social media, Xbox, or any other social-enabled platform.