As Meta continues to build next-level, more immersive social experiences in the Metaverse, it’s time to focus on responsibility and develop these new spaces with the safety and security of all users in mind. It seems that you need to make sure that right?
So if you’re asking people to spend more time in a fully immersive, closed reality-distorting headset, it seems to have a greater impact on mental health than a typical social media app.
What is of concern today is that wall street journal reports that Meta has abandoned that ‘The Responsible Innovation team was tasked with monitoring and addressing concerns about potential negative and adverse effects of various products.
” [Responsible Innovation] The team has about 20 engineers, ethicists, and works with our internal product teams and external privacy experts, researchers, and users to identify potential concerns about new products and changes to Facebook and Instagram. It included someone to deal with. “
Even with this team in place, it looks like no meta has ever worked out. Without these additional checks, it’s hard to imagine an improvement in this aspect.
Meta confirmed the decision to disband the group, but noted that he remains committed to the team’s goals.Meta also noted that most of the Responsible Innovation team members are doing similar work internally. It says it will continue, but believes these efforts would be better spent on a more “problem-focused” team.
Of course, it’s impossible to know what this really means in the broader development process of Meta, or how this might affect future projects. But now, Meta is rolling out the most immersive, most interactive, and most impactful experience ever.
More than ever, it seems that we need additional guidance.
This is a key concern in its metaverse development.move fast and break thingsProceed with the development of immersive VR without due consideration of mental health and other implications.
Meta already has history in this regard. For example, we never fully considered the implications of Facebook data falling into the wrong hands. Cambridge Analytica, to provide insights about users for research purposes, years before they matter. It seemed like they never thought about how algorithms would change people’s perceptions if they tuned to the wrong metrics. Large-scale influence operations by well-funded political groups the potential to change democratic processes, the impact of Instagram filters on self-awareness, and teenager mental health.
Of course, Meta is learning these lessons now and implementing fixes and procedures to address each one. But in each case, measures have been taken in retrospect. Meta did not foresee these being problems. We’ve only seen new opportunities as fast as Mark Zuckerberg’s eternal optimism can propel the meta into new territory and reach new connected paradigms.
Meta no longer uses “move fast and break things” as a mission statement. ‘Move fast with stable infrastructureIn 2014, Meta settled on “Bringing the World Closer” before finally morphing a few more times 2018.
It may sound more thoughtful, but is Meta actually approaching things in a more thoughtful way as a company? , will we see the negative impact of VR social?
Again, Meta has learned its lessons and has come a long way.However Early problems with the metaverseand its waiver The Responsible Innovation team raises concerns that Meta will, as usual, put scale above safety and be inspired by what can happen instead of thinking about who gets hurt in the process. .
to definitely appear Next month’s Connect conferencethe Metaverse of Meta is full of possibilities, offering a whole new way to participate in a fully interactive and customizable environment where virtually anything is possible.
For better or worse.
Meta is very keen on highlighting the good, but the opposite cannot be overlooked. This has happened many times in the past.
In this case, the impact could be much worse, and it is important that questions continue to be raised about Meta’s development process in this regard.