Below: Twitter reveals it has removed a China-based activity seeking to influence US politics, while the FTC accuses an education technology company of having weak cybersecurity practices. beginning:
If the Metaverse dream of the Metaverse comes true, regulators will face a whole new set of privacy concerns
Recently, Meta CEO Mark Zuckerberg It has provided a rosy picture of the future success of his company’s big bet to transform human communication through an immersive virtual world known as the Metaverse. In response to last quarter’s dismal results, Zuckerberg told investors the company’s new $1,500 virtual reality-powered Quest Pro headset will help employees work better than ever with regular computers. said it would help to
“Our work is historically significant and lays the foundation not only for a whole new way of how we interact with each other and integrate technology into our lives, but also for the long-term foundation of our business. I think it will be,” he told investors.
What Zuckerberg didn’t say is that policy watchers and industry representatives are already grappling with the thorny ethical and regulatory issues that would arise if a service like Quest Pro became popular. That was it.
One of the most vexing problems facing Meta and others is what to do with the detailed information it collects about users and their interactions in these immersive virtual spaces. Quest Pro improves on previous iterations of VR headsets by tracking the wearer’s eyeballs and facial muscles to allow them to express emotions through a virtual avatar.
Meta says that the face and gaze tracking features are completely optional, turned off by default, and images captured by the camera are processed by the device and then discarded.but as a colleague Jeffrey A. Fowler reportedMeta will continue to convert the user’s facial reactions into data and send it to some app makers who have asked for permission.
The XR Association, whose members include Meta, Microsoft and Google, said manufacturers should integrate privacy controls into their devices and make the public aware of how their information is being used. I’m here. Companies can control where they process the data they collect and hide images of people standing near people wearing AR or VR-equipped glasses or headsets, said XRA’s CEO. said. Elizabeth Hyman.
“Our fundamental approach to this is privacy by design. Make sure the consumer understands what the technology does,” Hyman said. “and give that consumer or user control” over how that data is used.
However, Samir Jain, policy director at the Center for Democracy and Technology, says that technology companies will inform users about how they are using the information they collect and give them the option to opt out of it. They argue that traditional models may not work in virtual reality. First of all, the data these devices can collect is much more intimate than the information collected through text- and video-based social media services, Jain said.
“That model becomes especially difficult when we’re talking about unconscious behaviors like heart rate and pupil dilation, which may not be conscious and may reveal emotions,” he said. “It can reveal inner feelings that are otherwise inexpressible or completely unaware. ”
Aside from several legislative efforts to study virtual and augmented reality, the issue has not risen to the top of Congress’ legislative agenda. However, both Jayne and Hyman acknowledged that regulators would eventually have to intervene if Zuckerberg was able to make the Metaverse dream a reality.
Twitter cracks down on China-based activity trying to sway US politics
Three China-based operations sought to covertly influence U.S. politics in the run-up to the midterm elections by amplifying polarizing topics. Jeremy B. Merrill, Joseph Meng and me reportThe operation involves about 2,000 accounts, some of which are said to be in the United States, complicit in high-profile topics such as allegations of rigged elections in 2020 and criticism of the transgender community. was
“The takedown of Twitter’s network, which primarily operated from April to October, comes at a stormy time for the social media giant, who was preparing to sell to a billionaire. Elon Musk And we faced continued scrutiny over how we crack down on disinformation ahead of next week’s midterm elections, when congressional political dominance looms,” my colleague wrote. Twitter further commented. did not respond to the request.
FTC pursues education technology firm Chegg for “negligent security.”
The Federal Trade Commission has accused the prominent provider of educational software of lax cybersecurity practices that led to a data breach exposing the personal information of tens of millions of users. Natasha Singer reportChegg has agreed to implement a comprehensive data security program to resolve claims, the FTC said.
“The FTC’s enforcement action against Chegg amounts to a wake-up call for the U.S. education technology industry,” Singer wrote.
Coming months after the FTC unanimously warned An educational technology company that illegally monitors students and implements weak cybersecurity programs.His May investigation by Human Rights Watch found Many educational tools are designed to send data to advertising companies, and few tell parents how their data is used.
Use of TikTok by US politicians calls into question the app’s readiness for misinformation
The growing presence of politicians in apps indicates that apps could play a bigger role in future elections. And social media and national security experts are concerned that the app isn’t as equipped to spot disinformation as other social networks. Cat Zakshevsky, Naomi Nicks When Taylor Lorenz report.
“Alliance to Secure Democracy, United States — a non-profit organization that investigates foreign efforts to interfere in democratic institutions,” wrote a colleague of mine.
- Democrats are more likely to embrace TikTok, according to the report, with 34% of candidates for the Senate, House, Governor and Secretary of State having TikTok accounts. I have.
- Politicians are still learning how best to use the app, according to a review of posts on those accounts. ,” wrote my colleague. “Some people encourage young people to vote.”
TikTok announced new policies and initiatives ahead of the midterm elections, including adding political content labels and directing users to election centers. We take our responsibility to protect the integrity of our website very seriously,” said spokesperson Ben Rathe. “We continue to invest in our policy, safety and security teams to combat election misinformation and verify the accounts of U.S. politicians.”