Site icon Read Fanfictions | readfictional.com

Meta AI Glasses: Highly private scenes end up with subcontractors | News

When Google presented the AR headset “Google Glass” as a prototype in 2012, data protection advocates expressed extensive fears that the integrated camera could constantly record the wearer’s surroundings and transmit them to the manufacturer’s servers. Fourteen years later, a similar model, the Meta Ray-Ban Display, is freely available – and the concerns expressed at the time are apparently still valid: an investigative report by the Swedish newspaper “Svenska Dagbladet” shows that images and videos end up on the screens of “data annotators”. Some of the content is extremely private in nature.

The journalists spoke to thirty employees of the company Sama, which is based in Kenya and evaluates video and audio material on behalf of Meta. What the data annotators do is essential for the reliability of meta-AI – by confirming or contradicting the AI ​​assumptions, the capabilities of the model used are optimized. But users are unlikely to be aware of which scenes are used for the further development of meta-AI. Employees report scenes that are clearly too private to be used for AI training: occasionally they have to witness people undressing, going to the toilet, or being intimate with one another.

Impossible to turn off
In Sweden, journalists are putting things to the test and trying to stop the transmission of data for training purposes. However, this seemed impossible to them: The smartphone app for controlling the Meta AR glasses allowed them to stop communication with the manufacturer’s servers, but then the AI ​​no longer provided any answers. If you want artificial intelligence, users must agree that data will “occasionally” be used for AI training. When this happens, or what these were, remains hidden. The reports from the Kenyan data annotators include recordings in which no AI interaction takes place.

Contradiction to GDPR?
Under the General Data Protection Regulation, manufacturers are required to treat information in accordance with EU data protection rules. Meta explains that this does not mean that the data collected in this way has to remain in the EU, but simply requires it to be processed and processed in accordance with EU rules. The company did not respond to the question of whether and how this is guaranteed for subcontractors in low-wage countries. Former Meta Corporation employees confirmed that intimate details are excluded from AI training. People’s faces and private details such as credit card numbers should be automatically obscured and overly intimate interactions should be filtered out. But this automatism regularly makes mistakes, according to the ex-Meta employees.

Source link

Exit mobile version