Microsoft’s new AI feature “Memory” for Windows 11, which takes screenshots of activities, already caused an uproar when it was announced. Concerns were raised that questioned Microsoft’s promise of encryption and data protection. Now a security researcher has uncovered serious security gaps in the function and describes it as a “security disaster”.

Beaumont, a renowned cybersecurity expert, tested the feature and found that user data is stored unprotected in a SQLite database. This database contains detailed records of all activities on the PC. As the data is stored in the user folder, it is easily accessible to hackers using malware. They could steal the data and infiltrate the systems.
Beaumont warns that this could give cybercriminals immense power and lead to an “AI-powered super-GAU”.
Microsoft’s encryption promise called into questionMicrosoft assures that data is protected from remote access through local storage and encryption on the device. However, Beaumont questions this claim. He explains that the database is also accessible without administrator rights and that the data can be exfiltrated remotely via the AppData files.
To substantiate his concerns, Beaumont exfiltrated data from the Recall database and developed a website. This website allows users to upload the database and search for information. Beaumont is withholding the technical details of the automated infiltration process of the Recall database for now. He is waiting for a response from Microsoft before publishing this information.
In view of the potential security problems, Beaumont recommends pausing the introduction of the feature for the time being. Microsoft has not yet responded to the security and data protection concerns. However, the recall feature is optional and can be disabled by users who feel uncomfortable with it. Regardless of the concerns, Copilot PCs with the AI-based recall feature are due to be launched on June 18.
While the “Recall” feature in Windows 11 offers certain benefits, it also poses significant security risks. Users will have to decide for themselves whether they want to accept the potential benefits of the feature with the privacy concerns. It is advisable to be aware of the risks and deactivate the function if necessary.
Source: doublepulsar
13 Antworten
Kommentar
Lade neue Kommentare
Urgestein
Urgestein
Veteran
Mitglied
Urgestein
Mitglied
Mitglied
Mitglied
Mitglied
Urgestein
Mitglied
Veteran
Mitglied
Alle Kommentare lesen unter igor´sLAB Community →