VB Transform 2024 returns this July! Over 400 enterprise leaders will gather in San Francisco from July 9-11 to dive into the advancement of GenAI strategies and engaging in thought-provoking discussions within the community. Find out how you can attend here.
Microsoft has announced major changes to its recently unveiled AI-powered Recall feature, part of the new line of Copilot+ PCs, in response to blistering criticism from security researchers about potential privacy risks. The company said it would make the feature opt-in, require biometric authentication to access stored data, and add additional layers of encryption.
Introduced last month, Recall was touted as a groundbreaking capability that would automatically capture screenshots as users worked, enabling them to search their computing history using natural language queries. But security experts quickly raised red flags, warning that the feature’s vast data collection and lack of robust protections created serious privacy and security vulnerabilities.
In a blog post, Pavan Davuluri, Microsoft’s Corporate Vice President for Windows + Devices, acknowledged the “clear signal” from critics that the company needed to strengthen safeguards and make it easier for users to choose whether to enable Recall. The changes, which will be implemented before the feature’s public release on June 18, include:
- Making Recall opt-in during PC setup, with the feature turned off by default
- Requiring Windows Hello biometric enrollment and “proof of presence” to view the Recall timeline and search its contents
- Adding “just in time” decryption of the Recall database protected by Windows Hello Enhanced Sign-in Security (ESS)
- Encrypting the search index database
The additional encryption is particularly notable, as it should make it significantly harder for attackers or unauthorized users to access the potentially sensitive data captured by Recall even if they gain access to the database. Stored screenshots will now be double encrypted and only decryptable with the authenticated user’s biometrics on their enrolled device.
VB Transform 2024 Registration is Open
Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now
Critics, including notable cybersecurity firms and privacy advocates, argued that the persistent storage and processing of screen captures could become a target for malicious actors. The outcry reached a peak when an investigative report by BBC highlighted vulnerabilities that could potentially be exploited to access sensitive information without adequate user consent.
Responding to the criticism, Microsoft published a blog post on their Windows Experience Blog detailing their decision to make Recall an opt-in feature during its preview phase. “Privacy and security are paramount,” stated the post, emphasizing that the company is taking steps to reassess the feature’s impact on user privacy.
The future of Recall: Balancing innovation with user trust
The decision to make the feature opt-in has been met with mixed reactions. Some industry analysts commend Microsoft for taking swift action in response to user feedback. “Turns out speaking out works,” said Kevin Beaumont, a cybersecurity researcher in a post on X.com. “Microsoft are making significant changes to Recall, including making it specifically opt in, requiring Windows Hello face scanning to activate and use it, and actually trying to encrypt the database they say.”
On the other hand, some users express disappointment, having anticipated the convenience promised by Recall. “In all seriousness, I’ve seen zero positivity about Recall (the Windows feature which takes screenshots every 5 seconds), which leads me to believe no-one thinks this is a good feature,” said Dr Owain Kenway in a post on X.com. “But is there a secret undercurrent of pro-Recall users embarrassed into silence?”
Microsoft has committed to a thorough review and revision of Recall’s security measures. According to their press release, the company plans to conduct extensive testing with selected users who opt into the preview post-review to gather more data and refine the feature’s security framework.
This incident underscores the delicate balance tech companies must maintain between innovating with cutting-edge AI technologies and ensuring the privacy and security of their users. It also highlights the growing role of public and expert scrutiny in shaping the development and deployment of new technologies in the digital age. As Microsoft navigates these challenges, the tech community and its users will undoubtedly keep a close watch on how Recall evolves and how it might set precedents for future AI integrations in consumer technology.