For those who aren’t aware, Microsoft have decided to bake essentially an infostealer into base Windows OS and enable by default.
From the Microsoft FAQ: “Note that Recall does not perform content moderation. It will not hide information such as passwords or financial account numbers."
Info is stored locally - but rather than something like Redline stealing your local browser password vault, now they can just steal the last 3 months of everything you’ve typed and viewed in one database.
Recall uses a bunch of services themed CAP - Core AI Platform. Enabled by default.
It spits constant screenshots (the product brands then “snapshots”, but they’re hooked screenshots) into the current user’s AppData as part of image storage.
The NPU processes them and extracts text, into a database file.
The database is SQLite, and you can access it as the user including programmatically. It 100% does not need physical access and can be stolen.
And if you didn’t believe me.. found this on TikTok.
There’s an MSFT employee in the background saying “I don’t know if the team is going to be very happy…”
They should probably be transparent about it, rather than telling BBC News you’d need to be physically at the PC to hack it (not true). Just a thought.
I ponder if Microsoft's engineers are following the SQLite code of ethics, since they're using it in Windows OS with Copilot+ Recall? :D https://sqlite.org/codeofethics.html
So the code underpinning Copilot+ Recall includes a whole bunch of Azure AI backend code, which has ended up in the Windows OS. It also has a ton of API hooks for user activity monitoring.
Apps themselves can also search and make themselves more searchable.
It opens a lot of attack surface.
The semantic search element is fun.
They really went all in with this and it will have profound negative implications for the safety of people who use Microsoft Windows.
You deal with a sensitive matter on my Windows PC. E.g. an email you delete. Does Copilot Recall still store the deleted email?
Answer: yes. There's no feature to delete screenshots of things you delete while using your PC. You would have to remember to go and purge screenshots that Recall makes every few seconds.
If you or a friend use disappearing messages in WhatsApp, Signal etc, it is recorded regardless.
It comes up a lot as people are rightly confused, but if you wonder what problem Microsoft are trying to solve with Recall:
It isn't them being evil, it's business leaders who are middle aged and can't remember what they're doing driving decision making about which problems to solve.
A huge amount of business leaders are dudes who have no idea what the fuck is happening. This leads to the Recall feature.
Managed to find out how BBC News printed in a headline story that it was not possible to steal Recall data without being physically at the device (which is false) - this is from the journalist:
Just to clarify, I can access it without SYSTEM too. Microsoft are about to set cybersecurity back a decade by empowering cyber criminals via poor AI safety. Feature ships in a few weeks.
The latest Risky Business episode on Recall is good, but one small correction - it doesn’t need SYSTEM rights.
Here’s a video of two MSFT employees gaining access to the Recall database folder - with SQLite database right there. Watch their hacking skills. (You don’t need to go this length as an attacker, either). Cc @riskybusiness
I’m not being hyperbolic when I say this is the dumbest cybersecurity move in a decade. Good luck to my parents safely using their PC.
this is the out of box experience for Windows 11's new Recall feature on Copilot+ PCs. It's enabled by default during setup and you can't disable it directly here. There is an option to tick "open Settings after setup completes so I can manage my Recall preferences" instead.
You allow BYOD so people can pick up webmail and such. It’s okay, because when they leave you revoke their access, and your MDM removes all business data from the machine ✅
What the employee does: opens Recall, searches their email, files etc and pastes the data elsewhere.
Nothing is removed from Recall, as it is a photographic memory of everything the former employee did.
Recent DHS published report handed to the US President which said it had "identified a series of Microsoft operational and strategic decisions that collectively pointed to a corporate culture that deprioritized enterprise security investments and rigorous risk management"
Microsoft: let’s use AI to screenshot everything users do every 5 seconds, OCR the screenshots, make it searchable and store it in AppData!
I went and looked at YouTube for Recall to get out of the echo chamber and I can only find one positive video. Even the people at the event are slating it, including people with media provided Copilot+ PCs.
There’s some content creators who’ve realised it records their credit cards, so they’re making videos of their cards going walkies.
It’s going to be interesting to see how Microsoft get out of this one. They may have contractual commitments to ship Recall with external parties.
I thought they were risking crashing the Copilot brand with this one, but I was wrong looking at the videos and comments on them - I think they’re crashing the Windows consumer brand.
The reaction to photographic memory of what people do at home has - you’ll be surprised to know - not been seen as a reason to buy a device, but a reason why not to.
Microsoft has been declining to comment on criticism of Recall for a week - but they have apparently told a journalist off the record at Future that changes will be made before Copilot+ devices drop in the coming days.
This may include an attempt to invalidate researcher criticism, we’ll see.
I hadn’t been aware until today of the external reaction to Recall. Holy shit. Tim Apple must be pleased.
Everything from media coverage to YouTube to TikTok is largely negative. All the comments are negative.
These videos have tens of millions of views and hundreds of thousands of comments.
I knew it would be bad but.. it’s worse. I’ve spent hours looking at the sentiment and.. well, they probably would have got better coverage from launching an NFT of pregnant Clippy.
Three Copilot+ Recall questions that keep coming up.
Q. Can you alter the Recall history?
A. Yes. You can change the OCR database and change the screenshots as the logged in user or as software running as the local user. There is no audit log of changes.
Q. Are they snapshots, as Microsoft says, or screenshots?
A. They are just screenshots, jpegs.
Q. What is to stop apps on your machine accessing your Recall covertly?
A. Nothing. There is no audit log of access.
If anybody is wondering what Microsoft's reaction to any of the Copilot+ Recall concerns are, they're continuing to decline comment to every media outlet.
I've seen comments MS staff have been given for enterprise customers, which are nonsense handwaving.
Microsoft are making significant changes to Recall, including making it specifically opt in, requiring Windows Hello face scanning to activate and use it, and actually encrypting the database.
There is obviously going to be devils in the details - potentially big ones.
Microsoft needs to commit to not trying to sneak users to enable it in the future, and it needs turning off by default in Group Policy and Intune for enterprise orgs.