For those who aren’t aware, Microsoft have decided to bake essentially an infostealer into base Windows OS and enable by default.
From the Microsoft FAQ: “Note that Recall does not perform content moderation. It will not hide information such as passwords or financial account numbers."
Info is stored locally - but rather than something like Redline stealing your local browser password vault, now they can just steal the last 3 months of everything you’ve typed and viewed in one database.
Recall uses a bunch of services themed CAP - Core AI Platform. Enabled by default.
It spits constant screenshots (the product brands then “snapshots”, but they’re hooked screenshots) into the current user’s AppData as part of image storage.
The NPU processes them and extracts text, into a database file.
The database is SQLite, and you can access it as the user including programmatically. It 100% does not need physical access and can be stolen.
And if you didn’t believe me.. found this on TikTok.
There’s an MSFT employee in the background saying “I don’t know if the team is going to be very happy…”
They should probably be transparent about it, rather than telling BBC News you’d need to be physically at the PC to hack it (not true). Just a thought.
I ponder if Microsoft's engineers are following the SQLite code of ethics, since they're using it in Windows OS with Copilot+ Recall? :D https://sqlite.org/codeofethics.html
So the code underpinning Copilot+ Recall includes a whole bunch of Azure AI backend code, which has ended up in the Windows OS. It also has a ton of API hooks for user activity monitoring.
Apps themselves can also search and make themselves more searchable.
It opens a lot of attack surface.
The semantic search element is fun.
They really went all in with this and it will have profound negative implications for the safety of people who use Microsoft Windows.
You deal with a sensitive matter on my Windows PC. E.g. an email you delete. Does Copilot Recall still store the deleted email?
Answer: yes. There's no feature to delete screenshots of things you delete while using your PC. You would have to remember to go and purge screenshots that Recall makes every few seconds.
If you or a friend use disappearing messages in WhatsApp, Signal etc, it is recorded regardless.
It comes up a lot as people are rightly confused, but if you wonder what problem Microsoft are trying to solve with Recall:
It isn't them being evil, it's business leaders who are middle aged and can't remember what they're doing driving decision making about which problems to solve.
A huge amount of business leaders are dudes who have no idea what the fuck is happening. This leads to the Recall feature.
Managed to find out how BBC News printed in a headline story that it was not possible to steal Recall data without being physically at the device (which is false) - this is from the journalist:
Just to clarify, I can access it without SYSTEM too. Microsoft are about to set cybersecurity back a decade by empowering cyber criminals via poor AI safety. Feature ships in a few weeks.
The latest Risky Business episode on Recall is good, but one small correction - it doesn’t need SYSTEM rights.
Here’s a video of two MSFT employees gaining access to the Recall database folder - with SQLite database right there. Watch their hacking skills. (You don’t need to go this length as an attacker, either). Cc @riskybusiness
I’m not being hyperbolic when I say this is the dumbest cybersecurity move in a decade. Good luck to my parents safely using their PC.
this is the out of box experience for Windows 11's new Recall feature on Copilot+ PCs. It's enabled by default during setup and you can't disable it directly here. There is an option to tick "open Settings after setup completes so I can manage my Recall preferences" instead.
You allow BYOD so people can pick up webmail and such. It’s okay, because when they leave you revoke their access, and your MDM removes all business data from the machine ✅
What the employee does: opens Recall, searches their email, files etc and pastes the data elsewhere.
Nothing is removed from Recall, as it is a photographic memory of everything the former employee did.
Recent DHS published report handed to the US President which said it had "identified a series of Microsoft operational and strategic decisions that collectively pointed to a corporate culture that deprioritized enterprise security investments and rigorous risk management"
Microsoft: let’s use AI to screenshot everything users do every 5 seconds, OCR the screenshots, make it searchable and store it in AppData!
I went and looked at YouTube for Recall to get out of the echo chamber and I can only find one positive video. Even the people at the event are slating it, including people with media provided Copilot+ PCs.
There’s some content creators who’ve realised it records their credit cards, so they’re making videos of their cards going walkies.
It’s going to be interesting to see how Microsoft get out of this one. They may have contractual commitments to ship Recall with external parties.
I thought they were risking crashing the Copilot brand with this one, but I was wrong looking at the videos and comments on them - I think they’re crashing the Windows consumer brand.
The reaction to photographic memory of what people do at home has - you’ll be surprised to know - not been seen as a reason to buy a device, but a reason why not to.
The funding isn't restricted to tech companies. In 2018, anti-democracy donors suddenly decided AI was the next big thing. Recall's snapshots are a data-gathering tool for CoPilot AI.
Noted GOP megadonor to Trump, Stephen Schwarzman funded MIT's new AI faculty in 2018.
Sure, and it's an coincidence that the WaPo ousted it's editor in favor of someone who wants TuckerKarlson op-eds.
I suppose you believe it was pure incompetence that drove Musk's management of Twitter into the shitter.
Rupert Murdoch marries his ruZZian handler, nothing to see here.
But co-pilot's creation has nothing to do with the billions of autocratic petro-dollars being pumped into Microsoft. Your not trying nearly hard enough to stick your head in the sand.
Microsoft has been declining to comment on criticism of Recall for a week - but they have apparently told a journalist off the record at Future that changes will be made before Copilot+ devices drop in the coming days.
This may include an attempt to invalidate researcher criticism, we’ll see.
I hadn’t been aware until today of the external reaction to Recall. Holy shit. Tim Apple must be pleased.
Everything from media coverage to YouTube to TikTok is largely negative. All the comments are negative.
These videos have tens of millions of views and hundreds of thousands of comments.
I knew it would be bad but.. it’s worse. I’ve spent hours looking at the sentiment and.. well, they probably would have got better coverage from launching an NFT of pregnant Clippy.
@GossiTheDog It depends on the definition of the word "access", I guess. Microsoft probably meant that, as a user X, you can't "recall" what user Y saw. The SQL database is per-user. But if one user can access (read) another user's files (e.g., by having Admin rights), he can access that other user's SQL database too.
Three Copilot+ Recall questions that keep coming up.
Q. Can you alter the Recall history?
A. Yes. You can change the OCR database and change the screenshots as the logged in user or as software running as the local user. There is no audit log of changes.
Q. Are they snapshots, as Microsoft says, or screenshots?
A. They are just screenshots, jpegs.
Q. What is to stop apps on your machine accessing your Recall covertly?
A. Nothing. There is no audit log of access.
@Jestbill@gangrif@GossiTheDog@hacks4pancakes Yes, if they have remote access as the user. But that's not really relevant here - I was pointing out that you can easily delete all the accumulated data by turning the feature off - no need for a special program.
If anybody is wondering what Microsoft's reaction to any of the Copilot+ Recall concerns are, they're continuing to decline comment to every media outlet.
I've seen comments MS staff have been given for enterprise customers, which are nonsense handwaving.
Microsoft are making significant changes to Recall, including making it specifically opt in, requiring Windows Hello face scanning to activate and use it, and actually encrypting the database.
There are obviously going to be devils in the details - potentially big ones.
Microsoft needs to commit to not trying to sneak users to enable it in the future, and it needs turning off by default in Group Policy and Intune for enterprise orgs.
Obviously, I recommend you do not enable Recall, and you tell your family not to enable it too.
It’s still labelled Preview, and I’ll believe it is encrypted when I see it.
There are obviously serious governance and security failures at Microsoft around how this played out that need to be investigated, and suggests they are not serious about AI safety.
I should be transparent btw that I took Satya and Charlie’s commitment to security at face value too - I even published a blog on it backing that up - and I have concerns (it isn’t just me).
They’re now going to have to win trust back about winning trust back.
A reminder that a few weeks ago at RSA, Microsoft signed CISA's Secure By Design pledge... and then shipped an enabled by design keylogger that OCRs your screen constantly into AppData.
Edit: I should say that's less a reflection on Microsoft and more a reflection on CISA's Secure By Design pledge.. it's a good idea, but the scope is extremely limited.
I think MS are a way off extracting themselves from Recall situation they've got themselves into.
This is just one YouTube comments section on a video since the not-enabled-by-default change - 500k views - but there's loads more, similar on TikTok.
I imagine it's going to continue through week and into next week when the laptops ship.
I have heard rumblings MS are discussing trying to take action against me over the whole thing, which a) good luck and b) would be pouring petrol on the flames.
I'm hearing that various MSFT people are furious about how this played out over the past few weeks, which IMHO represents a serious lack of introspection.
If anybody is wondering, with a Copilot+ PC, you can still programmatically access the Recall database as of today with a few commands. Launch is a few days away.
In this bit he talks about Recall (not named), where he pats himself and Microsoft on the back for “a feature change” and job well done.
Given it has been a complete cybersecurity and privacy car crash - and as of today the changes (plural) they’re referring to haven’t even been implemented - it seems like Microsoft fails to grasp customer needs: safety.
One other thing - Microsoft's written testimony to the US House says, quoting, bolded by MS:
"Before I say anything else, I think it’s especially important for me to say that Microsoft accepts responsibility for each and every one of the issues cited in the CSRB’s report. Without equivocation or hesitation. And without any sense of defensiveness."
I should say that if Brad is asked about Recall tomorrow, the answers may raise some.. uh... eyebrows here.
I don't know what MS SLT have been told, but expect fun when the feature drops on consumer laptops in a few days.
As I mentioned in my blog, there is some more security hardening there on Copilot+ PCs (this was before MS put out their blog)... but it's still easily bypassable.
eight times describing various ways in which the Biden administration is focusing on security, working to improve computing security, and "deserve immense credit" for their public focus on cybersecurity.
and one, at the end, criticising their "relative silence" on Microsoft Recall -- relative to other corporate "flawed strategies and harmful practices" that they did call out.
Microsoft President Brad Smith just testified to the US House that Recall is a good example of Secure By Design, and that they have the time to get it right (it’s supposed to launch in 3 working days).
Brad Smith just said Recall was designed to be disabled by default. That is not true. Microsoft’s own documentation said it would be enabled by default - they only backtracked after outcry.
He has somehow got almost every detail about Recall wrong while testifying.
I've been back and rewatched the Recall footage at the US House hearing and I just don't get it, Brad Smith representing Microsoft basically did this about Recall's security.. he had no challenge from the Senators as they didn't know any details.
I’m being told Microsoft are prepping to fully recall Recall. Another announcement is being prepped for tomorrow afternoon saying the feature will not ship on Copilot+ devices at launch as it is not secure.
Obviously, I’ll wait to see the announcement but it sounds like they’ve finally realised they need to take the time and get the feature right (and frankly consider the target audience - most home users, it ain’t).
They should have announced this before or during the US House hearing.
@GossiTheDog How much space does the whole thing waste? I get it that the SQL database with the texts is small - but what about the screenshots? If they are taken every 5 seconds, they probably waste an enormous amount of space...
@bontchev it uses at least 50gb of space. On a 512GB SSD, the default storage allocation will be 75GB, and if you have a 1TB SSD, the default space allocation will be 150GB.
@bontchev it's quite funny as it scales up too - so say with a 1tb drive, it's on by default and allocates 150gb - which is enough space for well over 6 months of screenshots. So, like, a lot of machines are going to have a lot of history.
@GossiTheDog Is there any information how exactly the data will be stored? If it is encrypted with a key stored in the TPM module, extracting it might not be trivial.
@bontchev it isn’t encrypted on most Windows 11 editions. Per Microsoft’s FAQ, it’s only encrypted on “Windows 11 Pro or an enterprise Windows 11 SKU”, if BitLocker is enabled. It’s just standard BitLocker, so if malware is running as a local user, data is accessible.