Microsoft have recently launched a new feature with its Co-Pilot AI software called “Microsoft Recall”. Recall will constantly take can search through all users’ past activity including files, photos, emails and browsing history. Now that in of itself isn’t a major issue as many devices can currently do this, however Recall also takes screenshots every few seconds and searches these too.
This means that everything a user does in their Microsoft account on a device gets captured and stored. Just think about everything you have accessed just today. All of that would have been (or may have been if you have it installed already) captured in the Recall system and storage area.
Microsoft have issued various statements outlining the benefits of the system, how it works to ensure it minimises security and privacy risks and how it is being enhanced and secured all the time.
The Information Commissioner (ICO) is reported by the BBC to be in contact with Microsoft for more information on the safety of the product. This must be a serious issue for the ICO, they cannot just accept the usual platitudes of ‘not wanting to stop innovation’. This is something that just presents too many threats for what is very little reward.
There are therefore some things that organisations need to be aware of and act upon urgently.
- These images will capture any personal and/or confidential data your users see. The whole point of them is to help the user get access to something they need they have lost or forgotten. Therefore it is extremely likely, if not certain, that the Recall system will be capturing and storing Personal Data.
- The system will take duplicates of everything is sees, but clearly won’t (and can’t as its just a screenshot) bring with it any of the controls placed on that data. So, no retention controls, security labels, metadata, access controls, etc – nothing.
- This means you’ll need to search it if you receive a Data Subject Access request, you’ll need to manually delete it where legally required to do so and you’ll need to ensure it is protected, otherwise that is a goldmine of data for someone to steal.
- It is highly likely this will require a Data Protection Impact Assessment (DPIA) and this can be incredibly useful to you in framing what some of the threats are, and if it is worth it for the limited benefit such an app can bring.
- On the topic of security, a number of Cyber Security experts have also pointed out several security flaws and issues with the system. Many issues of which Microsoft seem to rebuke but have been presented with some compelling evidence.
- The advice to Data Controllers (those responsible for the Personal Data they process) is to check to see if Co-Pilot is installed on your system and if so, turn this off. This is on by default with Co-Pilot so you’ll need to act at enterprise level to ensure this is turned off. Once it is off, seek advice from your Data Protection and IT Security teams for what issues this specifically presents your organisation and network.
Get in touch for more tailored advice and support via the details below.