ICO is encouraging developers to consider privacy at an early stage when implementing new technologies and, in the first annual Technology horizon report, examines the implications of some of the most important technological developments for privacy in the next two to five years. The report follows the commitment made in the new ICO25 Strategy for ICOs to adjust their views on emerging technologies with the aim of reducing the burden on businesses, supporting innovation and preventing harm.
The Technologies
The ICO report focuses on applied technologies that may impact privacy in the future, rather than on the underlying technologies (such as AI or 5G) that are built upon. The four applied technologies identified are:
- Consumer health technology: this includes wearable devices and software applications that help people assess their health and well-being (including electronic components in smart fabrics). The report distinguishes it from digital medical or therapeutic devices (Dtx), which are not protected because it notes that the product may need to be studied and maintained more during the process of being designated as a medical device.
- Internet of Things (IoT): these are physical objects that connect and share information and can also sense, respond/interact with the external environment. While IoT is not new, it is evolving. Edge computing and better hardware, software and interoperability will all enable the next generation of IoT to respond to people’s needs in real time.
- Immersive technology: while this covers a wide range of applications (called ‘extended reality’, or ‘XR’), the report focuses on augmented and virtual reality. This includes the hardware (headset etc.) that creates an immersive experience for the user.
- Decentralized finance (DeFi): DeFi refers to a financial system that removes centralized intermediaries from transactions and financial products/services. The report looks at software that uses blockchain technology to support peer-to-peer financial transactions and notes that personal information is often included in public transaction records.
However, it is worth noting that when it came to selecting the most privacy-impacting technologies, the ICO found that the “most important” was neurotechnology. This fifth area will be the subject of the ICO ‘deep dive’ report in Spring 2023 (which will be similar to other deep dive reports).
Other areas that, while still important for privacy, are considered to have a less immediate impact are quantum computing, digital transportation, generative AI, synthetic media, digital ID and behavioral analytics. The report provides a brief description of each of these, and in the case of behavioral analysis, reminds that the ICO warns organizations to assess the risks of using these technologies (see our blog).
Common Challenges
As well as noting the important opportunities presented by these different technologies, the ICO highlights some common challenges:
- Lack of transparency and control: Many technologies collect personal information in a way that is not always transparent and people may not have meaningful control over it. For example, an augmented reality device may capture information about a third party (other than the intended user) or a healthtech application may give a third party access to the data it collects (for example, if it uses a software development kit that allows people to log on through another account. platform but gives the platform access to the data without explaining it to the user).
- A complex data ecosystem affects people’s understanding/ability to exercise their rights: Complex ecosystems can make it difficult for people to understand how organizations process information (and continue to account) and organizations must ensure that people can still use information rights. Some technologies may collect more information than is necessary for their primary purpose. For example, people can be tracked through consumer health technology or virtual reality devices in ways that are not transparent or necessary for the primary purpose.
- Sensitive data requires additional safeguards: Some technologies collect information about sensitive personal characteristics that require additional safeguards. Organizations need to know when this information is classified as special category data and implement appropriate measures. For example, health technology organizations need to think carefully about whether they are collecting biometric or health data, and this may be influenced by the source of the information (eg advice to exercise more by a doctor or exercise wearable?). Organizations should also be aware that data they observe or infer that are linked to people may be personal information. For example, information collected by an IoT device and linked to a person’s account indicating that an application is turned on and off can provide information about that person’s location.
Challenges were also identified regarding the accuracy of conclusions drawn by some devices, bias, data minimization and cyber security as well as some technology-specific challenges. An example of the latter would be the potential difficulties involved in exercising the right of rectification and deletion if information is held on the blockchain. For a more detailed discussion of this point, see us March from the Block: GDPR and Blockchain white paper.
What can the organization do?
Privacy by design is key, and the report notes that some organizations are exploring new and innovative ways to design privacy into the design of the technology. An example given is manufacturers embedding redaction technology into augmented reality devices to minimize unintentional processing of information about viewers. The report also lists specific steps that organizations can take. For example, remind organizations that they must:
- carry out a data protection impact assessment where necessary, and try to minimize data collection (for example, if the IoT is always ‘on’ and collecting information);
- revisit the privacy notice to make sure it’s still fit for purpose – for example, a long, written policy designed for a 2D environment won’t work in 3D;
- keep on top of the various reports and guidance issued by the ICO – for example, where AI is involved, read the ICO’s various AI guidance. Also, if a child can use the technology, see the Child code. New guidance, research and reports are also expected in areas such as IoT and neurotechnology;
- make sure they comply with the new Telecommunications Product and Infrastructure Security Act as it introduces basic security standards for IoT devices.
- consider applying for the Sandbox ICO scheme if you need support in projects related to the technology in the report.
Some organizations are exploring new and innovative ways to design privacy into the design of these technologies…. Other organizations fail to imagine how privacy can be engineered into their ideas. [The ICO] will not allow businesses that do the right thing to be defeated by businesses that fail to comply with data protection laws (ICO Tech Horizons Report)