Neuralink, The “Mind Reading” Chip – A Risk Assessment

This insight provides a risk assessment of Neuralink's BCI technology, mainly focusing on data privacy risks and the importance of informed consent throughout the technology's entire life cycle.

Neuralink is an invisible brain implant that promises to “restore autonomy for those with unmet medical needs”, mainly individuals with neuromuscular disabilities, recording brain activity for that purpose. Neuralink employs Brain-Computer Interface (BCI) technology, which uses brain signals generated by the central nervous system. This data is wirelessly transmitted to a computer, analysed, and translated into commands by electrically stimulating brain activity, generating an output corresponding to the user’s intentions. The company claims that this technology is capable of curing disorders such as Alzheimer’s, Parkinson’s, depression, as well as motor disabilities. The company has been successfully performing multiple human clinical trials. Even though this represents a valuable advance in science, specifically in medicine, as it can offer a superior quality of life to people with specific disabilities and other conditions, it is essential to analyse the potential risks this technology may have, mainly in terms of data privacy, security, and ethics. This risk assessment will focus on these key areas.

The main legal issue raised by this technology relates to data privacy. As the device has access to brain activity (sensitive data), it is crucial to know how this data will be used, who will have access to it, and what plans are in place to guarantee protection from personal data breaches and hacking – this is an issue that should be taken into account, especially during testing and introduction to society phases, as well as at the end of the technology’s use. Other issues regarding informed consent arise within the legal domain, both during the testing phase (clinical human trials) and the introduction of the technology to society. However, this risk assessment focuses solely on the European framework, only analysing the issues arising regarding informed consent before the technology’s introduction to society on a European level. It does not address the clinical human trials as these occur in the United States (US) and are subject to US law.

As this type of processing uses new technologies and is likely to result in a high risk to the rights and freedoms of natural persons, the data controller shall, before the processing, carry out an assessment of the impact of the processing operations on the protection of personal data according to Article 35 of the General Data Protection Regulation (GDPR). Firstly, the processing of personal data must respect the Principles expressed in Article 5 GDPR and be lawful, according to at least one of the prerequisites set out in the subparagraphs of Article 6 (1) GDPR. Regarding data privacy (a fundamental right according to the Universal Declaration of Human Rights (Article 12), the European Convention on Human Rights(Article 8), and the European Charter of Fundamental Rights(Article 7)) and data security, the GDPR states that the processing of special categories of personal data is lawful when the data subject has given explicit consent to the processing of said data, except where Union or Member States laws prohibit that exception (Article 9 (1) and (2), subparagraph a), and (4) GDPR). According to Article 12 GDPR, the data controller must take the appropriate measures to provide the information referred to in Article 13 GDPR (as the data is collected directly from the data subject), as well as any communication relating to the processing of the data in a concise, transparent, intelligible, and easily accessible form, using clear and plain language (Articles 15 to 22 and Article 34 GDPR).

Concerning data breaches, it is likely that they will result in a high risk to the rights and freedoms of natural persons, so the controller shall communicate such data breaches to the data subject without undue delay (except when the exceptions of Article 34 (3) GDPR apply). It is also crucial to ensure that the technology incorporates data protection measures, like privacy and security measures, into the design and operation of its system (Article 25 GDPR). Some measures regarding processing security are non-exhaustively typified in Article 32 GDPR. By the end of the technology’s life cycle, the Right to Erasure is also vital, as the data subject has the right to request the erasure of their personal data from the data controller (Article 17 GDPR).

This technology must also respect Regulation 2017/745 of the European Parliament and of the Council of 5 April 2017 on Medical Devices, as it falls within the subject matter and scope of Article 1. It is also relevant to comply with the European Medicines Agency’s guidelines regarding compliance standards and obligations for the technology developers. The Convention on Human Rights and Biomedicine of the Council of Europe of 4 April 1997 (Oviedo Convention), regarding the protection of human rights and dignity in the context of medicine, is a particularly relevant legal diploma, especially regarding the primacy of the human being, professional standards, consent, private life, Right to Information, and scientific research (Chapter I, Articles 1, 2 and 4; Chapter II; Chapter III; and Chapter V of the Oviedo Convention, respectively).

Neuralink also incorporates Artificial Intelligence (AI) in its development and functioning for data interpretation, converting neural patterns into physical/motor actions, and algorithmic optimisation, as it uses machine learning to improve the technology’s performance and refine data accuracy and interpretation, among other purposes. Therefore, the recently in force AI Act (1 August 2024) must also be considered, along with the Ethics Guidelines for Trustworthy Artificial Intelligence, from the European Commission’s High-Level Expert Group on AI. The AI Act is relevant for this matter especially in what concerns High-Risk AI Systems (Articles 6 and 7(b)), which must comply with the requirements established in Section I of Chapter III; Data & Data Governance (Article 10); Transparency and Provision of Information to Deployers (Article 13); Accuracy, Robustness and Cybersecurity (Article 51); Post-Market Monitoring (Article 72); as well as Liability of the Provider in case of harm caused by the technology (Article 60, (9)), and Ethical Guidelines regarding AI (Article 95, (2) b). Overall, the existing framework can adapt to this technology and its application. Nonetheless, assuring the Right to Data Portability (Article 20 GDPR) would also be relevant in the event of a device change, to allow the transfer of collected data, especially the one that is important for machine learning, to a new device and not only to another data controller as currently prescribed by law. When it comes to data breaches, one of the appropriate mitigation measures would be data encryption, making it unintelligible to anyone who is not authorised to access it (as suggested in Article 34 (3), subparagraph a) GDPR and Recital 69 of the AI Act). Also, considering the data subject’s vulnerability and having their data privacy and security as a goal, it is vital to resort to Health Information Technology. Finally, to address potential issues with informed consent, it is indispensable to ensure that patients receive enough information to give informed consent. This information should be transparent and include details of the data collected and how it will be used, and potential risks associated with the device’s implementation, usage, and removal.

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.