The Responsibility of Technology Developers: Protecting User Data in Brain-Machine Interfaces

Neurological data from Brain-Machine Interfaces (BMIs) raises privacy concerns. Companies must protect data, follow regulations, and ensure ethical use for societal benefit.

Neurological data is directly gathered from brain waves using technology capable of doing so, such as the Brain-Machine Interface (BMI) – a system that connects the brain to an external device by capturing electrical stimuli released by neurons, which can be captured by invasive or non-invasive methods. The non-invasive method involves the use of external equipment to capture neural activity using electrooculograms (EOG) or electroencephalograms (EEG), which require the placement of electrical sensors on the scalp or face to record this activity.

The invasive method requires surgical intervention and is thus far more complex, but it is more effective at capturing electrical inputs since the electrodes are placed directly in the subject’s brain. Both types of BMI have the potential to help people with physical disabilities or neurological illnesses operate wheelchairs, prosthetic limbs, or other external devices, or to communicate using a computer, all of which would significantly enhance their quality of life.

However, the privacy and security of the data collected through these interfaces raise concerns, even though BMI offers several advantages given that the technology collects incredibly sensitive data that can be used to identify the subject.

This in a way generates a legal obligation of the service provider to ensure a strong security system that guarantees the protection of subjects’ data in accordance with the General Data Protection Regulation (GDPR).

Regarding the GDPR, it is crucial to note that the data collected by such technology could be interpreted under both Article 4 (1) and Article 9 (1) because, in addition to being personal, the data can be classified as sensitive considering the fact that interpreting the electrical stimuli provided by the brain creates the possibility of discovering, for example, the subject’s political orientation, a fact that has already been the subject of a test where it was proven that, based on the variation of the alpha, beta and gamma frequencies collected by the EEG, it is possible to infer political preferences, which is classified as sensitive information.

Additionally, artificial intelligence is used to manage the entirety of this system for gathering data, processing it, and producing an output. According to Article 7 (1) b of the AI Act[ 1] , BMI technology can be classified as high risk because it may poses risk of harm to the health and safety, or a risk of adverse impact on fundamental rights, if the data is leaked or if any problem in the interpretation of the data causes the AI to misunderstand something and harm the subject. As such, any company using this technology will be under the obligation to strictly follow Chapter 2, which presents the guidelines that must be complied with in case of AI classified as high risk.

In fact, it is important that the company developing this technology be very careful in how it programmes the AI to consider the sensitivity of the data that will be collected, processed and stored, because eventually this data could be used to manipulate the subject, making the use of such AI prohibited under Article 5 of the AI Act.[SSPC2] 

Companies developing this technology can ensure compliance with the applicable legal requirements by clearly explaining to the subject the risks faced in the event of a data leak. This could involve explaining the system’s cybersecurity features or preparing a contract with the relevant information, to be able to prove that the subject was warned of the risks. Furthermore, companies must anonymise, pseudonymise or encrypt the data collected to increase protection.

It is important to note that this technology has already been applied. One example is a wheelchair model that employs a hybrid EOG-EEG system that enables the user to control the wheelchair’s speed, direction, and movement through different “steering behaviours”. The user only needs to focus on the desired image depicted on the monitor device attached to the wheelchair to respond to this stimulus. It should be emphasised that, in addition to this study, there is also the BrainGate project, which has offered ground-breaking solutions to help people regain communication, mobility, and independence after losing these abilities as a result of neurological disorders, accidents, or limb loss.

Conclusion With the rapid advancement of technology, the brain-machine interface has emerged as a crucial instrument for enhancing the quality of life of those who suffer from physical impairments or neurological illnesses. However, collecting and maintaining sensitive

neurological data poses a difficult cybersecurity and privacy challenge, making it companies’ responsibility to implement new technologies allowing them to guarantee a high level of data protection compliant with the GDPR and the AI Act.

The biggest problem identified is that, given the personal nature of neurological data, if leaked, this data can allow access to other personal information such as personality traits, cognitive skills and preferences, among others, enabling criminals to replicate with a certain degree of accuracy an online persona of the person whose data was stolen.

Therefore, the regulation of the collection and protection of neurological data is crucial to ensure the privacy and security of BMI users. The protection of this data must be treated responsibly and ethically so that it can be used for the benefit of society, without harming the individual.


 [ 1]Considering the new texted adopted by EP it may be interesting to reflect on the updated provisions.

 [SSPC2]Referring here to the AI Act, correct?

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.