DSpace Repository

A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset

Show simple item record

dc.contributor.author RUSANU, Oana Andreea
dc.date.accessioned 2023-11-13T11:08:10Z
dc.date.available 2023-11-13T11:08:10Z
dc.date.issued 2023
dc.identifier.citation RUSANU, Oana Andreea. A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset. In: 6th International Conference on Nanotechnologies and Biomedical Engineering: proc. of ICNBME-2023, 20–23, 2023, Chisinau, vol. 2: Biomedical Engineering and New Technologies for Diagnosis, Treatment, and Rehabilitation, 2023, p. 103-112. ISBN 978-3-031-42781-7. e-ISBN 978-3-031-42782-4. en_US
dc.identifier.isbn 978-3-031-42781-7
dc.identifier.isbn 978-3-031-42782-4
dc.identifier.uri https://doi.org/10.1007/978-3-031-42782-4_12
dc.identifier.uri http://repository.utm.md/handle/5014/24774
dc.description Acces full text - https://doi.org/10.1007/978-3-031-42782-4_12 en_US
dc.description.abstract The brain-computer interface is a high technology inspired from science fiction with a strong impact for helping people with neuromotor disabilities suffering from complete paralysis. Leveraging the power of thoughts translated into processing and classifying the EEG based signals acquired from the brain result in controlling the mechatronic systems, such as robotic arms or smart wheelchairs aimed at the medical assistance of disabled persons. A robotic arm is necessary for grasping and moving different objects according to the users’ intention. Also, a simulation based on a virtual robotic arm can support the real experimentation of a complex and expensive physical robotic arm. This paper presents a prototype of a simple brain-computer interface implemented in LabVIEW programming environment for controlling a virtual robotic arm using the commands determined by the P300 evoked biopotentials and the EEG rhythms using the GTEC Unicorn headset and the related official applications. The integration between the LabVIEW proposed instrument and the Unicorn user interfaces is facilitated by the UDP data transfer. The P300 speller human faces board is associated with the generation of the commands necessary to animate specific joints from the structure of the virtual robotic arm. The EEG data frequencies (delta, theta, alpha, beta, gamma) are mapped to the angle values of each joint (shoulder, elbow, wrist) composing the 3D robotic arm. The purpose of the proposed application is to train people how to use a brain-computer interface. It also shows the possibility of integrating the LabVIEW development environment with the Unicorn EEG technology by using the UDP transfer. en_US
dc.language.iso en en_US
dc.publisher Springer Nature Switzerland en_US
dc.rights Attribution-NonCommercial-NoDerivs 3.0 United States *
dc.rights.uri http://creativecommons.org/licenses/by-nc-nd/3.0/us/ *
dc.subject brain-computer interfaces en_US
dc.subject unicorn headset en_US
dc.subject virtual robotic arms en_US
dc.subject LabVIEW en_US
dc.title A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset en_US
dc.type Article en_US


Files in this item

The following license files are associated with this item:

This item appears in the following Collection(s)

  • 2023
    6th International Conference on Nanotechnologies and Biomedical Engineering, September 20–23, 2023, Chisinau, Moldova

Show simple item record

Attribution-NonCommercial-NoDerivs 3.0 United States Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 United States

Search DSpace


Advanced Search

Browse

My Account