AggiesBCI

A brain-controlled wheelchair that converts thoughts to real-world movement

The AggiesBCI Team: (Pranav, Garner, Tejas, Oswin, Yusuf (me), & Daniel). None of this would have been possible without you guys!


Objective: Develop a brain-computer interface system that enables wheelchair control using thoughts as command inputs.

  • Disassembled an e-wheelchair controller and soldered its test points to an Arduino Nano.
  • Trained mental commands using an EMOTIV Insight headset and converted them into movement inputs.
  • Presented our prototype at the Aggies Create Innovation Expo and placed 1st out of 20 teams!
  • Currently transitioning to testing this product with prospective users.



Our Innovation Expo Presentation





A bit of the project's innerworkings. On the left is our linear actuator system in action. In the middle is our initial BCI protype using an OpenBCI Ganglion board. On the right is our system housed onto our wheelchair.


Coding
  • Hacking the Hexbug
    • I coded the Hexbug controls using Arduino C.
    • I used Python and the BrainFlow library to communicate with the OpenBCI Ganglion board and detect neuromuscular commands.
    • I sent all BCI commands through the serial monitor to an Arduino Uno connected to the Hexbug controller.
  • BCI-controlled Wheelchair
    • I worked with the EMOTIV Insight headset to train the mental commands using the EmotivBCI software.
    • I coded in Python using the Cortex API to detect the headset's commands and sent them to the Arduino Nano via the serial monitor
    • I used C to code the servo movements for the linear actuator and take in BCI commands through the serial monitor
  • (Ali, 2024) contains additional EMOTIV/OpenBCI documentation and all the code I developed for this project.


What's Next
  • Refining our Design
    • This upcoming semester, our team will work on reinforcing our wheelchair interface and perhaps hacking the joystick directly as we did with the Hexbug controller
    • We also aim to make our design more modular to fit various wheelchair modes
  • Future Project Ideas
    • Beinb able to control a digital interface from mental commands (mouse, cursor, keyboard, etc.). This would give individuals with mobility issues much more freedom in today's world
    • A mechanical arm that can grasp/move items based on mental commands. This would essentially be a 'third arm' for indivuduals working in environments that require their limbs to be occupied (construction, mechanical engineering, etc.)


References

2024

  1. AggiesBCI - Github Repository
    Yusuf Ali
    2024