PsyLink is experimental hardware for reading muscle signals and using them to e.g. control the computer, recognize gestures, play video games, or simulate a keyboard. PsyLink is open source, sEMG-based, neural-network-powered, and can be obtained here.
- All posts on one page
2023-08-25: Data Sheets
2023-05-31: Prototype 10
2023-03-22: Enhanced Signal by >1000%
2023-03-06: Sample Signals
2023-02-05: 2022 Retrospective
2022-02-24: Added Bills of Materials
2022-02-23: 3M Red Dot electrodes
2022-02-22: Microchip 6N11-100
2022-02-16: Next Steps & Resources
2022-02-15: Mass production
2022-01-19: Prototype 9 + Matrix Chatroom
2022-01-18: HackChat & Hackaday Article
2021-12-19: Prototype 8 Demo Video
2021-12-18: Prototype 8
2021-12-16: INA155 Instrumentation …
2021-12-15: Power Module 4
2021-11-30: Batch Update
2021-07-17: Neurofeedback: Training in …
2021-07-06: New Frontpage + Logo
2021-06-24: Cyber Wristband of Telepathy …
2021-06-21: Running on AAA battery
2021-06-16: Power Module 3
2021-06-10: Believe The Datasheet
2021-06-04: Back to the Roots
2021-05-31: Website is Ready
2021-05-29: Dedicated Website
2021-05-17: Gyroscope + Accelerometer
2021-05-14: Wireless Prototype
2021-05-09: Power Supply Module
2021-05-07: New Name
2021-05-06: Finished new UI
2021-05-04: Higher Bandwidth, new UI
2021-04-30: PCB Time
2021-04-29: Soldering the Processing Units
2021-04-28: Going Wireless
2021-04-24: First Amplifier Circuit
2021-04-14: Data Cleaning
2021-04-13: Cyber Gauntlet +1
2021-04-11: Adding some AI
2021-04-08: Baby Steps
2021-04-03: The Idea
Adding some AI└2021-04-11, by Roman
Most neural interfaces I've seen so far require the human to train how to use the machine. Learn unintuitive rules like "Contract muscle X to perform action Y", and so on. But why can't we just stick a bunch of artificial neurons on top the human's biological neural network, and make the computer train them for us?
While we're at it, why not replace the entire signal processing code by a bunch of more artificial neurons? Surely a NN can figure out to do a bandpass filter and moving averages, and hopefully come up with something more advanced than that. The more I pretend that I know anything about signal processing, the worse this thing is going to get, so let's just leave it to the AI overlords.
The Arduino Part
The Arduino Nano 33 BLE Sense supports TensorFlow Lite, so I was eager to move the neural network prediction code onto the microcontroller, but that would slow down the development, so for now I just did it all on my laptop.
The Arduino code now just passes through the value of the analog pins to the serial port.
Calibrating with a neural network
For this, I built a simple user interface, mostly an empty window with a menu to select actions, and a key grabber. (source code)
The idea is to correlate hand/arm movements with keys that should be pressed when you perform those hand/arm movements. To train the AI to understand you, perform the following calibration steps:
- Put on the device and jack it into your laptop
- Start the Calibrator
- Select the action "Start/Resume Recording" to start gathering training data for the neural network
- Now for as long as you're comfortable (30 seconds worked for me), move your
hand around a bit. Hold it in various neutral positions, as well as
positions which should produce a certain action. Press the key on your
laptop whenever you intend your hand movement to produce that key press.
(e.g. wave to the left, and hold the left arrow key on the laptop at the
same time) The better you do this, the better the neural network will
understand wtf you want from it.
- Holding two keys at the same time is theoretically supported, but I used TKinter which has an unreliable key grabbing mechanism. Better stick to single keys for now.
- Tip: The electric signals change when you hold a position for a couple seconds. If you want the neural network to take this into account, hold the positions for a while during recording.
- Press Esc to stop recording
- Save the recordings, if desired
- Select the action "Train AI", and watch the console output. It will train it for 100 epochs by default. If you're not happy with the result yet, you can repeat this step until you are.
- Save the AI model, if desired
- Select the action "Activate AI". If everything worked out, the AI overlord will now try to recognize the input patterns with which you associated certain key presses, and press the keys for you. =D
I used this to walk left and right in 2020game.io and it worked pretty well. With zero manual signal processing and zero manual calibration! The mathemagical incantations just do it for me. This is awesome!
Some quick facts:
- 8 electrodes at semi-random points on my forearm
- Recorded signals for 40s, resulting in 10000 samples
- I specified 3 classifier labels: "left", "right", and "no key"
- Trained for 100 epochs, took 1-2 minutes.
- The resulting loss was 0.0758, and the accuracy was 0.9575.
- Neural network has 2 conv layers, 3 dense, and 1 output layer.
Still a lot of work to do, but I'm happy with the software for now. Will tweak the hardware next.
Now I'm wondering whether I'm just picking low hanging fruits here, or if non-invasive neural interfaces are really just that easy. How could CTRL-Labs sell their wristband to Facebook for $500,000,000-$1,000,000,000? Was it one of those scams where decision-makers were hypnotized by buzzwords and screamed "Shut up and take my money"? Or do they really have some secret sauce that sets them apart? Well, I'll keep tinkering. Just imagine what this is going to look like a few posts down the line!
F-Zero└2021-04-09, by Roman
The look of the first device was way too unprofessional, so I pulled out my sewing machine and made a custom tailored sleeve from comfortable modal fabric.
On the inside, I attached some recycled studs that served as electrodes. Who needs that expensive stuff they sell as electrodes when a piece of iron suffices?
This time it had 4 electrodes. I targeted the middle and the distal end of two muscles, the Brachioradialis and the Extensor carpi radialis longus. I picked those muscles at random, because I honestly don't know what the fuck I am doing.
Software-wise, I played around with moving average and got reasonable signals, but it was clear that there was too much noise.
How to filter, though? I'm not going to solder some bandpass filter, that's too slow and inflexible. There are simple algorithms for doing it in software (link 1, link 2), but something seemed off about this method. In the end, I decided to learn how to do a Fourier transform on the Arduino.
With this code (inspired by this post), I took 64 samples at a sampling rate of 1kHz, performed the Fourier transform, cut out anything under 30% and over 50% of my frequency range, and then summed up the amplitudes of the remaining frequencies to generate the output.
Still very crude, but it allowed me to get distinctive signal patterns for various positions of my arm:
I was genuinely surprised that I got information of this fidelity and usefulness from just hooking up 4 ADC's to semi random places of my forearm and a software bandpass filter. This was good enough to use it as a basic input device!
I wondered, can I control a racing car game with this?
To test that, I built this program to read out the signals and convert certain ranges of values to keyboard presses of the keys Left and Right. The value ranges need to be calibrated before each use: I held my left arm like I'm grabbing an invisible steering wheel, moved it left and right, and looked hard at the signal values to find correlations like "signal A is always below X if and only if I steer left". Once the calibration is done, the invisible steering wheel turned into a magical keyboard with 2 keys =D
Right away I tried it out to steer in my favorite racing game, F-Zero:
Note that in addition to the steering wheel, I used my other hand to accelerate.
I loved it, but there is still a lot of work to be done. The calibration is a pain, especially since it needs to be repeated if the electrodes move too much, which happens a lot with this kind of sleeve. Also I want more electrodes, better signal processing, and data transfer via Bluetooth so I can run it off a battery.
Baby Steps└2021-04-08, by Roman
The Arduino arrived. I have no electrodes though. But what are electrodes, just some pieces of metal taped to your skin, right? Let's improvise that:
There are two pieces of aluminum foil taped to my skin, held together with blue medical wrap.
The educational material about electromyographs that I've seen described a chain of hardware elements to process and clean up the signal:
But I thought, let's focus on the MVP. Why not simply hook the electrodes straight to the analog input pins of the Arduino with some alligator clamps? Worked fine. I did minimal signal processing in software though, you can find the source code here.
This video shows the myoelectric signal on Arduino IDE's built-in signal plotter:
The Idea└2021-04-03, by Roman
On this day, I got the idea and started researching EMG design and signal processing, motor neurology basics, as well as existing projects.
- More analog-to-digital converter inputs
- TensorFlow Lite support, which would allow me to leverage neural networks for signal processing. This is a bit of a stretch, can't wait to get disappointed by this :)
I wish there was a decent battery/UPS shield, couldn't find one so far.