PsyLink

Development Log

This is the development log of PsyLink, where I attempt to build an AI-powered myoelectric neural interface, on a budget. It predicts your intentions by scanning your muscle signals and essentially acts like a keyboard.

Why do any of this?

Neural interfaces are the future. I want them to be open and safe, in the user's control, not the other way around. In early 2021, the biggest players in this field required invasive surgery and/or had poor incentives. Some companies invited researchers to build applications for their devices, only to sell them out later to Facebook and Google, while pulling their hardware from the market, leaving the users dry.

So I thought, why not build one myself and open-source it? Can't be that hard, it's just bits'n'wires, right?

Table of contents

Intro

We don't have to drill a hole into skull to get access to your nerve signals. Muscles naturally amplify them, allowing us to easily read them through skin, and use them for useful things like controlling a computer or an artificial limb.

This is not science fiction, check out what's out there already:

You could argue that we use our nerves/muscles to control keyboards already (and pretty much anything else). And for the time being, there are clearly superior human input devices. But there are reasons to do this:

This page documents my process of building one. Note that I'm no expert and I neither have a plan, nor do I know what I'm doing. I just thought, how hard can it be? If the architect of the Internet Exploder can build one, surely I can do it to.

2021-04-03

The Idea

On this day, I got the idea and started researching EMG design and signal processing, motor neurology basics, as well as existing projects.

Soon I realized that I will need a microcontroller to record and process the signals. I considered the Raspberry Pi Pico and Arduino Nano 33 BLE Sense, and chose the Arduino because:

I wish there was a decent battery/UPS shield, couldn't find one so far.

2021-04-08

Baby Steps

The Arduino arrived. I have no electrodes though. But what are electrodes, just some pieces of metal taped to your skin, right? Let's improvise that:

photo

There are two pieces of aluminum foil taped to my skin, held together with blue medical wrap.

The educational material about electromyographs that I've seen described a chain of hardware elements to process and clean up the signal:

But I thought, let's focus on the MVP. Why not simply hook the electrodes straight to the analog input pins of the Arduino with some alligator clamps? Worked fine. I did minimal signal processing in software though, you can find the source code here.

This video shows the myoelectric signal on Arduino IDE's built-in signal plotter:

2021-04-09

F-Zero

The look of the first device was way too unprofessional, so I pulled out my sewing machine and made a custom tailored sleeve from comfortable modal fabric.

photo

On the inside, I attached some recycled studs that served as electrodes. Who needs that expensive stuff they sell as electrodes when a piece of iron suffices?

photo

This time it had 4 electrodes. I targeted the middle and the distal end of two muscles, the Brachioradialis and the Extensor carpi radialis longus. I picked those muscles at random, because I honestly don't know what the fuck I am doing.

Software-wise, I played around with moving average and got reasonable signals, but it was clear that there was too much noise.

How to filter, though? I'm not going to solder some bandpass filter, that's too slow and inflexible. There are simple algorithms for doing it in software (link 1, link 2), but something seemed off about this method. In the end, I decided to learn how to do a Fourier transform on the Arduino.

With this code (inspired by this post), I took 64 samples at a sampling rate of 1kHz, performed the Fourier transform, cut out anything under 30% and over 50% of my frequency range, and then summed up the amplitudes of the remaining frequencies to generate the output.

Still very crude, but it allowed me to get distinctive signal patterns for various positions of my arm:

screenshot

I was genuinely surprised that I got information of this fidelity and usefulness from just hooking up 4 ADC's to semi random places of my forearm and a software bandpass filter. This was good enough to use it as a basic input device!

I wondered, can I control a racing car game with this?

To test that, I built this program to read out the signals and convert certain ranges of values to keyboard presses of the keys Left and Right. The value ranges need to be calibrated before each use: I held my left arm like I'm grabbing an invisible steering wheel, moved it left and right, and looked hard at the signal values to find correlations like "signal A is always below X if and only if I steer left". Once the calibration is done, the invisible steering wheel turned into a magical keyboard with 2 keys =D

Right away I tried it out to steer in my favorite racing game, F-Zero:

Note that in addition to the steering wheel, I used my other hand to accelerate.

I loved it, but there is still a lot of work to be done. The calibration is a pain, especially since it needs to be repeated if the electrodes move too much, which happens a lot with this kind of sleeve. Also I want more electrodes, better signal processing, and data transfer via Bluetooth so I can run it off a battery.

2021-04-11

Adding some AI

Most neural interfaces I've seen so far require the human to train how to use the machine. Learn unintuitive rules like "Contract muscle X to perform action Y", and so on. But why can't we just stick a bunch of artificial neurons on top the human's biological neural network, and make the computer train them for us?

While we're at it, why not replace the entire signal processing code by a bunch of more artificial neurons? Surely a NN can figure out to do a bandpass filter and moving averages, and hopefully come up with something more advanced than that. The more I pretend that I know anything about signal processing, the worse this thing is going to get, so let's just leave it to the AI overlords.

The Arduino Part

The Arduino Nano 33 BLE Sense supports TensorFlow Lite, so I was eager to move the neural network prediction code onto the microcontroller, but that would slow down the development, so for now I just did it all on my laptop.

The Arduino code now just passes through the value of the analog pins to the serial port.

Calibrating with a neural network

For this, I built a simple user interface, mostly an empty window with a menu to select actions, and a key grabber. (source code)

The idea is to correlate hand/arm movements with keys that should be pressed when you perform those hand/arm movements. To train the AI to understand you, perform the following calibration steps:

  1. Put on the device and jack it into your laptop
  2. Start the Calibrator
  3. Select the action "Start/Resume Recording" to start gathering training data for the neural network
  4. Now for as long as you're comfortable (30 seconds worked for me), move your hand around a bit. Hold it in various neutral positions, as well as positions which should produce a certain action. Press the key on your laptop whenever you intend your hand movement to produce that key press. (e.g. wave to the left, and hold the left arrow key on the laptop at the same time) The better you do this, the better the neural network will understand wtf you want from it.
    • Holding two keys at the same time is theoretically supported, but I used TKinter which has an unreliable key grabbing mechanism. Better stick to single keys for now.
    • Tip: The electric signals change when you hold a position for a couple seconds. If you want the neural network to take this into account, hold the positions for a while during recording.
  5. Press Esc to stop recording
  6. Save the recordings, if desired
  7. Select the action "Train AI", and watch the console output. It will train it for 100 epochs by default. If you're not happy with the result yet, you can repeat this step until you are.
  8. Save the AI model, if desired
  9. Select the action "Activate AI". If everything worked out, the AI overlord will now try to recognize the input patterns with which you associated certain key presses, and press the keys for you. =D

Results

I used this to walk left and right in 2020game.io and it worked pretty well. With zero manual signal processing and zero manual calibration! The mathemagical incantations just do it for me. This is awesome!

Some quick facts:

Video demo:

Still a lot of work to do, but I'm happy with the software for now. Will tweak the hardware next.

Now I'm wondering whether I'm just picking low hanging fruits here, or if non-invasive neural interfaces are really just that easy. How could CTRL-Labs sell their wristband to Facebook for $500,000,000-$1,000,000,000? Was it one of those scams where decision-makers were hypnotized by buzzwords and screamed "Shut up and take my money"? Or do they really have some secret sauce that sets them apart? Well, I'll keep tinkering. Just imagine what this is going to look like a few posts down the line!

2021-04-13

Cyber Gauntlet +1

So if you ever worked with electromyography, this will come to no surprise to you, but OMG, my signal got so much better once I added a ground electrode and connected it to the ground pin of the Arduino. I tried using a ground electrode before, but connected it to AREF instead of GND, which had no effect, so I prioritized other branches of pareto improvement.

I am once again confused and surprised that I got ANY useful results before.

For prototype #3, I moved the electodes further down towards the wrist in hope that I'll be able to track individual finger movements. It had 17 electrodes, 2x8 going around the wrist, as well as a ground electrode at the lower palm. Only 9 of the 17 electrodes are connected, 8 directly to the ADC pins, and one to 1.65V, which I created through a voltage divider using two 560kΩ resistors between the 3.3V and GND pins of the Arduino, so that the electrode signals will nicely oscillate around the middle of the input voltage range.

It all started out like a piece of goth armwear:

photo

Photo from the testing period:

photo

Soldering wires to the electrodes:

photo

The "opened" state shows the components of the device:

photo

But it can be covered by wrapping around a layer of cloth, turning it into an inconspicuous fingerless glove:

photo

If you look hard at this picture, you can see the LED of the Arduino glowing through the fabric, the voltage divider to the right of it, appearing like a line pressing through the fabric, the ground electrode on the lower right edge of my palm, and the food crumbs on my laptop :)

The signal seems to be much better, and as I move my arm and hand around, I can see distinct patterns using the Arduino IDE signal plotter, but for some reason the neural network doesn't seem to process it as well. Will need some tinkering. I hope it was not a mistake to leave out the electrodes at the upper forearm.

I already ordered parts for the next prototype. If all goes well, it's going to have 33 'trodes using analog multiplexers. The electrodes will be more professional & comfortable as well. Can't wait!

2021-04-14

Data Cleaning

The arduino code now produces samples at a consistent 1kHz. I also moved the serial read operations of the calibrator software into a separate thread so that it doesn't slow down on heavy load, causing the buffer to fill up, and the labeling to desynchronize. I am once again confused and surprised that I got ANY useful results before.

I disconnected analog input pin 7 from any electrode, and used it as a baseline for the other analog reads. By subtracting pin 7 from every other pin, the noise that all reads had in common was cancelled out. Hope this doesn't do more harm than good.

I also connected the ground line to one of the wrist electrodes rather than to the palm, since the palm electrode tended to move around a bit, rendering all the other signals unstable.

And did you know that the signals looks much cleaner when you unplug the laptop from the power grid? :p

screenshot

I'll finish with a video of me trying to play the frustrating one-button jumping game Sienna by flipping my wrist. This doesn't go so well, but maybe this game isn't the best benchmark :D My short-term goal is to finish level 1 of this game with my device.

2021-04-15

Multiplexers

The analog multipexers (5x DG409DJZ) and other stuff arrived! I almost bought a digital multiplexer, because I didn't know there were various types... But I think that these will work for my use case. The raw signal that I get out of it looks a little different, but when I filter out the low & high frequencies with TestMultiplexer2.ino, the direct signal and the one that goes through the multiplexer looks almost identical =D.

2021-04-19

Amplifiers

I have the feeling that before building the next prototype, I should figure out some way of enhancing the signal in hardware before passing it to the microcontroller. It's fun to hook the 'trodes straight to the ADC and still get results, but I don't think the results are optimal. So these days I'm mostly researching and tinkering with OpAmps.

2021-04-24

First Amplifier Circuit

I had my head stuck in electronics lectures, datasheets, and a breadboard to figure out a decent analog circuit for amplifying the signal. It sounds so straight forward, just plug the wires into the + and - pin of an operational amplifier, add a few resistors to specify the gain of the OpAmp, and feed the output to the analog input pin of the Arduino... But reality is messy, and it didn't quite work out like that.

Here's a list of problems:

I also connected the electrode signal to ground with a 1MΩ resistor which greatly improved the signal, and I have no idea why.

One peculiar thing I noticed was that the signal seemed stronger when my laptop was connected to the power supply. It superimposed noise, but also seemed to increase differences in electrode voltages. I don't quite understand this yet, but 2 things follow from that:

Some of the references I used:

The resulting circuit:

Circuit schematic

And the signals look like:

Signal image 1 Signal image 2

Yellow and green are two electrodes, right after their respective OpAmp, and purple is (yellow-green)*20.

This should be good enough to move forward, but I bought some INA128 instrumentation amplifiers and perhaps I will tinker some more to get an even better signal. Can't wait for the next prototype though :).

In other news, I watched Dr. Gregory House explain forearm muscles, so next time my electrode placement will be better than random!

And since I learned KiCad for creating the above schematic, I thought I'd add schematics for the previous models as well, see circuits.

2021-04-28

Going Wireless

I've been battling with reducing the power line noise for too long, so I thought screw it, let's go off the power line entirely. I put the circuit on two 3V CR2032 coin cells and wrote some code to transmit the signals via BLE (Bluetooth Low Energy) using the ArduinoBLE library.

Since I can not plot the signals via the Arduino IDE plotter anymore, I switched to GNURadio and wrote a plugin that establishes the BLE connection and acts as a signal source in the GNURadio companion software

My new "electrodes" also arrived: Simple prong snap buttons. They don't have sharp edges like the pyramidal studs I used before, and allow me to easily remove the wires from the electrodes and plug them in somewhere else as needed.

photo of the breadboard

I also employed INA128 instrumentation amplifiers, drastically reducing the complexity of the circuit. It's a tiny SMD chip, which I plan to embed in hot glue, along with the 3-4 capacitors and 3-5 resistors required for processing/de-noising, and place 8 of these processing units across the glove/wristband, connected to two electrodes each.

Circuit schematic

Now I'm battling the problem that I can only get about 1kB/s across the ether. How am I supposed to put 12kB/s worth of signal in there? (8 channels, 1k samples/s, 12 bit per sample) Let's see if I can find some nice compression method, but I fear that it's going to be lossy. :-/

2021-04-29

Soldering the Processing Units

The plan was to split the circuit into:

Here's my try to solder one of those units:

photo

This took me over an hour, during which I began questioning various life choices, started doubting this whole project, poured myself a Manhattan cocktail, wondered how long it would take to complete all eight of these, whether it will even be robust enough to withstand regular usage of the device (NO, IT WON'T), and how I'm going to fix the inevitable broken solder joints when the entire thing is in fucking hot glue...

I gave up, and now my plan is to get PCBs for this instead. I have little experience with this, so I've been putting it off, but how hard can it be?

First draft:

photo

Updated schematic:

photo

I removed the decouplying capacitor between ground and GNDS (signal ground) by the REF pin of the INA128 because mysteriously it made the signal worse, not better. Also removed the 1K resistors between electrodes 1+2 and the respective capacitors, because they served no apparent purpose.

Also, I was frustrated that GNURadio doesn't allow you to get a "rolling" view of a signal. The plot widget buffers as many samples as it can show, and only when the buffer is full, it updates the graph, clears the buffer and waits again. I wanted instant updates as soon as new samples are in, and as a quick&dirty workaround I wrote a GNURadio shift block which keeps filling up the buffer of the plotting widgets.

I'll finish with a nice picture of a finger snap, as recorded with one electrode pair on my dorsal wrist. Click to enlarge and view the frequency domain as well. (Just one electrode pair because that's all I can squeeze out of the poor bluetooth low energy bandwidth so far)

screenshot of EMG of a finger snap

2021-04-30

PCB Time

Today I made a new version of the PCB that processes the signals from one electrode pair:

pcb picture

Actually, several versions. This is the 4th iteration, and let's not even look at the previous ones because they were just plain wrong. I stared at this design for a long time though and couldn't find another problem, so I went ahead and ordered 30 pieces of it. Can't wait to find out in what way I messed up :'D And hey, maybe it'll actally work.

Main features:

To avoid having a kilogram of cables on the device, this board supports wiring in a mesh network topology, where the boards share the power lines amongst each other using the redundant power line connector ports. One board can power two other boards, which in turn can power 4, and so on.

The bypass capacitor between ground and V+ will hopefully keep the voltage stable, though I'm a bit worried about the reference signal. If necessary, I can "abuse" the reference signal pin of the power line connector ports to add extra ground electrodes. I considered adding an extra opamp on every board to generate a fresh reference voltage but that would make the circuit too big for my taste.

2021-05-04

Higher Bandwidth, new UI

Hah, I managed to raise the Bluetooth bandwidth from ~1kB/s to 6-7kB/s with this one magic line:

BLE.setConnectionInterval(8, 8);

It raises the power consumption by 4% (3.5mW), but that's totally worth it. I can now get all 8 channels in 8-bit resolution at 500Hz across the aehter. Eventually I should aim for 10-bit at 1kHz, but I think that can wait.

signals gnuradio flowgraph

This is the GNURadio flowgraph and the resulting output. (I only have hardware for 2 electrode pairs, so even-numbered and odd-numbered signals are wired to the same input. Still waiting for the PCBs.)

Power ratings:

Surprisingly to me, the LEDs were draining a good chunk of the power, and I saved 16mW by removing the external power LED (see previous photo) and by PWM-dimming the blue LED that indicated Bluetooth connections. It gives me approximately 15 hours run time with 2x CR2032 coin cells.

Also I'm in the process of rewriting the UI:

MyocularUI screenshot

The colorful column graph is a live visualization of the signal. The columns correspond to electrode pairs, while the rows are time frames. The top row shows the amplitude of the signal at the current time, and the rows flow downward, allowing you to view changes back in time, as well as correlations between signals.

You'll also be able to change settings on the fly, view the status of e.g. key recordings or machine learning processes, and more. All of this is in a modular library that will also be usable from e.g. GNURadio.

I was thinking of changing the graphical user interface toolkit from Tkinter to a more modern one, because Tkinter looks a little shabby, and it has problems determining which keys are currently pressed, but I decided against it, because I made the experience of being unable to run my own software several years after writing it because the exact version of the GUI toolkit, along with all dependencies, was too annoying to set up. Tkinter has been around for decades and will probably stay, so I'll stick with it for now. Also, I can easily solve the key pressing issue with an external key tracking library like pynput.

Can't wait to try out the new UI with 8 individual electrode pairs, once the PCBs arrive! (assuming they work :'D)

2021-05-06

Finished new UI

The new user interface now supports all previous features!

MyocularUI screenshot

It's sooo much more pleasant to have a direct view on the state of the application and an instant visualization of the signals. The previous version was literally just a blank window, with a single menu called "File" that contained all the actions. :D I never even bothered to upload a screenshot, but here's one for documentation purposes:

screenshot of old 'Calibrator' tool

Also, this time I used clean & efficient data structures to make the code easier to work with, a more reliable key capturing library (pynput), and threads to prevent one activity from blocking the others. The signals obviously go via Bluetooth instead of a wired serial connection.

I'm also thinking of changing the name for the project, since people are reading it as "my ocular" rather than recognizing the neologism made of "myo" (for "muscle") and "ocular" (from "eye"). But all the good names are taken, of course. -_-

2021-05-07

New Name

After some brainstorming, I changed the working title of this project from Myocular to ✨PsyLink✨. The close second favorite was FreeMayo (thanks to Vifon for the suggestion). Free as in free speech/software/hardware, and mayo as a play on myo (ancient greek for "muscle"). But somehow I liked PsyLink more. It's inspired by the Psionic Abilities from the 1999's game System Shock 2.

FYI, this is the list of words that I considered, although unfortunately many of the coolest combinations were taken:

2021-05-09

Power Supply Module

I made an updated schematic (circuit 6) that shows more clearly how the modules are connected. Also corrected an error with the feedback of the voltage follower, and changed values of some resistors/capacitors:

Schematic image

I also constructed the power supply module:

Photo of power module #1

but for some reason it didn't work. All the parts seemed to have been connected the right way, I couldn't find a short circuit, but the output voltage was ~0.5V instead of ~5V, and the reference voltage was just 0. I blame a possibly broken opamp.

Well, I didn't like the design and length of the circuit board anyway, so it didn't hurt trashing the thing and building this beauty instead:

Photo of power module #2

I'll use female-to-female jumper wires to connect V+ and GND to the arduino, and 3 more wires to bootstrap the power supply of the mesh network of the signal processing modules.

Notes:

I wonder if some 深圳人 will read this, shake their head, and view me as a primate struggling to make fire with sticks. That's what it felt like to construct this thing anyway. Nevertheless, I'm one step closer to the next prototype :)

2021-05-14

Wireless Prototype

Hell yeah! The PCBs arrived:

pcb photo

Soldering & Sewing

I never soldered such tiny SMD parts before, and didn't have proper tools, way too thick soldering tin and solder iron tip. I was also too impatient to order some, so after hours of torture, I produced this batch:

photo of 8 soldered pcbs for signal processing

The new prototype was to be a forearm sleeve of modal fabric once again, with snap buttons for electrodes which will also hold the signal processing PCBs in place.

But how to attach the Arduino and the power supply module to the sleeve? I thought, "why not Velcro?" (hook-and-loop fastener) and started sewing it to the circuit boards:

photo of me sewing velcro to the power supply module

(Yes, doing it felt as weird as it looks)

So I sewed the sleeve, assembled one electrode pair along with its processing PCB, and wired everything together. Here's me being overly excited about the first wireless test run:

Electrode Placement

Then there was the question of where to put the electrodes. Using an improvised muscle map along with two flexible electrodes on individual straps, I could find spots whose electrical activity correlated with turning the arm, twisting the wrist, or pressing individual fingers onto the table:

photo of me mapping my forearm

Flexor Digitorum Superficialis was particularly interesting; I found 3 areas over that muscle which map to the index, middle and ring finger each. For turning the arm and wrist, the muscles with "Carpi" in their name (e.g. Extensor carpi ulnaris) worked pretty good.) A huge disappointment was Extensor Digitorum, which is supposed to be active when fingers move up, but I could not find such correlation. Then again, I use snap buttons for electrodes, so I'm not that surprised.

The final layout of the electrodes:

electrode map

This piece is fully separable from the electronics and therefore machine washable.

Here are additional pictures of the inner side, the separated electronics, as well as everything combined. This nicely shows the tree topology of the green signal processing modules that pass through the power supply among each other to reduce the volume of wiring.

inside electronics only everything combined

I could have had 8 electrode pairs, but only added electrodes for 7. On these pictures, the electrode pair for the middle finger is also missing the circuitry. That's mostly a testament to my laziness.

Actually I regret where I placed the Arduino, since there would be some great spots for electrodes, but I noticed that too late. Will try to remove the Velcro and maybe add an 8th electrode pair there.

The final cyb3rware:

photo of the final product

While the signal was quite strong with the test straps, I found that the amplitude of the signal went way down once I had everything attached to the sleeve. Maybe there was some kind of interference from the Arduino or the power supply being closer to my skin, or maybe the modal fabric messes with the signal somehow. I hope I can compensate for this by increasing the signal amplification multiplier, but I leave that for later.

This issue occurred with a single electrode pair already, but was aggravated when attaching more of them. It might help if I add some flux capacitors to the power supplies to prevent cross-interference.

Test Drive

I drove F-Zero with Prototype #2 before, but back then I cheated a little bit. It only recognized 2 keys, left and right, and I accelerated with the keyboard using my other hand.

This time I hoped I could do better, and trained the AI to recognize 3 different keys (left, right, accelerate) from my muscle signals. It even kinda worked!

This was after recording ~2000 muscle signal samples over 1-2 minutes and training a convolutional neural network for 25-50 epochs (<1 minute) on the data using the PsyLink UI. I used 4 electrode pairs, all of which are on the dorsal side of the forearm.

Analysis

In the racing game, I didn't make it to the finish line yet, and it does look pretty clumsy, but I blame it on the software still having some obvious flaws. It doesn't even account for packet loss or packet duplication when handling the Bluetooth packets yet. Hope it will go better once I fixed them. Also, the test drive was with only 4 electrode pairs.

The raw values as visualized with the GNURadio flowgraph while randomly moving my forearm/wrist/hand show that the correlations between the signals are low enough to be theoretically useful:

graphs of signals

If you enlarge this image, you'll see that especially the black line is considerably different, which I suppose is because it's the only electrode pair that spans across several muscles. And that makes me wonder: Am I doing too much pre-processing in hardware before I feed the data into the AI? Sure, the differential amplification of this new prototype enhances small signals that the previous prototypes might have not detected, but a lot of information is lost too, like the voltage differences between electrodes from different electrode pairs.

Maybe I can compensate for this by simply adding some more electrode pairs that span muscles. I'm also thinking of switching to a design with 32-64 randomly placed electrodes -> buffer amplifiers -> multiplexers -> analog to digital converters of the Arduino. That way, the neural network can decide for itself which voltage differences it wants to look at.

New PCB layout

While soldering the PCB, I found some flaws and made these changes to the previous PCB:

psylink6 PCB

2021-05-17

Gyroscope + Accelerometer

I fixed up the PsyLink UI. It was so broken after the rewrite to Bluetooth Low Energy, I'm once again stunned that I got ANY useful results before. But now it receives the transmissions from the Arduino properly.

PsyLink UI screenshot

I also added 6 more signal channels: The x/y/z-axes from the Gyroscope and from the Accelerometer that are built in to the Arduino Nano 33 BLE Sense.

All put together finally allowed me to singlehandedly drive through the finish line of my favorite racing game F-Zero! \o/

2021-05-29

Dedicated Website

The PsyLink project now has it's own website: psylink.me, and this is the place where I will continue the development log, as soon as I finish the basic structure of the website.

2021-05-31

Website is Ready

The website is more fleshed out now, with a nice black/green design, neurons in the background, and a logo that is based on the logo of the fictional TriOptimum Corporation from the System Shock game series.

Videos are now hosted on a PeerTube Channel, allowing me to upload higher quality videos in the future while keeping the git repository of the website small.

I also catalogued individual components (circuit schematics, circuit boards, textiles, software) that resulted from this project so far, and documented how they all fit together in the prototype overview. Each component has an individual ID now, that I can easily write, print or sew on the hardware so I don't mix everything up. For example, prototype 4 has the ID "p4" and can be reached directly via https://psylink.me/p4, while the signal processing board of p4 has the ID "b1" and can be reached via https://psylink.me/b1.

Here's a screenshot, for a future time when the design has changed:

screenshot of the new website

2021-06-04

Back to the Roots

While uploading the old videos to the new PeerTube channel, I viewed the first video once again, which shows a pretty good signal from just two pieces of aluminum foil taped to the skin. And I wondered, why do I even bother with such a complicated set-up like in Prototype 4? It was really annoying to assemble, and the device is clunky and fragile.

Let's go back to the roots and build something more simple. Plenty of reasons:

Circuit 7

The Circuit 7 shows a simplification of the signal processing module to a simple non-inverting amplifier per electrode with a gain of 221x. There's also a 560KΩ bias resistor towards Vref so the voltage we measure isn't too far off the center. In Circuit 6 I had used 1MΩ, but here it produced mysterious oscillations, and going down to 560KΩ mysteriously fixed it.

This circuit also features a rechargeable 1.2V AAA battery with a TPS61220 step-up converter boosting the voltage to 5V, because I don't hate nature, and I burned through enough CR2032 coin cells. Coin cells also aren't exactly optimized for currents of 20mA, and thus get drained too quickly.

circuit 7

I measured a signal while pressing down a finger onto the table with two electrodes along the Flexor Digitorum Superficialis. Blue is electrode 1, red is electrode 2, and green is an amplified difference:

signals

Circuit 8

But let's cut even more out of this circuit. Here's one that is designed to be a shield to the Arduino Nano 33 BLE Sense, containing just the power supply, and an array of pass-through pins:

circuit 8

The signal that I'm getting is weaker, but certainly usable: (again, I measured a signal while pressing down a finger onto the table with two electrodes along the Flexor Digitorum Superficialis. Blue is electrode 1, red is electrode 2, and green is an amplified difference)

signals

PCB

I also built a PCB that implements this power supply/pass-through shield, and I figured, even if the device ends up not very useful, I'll still be able to use this for experiments later on, thanks to the pass-through pins.

bp2

I just hope that the PCB/circuit will work at all. I still haven't figured out how to simulate it, and I don't really know the best practices for PCB design. The PCB footprint for the AAA battery clips (Keystone 82) is my first custom-made PCB footprint too. Hope it all works out.

Next Prototype

It will be a relatively small forearm band with 8 electrodes (+ 1 ground electrode), which I plan to place around the Flexor Digitorum Superficialis for detecting what individual fingers are doing. The information from the gyroscope + accelerometer should cover the rest.

2021-06-10

Believe The Datasheet

Today the order of Power Module 2 arrived!

And with relief I saw that the battery clips fit nicely onto the board, as does the Arduino (with pin strips), and all the other components. Just that simple thing already felt like an accomplishment at my level of expertise with PCB design ;)

After some dreadful time trying to solder on the tiny 1x2mm-sized chip at U1 (I need a microscope for this shit), I had the SMD parts assembled and the circuit was ready for a test drive:

photo of assembled p5

But something was weird. The output voltage was a meager 1.5V, not the expected 5V, even though everything was connected properly. After hours of debugging I flipped over the table and just soldered a fresh board, this time without the Vref-generating OpAmp (U2+R3+R4+C3). But no luck, still just 1.5V. This was lower than the minimal output voltage of the voltage booster, so the chip didn't even finish it's start-up phase. How could that be, if there's not even any load on the output voltage?

I desperately tried several different things. One was doubling the inductance at L1 from 4.7µH to 9.4µH by using two SMD inductances in series:

photo

Unfortunately I think I broke the coils while constructing this, since they didn't let any current go through. But I found a regular, big inductance coil with 10µH, manually held its pins down onto the SMD pads, and indeed, the voltage jumped up to 5V!

So I soldered it... onto... the SMD pads.

photo

(Probably you can reconstruct my entire room from all the reflections in this image, along with a biometric picture of my face and at least 3 of my fingerprints...)

But once I connected the OpAmp, the voltage went back down to 2.5V, and with the Arduino connected, it went down further to 2.4V. Adding a second coil in series for a total of 20µH didn't compensate for this, but made it even worse, bringing it all the way down to 1.5V.

Well, clearly whatever is wrong with this construction has something to do with the inductance, and it's not purely the amount of inductance... Which brings us to the title of this post:

The mistake (probably)

Of course the data sheet of the voltage booster CLEARLY STATED that the inductance coil needs to be AS CLOSE AS PHYSICALLY POSSIBLE to the chip. The capacitors C1 and C2 too, by the way. And I even read that. But I thought, what could possibly go wrong if I move it ~1cm away to make some space for the battery? Nothing, right? Well, awesome, I guess it's time for another revision :)

At least the 2.4V were enough to power the Arduino, although it was visibly struggling. I could establish a Bluetooth connection and collect some signals, but the Bluetooth packets were coming in extremely slowly (though still faster than mobile internet in 80% of Germany.)

Pictures

Front:

Front

Somehow the oversized inductance coil adds a nice vibe to it.

Front, with battery:

Side

Back, with attached Arduino:

Back

This picture shows a pin strip socket that will be gone in the final version, where the Arduino will be soldered onto the board, reducing the height from 3cm to 2cm.

Side, with Arduino and battery:

Side

(Yes, the pin strip socket is too long by one pin ;))

2021-06-16

Power Module 3

Since Power Module 2 has the wrong PCB layout for the step-up converter, I built Power Module 3 to fix this issue.

It was still a worthwhile learning experience to build Power Module 2, resulting in the following changes:

Here's the circuit:

circuit 9

And the new PCB:

PCB bp3 front side

PCB bp3 back side

If you're wondering why the sparky, fancy looking power line goes all the way from the power switch on the right through digital pin D10 and into L1 on the left side... Indeed that looks pretty awkward, but L1 is the noisiest component, and I wanted to keep it as far as possible from the analog pins at the bottom, without sacrificing the effectiveness of the boost converter layout. Given the size & time constraints, I didn't find a better solution.

What time constraints, you ask? Well, actually I made a different PCB layout first. Polished every detail, and when it was perfect (according to my crude appraisal), I ordered it. Was already excited about the delivery, started putting the board on the website, and so on. But at some point I noticed that something was wrong... The Arduino pins were inverted. Theoretically, everything would still work, but the Arduino would have to be plugged in from the back side, which is something I wanted to avoid to keep the board laying nice and flat on the forearm...

Thankfully the manufacturing process hasn't started yet, and I could update the board for free. So I started redesigning half of the board and finished just in time for the production to start :D

Let's hope it works this time.

2021-06-21

Running on AAA battery

Wow, it feels like ages since I started working on making PsyLink run on a rechargeable AAA battery. It sounds so simple and straight-forward, but it wasn't :). This ate 2.5 weeks of my time, but finally I succeeded!

photo, tilted perspective

Coming from software engineering, the iteration time of hardware prototypes is horribly slow. There was a lot of waiting for package deliveries, a lot of time assembling, and a lot of wrestling with leaky abstractions. For example, it wasn't enough to just connect the 1.2V->5V boost converter like on the circuit diagram, but I had to take special care of the distances between the parts, and the widths of the copper tracks connecting them.

Another problem you never face in software engineering is that the package with electronics parts was stolen, and when the new PCB of Power Module 3 arrived, I had to work with the few remaining (suboptimal) parts that I had.

What almost drove me insane was that I had only one fresh TPS61220 chip left. That's the 5V boost converter at position U1 (in the bottom of the red circle on the photo below), which is so small that I don't really have the tools to solder it on properly. I kept accidentally connecting the pins of the chip with solder. I gave up and started asking around friends for whether they could solder it on for me, when I remembered this soldering tip someone gave me: If you don't have soldering flux (which I didn't have), just use ✨margarine✨ instead ;D. It sounds very wrong, but it actually made a huge difference, and only thanks to the power of margarine I was able to keep the solder exactly where I needed it to be. "If it's stupid and it works, it's not stupid."

Then - for whatever reason - the voltage was going up to 1.5V instead of 5V, just like with the previous PCB layout. I thought I fixed that problem by optimizing the layout around the TPS61220 chip, but apparently that wasn't enough. I figured the 4.7µH inductance at L1 wasn't big enough, so I squeezed in a second coil:

photo, coil acrobatics

(It looks like two tardigrades playing ball :) I wish I made a better picture before I disassembled it again)

This alone didn't help, but when I manually held an additional 10µH inductance in parallel to L1 for 1 second while the device was running, it surprisingly kickstarted and reached 5V!... until I drained some current, which made the voltage collapse back to 1.5V immediately. Curious behavior. :D

Then I tried soldering on the big 10µH coil directly onto the SMD pads, and the voltage reached 5V and stayed at 5V =)

photo, front

No idea why the big 10µH coil worked while the two smaller coils totaling 9.4µH didn't work... I doubt that the 0.6µH difference in total inductance turned the tide, probably there's some factor I'm not aware of. I actually had ordered a 10µH SMD inductance coil in anticipation of this, but well, it got stolen... My only consolation is the face of the package thieves when they open the package, they realize that it's just a couple of tiny SMD parts, and they wonder WTF this shit is even good for.

P.S.: I found out that the reason why the coils didn't work was that they were not rated for the >200mA that's passing through them. Once I got a 4.7µH coil rated for 280mA, everything was fine.

More pictures:

photo, side photo, back

The lack of parts forced me to adjust the resistance/inductance/capacitance values of Circuit 9:

circuit 9.1

Thankfully I had planned for way too many capacitors on the PCB, just for some extra VROOOM, so the fact that I only had 10µF capacitors instead of the planned 100µF didn't matter too much.

I also changed the 1MΩ resistors in the reference voltage generator to 110kΩ because I read that smaller resistances in a voltage divider make the output voltage more stable, at the cost of more power use, but I think we can sacrifice some power for signal accuracy. No idea whether it's actually going to help though. I chose 110kΩ instead of the more common 100kΩ and 220kΩ because the boost converter already requires a 110kΩ resistor, and that way this prototype requires fewer different parts. But the exact values don't matter, as long as both resistances are identical.

Tomorrow I'll start working on the electrodes and wristband. :)

2021-06-24

Cyber Wristband of Telepathy +2 [UNIQUE ITEM]

Power Module 3 finally found a home: Sleeve 4. Looks a bit like a Pip-Boy from the Fallout Series :)

photo, closed

Under the hood there's 9 electrodes (1x ground, 8x signal):

photo, open

2021-07-06

New Frontpage + Logo

The front page now looks a little more "modern", and I changed the logo from

old logo

to

old logo

Once again, this was inspired by the System Shock 2 Trioptimum logo.

2021-07-17

Neurofeedback: Training in both directions

For now, the training of neural networks mostly happens on the AI side. The human makes arm gestures and presses keys on the keyboard. This provides the input (electrode and IMU signals) as well as the output/labels (keyboard actions) to the artificial neural network, which then learns the correlation between the two through supervised gradient descent.

Now I'm looking into how to train both the neural network of the AI as well as the nervous system of the user through neurofeedback, that is, by making the user more aware of their neural signals, which in turn allows them to fine-tune these.

My hope is that this will make up for the low quality of information that's available to the AI, due to noise, attenuation, and the low number/quality of electrodes. The user neither knows what signals the electrodes can access, nor how to willingly produce movements that create these signals. Some gestures work well, while others can't be detected at all, so the best bet is to use forceful gestures with maximal muscle activation. But if there was some sort of feedback to the user, like a visualization of the data that the neural network is extracting, the user could focus on the movements that work, and gradually lower the intensity, perhaps to the point where no actual movement is required anymore.

Of course there is already some feedback about the signals: The PsyLink UI shows the amplitude of each electrode in a rolling graph, and the GNURadio application shows detailed plots of the raw signals, both of which already help determining which movements will work for gestures and which will not. But the AI can of course combine, cross-correlate, filter, convolve and deconvolve the signals, which enables it to extract information that a human won't see in the raw signal data.

Ultimately, the goal is that the user learns to, on demand, fire off just enough neurons that PsyLink can pick up the signal and trigger the intended key press without any visible movement of the arm.

Approach

As described above, simply presenting the user with the raw electrode data is insufficient. A machine-learning approach will likely be optimal here, to overcome the preconceptions of a top-down designer. Since we already have an artificial neural network, why not use that one to generate the visualizations too?

In my current version of this idea:

  1. The user needs to invent some arbitrary gesture that should correspond to the action "Press key 'A'".
  2. The user is repeatedly asked to perform the gesture by the UI
    • At random intervals
    • For random durations
    • With 2-3 seconds of heads-up warning to account for reaction time
    • In between the gestures, the user should perform random other activities, but never do the gesture without being asked by the UI
  3. The AI is trained on the fly with
    • Electrode signals as input
    • A binary label of "Key 'A' pressed" vs. "Key 'A' not pressed" as output
    • Each data point is added randomly (80:20) to the training or validation dataset
    • After X seconds of collecting data, the AI is trained for Y epochs
  4. Every Z milliseconds, the AI is asked to predict the output from the current input, and the neural activations of the last non-output layer of the NN are presented to the user visually, along with the predicted output.
    • The visualization could be a heatmap or a scatterplot, for example
    • The visualization should cover a large dynamic range (both small changes and large changes to the values should be easily visible)
  5. Using the feedback, the user can tweak their gesture as desired, to e.g.
    • Minimize the movement required to trigger the key
    • Maximize the reliability with which the key press is predicted
  6. Over time, old data is dropped from the NN training to refine the visualization and to keep the training time short.

Once the user is ready, they can add a second action like "Press key 'B'" and so on.