Iteration 3 | Jack | HRV

Iteration 3 - HRV Relative to Breathing

To kick off the third iteration of this project, we spent Monday's class implementing the new auto-calibration code block into our existing Micro:bit design.

The Setup

The setup was simple enough, and didn't change much from the previous iteration. It consisted of:
  • A Micro:Bit v2
  • A Micro:Bit extension breakout board, housing the Micro:Bit
  • A heart rate sensor, connected to the breakout board
  • Our MakeCode codebase

The Use of Local Display

An important topic discussed during the class was just how valuable the use of a local display can be.
When building a project, especially one like this, it makes life considerably easier when you can gauge things with a quick glance at the Micro: Bit's LED screen, rather than go into your browser, and load up MakeCode, and log in, and select the right project, and hook everything up, etc. etc.
In this project, we use the basic 25-LED screen to:
  • Indicate the Micro: bit is on, and the project loaded onto it correctly,
  • Measure and feedback the quality of the user's finger placement on the heartbeat sensor,
  • Indicate peak/trough detection after auto calibration, so the user can determine if the calibration was successful.

The Code

Let's start with the easy (but vitally important) bit, the breath pacer. This works by means of a forever loop, changing the values of the breath_pacer variable between 800 and 550 every 4000 and then 6000 milliseconds. The breath pacer values were chosen as they are within a similar range to the HRV values, and as such can be graphically overlapped 'on top' of the HRV graph.


The serial values of the delta_t and the breath_pacer variables are outputted also.
This next forever block gets the user's pulse value from the pulse sensor and plots it as a bar graph on the Micro: bit's local LED display, like discussed previously.
The more the user 'presses down' on the sensor, the bigger the bar graph reaches. This LED can be used in real-time to gauge the user's finger placement on the sensor.


## finish code walkthrough

Findings & Analysis

With everything set up, the next job was to put the heartrate sensor on my finger and get some data!
It took a while (and many failed video attempts), but I eventually managed to get a good video, which can be viewed below.
In case the video can't be viewed, here's a description of what's happening.
  • I have the heartrate sensor pressed lightly against my index finger. We've found the placement of the sensor to be quite important, and it's very easy for it to move slightly which can significantly throw off the readings. As such, I'm trying my hardest in the video to keep my finger as still as possible, and to keep the sensor firmly positioned in place.
  • The LED screen on the Micro: Bit is utilized effectively, with a bar graph indicating both my heart rate, and the quality of my finger placement on the sensor. Once adequate, I click the 'A' button on the Micro: bit, which calibrates the reading to my individual case.
  • There's a block in my code that takes this calibrated heart rate and converts it to HRV, which is then outputted in graph form to the device data output screen. In the video below, this is the blue line.
  • Simultaneously, a simple breathing pacer is being overlapped against this graph. I then begin inhaling/exhaling in a 4 second/6 second manner, in accordance with the pacer.
  • You can see how my HRV naturally peaks and troughs, in a sin wave-like manner, and as I continue the breathing exercises, this wave begins to move into phase with the pacer.


Next Iteration Brainstorm

In the final segment of Monday's class, we huddled together and discussed where the future of the project lay. A handful of problems were outlined, and potential solutions suggested.

'In Phase' HRV

An important thing discussed was the fact that we're not looking to merely see the HRV value rise and fall before our eyes - this already happens naturally. What we're looking to see is how over time the HRV sine wave comes 'into phase' or in-sync with our breathing.

Pulse Sensor Robustness

One of the biggest problems we all encountered in this iteration was in relation to the pulse sensor. We found that it's incredibly tedious getting your finger to make the right connection on the sensor, and then keeping it there long enough to get a good reading. In fact, sometimes I found I'd be concentrating so hard on keeping my finger still, that I would forget about my breathing exercises.
Improvements were made to help mitigate these problems, such as the local LED display gauging the finger placement, and the auto-calibration helping with the readings, but even still the problem persisted.
We brainstormed the idea of some kind of finger mount for the sensor. Something that reduces the chance of ill-connection, making the reading more robust.

DFRobot, the company our sensors are from, showcase putting the sensor on one's wrist. It's very possible this image is just for show, but in theory it should work, and might provide a more robust reading than the finger.
















I found a post on the DFRobot community forums of a user who decided to build their own heart-rate sensor, and in the post they designed and printed a finger shell housing with a very cool LED screen on top.

They do, however use a different sensor to ours, a rectangular 'heart-rate and oximeter' sensor that fits perfectly in the shell.

In the final iteration, they showcase their finger in the housing, and a heart rate value being returned


Other things we discussed included the rework of the current 'manual' auto-calibration to be automatic, and to try and figure out how to calculate the phase difference between the pacer and the HRV, getting the number of degrees the HRV is out of phase and using it as a future resonance measurement.
Unfortunately, when I went to work on these ideas, I could not get my HRV reading to work, which was incredibly frustrating - looking back now, perhaps this was because I had literally every port of my already underpowered laptop connected to something external, including a 24" monitor. I hope somebody else on the team was more successful with this than I.

Output Sonification

The last discussion topic was how we could sonify the output. The idea of resonance frequency was mentioned, which took me back to my secondary school physics days, using tuning forks to calculate resonant frequencies.
I immediately thought of that almost melodic sound that the tuning fork creates when struck. We could recreate that sound or similar and use it as an audio-feedback.



MQTT

In a future iteration we'll be taking what we've got and adding cloud functionality. We'll look to take the HRV value and beam it up to the cloud using the Beebotte MQTT server.



Comments

Popular posts from this blog

Iteration 2 | Jack | HRV Micro:bit Research

Iteration 5.2 | Lock In | Jack

Week 1: HRV Review