Lock-In

Introduction

For the remainder of the hours in the module, we had collectively decided to dedicate two full days (11AM - 5PM) towards the IoT Applications module. Within these two days, the task was to create an Internet of Things-related technological artifact which would utilise the BBC Microbit platform as a means of data acquisition. Building on the work completed by the previous year’s cohort of IoT Apps students, the aim was to create a system that would visualise human heart rate. While the previous year’s group had opted for a more physical ‘wave’ visualisation approach by using a mixture of dyed blue water and transparent oil within a bottle that would tilt from the movement of a servo motor which would mimic ocean-like waves based on heartrate data from the Microbit, this years visualiser approach aimed to utilise more contemporary web application technologies to visualise human heart rate.

Background

In the class which had taken place before the lock-in, several technology demonstrations had been showcased with the aim of ingesting analogue input data from the heartrate sensor (attached to a Microbit) and outputting the data to various services / endpoints.

Within that class, a discussion was held in relation to the speed at which heart rate could be sampled by the Microbit's heartrate sensor and displayed to the wider world to see on the internet. One vision shared by Jason was for the system to utilise a Fast Fourier Transform (FFT) graph which could be used to visualise several heartbeats. As an FFT chart is primarily used to visualise audio-based signals for music visualisers, I had come up an idea to convert the heartbeat sensor data first into audio and stream it as audio to a web application using the WebRTC protocol.


Reflection on Previous Iteration

One of the technology demonstrations from this class was a recovered version of the technical demo from the previous iteration but with a few tweaks. The previous demonstration had been a collaboration between me and Shane where we had used a React web application with a component to interface with an MQTT broker over WebSockets. A custom, locally-run MQTT broker (Mosquitto) had been configured on a Raspberry PI 4 upon startup on port: 8080. A locally running React web application was also configured on the Raspberry Pi 4 which would subscribe to its own MQTT broker on its internal, localhost network. A Microbit had been connected to the Raspberry Pi over USB which would send JSON-format serial messages to the RPI containing heartrate samples in each message.

To test how far could push the limits of locally hosted MQTT, Shane had removed the delay within the sensor microbit which had caused it to send roughly 50 samples per second.

The Microbit and Raspberry Pi had been configured to use the fastest available serial messaging speed (baud rate) of 115200 in order to test the limits of how low the latency could become with such an approach. A different Microbit module not connected to a RPI, featuring the heartrate sensor had been configured to send heartrate samples over radio communication to the ‘gateway microbit’ connected to the PI.

I had experimented briefly with a means of opening up the Raspberry Pi to make certain ports available to the wider internet using Cloudflare Tunnels. While I had been sucessuful with the configuration to allow the web application hosted on the Raspberry Pi to be accessed from the internet over a domain name I had purchased roughly a year ago (brokenstack.tech), I had been unsuccessful in making the websockets MQTT connection publicly available over Cloudflare Tunnels. Another observation from this MQTT setup had been the excessive CPU load that the raspberry PI had been under while running the Python serial-read script, Mosquitto MQTT and the hosting of the react application.

The result of such an implementation had been a surprising success at first glance. The React web application had displayed a gauge visual which reacted to MQTT data and the movement had been extremely smooth, however there was still one major issue. The gauge on the web application had only been accessible from the browser of the Raspberry Pi itself. The webpage did load when accessing it from an external network but no data was being sent from the Microbit.

Lock-in Iteration

After some discussion with other teams regarding different implementations of MQTT, we had reached the conclusion that MQTT is to the optimal protocol for sending over 50 samples per second as it had not been designed for such high-throughput data streaming applications.

For this two-day lock-in, we had agreed to build upon the React web application but replace the MQTT protocol with a new experimental approach: using the webRTC protocol to synthesise an audio stream from the heartrate sensor samples on the Raspberry PI, then stream this audio to a React applicaiton and a simple Unity client application (as some members within the group had a background in game development).

To ease some of the computational load from the Raspberry Pi, we had decided to shift the backend layer away from the RPI and onto a Linux Ec2 instance running on Amazon Web Services. This instance would run a backend service to handle the relaying the WebRTC audio stream from the Raspberry PI Microbit setup to the client which could access the server from any location in the world.

I had unexpectedly taken on the role of a product manager when I pitched the idea in the class before the lock-in. In the first hour of the lock-in, we were tasked with splitting ourselves into teams to tackle various aspects of the technology stack. To make the most of the skillsets of each induvidual, I had suggested splitting into four teams: 

  1. Myself and Shane to revive the old iteration, port it onto Shane's more powerful Raspberry Pi 5 and adapt it to synthesize low frequency audio based on heartrate sensor samples.
  2. Daniel, Jay, Alex and Dean to research WebRTC data transmission and the creation of cloud infrasturcture due to Daniel's experience with WebRTC for his final year project and the rest of the team members background, specialising in Cloud Computing
  3. Brendan and Mark to implement a simple heart visualiser in the game engine: Unity which would expand / contract based on audio, integrating it with a WebRTC stream.
  4. Arthur to continue the Microbit MQTT to Google Sheets implementation as a crude solution to storing the heartbeat samples in an excel-like database.

We had chosen to use a Kanban-style planning approach on the whiteboard to keep track of tasks.











The following diagram depicts the architecture which we had attempted to implement:

Results

Over the course of the lock-in we have achieved many aspects of the proposed implementation, however, we were unsuccessful in the implementation of some aspects due to time constraints.

The infrasutucture team had great success with the deployment of the WebRTC signalling server and the deployment of the React application I had created earlier. The team had been able to send images, audio and files from the signaling server to their local machines using the protocol. 
Brendan and Mark's team had been successful in creating a Unity mobile application demo which had visualised the beating of a heart image based on the modulation of an audio wave, however there had been dificulties with the integration of the WebRTC protocol as the library made to handle WebRTC had been deprecated and incompatable with the current version of Unity.

Myself and Shane had successfully ported the previous iteration with the MQTT gauge to the raspberry PI which we had working locally on the Pi's browser and from clients on the local network. Over the course of the second day of the lock-in, I teamed up with Daniel to focus on altering the python script on the Raspberry Pi to synthesize an audio wave based on the modulation of a sinewave from samples collected from the gateway microbit. In the end we were unsuccessful in the creation of a modulated sine wave as we had channelled most of our effort into sending a basic audio wave to the signalling server on AWS using the WebRTC protocol. In the end, the python script had successfully synthesised a basic sine wave and was successfully transmitting from the Raspberry Pi, to the signalling server on AWS and eventually to the client over the WebRTC protocol. The following video shows this in action:

While it may not sound impressive, this demonstration is an excellent proof of concept that it is possible to use infrastructure and tooling created for streaming live audioin order to adapt it for Internet of Things application.
Had we been given some more time, I am confident that we would have been able to successfully demonstrate the architechture I had proposed, with Jasons vison for the live FFT chart to visualise heart rates.

Message to Future Students

For the 2026 group of IoT Applications students (if you do decide to build on top of this iteration), the implementation of sine wave modulation to vary some aspect of the sine wave based on the microbit serial samples remains the biggest piece left unsolved. Within this lock-in our group had proven that is a viable architecture and I wholeheartedly (no pun intended) believe that you are capable of finishing what we had started. Best of luck!  - Andrew Koval












Comments

Popular posts from this blog

Blog 2 - HRV Demo

Week 1: HRV Review

Lock In Day 1 (20/05/25)