Blog 6 - The Lock In

 IoT Apps -  Heart Monitor Project & Real Time Audio Streaming

Introduction

This final project for the IoT Apps module brought everything we have learned so far together. Building on the concepts we learned throughout the semester such as analog in/analog out and HRV. We took on a challenge to develop a live heart monitor system. 

The idea? Use a Microbit and heart rate sensor to record heartbeats, stream that data in real-time to the cloud  and visualise it both audibly and visually through a Unity app.

We tackled this across four structured sprints, dividing into teams and managing progress via a Kanban board.


Figure 1 - Kanban pt.1

Figure 2 - Kanban pt.2


Figure 3 - Backlog


Figure 4 - Tasks

Teams

We broke into the following teams in order to split up the work:

Unity - Mark and Brendan
WebRTC/FFmpeg - Daniel, Shane and Andrew
Microbit + Streaming Arthur
Cloud Infrastructure - Alex, Jay and Dean

This was decided among us based on what college streams were, as people doing game development were happy to use unity, cloud using AWS and IoT doing the WebRTC as they had used it with some of their personal projects. The teams were not very strict as we ended up all helping each other with different aspects of the project.

Design Diagrams



Figure 5 - Intended System Design

Figure 6 - End of Sprint System Design

Day 1

Sprint 1

We started off the first sprint with a meeting with Jason where we planned out how we wanted the app to work. In order to carry this out, we followed an agile methodology, breaking the work up into sprints and designed a Kanban board to compliment this. Teams were decided upon based on what each persons strengths were. The Kanban board proved to be extremely useful as we could all see what tasks were currently being worked on and what the status of them was, such as if one team had completed a task or if we were blocked somewhere else

Sprint 2

Starting sprint 2, everybody was assigned a set team and knew what tasks they had to do. I was assigned to the cloud team, which involved us setting up EC2 instances for both the signal server and the react front end we had planned to create. This did not take the cloud team too long to do and our work was also dependant on the other teams finishing tasks so we could push the work up to the cloud.

As a result of this, I began working with Daniel on the Web RTC team. The Web RTC set up worked as followed:

  • Sender Script - This was a python script Daniel would run in order to send data, such as an audio file, to whomever was running the receiver script.
  • Receiver Script - This was another python script which would accept the incoming connection and handle incoming data
  • Signal Server - This JavaScript server acted as a middle man, using websockets to allow communication between the sender and receiver script.
The code the the scripts can be found here: https://github.com/dlaw4608/IoT_Apps

Using this infrastructure, we were able to host the signal server in the cloud on an EC2 instance. We proceeded then to test the sender and receiver scripts. While I had the receiver script running on my laptop, Daniel was able to transfer both image and audio files from the sender script by using WebRTC . The next goal on the WebRTC side was to be able to stream live audio, as this would be how our heart rate would be recorded.

On the unity team, Brendan and Mark were both able to create an app which had a heart beat growing bigger and smaller in reaction to a soundtrack being played. This would prove to be extremely useful as once live audio was streamed in of our heartbeat, the heart could react to it.




At the end of the day, we held a retrospective to see what we did right and wrong and if anyone was blocked on a task. We also discussed on what the plan was for Day 2 and how we would finish the app within the next 2 sprints.

Day 2

Sprint 3

We were immediately set back on day 2 as when we arrived to the lab room, it was now occupied by students from another course who had removed our Kanban board. This was a nuisance as it meant we no longer had a visual representation of what tasks were to be done and what tasks were completed.

I began the second day helping across both the cloud team and the WebRTC team. Both Andrew and Shane were working on a feature that could convert the heart rate variability readings into a sine wave which could then be sent as audio. The goal was to take the work myself and Daniel did in sprint 2 and combine it. We would then be able to set up a react web app, along with using the unity app to have a front end displaying the HRV data.

Unfortunately, this proved to be a lot harder than intended, as streaming the audio files proved to be a lot harder than how we used the send and receive. Most of the team then came to assist both Daniel and Andrew as we seemed to be bottle-necked on the one issue.

Sprint 4

During the final sprint, we were all focused on resolving the issue with the WebRTC and sine wave. With limited time remaining, a compromise was made, where instead of streaming the HRV data, a placeholder, hardcoded sinewave was created in order to prove the concept could be done. Coming up to the end of the sprint, Andrew and Daniel eventually found the source of the issue that had blocked us, as the format of the generated sinwave was incompatible with how the receiver was expecting it, resulting in nothing being sent through. Once fixed, we were able to hear clear audio being streamed through on the react app side.


As we reached the end of the day, we were happy to end on a good note of resolving the sine wave issue. While we didn't have the time to combine all of the individual components to work in tandem, we felt we were leaving the project off at a good starting point for next years team to finish.

Future Iterations

While the app isn’t fully integrated, it’s now in a place where all the core components have been built and tested individually. The Microbit can read pulse data, the WebRTC setup can stream audio, and both the Unity and React front ends can receive and display output. The most obvious next step would be combining these elements into one continuous pipeline, from sensor input to visual/audio output. Small improvements like syncing the audio timing more accurately with the heartbeats, or packaging the whole system into a single React-based dashboard with embedded Unity and data visualisation, would make the app feel more complete. There's also room to clean up the data formatting between components, which was a key issue during integration. All in all, the project is at a solid prototype stage that just needs refinement rather than a full rebuild.

Reflection & Conclusion

This project was a great way to apply everything we learned throughout the IoT Apps module in a real, collaborative setting. Although we didn’t manage to fully integrate all the components into one seamless system, we got parts such as the Microbit sensor, WebRTC streaming, Unity animation and React front end, working independently, which in itself was a major achievement. 

One of the most valuable aspects of the project was how we operated as a team. By following agile principles and using sprints, we were able to clearly define roles, track progress and most importantly, respond quickly when things went wrong. Whether it was losing our Kanban board or hitting a technical bottleneck with the audio stream, we adapted by reshuffling responsibilities, pairing up across teams and tackling problems together. 

The flexible structure meant that nobody was boxed into one task and everyone contributed across the board, which made a huge difference. I'm proud of how far we got and confident that we have laid a solid foundation for future development.



Comments

Popular posts from this blog

Blog 2 - HRV Demo

Week 1: HRV Review

Blog 5 - FFT & HRV