At the University of Sydney, the CS society (SYNCS) runs an annual hackathon where you have ~24 hours to build a open ended project.
In 2020, I competed with a team of six in the hackathon to build what is essentially an audio version of QR Codes, called QR Tones. The idea is that when you don't have access to a widespread visual transmission method for QR codes like a projector or printed sheets of paper, you can instead transmit the code to everyone at once by playing a sound. There are a couple flaws with the idea (noisy rooms make everything hard to hear), but hackathons are hackathons after all. 🤷
In the end we got second place, yay! 🎉
The app was only ever locally hosted, but you can still view the devpost and source code.
This was the first entirely online hackathon due to the 2020 COVID lockdowns, so it was quite a different experience to the previous in-person hackathon I attended in 2019.
We also went into the entirely open-ended hackathon with no real plans, which I have learned now was a mistake. Without a plan, the ideation process feels rushed as you are cutting into valuable hack time and you end up with many people unsatisfied with the idea. Fortunately everything turned out fine in the end as we came around to the idea, but still a lesson learned there.
As there were six of us (also not really recommended), I definitely spent a good amount of time ensuring the work we did would actually fit together. Organising programmatic interfaces between modules developed by different teams, in particular. This was further complicated by being all virtual so I couldn't just walk over and chat with people.
The coding I did do was mostly in the actual noise decoding algorithm. Interface-wise, my job was to take an audio signal as waveform samples and return a message encoded using a communication protocol defined by the encoding team.
The primary challenge here was the extreme levels of noise that scrambled the signal, making it very hard to correctly decode. Thankfully, a tradeoff existed between signal speed and accuracy, so we were able to keep sacrificing speed until the accuracy was satisfactory.
Our first working prototype was completed with less than 4 hours before the demonstration was due (not recommended). Below is a video of that prototype in its final form.
A hint at our encoding method exists in the above visualisation. Each distinct tone holds a single byte of information and the different spikes represent the amplitude of the waveform at one of 8 possible frequencies (one for each bit in the byte).
The first, slightly longer signal was actually a header that was used indicate to the decoder that a message was about to be received. The header is why you hear 6 tones for a 5 byte message "hello".
During decoding, a fourier transform was used to extract the amplitude at each of these 8 frequencies which could then used to decode the signal.
We managed to get QR tones to work reliably at a speedy 1 byte per second. A typical Version 4 QR code for a URL may only need to hold up to 50 characters, which translates to around a minute of transmission time for QR tones.
I expect that with some time and effort (compressed encoding, noise cancellation, error correction) we could have increased the rate of reliable transmission by an order of magnitude, which is definitely comparable to the visual alternative.