For my next iPhone app I’m working on a metronome. I know what you’re thinking – another metronome?!? Yes. Another one. Before you run away, let me explain…ok, I guess there’s not much to explain. There’s a lot of metronome apps out there on the store, but as you might imagine, a huge percentage of those are just plain crap. New ones keep coming and going, and yet, Tempo by Frozen Ape continues to stand tall above the rest. Don’t get me wrong, Tempo is a great metronome. I just think it’s time to take a shot at gaining that spotlight for a while. Plus, I think I’ve got a few things I can bring to the table that will set mine apart too. We’ll see. Anyway, the reason for the post is I’ve been working today on getting a Tap Tempo feature up and running. I think I’ve found a pretty good implementation.
Averaging it Out
The basic idea behind a tap tempo for a metronome is that the user will tap on a button along with the beat of a song, and the metronome will figure out the beats-per-minute and set itself to that tempo. So, all you really have to do is keep track of when the user taps, calculate the average time between taps, and then convert from length of time to BPM. Simple right? The problem is that you’re trying to measure a very precise amount of time, and you only get a couple of samples to measure and average, and most importantly – it’s a human pressing that button, not a machine. That, combined with any slight latency in the device, makes for a tricky situation when deciding on what BPM the person is trying to tap. Tiny differences in the time interval between taps can have a big affect on the BPM calculated from it. At exactly 120 BPM, there all 2 beats per second, so the interval between taps will be .5 seconds. At 130 BPM, the interval will be .4615 seconds – so a difference of less than 5/100ths of a second between taps will yield a BPM that’s completely different! I don’t care how good a musician you are, nobody can tap consistently with that kind of accuracy. And when it comes to trying to match the tempo of a song, even a difference of 1 BPM will be noticeably “off” within just a few beats or maybe measures. Sticking with the example of 120, the difference in tap interval between that and 121 BPM is less than 1/200th of a second, and yet musically, just one BPM different is a large difference. Obviously, the trick here is to get as many samples as possible and average them together – the more taps you can collect, hopefully the closer you can zero in on that desired tempo. To implement this in Objective-C, I’m just using good old NSDates and an NSMutableArray. Each time the user taps, the date is added to the array. Once at least two taps have been registered, use [NSDate timeIntervalSinceDate:] to find the interval between the two. When 3 or more taps have been recorded, loop through these and store each interval, then average them together. More taps = better average = closer to desired tempo. Not too hard, right?
What’s the Catch?
The problem then, is how and when to break out of the cycle. You want to store as many taps as possible to allow the user to really zero in on the desired tempo, but you can’t keep adding and averaging intervals forever. What if they want to set a new tempo? What if they mis-tap a time or two and you need to clear out those bad taps? What if they start tapping quarter notes and then realize they’d rather do eighth notes? For my implementation, I’ve addressed each of these problems like this:
1. Setting a new tempo. This one’s pretty easy and is commonly done with a simple timer. Each time the user taps, an NSTimer is scheduled with a specific interval, if no new taps are received before the timer fires, then we assume the user has found their desired tempo and stopped tapping, and we clear out the cache of intervals. So, if they start tapping again, we’ve got a clean slate and are ready to calculate any tempo they might want. The timer technically only has to be slightly longer than the slowest tempo we want to be able to calculate. For example, 30 BPM = 2 seconds between each tap, so as long as we wait at least 2 seconds before clearing out the cache, we can calculate tempos as slow as 30 BPM.
2 & 3 go together: Mis-taps and/or just plain bad tapping, and quickly and drastically changing tempo – e.g. switching from quarter notes to eighth notes. Although these may seem kind of like edge cases, I think they are actually quite common. I know I myself have run into these problems when using other metronome apps, and I haven’t found one yet that addresses them very well, hence, my own implementation. For example, what if you’re trying to find a tempo to record your own song at? There’s no music playing to follow along with, just music in your head – or live from your instrument. As you go, you may be tapping along a little faster now, a little slower now, trying to find the right speed, or even start out WAY too slow or fast and want to adjust your tapping on the fly. Or, perhaps a song has a very slow tempo. You start tapping along to the quarter notes, but realize that’s too inaccurate, so you immediately start tapping eighth notes. Well, if our only escape mechanism is the timer, then the only way to get the metronome to calculate a new tempo is to stop tapping completely, wait until the timer fires – and two seconds really is a LONG time to wait – and then start again at the new tempo. Otherwise, all those taps will be used in the calculation, and the average won’t be anything close to what you’re looking for. This waiting is completely unintuitive and potentially very frustrating to any user. There’s no clear way to tell them, “sorry but you’re going to have to stop tapping until my timer fires, and then you can start again.” The most common way to deal with this is to limit the number of taps that are used to calculate the average. Start out too slow? Well, just keep tapping, and eventually those old, bad taps will be pushed out by the new, correct ones. Problem here is, we want to use as many taps as possible to get a good average, but if we use too many, then the user can run into these problem situations where the metronome is really slow to respond to large changes in tapping speed. How can you compensate for these limitations while making the whole thing intuitive an invisible to the user? My solution: As each new tap comes in, compare it to the previous average. If it falls outside of a certain range of acceptable variation, throw out all the old ones and start over from scratch. Now we’ve got the best of both worlds: if the taps are nice and steady, just keep on collecting intervals, and the average keeps getting more and more accurate. But, if a rogue tap comes in, you don’t have to wait for the 2 second timeout or the old taps to “fall off” the end of the array, just keep on tapping, and the metronome will adjust to follow your new tempo! One of those simple little situations where the little details can make the difference between frustration and pleasure for the user, trying to make sure the mechanics of the interface fade into the background and get it to “just work.”