You'd have to have a very reliable rate of accumulation of those "neutral variations," by which I suppose you mean mutations, right? But how reliable could such a number be?
It doesn't need to be that reliable. It's an average over quite a long time - short term deviations lasting even thousands of years would be smoothed out.
Think of it like this. Let's say I eat 2 steaks a year. I'm in my late 30s and we can safely ignore my childhood as the average would likely be different so let's say I've eaten about 70 steaks in my life and that's how I derived the average. If I say 'I've eaten 3 steaks since last we met' you could guess that its been 18 months, but this would be unreliable. I could eat 5 steaks one year and none another year.
However, if I say 'I've eaten 30 steaks since last we met', you'd be on safer grounds to say it must have been about 10 years.
It's like flipping a coin. On one or two trials, it'd be difficult to be confident what the distribution of heads would be. After a thousand trials you probably wouldn't be too far wrong if you guessed it was about 500.
So we calculate based on observations and understanding of the mechanisms how many mitochondrial mutations there are per generation. When we're only examining a 100 generations, the error margin is likely to be high. When we're examining 10,000 generations (ie., for humans about 200,000 years) the error margins are smaller. The more trials, the smaller the error margin.
As ever, there are always complications and confounding factors, but that's the gist of it.
Edited by Modulous, : No reason given.