Let's see if anyone likes to solve simple physics problems.
We all know (I hope) that the standard Newtonian formula for the gravitational attraction of two (point like) masses is:
F = G M*m/r^2 where G is the Gravitational Constant - M is the mass of
one object, m is the mass of the other and r is their separation.
So far so good.
Now imagine two point masses as above, separated by an initial distance 2r (it helps the algebra to make it 2r) and that they are initially at rest.
They are then 'let go' and allowed to move together by gravity alone. I could couch this in some fancy language about assuming the space is maximally symmetric blah blah blah - but the initial conditions are as stated above - it doesn't need GR or such.
Now the question is:
How long does it take them to collide?
Yep - that simple a question. Seems easy doesn't it? After all it's so basic it must be in any basic mechanics text - except it isn't in those books - or at least any I remember. In fact it is curiously absent - I wonder why?
It really isn't that difficult but it is subtle enough to make you think a tad.
Hint:
There are two ways of solving this I can think of. One uses a very fundamental theorem that dates back 300 years or so - the other is a clever way of recasting the problem that actually dates back 400 years or so.
(I don't know really what forum this belongs in - I thought Big Bang and Cosmology might be best just because most of the physics types post there but use your judgement.)