That they no longer apply means the problem is in the our tools ability to measure. Not that the laws of nature cease to operate in the way it operates at time 10 -43 and greater.
Historically, that's been one interpretation of quantum uncertainty; that it's just a measuring problem.
Now, via ways that I don't understand and couldn't possibly explain, it's possible to test that. That is, a model where uncertainty is just an engineering problem makes different predictions compared to amodel where uncertainty is a fundamental constraint on the universe; that is, not only do
we not know for sure the exact position of a certain particle; but the
particle doesn't know either. It's precise position is uncertain because it does not
have a precise position.
We can test the difference between predictions made from these two competing models, and the models that explain uncertainty as simply a measuring constraint are never as accurate. To the best of our ability to measure, they're wrong.
The best scientific conclusion is that randomness and uncertainty exist in the universe at a fundamental level, not as simply a measuring problem. We can't measure precise positions because at that tiny level there
are no precise positions, only statistical positions (i.e. "this particle is 93% here.")
This could be computed if only there was a way to compute all the forces involved.
Again, no, it couldn't. The fundamental nature of the universe makes this impossible. It's not just a measuring problem; the universe itself is random on a very fundamental level.
If there is only matter/energy and laws which of these is going to act contra to the only way it can act under the circumstances.
What people are telling you is that there's no such constraint - there's no situation where there's only one outcome of a given configuration or interaction of matter according to physical laws. The laws themselves incorporate randomness and nondeterminism at a very fundamental level.