|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: A definition of infinity? | |||||||||||||||||||||||||||
Son Goku Inactive Member |
There are questions about two things here so I'll answer them in order.
So it appears to me if we view the set of positive integers solely as that, a set, it may be defined as infinite. However, if we start to think of the set in terms of a sequence, it may be defined as semi-infinite, as there is a point beyond which it is impossible to go in one ”direction’.
Formally that quality is expressed by saying that the positive integers are bounded. They are still infinite in the sense that they do not have a finite collection of elements.You've spotted the difference yourself though. As a set (as you said) the positive integers are infinite, but when you think of it another way they are only "semi-infinite"(bounded). This other way of thinking is Topology, where being bounded is an important property. If something is bounded from both ends and it is closed (if you know what closed means) it is called compact. Being compact is probably one of the most important topological properties. There’s another interesting point regarding positive integers and infinity. It concerns what happens when you pair each positive integer with its square, i.e. 1 with 1, 2 with 4, 3 with 9, etc. The original set will be a complete set of positive integers, but the set of squares is not, yet both sets are exactly the same size! I believe this apparent paradox has been resolved, but I cannot remember by whom, and am not aware as to what the explanation is. Maybe somebody on here can help us.
It was resolved by Cantor at the turn of the century. Any infinite set of objects that can be put into correspondence with the positive integers is called countably infinite or aleph-null. (Believe it or not, the set of all fractions is countable)Any set which cannot be put in correspondence with the positive integers is called uncountable. An example would be the real numbers. Now with regard to the double slit experiment, let's call the slits hole A and hole B,. When you attempt to detect which hole the particle went through, you must set up experimental equipment which can measure what hole the particle went through. (This might sound like an obvious tautology but it's a very important point that I'll explain later.) This will mean the equipment can return two results.These results being A or B. Corresponding to each of these two measurements there is a quantum mechanical state/wavefunction. I'll call these wavefunctions |A> and |B>, being in these states means the particle is localized (or concentrated) at hole A or B. Measurement of either A or B by the equipment will mean the particle is now in state |A> or state |B>.(To sum up, if that sounded muddled, if I measure B (i.e. I detect the particle at B), that means the particle-wave is now concentrated at B and not spread out all over the place.) So I let the experiment run and my equipment detects the particle at hole A, which localizes the particle to A (puts it in the state |A>). After the measurement the particle spreads out like a wave from A and hits the detector screen. However I get no interference pattern. Why? Take a look at this picture, where interference is occurring:
The reason I got no interference pattern was because I localized my particle to A, which meant there was no wave coming from hole B and therefore nothing for my particle-wave from A to interfere with. Now it doesn't matter what method of detection I use, because all methods will have {A,B} as their set of possible results and therefore localize the particle to A or B and prevent interference. Therefore the result holds for any experimental equipment which can measure what hole the particle went through regardless of the method it uses.
|
|||||||||||||||||||||||||||
Son Goku Inactive Member |
I'll get back to your post tomorrow, stuck for time at the moment.
However I will give you something to chew on, the difference between regular everyday probability and quantum probability. The example I'll choose is a ball/particle (ball for the classical case, particle for the quantum) being kicked from A to B along three different tunnels.
In fact it'll usually be n% chance for one hole and a m% and l% for the other two holes. With m%+n%+l%=100% At the quantum mechanical scale, probabilities come in after I've obtained complex number associated with each path. I won't go into the details but basically sometimes the rules of the mathematics mean I have to subtract probabilities. An example would be:I get 33.3% for path 1, 33.3% for path 2 and 66.6% for path three. However the maths says I have to subtract the last probability from the other two. So I get a 33.3% + 33.3% - 66.6% = 0% chance of the particle arriving at point B. A result you never get at the classical scale. The more paths you have to a location never reduces your chance of getting there. (To those who know the details of the mathematics, I know I'm skipping a lot, but I think this captures one of the major differences between standard probability and QM)
|
|||||||||||||||||||||||||||
Son Goku Inactive Member |
Sorry for the delay, I was a bit busy.
I also apologize in advance if I ramble a bit and repeat things. I'll expand on my message (No. 28) and answer Javaman's question first.
Javaman writes:
It means the exact same thing in the quantum case. Could you explain this a little further? Classical probability is just a statistical thing: if path X has a probability of 0.2, all this means is that in 100 cases of the ball travelling from A to B, then the ball is likely to follow path X in 20 of them. What does the probability of 0.2 mean in the quantum case?Take a look at the picture above and consider the circumstance where two of the paths are absent, leaving only one, let's say the top one. In both the classical and quantum case, if I send a ball down the path it has a probability of 33.3% of reaching B. Simple enough. Now consider the case where only the middle path is present, in both the classical and quantum case, if I send a ball down that path I've a 33.3%........e.t.c. Now consider the scenario with only the bottom path.......e.t.c. So far no difference between classical and quantum cases.Now what happens when I consider the case where all three paths are present? In the classical case I start with one path and open up the other two. The process being:Starting probability: 33.3% Open one path: +33.3% Open the next path: +33.3% Total: 100.0% In the quantum case:Starting probability: 33.3% Open one path: +33.3% Open the next path: -66.6% (Even though it was 33.3% when open on its own) Total: 0.0% What happened when I opened the final path?Basically, in the classical case I don't have to take into account the presence of other paths when I open another one. It just adds to the total probability. In the quantum case, opening a new path can cause interference with the other two. When the path is open by itself, this interference is absent so it gives a regular 33.3% probability. With the interference from the other paths it gives -66.6%. Now the thing is, in no quantum mechanical situation will we ever end up with a negative probability over all. Even though some paths will give negative probabilities there will always be a greater or equal contribution from the non-negative ones. If we were to try and get an overall negative probability, we'd have to shut off the positive probability paths, but this kills off some of interference that makes the negative probability paths have negative probability. So they turn into positive probability paths, ensuring the whole system always has a positive probability. Anyway on to dogrelata's post: dogrelata writes:
Since a cat has the unfortunate properties of being mostly classical, I will instead consider the case of an atom on its own which starts in a state of having a 50/50 chance of decaying or not decaying. I can get my head around the Schrdinger’s cat thought experiment. As far as I understand it, there is no way of knowing whether the cat is alive or dead until the box is opened. Although I have seen this characterized in some places, as being the cat is simultaneously alive and dead, my standard worldview kicks in and dismisses this as purely hypothetical. A different analogy might be, a football match has taken place, but I am unaware of what the result is, so until I learn of the result, the outcome remains a matter of probability-based conjecture. I'll write the quantum state of the atom in the following form:(a,b) Where:a=current chance of being mesured as decayed. b=current chance of being measured as not decayed. So when we start off the atom is in the state (50,50).I make a measurement to see if the atom decayed. Let's say I find the atom is not decayed, so the state right after measurement is: (0,100) Obviously, because I've measured it to be not decayed it has 100% of being not decayed when I measure it. Then I leave my equipment and stop measuring the atom. Over time the probability returns to 50/50.So over a time of about a tenth of a second, the following happens to the state of the particle: (0,100) -> (10,90) -> (20,80) -> (30,70) -> (40,60) -> (50,50). Now I can introduce the Zeno effect, probably one of the weirdest things in QM. Now that the atom is back in the 50/50 state, I get a piece of equipment that emits one frequency of light, let's say a pure red laser.When the atom is decayed it absorbs the red light, when it's not decayed the light passes straight through it. So I emit the light at the atom. It passes straight through, meaning the atom is not decayed and therefore is in the (0,100) state. As before the atom will start to evolve back into the 50/50 state. However let's say I measure the atom again, really quickly, so quick that it's only gotten to (1,99).Since it has a 99% chance of being measured undecayed, the odds are I will measure it as not decayed by seeing the light to pass through the atom, which returns it to (0,100). If I keep measuring it really fast, so that it only gets to about (0.0001,99.9999) every time, the odds are I won't measure it as decayed for quite a while, because the odds are much greater to measure it as not decayed.In this way I can freeze the atom in an undecayed state for quite a long time. However this is weird when you think about it. The way I measure it as not decayed is if light passes through it, i.e. if the atom and the light don't interact. So I've managed to stop an atom from decaying by not interacting with it, in other words, by doing nothing to it. (And this effect has been demonstrated in a lab.) Edited by Son Goku, : Minor spelling correction.
|
|||||||||||||||||||||||||||
Son Goku Inactive Member |
Unfortunately I agree with everything you've written, I just have trouble getting this stuff into English. Particularly Zeno freezing. You are also correct to pull me up on it.
I also apologize in advance to dogrelate, I'm going to take cavediver's advice on board and write two much longer posts when I'm satisfied with my attempt to reword my explanations more in the spirit of the actual mathematical material*, so if you think you're confused now, just wait. *I won't write actual mathematics, just some pseudo-maths to appeal to intuition. I more mean I'll try to cut down on the gee-whiz stuff that leaked into my previous post.
cavediver writes:
I just don't know how to go about explaining the "real thing". (i.e. Integrals over the space of classical paths). I suppose I was just trying to get across that unlike classical probability more options doesn't necessarily increase the probability. Maybe I should try it from a "action weighted" path point of view? 1) I think you've introduced a bit of confusion with your negative probabilities by (unintentionally) suggesting that a specific path can be associated with a -ve prob. I bring forth dogrelate as my first evidenceTo be honest, I think this is the mistake I can correct the easiest, since I'm sort of half-way there and simply need to explain that it is the infinite limit of summing over all paths that defines the probability and then explain what is being summed. Think of how you used to view black holes as these truly mystical objects, baffling beyond belief in terms of their ability to screw around with space and time. And then you saw the simplicity of Schwarzschild... were you disappointed or blown away by the elegance and beauty? It is that elegance and beauty I wish we could express better... but could we make anyone else appreciate it?
First of all, the Zeno freezing is an aspect of (I'm only saying this to set the stage since I know you know it) measurement acting as a projector down onto eigenkets.The unfortunate thing is how do I explain this physically? I can't use environmental decoherence to make it seem less mysterious, as I would like to, because I can't, considering the ontology of environmental decoherence isn't complete (Problems with the density matrix, e.t.c.). I can't honestly use any of the other interpretations, because they all have their flaws. So all I'm left with is what happens directly in experiment without any aid from any interpretation or what happens in Dirac's bra ket formalism. I can't use the latter, because that would amount to bringing somebody up to speed on linear algebra and its applicability to spaces of functions.So all I'm left with is cold, flavourless "turn on the machine and see what happens" experimental fact. Experimentally, light passing straight through an object is doing nothing to it, it is no interaction. And so experimentally the object freezes because of no interaction. However this is a lame explanation, that can be made less lame by picking an ontology, but which one? And why one above any of the others? Or am I left only with Dirac? Or, even worse, I'm I left with only my original explanation, knowing that there is something incorrect about it, but unable to justify a more detailed explanation? cavediver writes:
This I have no idea how to do. I know I should do it, but what way would I go about it? All I can see is something like this happening: I must admit I've never liked the "classicalisation" explanations of quantum phenomena (look what happens to this *particle* or consider this *path*) as they perpetuate much of the voodoo nonsense. Quantum behaviour is fascinating but it quickly leads down the road to the confusion we often see expressed by certain individuals here at EvC. The truth is that classical behaviour is the weird stuff, and quantum the mundane. The fact that a classical regime emerges is what amazes me. And trying to explain the quantum world by reference to the classical is like explaining the chemistry of liquid water by reference to oceanic currents. But can we do better?Me: So our world isn't really anything but a diagonalization.......jargon, jargon........and also.....jargon, jargon, jargon.......lim (jargon-> infinity) Person: Eh, what? (I'm reminded of the time somebody asked my where did electromagnetism come from, what caused it? |
This message is a reply to: | |||
Message 35 by cavediver, posted 01-06-2007 9:36 AM | cavediver has replied |
Replies to this message: | |||
Message 37 by cavediver, posted 01-06-2007 11:50 AM | Son Goku has not replied |
Message 39 of 41 (375405)
01-08-2007 3:04 PM |
Reply to: Message 38 by JavaMan 01-08-2007 8:55 AM |
|
This message is a reply to: | |||
Message 38 by JavaMan, posted 01-08-2007 8:55 AM | JavaMan has replied |
Replies to this message: | |||
Message 40 by JavaMan, posted 01-09-2007 8:15 AM | Son Goku has replied |
Message 41 of 41 (375817)
01-10-2007 4:14 AM |
Reply to: Message 40 by JavaMan 01-09-2007 8:15 AM |
|
This message is a reply to: | |||
Message 40 by JavaMan, posted 01-09-2007 8:15 AM | JavaMan has not replied |
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024