|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
|
|
Author | Topic: What exactly is ID? | |||||||||||||||||||||||
PaulK Member Posts: 17826 Joined: Member Rating: 2.3 |
quote: That is not in contention. The question is whether the mutated version had NO function. And we do not know that.
quote: Since the probabilty you are calculating relies not only on the number of proteins in the E Coli flagellum, but also their structure it seems very unlikely that it can be separated to the degree required by Dembski's TRACT condition. But please, if you think otherwise then give the specification.
quote: So long as you give me no reason to bring up your error I am happy to stop talking about it.On the other hand if, for instance, you try to use the result of your erroneous calculation it is perfectly reasonable for me to point out that it is wrong. (I will add that anyone with even a basic understanding of probability theory should be capable of working out how to calculate the number without too much difficulty). quote: As we both know that isn't true. Your calculation is based on using details which are completely absent from the specification "bidirectional rotary motor-driven propeller".
quote: Basically I am saying that we cannot know if something is CSI unless we use the method for working out whether it is CSI or not. Which requires looking at all the possible explanations of how it arose.
quote: I have already explained this. Firstly the sickle-cell trait is beneficial while hair and eye colour are neutral. So, the frequency of hair and eye colour is dominated by drift, while selective effects dominate in the case of sickle-cell. That is one difference. Secondly - as I have pointed out more than once - the sickle cell allele is unusual in that the heterozygous state is quite strongly beneficial in malarial areas, while the homozygous state is strongly deleterious even there. Thus selection acts to hold the frequency in balance.
quote: It is not fixed in ANY human population (unless you mean some small and soon to be extinct group). As I have already explained. it can't be, because the homozygous state is strongly deleterious and for it to be fixed the entire population must be homozygous for sickle-cell. And while sickle-cell did spread in the past, it is no longer spreading - which was your claim.
quote: This only shows that you badly fail to understand my argument. My argument is not based on individual cases at all. Instead it is based on the long term aggregate effect over all the population.
quote: The monster beat genetic entropy BY decreasing in size. By decreasing in size it increased it's fitness so that it could outcompete all the other variants.
|
|||||||||||||||||||||||
Vacate Member (Idle past 4626 days) Posts: 565 Joined: |
Notice what the article claims. It says that during the course of mammalian evolution, teh body size had increased. Therefore, the accumulation of slightly deleterious mutations increased also. And than they finish it off by saying that this could contribute to the extinction of large mammals. No, that is not what the article says. "consequently, Ne tends to decline" - I realize you missed that part but you missed it again when Wounded King explained what the article actually says, better than I could, above on this very thread. Please read more carefully. ***ABE: I just noticed that with PaulK you indicate a decrease in size shows genetic entropy, but you quote an article and argue with me that an increase in size produces "slightly deleterious mutations" and that is an indicator of genetic entropy. So do you swing both ways or is there something else afoot?
Please read carefully. I will if you will. Did you miss the whole point of my post yet again? That is twice now. Are you simply not reading very carefully or are you avoiding it because you cannot support your claim? I bet you can avoid it once again by not even making it this far into the post. Edited by Vacate, : Whats size got to do with it? Edited by Vacate, : Fixed title
|
|||||||||||||||||||||||
Larni Member Posts: 4000 From: Liverpool Joined: |
Notice what the article claims. It says that during the course of mammalian evolution, teh body size had increased. Therefore, the accumulation of slightly deleterious mutations increased also. And than they finish it off by saying that this could contribute to the extinction of large mammals. This is because the gene pool is smaller and allows for less genetic variation as a result of sexual selection. More individuals means more variation within the gene pool. This means the organism is less vulnerable to catastrophic change as when creatures with deleterious mutation dies (as it normally does) it has a proportionally large impact on the population. As an aside, larger organisms tend to have longer generations so they recover from population catastrophe.
|
|||||||||||||||||||||||
Taq Member Posts: 10072 Joined: Member Rating: 5.2 |
It doesn't matter if it lost affinity specifically to streptomycin. It probably can bind to something else too. The original structure is the original information content. Now, if it loses the affinity, to anything, it got degraded, and it lost information. Then evolution necessarily requires a loss in information according to your definition. For example, both the chimp and human lineages lost information found in the original common ancestor. Chimps and humans are degraded, according to you. The same for all mammals (including humans) as they have degraded from the common ancestor of mammals. The same for all amniotes, all vertebrates, all eukaryotes, and ever single species since the last universal common ancestor. By extension, you are arguing that when an organism evolves from a simple state to a complex state this is a degradation event because the information for making a simple organism has been degraded. You have argued your way out of the debate. You claim that evolution requires new information, and yet observed instances of evolution do not produce this entity. Therefore, evolution does not require new information. You are out of the game.
I see no reason why a bunch of mutations couldn't build a fully functional ATP synthase. I mean, it won't happen. But if it did, it would be an increase in FSC for sure. No it wouldn't, at least according to you. Any change from the original DNA sequence is a degradation.
|
|||||||||||||||||||||||
Taq Member Posts: 10072 Joined: Member Rating: 5.2 |
Notice what the article claims. It says that during the course of mammalian evolution, teh body size had increased. Therefore, the accumulation of slightly deleterious mutations increased also. And than they finish it off by saying that this could contribute to the extinction of large mammals. The article claims that mammalian evolution trends towards larger body size in many lineages. Obviously, this isn't so in all lineages. Mice, for example, are quite small and they are . . . hmm, let me think . . . oh yes . . . MAMMALS. According to the paper, their population size prevents the accumulation of deleterious mutations compared to larger mammal lineages.
|
|||||||||||||||||||||||
Taq Member Posts: 10072 Joined: Member Rating: 5.2 |
Do you have evidence for evolution which is not evidence for evolution? Our detector, in this case, is phylogenetic comparisons of DNA. This detector is capable of telling us "evolved" and "not evolved". For example, we can compare the genes of Glofish (link), jellyfish, and trout. What we should find, if evolution is true, is that the Glofish genes should more closely resemble that of trout than of jellyfish. What do we find? We find that this is false for certain genes. Why is that? Because these genes DID NOT EVOLVE. Glofish contain genes which are almost exact copies of genes found in jellyfish, but not found in trout. How did this happen? Through intelligent design. Humans moved genes from jellyfish into Glofish to make them fluoresce under UV lights. There you have it. A detector that can detect both evolution and not evolution. So what detector will tell us ID or not ID?
|
|||||||||||||||||||||||
PaulK Member Posts: 17826 Joined: Member Rating: 2.3 |
quote: I thought that I would let Smooth Operator reply before giving the real answer. Strictly speaking the specified information measure refers to the specification, rather than the actual event - in this case the ribosome. In fact it has to, for the obvious reason that only that information is specified. I can think of only one sensible way to get a measure of the specified information for the event, and that is to use the highest value for all the valid specifications that include the ribosome. An increase in specificity would allow us to draw a tighter specification (i.e. if the specification includes a minimum level of specificity) which would be expected to have a higher amount of specified information (it cannot be lower, since the new specification is contained within the older one). Thus it is possible for the specified information measure of the ribosome to increase - but it may not - and if the change has any other effects the specified information level might even decrease. (Technically we could do much the same with a specificity decrease, but unless very low specificity values are rare it's unlikely to be significant) I'll also comment on this:
quote: In fact this does seem to be a flaw in Dembski's methodology. Once streptomycin came into existence, specificity to streptomycin became a valid specification even though - in my assessment - it would not have been before. And that would make that specificity into specified information. Which, as you correctly point out, raises the possibility of a false positive.
|
|||||||||||||||||||||||
Taq Member Posts: 10072 Joined: Member Rating: 5.2 |
As the title implies, is this change in enzyme specificity an example of an increase in CSI?
quote: |
|||||||||||||||||||||||
Meddle Member (Idle past 1296 days) Posts: 179 From: Scotland Joined: |
It doesn't matter if it lost affinity specifically to streptomycin. It probably can bind to something else too. The original structure is the original information content. Now, if it loses the affinity, to anything, it got degraded, and it lost information. Of course the ribosome binds to something else. It binds to mRNA and the anti-codons of tRNA carrying amino acids, in it's role of protein synthesis. Streptomycin interferes with the normal role of the ribosome by irreversibly binding to it. This is why streptomycin is an antibiotic, since without protein synthesis the bacterial cell dies, and why it is ridiculous to describe the failure to bind streptomycin as a loss of function. The relevant mutation allows the ribosome to continue functioning in protein synthesis even in the presence of streptomycin, which can be described as a gain in function.
|
|||||||||||||||||||||||
Wounded King Member Posts: 4149 From: Cincinnati, Ohio, USA Joined: |
You can't make it sensible to talk about binding sites which have evolved/arisen to bind a particular protein representing CSI in the binding target in all cases. Using your logic every time an antibody is raised to a different epitope on a protein the information content should rise! Every animal with an adaptive immune system is increasing the genetic CSI content all the time!
But now you talk about 'original information content' which is quite different, I put it to you that the 'original information content' would have arisen before the full streptomycin biosynthesis pathway, so in fact all you are losing is ,as I suggested before, the 'free' informational value imparted by the development of streptomycin biosynthesis rather than any of the 'original information content'. But what you really seem to be saying is, once again, that any change from the first sequence derived from a gene is a loss of information, or in Durston et al.'s approach essentially any deviation from the consensus from an alignment of related sequences. The binding specificity whatever it is for, seems totally unrelated to what you are saying, you are trying to hang some element of functionality on it when there simply isn't any. The only functional effect the mutant has is to allow the bacteria to survive in the presence of streptomycin. It hasn't lost the function of binding to streptomycin becuase that was never its function. The ribosome is arbitrary in some ways, it is not 100% conserved amongst all species so clearly there is some allowable variation at different positions. You have decided to arbitrarily decree that any mutation changing the sequence from its inital state is a loss of information. Similarly Durston et al. arbitrarily decree that any change away from their consensus sequence will be a loss of information. You talk about multiple mutations giving rise to a functional ATP synthase gene, but that misses my point about novel mutations. If you take a random sequence and put it through multiple rounds of mutation and selection until you eventually produce a sequence which matches that of a consensus ATP synthase then using Durston et al.'s method we will have arguably increased the information for that sequence, but only to match a sequence we already had we haven't generated truly novel information. Durston et al.'s method won't let us measure the information we have created if we experimentally evolve an ATP synthase enzyme with a radically different underlying sequence, all that would do in fact if we added our new enzyme to the alignment would be to reduce the functional specified complexity for the whole gene family of existing ATP synthases, ormore probably the program would fail to return a meaningful result because the sequences wouldn't align at all. Similarly if we were to produce an ATP synthase with a truly novel mutation, i.e. one not extant in the family of functionally related genes, which improved the rate of ATP synthesis Durston et al.'s method would again tell us that our new sequence has less functional information than the consensus sequence and would again reduce the FSC for the whole family of ATP synthase genes. My point was that you Durston et al.'s approach sets an arbitrary maxima for FSD based on the consensus sequence. It will not allow you to measure an increase in FSC even if a novel beneficial mutation arises. In contrast Haze et al.'s method uses actual measures of functionality as an important element so it would allow you to quantify an informational increase based on such mutations. Again we see how the IDist route of ignoring actual details of biological function for vague proxies makes their approach fruitless. TTFN, WK Edited by Wounded King, : No reason given.
|
|||||||||||||||||||||||
Taq Member Posts: 10072 Joined: Member Rating: 5.2 |
The ribosome is arbitrary in some ways, it is not 100% conserved amongst all species so clearly there is some allowable variation at different positions. You have decided to arbitrarily decree that any mutation changing the sequence from its inital state is a loss of information. Similarly Durston et al. arbitrarily decree that any change away from their consensus sequence will be a loss of information. You talk about multiple mutations giving rise to a functional ATP synthase gene, but that misses my point about novel mutations. If you take a random sequence and put it through multiple rounds of mutation and selection until you eventually produce a sequence which matches that of a consensus ATP synthase then using Durston et al.'s method we will have arguably increased the information for that sequence, but only to match a sequence we already had we haven't generated truly novel information. Adding to this, when sequences are treated in this arbitrary manner you also commit the Sharpshooter Fallacy. ATP synthase was not the target or goal. Rather, increased fitness was the goal. It is entirely possible that an enzyme catalyzing a different chemical reaction could have been found. Calcuating the odds of a specific enzyme arising through evolution misses the boat. You also need to add in every single amino acid sequence that would have increased fitness through a new metabolic pathway. The same applies to the flagellum. You need to find every single possible motility system in order to calculate the probabilities. Using an analogy, using the ID version of CSI the probability of anyone winning the lottery is 1 in 150 million or so. According to ID logic it should take 150 million lottery drawings before anyone wins.
|
|||||||||||||||||||||||
Smooth Operator Member (Idle past 5139 days) Posts: 630 Joined: |
quote:Creationism is not a mechanism. How exactly is creationism supposed to be a mechanism? Anyway, your argument is nonsensical how ever you look at it. I'm claiming to have a method of design detection. And when I apply it, you claim that I can't use it because that's "creationism". Well, that's liek saying that you can show me evidence for evolution, except when you use evidence for evolution. How exactly is than somebody supposed to show evidence for evolution, if not with evidence for evolution? And how is somebody show evidence for a working design detecting method, without actually referencing that design detecting method.
quote:Those two things you aer comparing are totally different. And if you want to call the advancement of transportation - "evolution". Than fine by me. But do not, and I repeat, do not try and confuse the mechanism which brought about those changes, with what is going on in nature. The advancement of transportation was brought about by an intelligence. It's a case of intelligent design. Over a more or less large period of time. And no, there is no way to compare this to what you are claiming is going on in nature. You are claiming that NO INTELLIGENCE was there in teh process of biological evolution. You are claiming that natural selection and random mutations did the job all by themselves. So what you are basicly doing is a common bait and switch method. You show me a case of intelligent design (advancement of transportation), you call that evolution, and than you claim that this is evidence for evolution in nature, which is supposed to be a non-intelligent darwinian process.
quote:Change over time caused by what? Random mutations, or intelligent input? Obviously intelligent input. The changes were caused not by random changes, but by intelligent changes. This is why this example is not applicable to biological evolution. Unless you want to claim that natural biological evolution is also designed. quote:This is also a great example of intelligent design. Such programs are designed to evolve. Dembski explained this perfectly using the AVIDA program. The changes are more or less random, but the selection isn't. When the selection happens, information is being transmited from the environment into the robots. And this cutoff is called a fitness function. It selects certain robots and removes others. But this fitness function has been designed. So the same amount of information that was in the fitness function fromt he start has been transmited into the robots later on. No new information was created.
quote:Which is why it so easy for small population to go extinct. Yes, I agree with that. And the same goes for large populations, only not as much, beccause selection is more effective in larger populations. But the same principles apply to all populations. Edited by Smooth Operator, : No reason given.
|
|||||||||||||||||||||||
Smooth Operator Member (Idle past 5139 days) Posts: 630 Joined: |
quote:No, that's not the question. That was not the point of the experiment. The point was to show how much mutational load can a enzyme take before it loses it's function. The one function that was known to exist was measured. And it went away after some time. So now we know how much changes can there be on average before a certain function is lost. quote:The only relevant question is. Does the flagellum describe this pattern: "bidirectional rotary motor-driven propeller"? The answer is - yes. So since we know of other objects that exhibit this pattern, we conclude that it's a specification, and not a fabrication. quote:I'm willing to do the "correct" calculation. I'm simply waiting for you to tell me what to do. quote:Let's see what Dembski has to say about that... quote:http://www.designinference.com/.../2005.06.Specification.pdf Let me explain what he is saying here breafly. The ϕs(T) part of the equation describes the specificational resources. That is, the amount of all possible specifications relevant the the specification T that is exibited by the even E. Imagine now a descriptive language D* for which we will say will be English language consisting of aprox. 100.000 words. That's 10^5 basic concepts. Using the definition of CKS information theory The flagellum describes the pattern "bidirectional rotary motor-driven propeller". This patternt consists of 4 basic concepts, that is, 4 words. The total complexity of this specification is thus 10^5 10^5 10^5 10^5 which equals 10^20. This is the complexity of the specification as calculated by Dembski himslef. After that we multiply it by 10^120, and with the probability of the event, and than we get the final number...
quote:I agree. The problem I have with your interpretation is that you confuse growth mechanisms with high probability events. quote:But this is not a difference. Remember. Mutations are neutral/beneficial/deleterious depending on the environment. Therefore, in some environment, blue eyes could be beneficial, or deleterious. So no, there is no difference, the same natural selection is operating on all traits. In all cases it depends on the environment and the mutations if they will be removed or spread and at which extent. There is nothing special in this case.
quote:Neitehr are a lot of other traits. So what? Are blue eyes fiex in human population? No obviously they are not. Only a small minority has them. Does that mean they are deleterious? No obviously not. quote:That's my argument also. Iw as just making an example on an individual. Are you telling me that other traits that get selected besides genes are not working on the level of population? Are you telling me they do not interfeer with genetic selection on the level of population over a long periods of time? Of course they do! And that is why natural selection is inefficient. quote:No, it's the genetic entropy that is causing it's shortness. And the most important question is, how is that chain supposed to evolve into something more complex if it is constantly decreasing? It's not! And that's why evolution doesn't work. If it did, the chain would be getting longer. But it's not getting longer.
|
|||||||||||||||||||||||
Smooth Operator Member (Idle past 5139 days) Posts: 630 Joined: |
quote:No I didn't miss that part. What's so special about it? The slightly deleterious mutations keep accumulationg. That is the point of the article, and that's my point. What am I missing? quote:No, either increase or decrease in size itself has nothing to do with genetic entropy. Nothing, absolutely nothing. The reduction in size does nto cause geentic entropy, or the decrease. What causes genetic entropy is the degradation of biological functions.
|
|||||||||||||||||||||||
Smooth Operator Member (Idle past 5139 days) Posts: 630 Joined: |
quote:Okay, I don't really care why this is. My point is simply that it's happening. It's causing genetic entropy. If you consider genetic meltdown a catastrophe, than no, the population is not going to recover.
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024