Register | Sign In


Understanding through Discussion


EvC Forum active members: 61 (9209 total)
2 online now:
Newest Member: The Rutificador chile
Post Volume: Total: 919,502 Year: 6,759/9,624 Month: 99/238 Week: 16/83 Day: 7/9 Hour: 2/2


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   What exactly is ID?
PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 810 of 1273 (544173)
01-24-2010 1:31 PM
Reply to: Message 806 by Smooth Operator
01-24-2010 12:17 PM


CSI & Genetic Entropy discussions
quote:
We know it lost all KNOWN functions. That is the only thing I'm interested in anyway. Like I said before, it may very well be that it's useful for something else. I'm not disputing that. Maybe it is, maybe it isn't. But what we do know, is that now we know how many mutations it takes for an enzyme to lose it's known function.
I will only comment that you were interested enough in the claim that it had lost ALL function to try to dispute the point.
quote:
And D* = bidirectional rotary motor-driven propeller = 10^20.
Wrong ! D* = the set of bidirectional rotary motor-driven propellers". Your 10^20 is Dembski's estimate of the number of four-level concepts, as your quotes showed.
quote:
Is the probability of a "bidirectional rotary motor-driven propeller" that consists of 10 proteins equal to the probability of "bidirectional rotary motor-driven propeller" that consists of 100.000.000.000 proteins?
No, and the probability of getting some "bidirectional rotary motor-driven propeller" - the probability that we actually want - is different from both of them.
quote:
Yes, exactly, D* here is the "a bidirectional rotary motor-driven propeller", which let me repeat, means that: D* = bidirectional rotary motor-driven propeller = 10^20. So I was rigth from the start.
Where "Yes, exactly" means "No". All you have done is repeat the same grossly erroneous claim. 10^20 is Dembski's estimate of the number of 4-level concepts, of which
bidirectional rotary motor-driven propeller is one.
quote:
I know it does't! But it does matter to the probability of E! Which is also imprtant.
No, the probability of E is not important. Only the specified information matters. And that is derived from the probability of D*.
quote:
Do you, or do you not understand that getting number 6 fher throwing a die i 1/6? If you have two dice, and you want to get the number 6 on both, the probability decreases, and now it's 1/12. Therefore, aflagellum consisting of 50 proteins has higher probability of forming by chance than a hypothetical one consisting of 10.00.000 proteins. Therefore it's complexity is relevant to the calculation!
You can't even get the probability of rolling two 6s on 2 dice correct. It's 1/36. Besides that your whole argument deals with irrelevancies. We want to know the probability of getting ANY "bidirectional rotary motor-driven propeller", not one using a particular number of proteins, because the number of proteins is not part of the specification,.
quote:
Dear God! I already told you! It's in the NFL, which I already said is what Dembski calculated.
Do you want my help or not ? If you want it then all you have to do is to provide the information I ask for. Remember I'm only doing this as a favour to you. If you can't be bothered to look up the details of the calculation then there's no reason why I should.
quote:
A regular pattern does not equal specification.
I'm afraid that it does. Look up Dembski's definition of specification again.
quote:
WTF am I didging? Nothing! You are the one who is pretending not to understand what I'm talking about.
I DO NOT CARE IF THE MUTATION IS UNUSUAL!!!!!!
I will observe again that you cared enough to try to argue against it. Even if you did not care enough to actually address the reason why it is unusual (which is the dodging). And in fact there is a reason why you should care. If sickle-cell is not a typical example of a beneficial mutation it cannot be used as such.
quote:
The point is that natural seelction selects anything, including the "unusual" mutations, if they confer reproductive fitness. Even those mutations that reduce biological functions, such as the sickle cell mutation. Therefore, it contributes to genetic entropy.
Of course this argument is absurd since it equates increasing fitness with declining fitness. And your main example - sickle-cell is atypical, and so can't be used.
quote:
No! Stop repeating this crap over and over again. There is no noise averageing.
The field of statistics would disagree with you. It is a fact that given a large population (of samples) noise will tend to average out - because it is random.
quote:
For a full removal of all mutations you would need an infinitely large population. Since you don't have one, mutations accumulate. I never said ANYTHING about some correlation. The effects, that is, the noise is too strong to be averaged out, precisely becasue genetic changes are not strong enough, and they are just a tiny part of what is being evaluated by natural selection.
I'm not arguing for a "full removal of mutations". Just a dynamic equilibrium where the number of deleterious mutations maintained in the population falls short of mutational meltdown. The other problem is that you are now assuming that drift dominates to the point where there is no selection at all. I suggest that you produce evidence for this bold claim.
(And yes, I know that you didn't mention correlation because I know that I brought it up. Because you need correlation to avoid the averaging effect of large populations.)
quote:
LOL! This is simple math! Anything less than 100% equals accumulation of mutations!
Wrong again. 100% efficiency would produce guaranteed removal of all deleterious mutations when in fact some can reach fixation. Unfortunately accumulation requires more than that - it requires that there cannot be a balance point, where the rate of removal of deleterious mutations equals the rate at which more enter the gene pool. However, by your own admissions all surviving organisms must at least be close to such a point (because the rate of accumulation must be very, very slow to explain why life still survives). And if it is that close the evidence offered so far cannot tell us that the balance point has not been reached in at least some species.
I will reply to your comments on the monster only to point out that I make no admissions. I simply decline - in deference to the preferences of the site owner - to add another subject to this discussion.
Edited by PaulK, : No reason given.

This message is a reply to:
 Message 806 by Smooth Operator, posted 01-24-2010 12:17 PM Smooth Operator has replied

Replies to this message:
 Message 911 by Smooth Operator, posted 01-27-2010 4:36 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 876 of 1273 (544436)
01-26-2010 12:24 PM
Reply to: Message 875 by traderdrew
01-26-2010 12:11 PM


Re: addition, subtraction, addition, subtraction, where does it end?
quote:
Actually I think this quote (just below) from the Pandas and People article is more relevant to me than the one you posted. It would seem logical to me to measure the specificity of an object first and then compare those objects to others. After you have many comparisons and references then, you can start to determine what is complex and what is not and the possible grey areas inbetween.
Dembski's writing is conflicted on the subject of whether the system's specification comes before or after the determination that the system is complex. For example, in his infamous explanatory filter it is quite explicit that the determination of the system's complexity comes before the determination that it is specified. In other places, however, specification comes first.
Actually it doesn't tell you much. (And it tells you nothing unless you know what Dembski means - something you have been highly resistant to learning). In fact the two could be done in any order, but there are practical reasons why you would almost always check that the observed pattern is a valid specification first.

This message is a reply to:
 Message 875 by traderdrew, posted 01-26-2010 12:11 PM traderdrew has replied

Replies to this message:
 Message 878 by traderdrew, posted 01-26-2010 12:28 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 880 of 1273 (544442)
01-26-2010 12:40 PM
Reply to: Message 878 by traderdrew
01-26-2010 12:28 PM


Re: addition, subtraction, addition, subtraction, where does it end?
quote:
In other words, you agree with me.
No, I'm warning you that you don't know what you are talking about. There is no "specificity of an object" in Dembski's method, nor is complexity calculated by comparing objects. The quote doesn't help you because it is talking about a method that is quite different from whatever it is that you mean.
If you can't be bothered to learn what Dembski's method actually involves then you really really ought to stop talking about it - and attacking people who try to explain it to you.

This message is a reply to:
 Message 878 by traderdrew, posted 01-26-2010 12:28 PM traderdrew has not replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 920 of 1273 (544651)
01-27-2010 6:19 PM
Reply to: Message 911 by Smooth Operator
01-27-2010 4:36 PM


Re: CSI & Genetic Entropy discussions
quote:
For the trillionth time - ALL KNOWN FUNCTIONS -.
No, that is the position you retreated to, after it became clear that you did not have the evidence to rule out all function.
quote:
LOL, I can't believe this! D* is not a set of any kind. D* is the descriptive pattern. There is no ihnstance of the word "set" anywhere! Stop making things up! Do you, even know, do you have even faintest idea of what these "four-level concepts" are? No obviously you don't.
No, D is the descriptive pattern. Strictly speaking D* is the pattern considered as an event - or put another way the event of matching the pattern. But given that we have a physical object rather than the event, in this case regarding it as the set of objects which match the pattern is quite reasonable. Certainly more reasonable than equating it to a number as you did.
Let me remind you:
TDI p165
..the event that needs to have a small probability to eliminate chance is not E, but D*
The four-level concept is also easy to understand. It is a concept made up of four elements. In this case "bidirectional", "rotary", "motor-driven" and "propellor". Dembski allows for 10^5 possible elements, therefore estimating the number of four-level concepts as (10^5)^4 = 10^20.
Look at your quote in Message 688
For a less artificial example of specificational resources in action, imagine a dictionary of 100,000 (= 10^5) basic concepts. There are then 10^5 1-level concepts, 10^10 2-level concepts, 1015 3- level concepts, and so on. If bidirectional, rotary, motor-driven, and propeller are basic concepts, then the molecular machine known as the bacterial flagellum can be characterized as a 4-level concept of the form bidirectional rotary motor-driven propeller. Now, there are approximately N = 10^20 concepts of level 4 or less, which therefore constitute the specificational resources relevant to characterizing the bacterial flagellum.
"Specificational resources" is essentially the number of possible specifications. It in no way compensates for the fact that the specification describes many things that are not the E Coli flagellum.
quote:
quote2.) A descriptive language D.
4.) A pattern D whose correspondence * maps D to E.
TDI - Page 144.
There is no word "set", anywhere. D is the descriptive language. D* is the pattern that describes the event E. Remember that.
[/qute]
If you actually READ the quote you will see that the pattern is D, and D* is the correspondence of the pattern.
quote:
quote... but rather for formulating simultaneously botha a description D and a map * that maps D to E (i.e. a pattern that delimits E) so that there is no question...
TDI - Page 155.
Again, it's a pttern. D* is a patternt. Not a SET of patterns, but a pattern. [/quote]
In this quote the pattern is (D, *), whereas in the first it was D. Neither is D*. D* is not described as a pattern. It is, however, described as an event on p165 quoted above.
quote:
The bidirectional rotary motor-driven propeller" is the pattern D*. It consists of 4 concepts. As I quoted they are bidirectional", "rotary", "motor-driven" and "propeller". Each of those concepts has the complexity of 10^5. And since there are 4 of them. Their full complexity is 10^20.
Therefore, D* = bidirectional rotary motor-driven propeller = 10^20.
As the quote I have produced above makes clear, 10^5 is the number of basic concepts. It is not the "complexity". And it certainly is not the probability of D*, which is what Dembski say must be calculated.
quote:
Exactly, it's different. So tell me, why in the world would you wan't to include the complexity of some other hypothetical flagellum into the calculation of the known flagellum that consists of the known 50 proteins? When it's obvious that their differenceis in complexity are going to yield different results. A flagellum that would have a omplexity of less than 400 bits, but would correspond to the pattern bidirectional rotary motor-driven propeller, would not even make it in the calculation, becasue it's complexity is too low, and it would be automatically attributed to chance, not design.
Yes, the probability you want to calculate is different from the one that Dembski says that you should calculate. And I think that it should be obvious why you should calculate the probability required by Dembski's method - the probability of D* - rather than some other probability of an event which isn't even fully specified.
quote:
That means that it doesn't matter that soem flagellum has 50 proteins,a nd other has 1.000.000 proteins. Their complexity is teh same according to you.
As I have told you before the complexity figures are associated with the specifications rather than the raw events. This is because the relevant probability is the probability of meeting the specification - i.e. the event D*, as Dembski says.
quote:
Wrong example. Yes, 1/36 is the correct number. Anyway, by claiming that their complexity is not important you are clearly wrong. Becasue that's liek saying that it's the same probability of getting one 6, and geting two 6s. Or that a probability of a 50 protein flagellum is teh same as the probability of a 1.000.000 protein flagellum. But it's obviously not. The lower the probability, the higher the complexity. Which means the overall CSI is higher.
As I keep pointing out the probability we want is the probability of getting ANY "bidirectional rotary motor-driven propellers". See the quote from TDI above.
quote:
We discussed this before. I don't care anymore becasue you keep pretending you don't understand wha I'm talking about.
I understand what you are talking about. I'm just not going to do the work for you. You would need that information to do the calculation anyway. Of course as I have informed you more than once it is a complete waste of time because it's the wrong probability anyway.
quote:
Simle patterns are not specifications. A snowflake is a simple reoccuring pattern, which is not a speification. Becasue you have to first look at the snowflake to know the pattern. If the snowflake was int eh shape of a car, than it would be a specification.
Simple patterns ARE specifications. They are very good specifications. Consider the Caputo case. If the Democrats had been placed first on the ballot EVERY time, wouldn't that give more reason to suspect tampering, rather than less ?
With a snowflake the problem is that we don't have a specification that provides a detailed description of a particular snowflake. Again, check the definition of specification.
quote:
It doesn't matter if it is tipical or not. The point is, it got selected by natural selection, and by doing this, the genetic entropy increased. Showing that naturral selection is not perfect, and does not remove ALL mutations that degrade the genetic information.
However that was not the point being argued. You wanted to argue that beneficial mutations in general increased genetic entropy. Which you can't do by relying on an atypical example.
quote:
But in your case, the large enough population is infinity. Which you do not have. Therefore, there is no noise averageing.
If you've run the numbers - and you would have to to make such a claim, let's see the calculations.
quote:
I already cited articles that show how slightly deleterious mutation cause genetic meltdown. And no, there is no such thing as a dynamic equilibrium. Because that would mean that at soem point natural seelctionw ould have to work at 100% efficiency, ann at some point under 100%. And we know that it NEVER gets to 100%.
You've cited articles saying that mutational meltdown can happen in small populations. Which implies that it is unlikely to be a problem for larger populations. And you are quite wrong to say that the balance requires 100% efficiency. It doesn't. All it requires is that the efficiency is high enough to hit the balance point before mutational meltdown.
quote:
No, I need no correlation? What would correlation do for me?
A correlation would mean that we could not assume that the noise will tend to average out.
quote:
The balance point is where natural selection removes all mutations that come into the gene pool. Which means that it works at 100% efficiency at that specific point in time. Which we know is not possible.
Wrong. The balance point is where deleterious mutations leave the population at the same rate as they arrive. Removing a deleterious mutation that has hung around for 50,000 ears - or longer - is as good as removing one that appeared last week. We don't need 100% efficiency for that.

This message is a reply to:
 Message 911 by Smooth Operator, posted 01-27-2010 4:36 PM Smooth Operator has replied

Replies to this message:
 Message 925 by Smooth Operator, posted 01-27-2010 7:27 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 934 of 1273 (544741)
01-28-2010 2:59 AM
Reply to: Message 925 by Smooth Operator
01-27-2010 7:27 PM


Re: CSI & Genetic Entropy discussions
quote:
NO! Becasue I already said that we are not going to invent functions that we do not know about. We know of one function it had. It lost it, therefore it lost all functions.
So now - after trying to pretend that you didn't say it, you are going back to the same illogical argument. In fact you don't know whether the mutated version had lost all function or not. The tests weren't done. And even if they had been they would only apply to the version tested, and 20% difference allows for a LOT of different versions.
quote:
No, D is teh descriptive language! I just quoted you the part where it said that!
However the quote I was actually talking about said that D was the pattern. Expecting me to ignore the quote I am talking about and look at some other quote instead is really stupid.
quote:
The flagellum is the event. Physical object equals teh event. Con tosses could be an event also. Anything is regarded as an event. And since the word "set" is not used anywhere, please do not use it . Do not invent notions that Dembski never did use.
I'm not inventing notions that Dembski never used. And you are forgetting the quote from p165 where it describes D* (which is NOT D, remember !) as an event.
quote:
Which means that D* = bidirectional rotary motor-driven propeller = 10^20.
No, it doesn't mean that. D is "bidirectional rotary motor-driven propeller. D* is the correspondence of D and 10^20 is the number of four-level concepts.
quote:
Which specification describes that?
Which specification describes what ? The specificational resources aren't described by a specification. The specification I referred to is - as you ought to know by now - is "bidirectional rotary motor-driven propeller".
quote:
Which means that D* is the pattern described by bold D. There is a bold and a regular D. One is a descriptive language, the other is a pattern.
Since it specifically says that the pattern is D, it doesn't mean that D* is the pattern.
In fact neither quote says that D* is the pattern. They don't agree on what the pattern is, but that's Dembski for you.
quote:
I never said it was. Obviously it's not. But 10^20 is the complexity of the specification.
Well you're still wrong, it's the numerical value used to quantify the specificational resources.
quote:
But do you not understand that a flagellum that describes the pattern bidirectional rotary motor-driven propeller and consists of 50 proteins has an IDENTICAL complexity of the specification as a one that consists of 1.000.000 proteins? So by your logic their probability of occuring by chance is the same! Which is totally FALSE! Since they describe the same pattern, the complexityy of what they describe is IDENTICAL! It's obvious you need to also include their own complexity into the calculation.
I can only repeat that the complexity belongs to the specification, not to the event. As Dembski tells us, we calculate the probability of meeting the specification, not the probability of a partially specified event. Also an event may have many possible specifications which have different probabilities.
And I have to add that MY logic doesn't say that the probability of getting each of the events described by the specification is the same as getting any particular one. Excepting the degenerate case where there is only one possible event that fits the specification that will always be false. My logic says that the events will often have different probabilities - but the difference is down to unspecified details which are not relevant to determining if the event is CSI.
quote:
But it's HARDER do meet the specification with MORE proteins! Therefore the probability is LOWER. And it's EASIER to meet the specification with LESS proteins, therefore teh probability is HIGHER! And that also means that every instance of an event is calculated separately!
But you will still have to combine them to get the probability we want. Remember we want the probability of getting ANY of these, not the probability of getting a specific one.
quote:
And those events that consists of a complexity that is under 400 bits are not claculated at all! What would be the point if they are automatically claimed to be non designed!?
If any one of the results comes to less than 2^-400 you had better abandon the whole thing since the probability of getting a "bidirectional rotary motor-driven propeller" is at least as high as the probability of getting a particular "bidirectional rotary motor-driven propeller". Time to give up on that specification, rather than fiddling the figures (which is what your suggestion amounts to).
quote:
NO WE DO NOT!!!! What would be the point!?
The point is following Dembski's method. Arguing that we shouldn't do it because it doesn't guarantee getting the result you want is just silly.
quote:
Simple patterns that also conform to an independently givent pattern are specification. It's not enough that it's just simple.
The pattern IS the specification. Simple patterns are usually the best specifications.
quote:
Nope. I'm arguing that beneficial mutations are not going to reduce the effects of deleterious and slightly deleterious mutations, becasue they also CAN, not always, but sometimes CAN reduce genetic information. Which is true. They can reduce geentic information. Therefore, just by occuring, they will not save a population from extinction.
In fact you originally argued that nearly all beneficial mutations contributed to genetic entropy - using sickle-cell as your main example. And as I have pointed out before your argument fails because genetic entropy is about reducing fitness and beneficial mutations (by definition) increase it.
quote:
There is no calcualtion here, only simple logic.
Take the number 100 to represent the amount of genetic information. How small would the number of reduction have to be, for this number never to reach zero? IS it one? Well obviously not, becasue in 100 steps, you will reach 0, and the population is dead. Is it 0.5? Nope, that just means that you will reach 0 in 200 steps. So tell me, how small does it have to be for you never to reach zero? Obviously infinitely small. Or, the starting number would have to be infinitely large.
I could point out a number of flaws in the premises and reasoning, but the really fatal problem in your response is that you are addressing the wrong problem. The question is about the averaging effect of a large population in reducing the effects of "noise". You claim that there is no averaging effect (presumably meaning no significant averaging effect). However since the effect must be there to some extent, the only way to find out how significant it is is to use the numbers as they apply to real populations. Even if you were attempting to answer the actual problem instead of a completely different issue, you would need data and calculations - not theoretical speculations.
quote:
The tests were performed on small populations becasue they experince the effects of geentic meltdown faster! What would be the point of producing an experiment that lasted for 10.000.000 years!?!?
Unfortunately we aren't talking about the tests. We are talking about a general statement giving background information. A statement which specifically identifies it as a potential problem for small populations.
quote:
A correlation of what? Be more specific?
I was specific. A correlation between a mutation and one of the other factors you identified as influencing the selective process, such that a deleterious mutation would have a consistent advantage which would not average out across the population.
quote:
YES YOU DO! How elese are ALL mutations that get in, going to get out if natural selection is nto working at 100% efficiency!?
The dynamic equilibrium does not require that "ALL mutations that get in, get out" or even all deleterious mutations. As I have said, in this case deleterious mutations are held at a fixed level. It is only the number of deleterious mutations that matter. The fate of individual mutations - whether lost immediately or remaining in the population indefinitely is irrelevant. Therefore 100% efficiency (which would be the immediate removal of all deleterious mutations) is not required.
quote:
9 out of 10, or 49 out of 50, or 799 out of 800 is less than 100% efficiency. And in that case genetic entropy increases. Only during 100% efficiency does it not increase, and you have a BALANCE! Only if you remove ALL mutations that get in, do you have a BALANCE, but that means that natural seelction is working at 100% efficiency! Which we know is not true.
Unfortunately for your argument the balance also includes mutations lost by genetic drift. As the number of deleterious mutations goes up, the number of deleterious mutations lost through drift also goes up. And as I state above the balance is not about the fate of individual mutations, it does not require that we remove all mutations, only that the numbers (etc) remain roughly constant.
Edited by PaulK, : No reason given.

This message is a reply to:
 Message 925 by Smooth Operator, posted 01-27-2010 7:27 PM Smooth Operator has replied

Replies to this message:
 Message 945 by Smooth Operator, posted 01-28-2010 4:21 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 947 of 1273 (544844)
01-28-2010 5:17 PM
Reply to: Message 945 by Smooth Operator
01-28-2010 4:21 PM


Re: CSI & Genetic Entropy discussions
quote:
My argument was the same from the start. Only one function was known, it was lost, therefore all functions were lost. Yes, there could be some unknown left, but we are not going to invent them just for fun. The point remains that the KNOWN function is lost. That is my point. Plese continue from this, don't go back to what you think I was saying.
Your argument that all function was lost is not logically valid. That the measured function was lost is not disputed. I don't know why we keep having to go around and around on this point with you continually shifting your position when you aren't even making anything of the point.
quote:
If you are talking about the regular D than you are right. But if you are talking about the bold D than no. It's a descriptive language.
Given that I wrote D rather than D and given that my statement is correct if I meant D and not D it seems clear that I meant D and not D.
quote:
And D* is surely not a correspondence of D and 10^20 is the number of four-level concepts. You see, on page 136 it clearly says that regular D is just a shorthand for D*.
I am afraid that you are misreading it. It says that D may be used to represent (D, *), not D*. I can see the sentence that you mean, but the following sentence makes it clear:
It will be clear from the context whether D signifies the full pattern (D, *) or merely it's first component.
quote:
And * maps regular D to E, not to "bidirectional rotary motor-driven propeller. E in this case is the flagellum that consists of 50 proteins.
Again you misread it. Just as there are two Ds, there are two Es. The correspondence maps D to E (p144). And the result of applying * to D is D*. NOT E unless E=D*. And in this case the problem is that the E you want to use is NOT D*.
quote:
Yous aid that: "It in no way compensates for the fact that the specification describes many things that are not the E Coli flagellum". Which are those specifications?
The specification is "bidirectional rotary motor-driven propeller". There are many things that are "bidirectional rotary motor-driven propellers" that are not E Coli flagella. Got it ?
quote:
It's both! We need both the complexity of the specification, and the complexity of the event. Becasue it is much harder for a lower probability event to mach the same pattern than it is for a higher probability event. That is why we need to include both complexities.
You are making no sense. We don't need an improbability inflated by unspecified details. Why would we ?
quote:
But it is important becasue a flagellum consisting of 300 bits will not be regarded as CSI, but a flagellum that describes the same pattern and consists of 500 bits will! Obviously we need to take into account their complexities.
It's not obvious to me. In fact it is obvious to me that Dembski s right on this point and that the probability of D* is the only one that matters. Unspecified information is irrelevant to identifying design - and anything outside of D* is outside of the specification D.
quote:
NO WE DO NOT! Why the hell would we want to combine them!?!!?!!?!?!?!??! A 300 bit flagellum is not CSI in the first place!!! What's tehre to combine? Where did Dembski say anything about combining!?
Here's that quote again (TDI p165)
...the event that needs to have small probability to eliminate chance is not E, but D*.
Since D* includes BOTH the flagella (and more) you would need to combine their probabilities to get to D*. (And if you find ONE with less than 400 bits of information, the probability of D* will not be low enough so there is no need to continue).
quote:
NO WRONG! It's obvious to me now that you misunderstand the whole concept. Why the hell do you think the UPB of 10^120 even exist!? By your logic it even does not have to exist! Obviously it doesn't according to you becasue youa re only taking into account the complexity of the specification!
Every time you write by "your logic" you mean some crazy idea that you have. Can you please stop doing that. And, in fact, the position you are disagreeing with is Dembski's.
quote:
Yet the point is to compare if the probability if the event that happened is high or low in matching the complexity of the specification! That is why we also need the complexity of the event. How do you do that if you only have one complexity, that of the specification? Obviously, you don't, because you can't!
Here's that quote again (TDI p165)
...the event that needs to have small probability to eliminate chance is not E, but D*.
So THAT is the probability we want to compare to the UPB.
quote:
If and only if it is detachable from the event in question.
Which is one of the reasons why simple patterns are better. They are easy to detach from the observed event.
quote:
No. I never said that.
Unfortunately you did indeed claim that nearly all beneficial mutations increased genetic entropy, and offered sickle-cell as your first piece of evidence.
quote:
And you need to be more specific about the fitness. Genetic entropy is primarily about genetic information. The reduction of genetic information. Which I already said is not proportionally correlated with REPRODUCTIVE fitness. Soemthing like a sickle cell mutation can increase REPRODUCTIVE fitness, yet reduce genetic information, thus increasing genetic entropy. So you need to be more specific. What fitness are you talking about?
You are simply wrong about genetic entropy. Genetic entropy is abut reproductive fitness, not some poorly-defined and unquantifiable concept of "genetic information".
quote:
So you are telling me that 1+1 do not equal 2, right? A reduction is a reduction. Mutations o average reduce genetic information. More in smaller populations, less in large populations. But in any population, reduction still exists. You can not completely remove the reduction by invoking larger population sizes.
What I am telling you is that the point you were meant to be supporting was your assertion that the population needed to be infinite for the statistical effects of a large population to significantly reduce the impact of "noise". That has nothing to do with your ideas about "genetic information".
quote:
And we extrapolate this to larger popultions too. Becaue there is no reason to think that larger populations will fully remove the effects of genetic entropy. They will reduce them. but never fully remove them. Thus entropy continues to increase.
That is simply your opinion. The experts working in the field don't seem to agree. Which is why you never found a paper that actually supported your claim.
quote:
The only help for you would be that there was a correlation of all those traits with the genetic traits. If it was correlated that those who have all the good traits, plus the beneficial mutations always got selected, than the noise would not matter. But on average it doesn't happen. There is no correlation on average. And bad and good traits from every source are on average, equally spread through the population. Therefore, noise exists.
And in large populations - as I keep pointing out - the effects of noise will be reduced. That is why genetic drift - your "noise" - is weaker in larger populations. That is why I don't need to appeal to correlations - and it would be to your advantage if you could.
quote:
The removal of all deleterious mutations is 100% efficiency. If it's not 100% deleterious mutations accumulate.
Of course I have already pointed out a case where the removal through selection is not 100% efficient and where not all deleterious mutations need be removed - and yet they do not accumulate. A dynamic equilibrium where the NUMBER of deleterious mutations removed from the population - by either selection or drift - equals the NUMBER of deleterious mutations added to the population.
It is not necessary that all be removed, nor is it necessary that selection should do all the work of removal when drift can also assist.

This message is a reply to:
 Message 945 by Smooth Operator, posted 01-28-2010 4:21 PM Smooth Operator has replied

Replies to this message:
 Message 956 by Smooth Operator, posted 01-31-2010 8:59 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 960 of 1273 (545101)
02-01-2010 2:54 AM
Reply to: Message 956 by Smooth Operator
01-31-2010 8:59 PM


Re: CSI & Genetic Entropy discussions
quote:
By all functions I mean all known functions. Since there is one know function, and it was lost. Meaning, one function is lost, zero are left, meaning all functions were lost.
In fact we know that you didn't mean that. But we can agree that the only functin actually tested for was lost.
quote:
I see no problem with that, I just assumed you meant the bold D since you said that D* is not the pattern. But you see, it is, both D* and D are the pattern, becasue D is the short for D*.
D is not used as an abbreviation for D*. It is used as an abbreviation for (D,*). They are not the same thing.
quote:
quote:
I am afraid that you are misreading it. It says that D may be used to represent (D, *), not D*. I can see the sentence that you mean, but the following sentence makes it clear:
Again, page 136.
quote:
Formally, a pattern may be defined as a description-correspondence pair (D,*) where the description D belongs...
...
(D,*) is therefore a pattern relative to the descriptive language D and the collection of events E.
As you can clearly see is the pattern. And D* is the event that describes that pattern. Which is basicly the specification.
As you can clearly see the pattern is (D,*). D is a description (not an event). D* is the event described. And D is sometimes used as a shorthand for (D,*) - but never for D*.
quote:
You don't apply * to regular but to bold D.
You only have to look at it to see that that is wrong. It is D* (with no bolding). Therefore * is being applied to the description D.
quote:
Exactly! So why in the world, if wanted to calcualte the complexity of the flagellum, would we want to include all other objects that specify the same pattern yet are not the flagellum? It makes no sense.
It makes perfect sense, and I already explained why. If we use the specification "more heads than tails" for a given run of coin tosses we want the probability of getting any of the sequences that fit that specification. We don't want the probability of that particular sequence.
As Dembski says (yes, it's p165 again !)
...the event that needs to have small probability to eliminate chance is not E, but D*.
quote:
I'm not inflating anything! I'm comparing one probability to another. How in teh world are you going to tell if the probability of hitting the target, that is, specifiying the pattern is small enough to infer design? HOW!? If you only have the complxity of the specification, than you can't. You need botht he complexity of the specification, and the event (in this case the 50 protein flagellum) to see if their ration is less or more than 1/2.
The target event is D* (that is the whole point of the specification - to define the target). So if we want to calculate the probability of hitting the target we want the probability of D*. And that is what we need to infer design:
As Dembski says (TDI p165)
...the event that needs to have small probability to eliminate chance is not E, but D*.
quote:
The S(T) is the complexity of the specification, the P(T|H) is teh complexity of the event. You need both of them!
S(T) is available specificational resources. p(T|H) would be the probability of meeting the specification (i.e. p(D*|H)). Here's what it says about T (p18) [qs] ...T in [b]P(T|H) is treated as an event (i.e., the event identified by the pattern). [/qs]
quote:
That means you want to combine both objects one with the complexity of 300 and the other witht he complexity of 500 bits. That's illogical. Only one is CSI, the other is not.
Wrong again. On the basis of this specification, neither would be CSI since there are less than 300 bits of specified information.
quote:
Please tell me, do you know why the UPB exists, and how do we use it?
Yes, I do know. It is supposedly a probability set so low that we cannot expect a single specified event of this probability to occur in the lifetime of the universe. Unspecified events - and more importantly sequences of events = of arbitrarily low probability can and will occur. That is why Dembski says (TDI p165):
...the event that needs to have small probability to eliminate chance is not E, but D*.
quote:
For all I can it can even be the minority of beneficial mutation that increase genetic entropy. It's not important to me. My whoole argument and point was to show you that you can't simply invoke beneficial mutations to offset the effects of deleterious ones. And that is what I have shown you by showing you sickle cell, HIV resistance, and antibiotic resistance. All those mutations are considered beneficial, yet they increase entropy.
And of course your argument was completely wrong. How can increases in fitness fail to offset decreases in fitness ? Genetic entropy is about reducing fitness, beneficial mutations increase fitness.
quote:
But why listen to me? Listen to the professionals. They agree with me you know?
Contamination of the genome by very slightly deleterious mutations: why have we not died 100 times over? - PubMed
quote:
In many vertebrates Ne approximately 10(4), while G approximately 10(9), so that the dangerous range includes more than four orders of magnitude. If substitutions at 10% of all nucleotide sites have selection coefficients within this range with the mean 10(-6), an average individual carries approximately 100 lethal equivalents. Some data suggest that a substantial fraction of nucleotides typical to a species may, indeed, be suboptimal. When selection acts on different mutations independently, this implies too high a mutation load. This paradox cannot be resolved by invoking beneficial mutations or environmental fluctuations.
See? You can't invoke beneficial mutations to save a population. For three very good reasons. First, obviously, as I've already said, not all beneficial mutations can offset deleterious ones. Becasue they also can cause geentic entropy to increase. Secon reason is the noise the noise. And the third one, is because there is just so damn little of them. Nearly neutral and deleterious aer much more numerous. Therefore,t eh population goes extinct sooner or later.
If you really listened to them you would know that they didn't agree with you. Firstly Kondrashov say that (effective) population size is important. That is what Ne is. Secondly Kondrashov does not say anything about beneficial mutations contributing to genetic entropy. He is (correctly) talking about the accumulation of mildly deleterious mutations.
Here are the first three sentences of the abstract - omitted from your quote - which make it clear:
It is well known that whens,the selection coefficient against a deleterious mutation, is below ≈ 1/4Ne, where Ne is the effective population size, the expected frequency of this mutation is ≈ 0.5, if forward and backward mutation rates are similar. Thus, if the genome size, G ,in nucleotides substantially exceeds the Ne of the whole species, there is a dangerous range of selection coefficients, 1/G< s< 1/4Ne. Mutations within this range are neutral enough to accumulate almost freely, but are still deleterious enough to make an impact at the level of the whole genome.
The only point regarding beneficial mutations is that they are not sufficient to offset the problem.
So no, the professional clearly doesn't agree with you.
quote:
Do you have the book? No you do not. I do. I have Sanford's book, and I know what he is talking about. He compared the genome to a manual and how small random changes to a manual might not do much harm, but over time they will destroy the information in it.
In other words you think that your interpretation of an analogy dictates Sanford's meaning ?
You're going to need to do better than that if you want to claim that Sanford is talking about anything other than the same accumulation of deleterious mutations that the Kondrashov paper refers to. None of your other quotes offer any support for your position either. In fact it seems like you are actually avoiding any quote that would clearly state what Sanford means.
quote:
Thing tend to degrade on average precisely becasue of chance. The biologically relevant sequences ar such a tiny minority of all existing sequences that the genome can describe. So it's obvious that there the mutation will on average always mutate the genome to a position that it will be less and not more functional. The more individuals you have the more chance there will be you will get less deleterious mutations. But always, on average, you will get more deleterious mutation. The only way this could be removed is you had an infinite amount of individuals to mutate.
Of course the issue here is not chance "making things worse" the question here is whether the "noise" interfering with selection tends to even out over large numbers. The first sentence of the Kondrashov abstract referred to above clearly indicates the importance of population size in controlling deleterious mutations. Listen to the professionals. They agree with me, you know. So your argument makes no sense and the conclusion is contradicted by a reference you yourself put forward.
quote:
Actually I did. Just becasue youd on't bother to read my other posts, that doesn't mean I didn't find them.
Just a moment...
That paper contradicts you. It explicitly points out the importance of population size. Your only quote relating to infinite populations only states that there is a balance point that can be more easily calculated given an infinite population,.
quote:
es, it's true that it is reduced! But not REMOVED! It's REDUCED, not REMOVED!!!
Notice the difference much!?
If it was removed, than tehre would be no increase in entropy. If the noise is just reduced, than entropy still increases! Less than before, but it still increases!
That is just assertion. Remember to listen to the professionals. They agree with me, you know.
quote:
And I have shown you in my previous post, with NUMBERS, why this is wrong. You still end up, with the genetic entropy increasign and with genetic meltdown waiting for you.
No, you haven't. In fact you said that you didn't need the numbers, And you were wrong.

This message is a reply to:
 Message 956 by Smooth Operator, posted 01-31-2010 8:59 PM Smooth Operator has replied

Replies to this message:
 Message 967 by Smooth Operator, posted 02-05-2010 8:51 AM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 970 of 1273 (545784)
02-05-2010 10:06 AM
Reply to: Message 967 by Smooth Operator
02-05-2010 8:51 AM


Re: CSI & Genetic Entropy discussions
quote:
Since you can't read my mind, you do not know what I meant.
If I have to read your mind to understand your posts there is no point in you posting anything.
The fact is that you have explicitly argued for the claim that ALL function was lost, not just the known function and from that we can conclude that that is really what you meant.
quote:
Yup, I agree. Therefore, (D,*) = "bidirectional rotary motor-driven propeller = 10^20.
"bidirectional rotary motor-driven propeller is just plain D. And neither it. not D*, nor (D,*) equal 10^20 which is the estimate of the specificational resources given a four-part concept.
quote:
Description as in descriptive language, or a pattern? Which one do you have in mind?
When I refer to the description, D, I do not mean the descriptive language D or the pattern (D,*). So the answer is that I mean neither.
quote:
Okay, but if you take the amount of throws as the compllexity of the event, say 10, than it is obvious that all those events are equally probable.
this does NOT, however apply to the flagellum that has 50 proteins, and the one that has 1.000.000 proteins. Becasue it is harder for the more complex one to match the pattern. Therefore, we do the calculation separately.
I do not take the number of throws as the complexity of the event. And in fact it doesn't have much effect on the probability unless the number of throws is both even and low (the probability is 0.5 for any odd number of throws and the lowest probability is 0.25 for 2 throws - a difference of 1 bit in the extreme case).
What is more the number of proteins in the flagellum is NOT a number of attempts. It is an unspecified detail of that particular "bidirectional rotary motor-driven propeller, just as the exact sequence of heads and tails is an unspecified detail in my coin-toss example.
quote:
I agree. E is irrelevant. D* is what we are looking for. In this case, D* is the flagellum consisting of 50 proteins. And it's complexity is 10^2954.
I thought that the specification was "bidirectional rotary motor-driven propeller. There is no mention of 50 proteins there. D* - the specification considered as an event would be something like "getting a bidirectional rotary motor-driven propeller.
Either you are using some other specification you haven't mentioned (and one that smells of fabrication) or that isn't D*. Which is it ?
quote:
Yes, I totally agree. But you also have to compare how hard is it for that complexity of the event that mathces the pattern, that is, the event D*, to hit the patternt D*. And you have to make sure that it's more than 1/2. Check out the chapter "The magic number 1/2" on the page 190. It explains that you have to use botht eh complexity of the pattern and the event that matches that pattern and compare them to see if it's over or under the number 1/2.
So then we need to calculate the probability of D*. Which means that either you need a valid specification for the calculation you want to use, or you need to do the calculation for the specification we agreed - "bidirectional rotary motor-driven propeller. Whichever you prefer.
quote:
Listen, we have already established that S(T) is actually the complexity of the pattern (D,*) which is 10^20.
It's the specificational resources, no matter what else Dembski or you call it.
quote:
Yup. S(T) is the complexity of the pattern. The pattern is "bidirectional rotary motor-driven propeller = 10^20. We already established that. Let's not go over this again.
The pattern is not the specificational resources and the description is not the pattern - and it isn't the specificational resources either.
quote:
And yes, P(T|H) would be P(D*|H), which is the event that matches the above mentioned pattern. D* in this case is the 50 proteins large flagellum, whose complexity is 10^2954.
No, it wouldn't for reasons we've already gone into.
quote:
The one with 500 bits would be CSI, becasue you need 400 bits in order for an event to qualify as CSI.
Wrong, you need 400 bits of SPECIFIED information to be CSI. Unspecified events aren't CSI no matter how many bits of "complexity" they have.
quote:
Great. Tell me, how to we than use UPB, when we want to infer design.
To give the simple answer. You find a valid specification that includes the event. You calculate the probability of meeting the specification. If that probability is less than the UPB (2^-400) then you infer design.
quote:
Again, you are nto specific enough. You are dancing around the point. Don't do that. Which fitness are you talking about? Reproductive fitness? Yes, it can be increased by benficial mutations. Beneficial mutations like sickle cell that increase reproductive fitness in Africa, yet int eh same time reduce geentic information.
Of course we are talking about reproductive fitness. And since genetic entropy is about reproductive fitness and not some vague notion of "genetic information" fitness gains from beneficial mutations can and do counteract the fitness loss of deleterious mutations. Just how hard is that to understand ?
quote:
1.) You are wrong.
2.) Size is important, becasue the larger the population size, there is less of an increase of genetic entropy.
3.) I never said he said anything about beenficial mutations increasing genetic entropy. I said that he said that you can't just invoke beneficial mutations to reduce geentic entropy. And he said just that. Read his last statement...
1) What am I supposedly wrong about ?
2) I am glad that you admit the importance of size, however you still have to deal with the fact that the experts do not think that genetic entropy is a problem for large populations. (All your quotes from experts deal with small populations)
3) Kondrashov does not say that beneficial mutations play no role, simply that they are not sufficient to deal with the problem, given the numbers he is using for effective population size and mutation rate etc.
quote:
LOL. But that was my point! My point is, as stated above, that you CAN NOT simply invoke beneficial mutations to remove genetic entropy! And Kondrashov said just that. And even you said so, right? You agree with me that this is true?
My point was that beneficial mutations did play a role, and that your argument ignored that. Kondrashov does not deny that.
quote:
Actually I quoted the parts where he says that genetic entropy is about the reduction of genetic information. And now I'm going to quote the part where he said that beneficial mutations degrade the genome also.
Actually you didn't because not one of your quotes mentioned "genetic entropy" at all. They were just the old creationist "information loss" argument (which is best described as meaningless).
The point is that "loss of information" without loss of fitness is not going to force a species into extinction. "Loss of information" with a gain of fitness is more likely to save a species from extinction.
quote:
How does it contradict me. Tell me exactly how does this paper contradict me?
The papaer said that the balance can be reached when we have an infinite population. And it NEVER EVER mentioned any calculation. It plainly says that it's about an equilibrium that exists when the population has an infinite amount of individuals. Stop inventing words.
Because it explicitly states that the risk of extinction comes from the effect of fragmentation lowering the effective population size.
Here we show that metapopulation structure, habitat loss or fragmentation, and environmental stochasticity can be expected to greatly accelerate the accumulation of mildly deleterious mutations, lowering the genetic effective size to such a degree that even large metapopulations may be at risk of extinction.
It does NOT say that equilibrium can only be achieved with infinite population size, only that the equilibrium level is independent of the mutational effect with infinite populations size.
And in fact they do calculate this equilibrium level in their work.
(Text for figure 3)
Simulations of populations with mutation accumulation (open symbols) start with the mutational load of an infinite population at mutation-selection balance
quote:
No, it's not. It's primary school math. Either something decreases, increases, or is in perfect equilibrium. There is no fourth choice.
So, either the mutations increase, as they always do, or they are removed totally if you have a perfect selection, or they are in equilibrium if you have an infinite population. You DO NOT have perfect selection, and you DO NOT have an infinite population. Therefore, they increase.
Or they are in equilibrium with a finite population. You haven't offered anything to rule that out yet.
quote:
Maybe you missed them. Let me repost them again. Here you go:
Let's say that 50 is the threshold to genetic meltdown. The population starts out with 0 mutations.
No, I didn't miss that. But it doesn't show anything because it begs the question. It simply assumes that less than 100% effectiveness equals accumulation (which is what it is supposed to show). However, even delayed (but certain) removal is less than 100% effectiveness, and you also need to count the loss of deleterious mutations due to drift. As I said, to deal with the issue you need real numbers, because they control the equilibrium level.

This message is a reply to:
 Message 967 by Smooth Operator, posted 02-05-2010 8:51 AM Smooth Operator has replied

Replies to this message:
 Message 972 by Smooth Operator, posted 02-05-2010 12:58 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 975 of 1273 (545805)
02-05-2010 1:36 PM
Reply to: Message 972 by Smooth Operator
02-05-2010 12:58 PM


Re: CSI & Genetic Entropy discussions
quote:
And I explained that that is becasue only one function was known to exist, and it was lost.
So much for letting it drop !
Anyway, thanks for admitting that I was right about what you said , and that I did NOT need to read your mind.
quote:
"bidirectional rotary motor-driven propeller is the four-part concept. It consists of 4 concepts, those four words. And since D is short for (D,*). Than "bidirectional rotary motor-driven propeller = (D,*) = 10^20.
Just because Dembski uses D sometimes to mean the D component of (D,*) and sometimes to mean (D.*) does not mean that they are the same thing. So, no, D does not equal (D,*). "bidirectional rotary motor-driven propeller is just D, not (D,*). And neither is equal to 10^20
quote:
There is no such thing as the description D. There is a pattern D.
Yes there is - it's the D in the pattern "(D,*)".
quote:
What? Of course that's the complexity of the event. 2 throws are more complex than 1 throw. 10 throws are more complex than 2 throws.
Do I need to remind you that we want the probability of D*, not the probability of the unspecified event ? Do I need to point out that this example proves exactly that ? Do I need to repeat that the probability of meeting the specification I gave - P(D*) - is at least 0.25 regardless of the number of throws ?
quote:
Unless it matchess the pattern.
Since we're talking about the details which AREN'T part of the pattern, they can't match it.
quote:
Yes, I'm uing the same specification as always. And teh 50 proteins are mentioned in NFL.
OK, then 50 proteins are NOT part of D*. D* is "getting a bidirectional rotary motor-driven propeller - no mention of 50 proteins there. Whether it is mentioned in NFL doesn't matter if it isn't mentioned in the specification.
quote:
We already have everything.
D = "bidirectional rotary motor-driven propeller = 10^20.
D* = 50 protein flagellum = 10^2954.
That doesn't even make sense. 10^20 considered as an event is 10^2954 ?
quote:
I'm sorry but it doesn't matter what I or you say. But it matter a lot what Dembski says. And he says it's hte complexity of the pattern.
Dembski also says that it's the specificational resources.
quote:
Yes, I know that. I'm trying to explain to you that there is no combining going on.
But the combining is to eliminate the unspecified information. So if you agree that we shouldn't count it then you have to agree with the combining.
quote:
Great! How do we find the probability of the event?
We don't want the probability of the event, just the probability of meeting the specification. And in the case of the flagellum I have no idea of how to calculate it. And neither does Dembski.
quote:
But you are wrong. Sanford said it's about degradation of genetic information. that can, but it does not in every case decrease reproductive fitness.
In that case can you quote him actually saying that ? Because you didn't.
quote:
2.) I have ALWAYS said that larger sizes do help, but do not remove entropy completely. And no, I specifically showed you where it says that it's the problem for large populations as well. Once more.
And it says that it is only a problem when fragmentation REDUCES the effective population.
Therefore it doesn't support you.
quote:
And again, you are wrong. I never said that. Again, for the trillionth time. Beneficial mutations do play a role. But they are not enough. Do you understand me now?
If you were agreeing with me all along, then why were you arguing ?
quote:
They do not have to mention it! The whole book is about genetic entropy! It doesn't have to be mentioned in every statement!
They do have to mention it if they are saying that this information loss IS genetic entropy.
Which is the point you were supposedly trying to argue.
quote:
Wrong! Biological information is what performs all the biological functions. Without biological functions, living organisms can't do what they do. When they loose enough of the functions, they die! How can a non-functional lungs save a population from extinction?
Which only covers "losses of information" that negatively impact fitness. Not those that increase fitness.
quote:
That doesn't contradict me. It still means that only when the population is not fragmented, and is infinite in size, that equilibrium exists.
It does contradict you because it makes it clear that the problem only exists for low effective populations sizes.
quote:
How would that happen. Explain how?
By selection and drift removing deleterious mutations from the population at the same rate as they arrive. Thus we have an equilibrium without selection being 100% effective.
quote:
And the drift does not help you! The drift is random. And since there are more deleterious than beenficial mutations, while drift is in operation, as much deleterious mutations are lost, more will be accumulated
Wrong. The more deleterious mutations in the population the faster drift will remove them. That is one of the factors that your "example" didn't take into account.

This message is a reply to:
 Message 972 by Smooth Operator, posted 02-05-2010 12:58 PM Smooth Operator has replied

Replies to this message:
 Message 994 by Smooth Operator, posted 02-10-2010 12:14 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 999 of 1273 (546370)
02-10-2010 1:11 PM
Reply to: Message 994 by Smooth Operator
02-10-2010 12:14 PM


Re: CSI & Genetic Entropy discussions
quote:
I can't drop something that you are insisting on, while in the same time, that thing is not true. And it ain't.
You indicated that the only important thing was the known function - which we agree occurred. All I did was make it clear that you earlier had argued for loss of ALL function. And you turn around and start trying to argue for the loss of ALL function again.
quote:
However you look at it, "bidirectional rotary motor-driven propeller is the pattern. For which you claim that it's D. Fine, let it be D. But it's complexity is 10^20.
It's the description D, not the pattern (D,*). And the "complexity" is just the specificational resources. It certainly ISN'T the probability of D*, which is what we want.
quote:
Yes I know that. You do not have to remind me. I'm just saying that more throws are more complex than less. Do you agree with that or not?
If you are referring only to the completely irrelevant unspecified complexity, then yes. But that only supports my point - we do not want nor care about that complexity. It is only the specified complexity which is never less than 0.25 for the specification - which is why that specification can never indicate design.
quote:
Which details are those?
The 50 proteins and their structure in the case of the flagellum. Probably some details of their arrangement, too. For my example with the coins the exact sequence - which is what your "complexity" above refers to.
quote:
quote:
p(T|H) would be the probability of meeting the specification (i.e. p(D*|H)). Here's what it says about T (p18)
These are your words. You agreed that the probability P(T|H) is P(D*|H). And I agree also. And as we all SHOULD know, P(T|H) is the 50 protein part flagellum. How do I know? Because Dembski says so.
quote:
It follows that —log2[ 120 10 ϕ
S(T)P(T|H)] > 1
if and only if P(T|H) < 140
2
1 10− , where H, as we noted in section 6, is an evolutionary chance
hypothesis that takes into account Darwinian and other material mechanisms and T, conceived
not as a pattern but as an event, is the evolutionary pathway that brings about the flagellar
structure (for definiteness, let’s say the flagellar structure in E. coli). Is P(T|H) in fact less than
140
2
1 10− , thus making T a specification? The precise calculation of P(T|H) has yet to be done.
But some methods for decomposing this probability into a product of more manageable
probabilities as well as some initial estimates for these probabilities are now in place.33
See, he says that a precise calculation has yet to be done, but a manegable one already exists. And he points to the reference number 33. Which when you go and look it up, is the chapter 5.10 of his book The No Free Lunch. And guess what is that chapter called? It's called "Doing The Calculation". Yes, and in this chapter he finely discusses how to get the number 10^2954, and clearly says that the flagellum consists of 50 parts.
In other words Dembski didn't completely botch the calculation in NFL - because Dembski says so. Unfortunately for you he did botch it. The calculation in NFL is NOT the probability of getting a "bidirectional rotary motor-driven propellor" or even close as should be quite obvious. The fact that it uses details which clearly aren't in the specification is a dead giveaway.
quote:
quote:
But the combining is to eliminate the unspecified information. So if you agree that we shouldn't count it then you have to agree with the combining.
WHAT!?
It's quite simple. if we want p(D*) we want the probability of getting ANYTHING which satisfies the description D. So we combine the probabilities of everything which satisfies D.
So either you agree to the combining, or you disagree that we want the probability of D*.
quote:
LOL WRONG! Of course you need the probability of the event!? How are you going to know if the probability of matching the patternt witht he specified event is small enough to infer design!? That's the whole point of design detection and you're missing it!
The probability of matching the pattern - which is what Dembski's method uses to infer design - is the probability of D*.
As Dembski says - TDI p165
..the event that needs to have a small probability to eliminate chance is not E, but D*
So you're just laughing at Dembski
quote:
It seems as thoug I was right. Genetic information is not perfectly correlated with reproductive fitness and noise averaging is not going to effectively remove genetic entropy.
But only if the noise is increasing. So THAT "noise" is not genetic drift. Do pay attention to the context.
quote:
Those things always happen in the nature. It's like saying that something only happens when people eat or walk. They always do that.
No, it doesn't always happen. Populations can increase in size. They don't have to get fragmented to the extent that mutational load is a problem.
quote:
I'm not agreeing with you. This is what you said:
YOU: "You are claiming that beneficial mutations dont' help!"
ME: "No. I said that they do help, but they do not remove genetic entropy completely."
YOU: "No you said that they do not hel at all."
ME: "No, I said that they DO help, but not enough to completely stop genetic entropy."
YOU: "No, you said they simply don't help at all..."
ME: "No, I did not!"
YOU: "Yes you did..."
etc...
etc...
See?
Yes. I see that you are now agreeing with me that beneficial mutations DO help and that you were wrong to leave them out of your diagram.
Perhaps instead of trying to cover up your mistakes you should try harder to avoid making them in the first place ?
quote:
Ahem. The article talks about LARGE populations.
With a LOW effective population size. Which is exactly the point I made.
quote:
But that by definition is natural seelction at 100%. Which is impossible.
So what you are saying is that less than 100% effectiveness is by definition 100% effective.
I don't think that makes much sense. It's quite clear that not all deleterious mutations are removed, and not all of those that are removed are removed by selection, so selection is obviously less than 100% effective.
quote:
I'll tell you. The whole genome. That is the unit of selection. Therefore, some organisms that are more fit than others might as well carry some deleterious mutations with them. And on avrage they will. Natural seelction does not look at the singl nucleotide to evaluate the organism. It looks at the whole genome. So even those who are more fit are going to carry their deleterious mutations into the next population. And genetic entropy increases.
With a large population size and genetic mixing from sexual reproduction there will be selection for and against individual alleles. Only in the case of pure clonal reproduction would it make sense to say that the whole genome was the unit of selection. Because without that the whole genome doesn't survive. It is just a feature of one individual.
quote:
But on average it won't remove all. Neitehr will tehre be an equilibirum, they will accumulate. Look, natural seelction is invoked in the first place to remove deleterious mutations. Without it, genetic entropy is even faster. You can't turn the table and say that genetic drift is better than natural selection. It's not. Without natural selection a species is doomed even faster.
But I am not saying that drift is better than selection. I am saying that drift AND selection both remove deleterious mutations. Obviously together they will remove more than selection alone !
Edited by PaulK, : No reason given.

This message is a reply to:
 Message 994 by Smooth Operator, posted 02-10-2010 12:14 PM Smooth Operator has replied

Replies to this message:
 Message 1029 by Smooth Operator, posted 02-13-2010 10:20 AM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 1034 of 1273 (546754)
02-13-2010 12:30 PM
Reply to: Message 1029 by Smooth Operator
02-13-2010 10:20 AM


Re: CSI & Genetic Entropy discussions
quote:
1-1= 0 does it not? If the enzyme had one known function, and it lost it, than how many has it left? Obviously, the correct answer is ZERO. Meaning, no functions are left, meaning, all functions are lost.
And an unknown number plus one minus one plus an unknown number is an unknown number.
quote:
I know it's not. So stop mentioning it. Do you, or do you not agree that the "bidirectional rotary motor-driven propeller is the pattern D, whose complexity is 10^20, just as Dembski said?
I only agree that the number 10^20 is Dembski's estimate of the specificational resources.
quote:
Okay, so you agree that more throws are more complex than less throws. Fine, let's move on from here.
If you agree that more complex objects have higher complexity, than you also agree that a flagellum consisting of 1.000.000 proteins, has a higher compexity than the one consisting of 50 proteins, right?
No, I don't agree because the "complexity" is produced from a probability calculation and we don't know what the results of the two calculations would be. It is likely that the one needing more proteins would be less probable, but it isn't certain.
quote:
What the hell are you talkign about? What did he do wrong?
He didn't calculate the correct probability. Or even anything resembling the correct probability. That's what he did wrong.
quote:
But you do understand that the probability of a 50 protein falgellum will be different than a 1.000.000 protein flagellum?
But do you understand that since both wil fit the pattern we need the probability of getting either of them ? Or any other flagellum.
quote:
Again, WHAT?
The "noise" in the quote is not the "noise" of genetic drift that we were talking about earlier.
quote:
Do you nto remember this picture I made!?!?!?! I specifically said that green numbers are benefitial mutations. And I have ALWAYS been sayign that they help somewhat. But that you can't invoke them to completely remove genetic entropy. You are either insanely senile, or you are deliberately lying.
Except that you DIDN'T allow them to offset the fitness loss. Which they do.
quote:
How exactly does that help you? Show me the quote in the article that agrees with you.
It helps me because it shows that the article agrees exactly with what I said. Mutational load is only a problem with a low effective population size.
quote:
This is physically impossible. Do you honestly think that individuals pass just portions of their genome to the offspring? No, they pass on everything. So the unit of selection is the whole genome. Natural selection can not select a single nucleotide.
You obviously don't know much about reproductive biology. Sexually reproducing species get only HALF the genome of each parent.
quote:
Exactly. Yes they will. But not ALL of them. And that is the reson geentic entropy exists. Because natural selection, and any other mechanism you can come up with are not perfect. So the information must decline.
Of course, since you have no sensible measure of "genetic information" any such statement is pure speculation.

This message is a reply to:
 Message 1029 by Smooth Operator, posted 02-13-2010 10:20 AM Smooth Operator has replied

Replies to this message:
 Message 1039 by Smooth Operator, posted 02-16-2010 12:44 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 1051 of 1273 (547150)
02-16-2010 6:28 PM
Reply to: Message 1039 by Smooth Operator
02-16-2010 12:44 PM


Re: CSI & Genetic Entropy discussions
quote:
Yet I specifically said that I'm only talking about known functions.
You also specifically said all functions and argued that we could infer the loss of all function from the loss of the known function.
quote:
Since bidirectional rotary motor-driven propeller consists of 4 parts, whose individual probability is 10^5, multiplying that up leads us to 10^20. This is as previously stated teh complexity of teh pattern. And int eh same time the relevant specificational resources for specifying the pattern in question. So it's both, just as I've been saying all along.
I's the specificational resources which Dembski also for some reason calls the complexity of the pattern. The point of calling it the specificational resources is because it is a more meaningful name so you won't confuse it with anything relevant to the probability of matching the specification (the probability of D*).
quote:
Explain the logic behind saying that getting 6 with one die is less complex than getting 1.000.000 6s with 1.000.000 dice. And than in turn sayignt that getting a 50 protein flagellum is NOT less complex than getting a 1.000.000 protein flagellum.
The logic was explained. THe two cases are simply not comparable - not least because in the case of the dice we know how to calculate the probability. We don't for the two hypothetical flagella and there could be aspects which make the one with 1,000,000 proteins more likely than the one with 50 (which would depend on things like the proteins).
quote:
Why? What's wrong with the calculation found in NFL?
It doesn't use a valid specification and it ignores relevant information.
quote:
No we do not. We need only one. In this case the 50 protein one. If we were interested in the million protein one, we would calculate only his probability.
It doesn't matter which noise it is. It's still noise and it's not averaging out. Any noise is bad. Some will scale with fitness some will not. And the noise that is not scaling increases genetic entropy.
It may not matter to you what the material you quote actually says, but it does to anyone honestly interested in understanding the issues. This other noise IS the "genetic entropy".
quote:
Take a look at this article. It specifically talks about large sexually reproducing organisms. Which as you can see are also in danger of mutational meltdown. Note however that the authors are only concerned with reproductive fitness. They do not take into account that genetic entropy is building up becasue of deterioration of genetic information. They are only concerned with reproductive fitness itself. Yet, they show that large populations can also go extinct.
Congratulations on finding one paper that actually supports your point - to a degree. A definite step up on using papers that contradict your claims. Of course it is purely theoretical - and published in a physics journal - and still presented only as a possibility depending on the parameters. It is still not the inevitability that you claim.
quote:
Where does it say that?
Did you not read that it is specifically dealing with the problems faced by a fragmented population ? Did you not read that it says that fragmentation lowers the effective population size ? Do you not understand that it is effective population size that matters ?
quote:
I KNOW! The point is that while reporducing, the parents DO NOT pick and choose which nucleotides they will pass on to their offspring !
So you didn't mean what you said and what you did mean was a complete irrelevance. (Here's a hint random mixing of the parental genomes supports my point)
quote:
...the offspring is the mix of those two genomes. Therefore, natural selection selects on the level of the genome. Not on the level of the single nucleotide. Which means that even if a certain individual has beneficial mutations, the deleterious that he might also have, go right to the offspring thogether with the beneficial ones. Natural seelction can not select out individual deleterious nucleotides.
Of course I never made reference to single nucleotides (or do you not know about genes or chromosomes ?) The fact is that in a typical sexually reproducing species the whole genome will appear in very few individuals, and so it will not have much of an opportunity to be selected. It would take an extreme case to have much impact. However, genes are usually passed on intact (and genes on the same chromosome have a tendency to stick together). So genes are better as a unit of selection because they can spread more widely in the population and persist over the generations.
quote:
Exacept that CSI is a fine measure.
CSI is a binary measure (either something is CSI or it isn't) which makes it a poor choice. And it can't be measured for any gene which makes it a totally useless choice.
quote:
Anyone who claims that 50 protein flagellum is the same in complexity as a 1.000.000 protein falgellum can't understand CSI, nor Shannon information for that matter. Even in Shannon information those two are distinct.
Since Dembski's "complexity" is a probability measure which does not depend purely on length anyone who claims that any flagellum based on 1,00,000 proteins MUST be more complex than any flagellum based on 50 doesn't know what they are talking about. ANd Shannon information isn't even relevant to that.

This message is a reply to:
 Message 1039 by Smooth Operator, posted 02-16-2010 12:44 PM Smooth Operator has replied

Replies to this message:
 Message 1054 by Smooth Operator, posted 02-18-2010 6:10 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 1056 of 1273 (547369)
02-18-2010 7:13 PM
Reply to: Message 1054 by Smooth Operator
02-18-2010 6:10 PM


Re: CSI & Genetic Entropy discussions
quote:
No, I said that we can say it lost all functions becasue we don't know of any other.
Which is what I said. And it's wrong - and you know it's wrong.
quote:
NO! Proteins are proteins. The proteins are identical. 50 proteins is LESS, and ALWAYS LESS than 1.000.000 proteins. Therefore, the 50 protein flagellum is LESS complex than a 1.000.000 protein flagellum. Therefore, a 50 protein flagellum is MORE probable than a 1.000.000 flagellum.
Of course that isn't true. You can't work out the probabilities just from the number of proteins.
quote:
You do NOT get to invent some additional ideas that would increase the probability of the 1.000.000 protein flagellum. They consist of identical proteins. Therefore, one is less probable than the other.
I'm not inventing anything. I'm just pointing out that there can be factors that you simply haven't taken into account.
quote:
This is thoroughly explained in Dembski's newest article "Bernoulli’s Principle of Insufficient Reason and Conservation of Information in Computer Search". Where he explains that if we have no prior knowledge about the search space, we impose the unifom probability distribution. Anything else is going to lead us to even worse results. So if we know that a certain kind of protein has a probability of forming P, than 50 of those will have the probability of forming at P/50. And it logically follows that anywhere else in that same searchs pace the probability of forming 1.000.000 of those same proteins will be P/1.000.000.
Obviously you don't know what you are talking about. We aren't talking about the design of search algorithms, we are talking about calculating the actual probabilities.
quote:
Explain everything in full detail!
It seems clear enough. Dembski's method requires the calculation of the probability of meeting the specification (the probability of D*). Dembski didn't do that. What more is there to say ?
quote:
Genetic entropy is the deterioration of the information in the genome over time. As I have quoted Sanford already. Stop inventing your own notions of what genetic entropy is. The noise is one of the causes, whereever it comes from.
Since the "noise" IS the "genetic entropy" it can''t be a cause of it. This is what happens when you don't care about understanding the material you are quoting.
quote:
My claim is that it can happen in large populations. You claimed that it CERTAINLY CAN NOT. Iwas right. You were wrong. And since I was right. We can extrapolate the empirical resultsfrom other experiments that show genetic meltdown to higher sexually reproducing populations. If you disagree, explain why in detail.
Well no. I've never argued that it couldn't under some theoretical conditions be a problem for a large effective population. If you're prepared to accept that it might possibly, in theory, be a problem for some large populations - and no more, then we can close this down.
quote:
Quote the relevant part and explain in detail why it supports your view. You know, like I've been doing all along.
Well, no. Usually the material you quote DOESN'T support you.
But really you are missing the whole point of the paper. It 's arguing that conservationists need to try to avoid letting a population fragment because that can significantly increase the risk of extinction. If extinction was inevitable anyway, and the fragmentation of the metapopulation was not relevant there wouldn't be any point in the paper at all.
quote:
No, it does not support you. For two resons.
1.) Genetic recombination DOES NOT EQUAL natural seelction. We are talking about if natural selection selects on the level of the nucleotide.
2.) Genetic recombination is random. Therefore it does not pick out the deleterious mutations. It could pick them out. But it can just as well pick out the beneficial ones an leave the deleterious ones.
Both "reasons" are completely bogus and have nothing to do with my argument. I suggest that you go back and read it again since you obviously didn't understand it..
quote:
And again, this has nothing to do with natural selection selectin on either on the level of a single nucleotide, or a gene, or a chromosome. It still evaluates the whole genome, and than seelcts.
As I have pointed out, this is incorrect. Using the genome as the unit as selection is simply silly for the reasons I have already given.
quote:
All inforamtion that we measure is binary. Yes, something is either CSI or it's not. But you can still have 400, 500, 1000, or 10.000 bits of CSI.
Strictly speaking, you can't. As I said CSI is binary - either something is CSI or it isn't. You can have bits of information or even specified information but not CSI, because the C is the probability bound. Anything over the bound is Complex, anything under it is not. And that's all there is to it.
quote:
Of course it can. The fact that you don't accept it is not my fault.
Of course it can't be done and it hasn't been done. That's why you can't come up with a valid example.

This message is a reply to:
 Message 1054 by Smooth Operator, posted 02-18-2010 6:10 PM Smooth Operator has replied

Replies to this message:
 Message 1070 by Smooth Operator, posted 02-21-2010 5:09 PM PaulK has not replied
 Message 1071 by Smooth Operator, posted 02-21-2010 5:10 PM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 1078 of 1273 (547692)
02-21-2010 6:57 PM
Reply to: Message 1071 by Smooth Operator
02-21-2010 5:10 PM


Re: CSI & Genetic Entropy discussions
quote:
For this to be wrong, we would have to invent new unknown functions. Why would we want to do that?
No, it must merely be possible that there are unknown functions. Since that *is* possible it is not valid to infer that loss of the one function tested for is loss of all function in every case.
quote:
Uh, yes you can. That's liek saying you can't work out the probabilites of what number you are going to get just from the number of dice you have. Yes you can.
No, it isn't like that at all. We know how to do the calculation for dice. We don't know how to do the calculation for completely unspecified proteins - because it can't be done without more information.
quote:
YES I KNOW! There can be an invisible pink unicorn playing with proteins when we are not looking!!!! There can be a light blue invisible unicorn magicaly increasing the probability of the flagellum forming every time the amount of protein reaches 1.000.000. There can even ba a green witch that adds wings to the flagellum when we are not looking, and takes them off every time we look.
SO THE HELL WHAT!?
I was thinking more of evolutionary relationships between the proteins - with each other and with other proteins in the organism or that might be acquired by the organism - and the lengths of the proteins. 1,000,000 slightly different proteins might be more probable than 50 hugely long proteins, all completely unrelated to each other and anything else . I could probably think of more factors if you actually bothered to show the calculations.
quote:
LOL! The algorithms are used to search a sequece space. They are the ones that increase the probability of an event to occure.
And in the example we are considering, which algorithms would they be ?
quote:
The point of the article is that when we are calculating a probability in any point in teh search space, we are supposed to use uniform probability unless we have knowledge otherwise. And since we don't, we go with uniform probability. Meaning that, results of throwing 1 die is less complex than throwing 5 dice, and a 50 protein flagellum has a higher probability than a 1.000.000 protein flagellum. Anything else is plain lunacy.
In other words you take the article as support for your position that we shouldn't bother trying to do things right, we should just do them your way. Unfortunately he is talking about the performance of search algorithms, not the calculation of probabilities in a specific case.
quote:
That what did he do?
A completely bogus and irrelevant calculation.
quote:
WTF are you talkibng about!?!? Noise is NOT the geentic entropy!!! Are you insane!? I gave youa quote from Sanford himself. Here I'll do it again.
In other words by giving a quote which doesn't include the word "noise" you think that you can show that a different quote didn't use the word "noise" to describe genetic entropy.
quote:
You were constantly saying that it's NOT a problem for large populations. Obviously it is.
No, that isn't obvious at all. Theoretical models aren't reality.
quote:
Because if life originated in a form of abiogenesis, the initial population of ONE, would obviously be a small population. Whatever would that hypothetical thing be it wouldn't last long. And even if it did develop few replicated instances of itself, being that it's still a small population, it would be killed by genetic entropy. So however you look at it, if you agree that genetic entropy destroies small populations, than it's obvious that those small populations would never even have the chance to become large in the first place. Let alone evolve into something else! That's my main argument. That darwinian evolution is not suited for increasing biological complexity.
But it doesn't inevitably destroy even small populations. The cheetah population has suffered from a severe genetic bottleneck (at one point probably reduced to a single pregnant female). And subsequent hunting has made their problems worse. But they're still around.
quote:
Once more, with style. Please, would you be so kind, to point out the lines in the article that support your position?
Would you like to explain the relevance of a fragmented metapopulation in the article if it does NOT make a species more vulnerable to mutational meltdown ?
quote:
What's wrong with my arguments?
The fact that they dont address what I'm saying at all.
quote:
You gave no reason whatsoever! You simply asserted it! How in the world are you going to tell me that natural selection PICKS OUT THE DELETERIOUS MUTATIONS ONE BY ONE FROM THE GEONOME!!?!? This is not sane!
It's also not what I said. What I said is that the gene is a better choice for the "unit of selection" than the genome. And I gave reasons. Now if you want crazy we can take your assumption that the unit of selection must either be the whole genome or individual nucleotides. Anybody who knows about genes would know that that was wrong.
quote:
And the morte compelx something is, the more bits it has. Therefore, you can have 400, 500 or 1000 etc... bits of CSI. But not 300.
No, they are not bits of CSI, because CSI is having more bits than the threshold. That's what the "Complex" refers to (I know it's misleading but that's Dembski for you).
quote:
You simpy say that my valid examples are not calculated the way they should be and that's your whole argument.
Because your "valid examples" obviously aren't.

This message is a reply to:
 Message 1071 by Smooth Operator, posted 02-21-2010 5:10 PM Smooth Operator has replied

Replies to this message:
 Message 1089 by Smooth Operator, posted 02-25-2010 12:30 AM PaulK has replied

PaulK
Member
Posts: 17919
Joined: 01-10-2003
Member Rating: 6.7


Message 1093 of 1273 (548043)
02-25-2010 2:49 AM
Reply to: Message 1089 by Smooth Operator
02-25-2010 12:30 AM


Re: CSI & Genetic Entropy discussions
quote:
By ALL i mean ALL KNOWN functions.
That's obviously false for a start.
quote:
Since it had ONE KNOWN function. You don't get to invent new functions that we do not know of. If you don't know about them, don't make them up. We are talking only about what we do know about.
I'm not making up anything. I'm just keeping it to what we DO know about. The known function, the one tested for was lost. Unknown functions - whether existing before or gained as a result of the mutation - were not tested for and obviously could occur in some cases even if they did not appear in this particular set of experiments. So we don't know anything about those other than that they could be present.
quote:
We are talking about IDENTICAL proteins. Do you, or do you not understand that? IDENTICAL proteins. There are just more of them. The same thing with dice. Same dice, just more of them.
The 50 proteins in the E coli flagellum are all distinct. And if the proteins WERE identical there would likely be very little difference in "complexity". (It would be more like all the dice being dice rather than all the dice coming up with a particular number !)
quote:
Isn't it obvious!? The darwinian evolution.
Since you don't actually use evolution in any of your calculations - no, it isn't obvious.
quote:
Okay. Look, yes he is talking about search algorithms. His previous article was called: "Conservation of Information in Search: Measuring the Cost of Success". But this new article is the foundation for the previous one. Get it? The COI that he is addressing in this new article is short for Conservation of Information.
In his previous article Dembski is tryign to show that on average all search algorithms perform equally well over all search spaces. But for this to be true, you have to have uniform probability.
So, according to you, Dembski makes this assumption because the argument needs it. That's not a good reason. And his real reason (he's attempting to provide an estimate of *general* performance with limited information) makes it inapplicable to a situation where we need the real numbers for a specific case.
quote:
Explain why.
I already did.
quote:
If in the definition of gentic entropy Sanford said that it's the loss of genetic information. He did not specifically say that geentic entropy is not noise. Why should he? He alo didn't say that genetic entropy is not a HOUSE, or a PLANE, or a BOAT! So by your logic if he didn't say that genetic entropy is not something, we assume it is?
I'm afraid that is another silly argument, since the English language allows for more than one way to describe a thing. In fact deleterious mutations could easily be described as "noise" in the genome. So merely NOT using the word "noise" in the definition does NOT mean that "noise" can't be used to refer to genetic entropy.
quote:
I know they are not. But we have FACTS. The facts that small populations die out because of genetic entropy. And we have models that show the same applies to large populations. Why should I not accept them. It's an extrapolation from FACTS. Why shouldn't I do that?
Because you only have models that state that large populations can, under some conditions, die out from mutational meltdown. Those models can't tell you if those conditions apply to any real populations.
quote:
And they are going to be for a very long time. Yet they have a LARGE, and I do mean LARGE amount of genetic information. Unlike ONE RNA chain somewhere in a pond. Genetic entropy would destroy it, and it would ahve never spread arond. NEVER. But let's use a bit of magic and say that it would.
Of course the relevant measure is not any "absolute" measure of information. A non-viable embryo cheetah, doomed through mutational meltdown, would still have far more "genetic information" than the single RNA strand. And it wouldn't make the slightest difference.
quote:
Let us make orselves insane and actually accept the idea that such a chain would actually survive and spread around. Okay, what than? What would have happened? What would darwinian evolution do? Make it more complex? Create new functions? No, it wouldn't. Genetic entropy would not allow that. We have no reason to suppose that it would. Do you have even the faintest, smallest bit of evidence, that is such pupulations of RNA chains, darwinian evolution can increase their functionality?
I'm not an expert on RNA life, but there's no clear barrier against increases in functionality. In reality, genetic entropy doesn't even stop increases in functionality occurring and being selected.
quote:
Okay than. Please do make a list of your arguments and I'll address them. Something like this:
1.) I say that...
2.) And also...
3.) etc...
Since you couldn't be bothered to read it the first time, or go back and read it when I asked you to why exactly should I repeat it again ? If you really ant to read it just go back through the chain of posts.
quote:
HOW!? HOW!? HOW!? How is a gene a better choice for a selection unit than a genome?
I already explained. You responded by asserting that I must be assuming that the nucleotide was the unit of selection and that I must be ignoring the random mixing of genes. Neither of which was true at all.
quote:
Population genetics makes few assumptions to define the population as a gene pool. They claim that genes can be selected for. For this to be true the following assumptions have to be true. Non of them are. So the gene pool view of populations is wrong. Which means there is no selection on the level the gene.
1.) No genetic linkage blocks.
2.) No epistasis.
3.) Infinite population size.
4.) Unlimited time for selection.
5.) Ability to select for an unlimited number of traits.
Of course, you are exaggerating here. While idealisations make for an easier mathematical treatment, none of these have to be absolute.
quote:
What?
What I said. CSI is not measured in bits. Information and Specified Information is measured in bits. Complex Specified Information is any Specified Information with more bits than the bound (or has a probability below the bound - which is the same thing).
quote:
Yet you don't explain why...
In fact I keep explaining why. Your calculations keep trying to pull in irrelevant details which aren't part of the specification. That's obviously invalid.

This message is a reply to:
 Message 1089 by Smooth Operator, posted 02-25-2010 12:30 AM Smooth Operator has replied

Replies to this message:
 Message 1094 by Smooth Operator, posted 02-25-2010 5:05 AM PaulK has replied

Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024