|
Register | Sign In |
|
QuickSearch
EvC Forum active members: 51 (9221 total) |
| |
danieljones0094 | |
Total: 920,782 Year: 1,104/6,935 Month: 385/719 Week: 27/146 Day: 8/19 Hour: 0/0 |
Thread ▼ Details |
Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
|
Thread Info
|
|
|
Author | Topic: Quality Control the Gold Standard | |||||||||||||||||||||||||||
Wounded King Member (Idle past 358 days) Posts: 4149 From: Cincinnati, Ohio, USA Joined: |
to say nothing of Von Neumann's mathmatical analysis. Which is exactly what you do, say nothing about it. You say that Von Neumann wished to design a self replicating machine himself and ended up throwing in the towel. You haven't presented a scrap of evidence that he concluded that such a system could not be produced by any mechanism all you have done is said that Von Neumann gave up on producing such a system himself. You certainly haven't shown that he ruled out random mutation and natural selection as a method of generating such a system. Indeed Von Neumann himself explicity discusses the operation of a Von Neumann machine in terms of random mutation, (Von Neumann, 1966).
One of the difficulties in defining what one means by self-reproduction is that certain organizations, such as growing crystals, are self-reproductive by any naive definition of self-reproduction, yet nobody is willing to award them the distinction of being self-reproductive. A way around this difficulty is to say that self-reproduction includes the ability to undergo inheritable mutations as well as the ability to make another organism like the original. So Von Neumann saw a requirement for heritable mutation in a self-reproductive system more complex than trivial systems such as crystalline growth. TTFN, WK
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
The problem is you are again practicing obfuscation.
The DNA 7 sigma replication scheme and machinery had to EVOLVE from non-life to life and from the simplest possible, highly error prone replicator (which no one can define or demonstrate) to the present 7 sigma replicator by random mutation and natural selection. The issue is that your example is "toy" and did not evolve but was designed, created de neuvo, by an intelligent designer. So it has no evolutionary application unless and until it is developed by random "mutations" from say a one device string length to at least forty devices. Please no more red herrings or strawmen analogies.
|
|||||||||||||||||||||||||||
Modulous Member (Idle past 311 days) Posts: 7801 From: Manchester, UK Joined: |
The present Six Sigma paradigm attempts to design and operate complex processes so accurately that only 3.4 errors per million operations is realized over the long run. That is the three sigma level of performance. Six Sigma is for human driven processes. If humans were driving evolution then you'd have a point, but we aren't.
The cell/DNA replication process is operating at about 7 sigma... an undreamed of level of accuracy and quality performance. If it was a human process it would be. However I have written a program which replicates itself perfectly every time. It does this through checksum algorithms. The biological system is very good, but it isn't as perfect as a designed system. Which is good for us, because we wouldn't have evolved. I'd imagine a rigidly perfect quality control system would be selected out because life is best when it is adaptable. It is adaptable when it doesn't replicate perfectly. Ironically perhaps, airplane fatality rates are approaching 7 sigma levels, so its hardly undreamed of.
At no stage is the improvement sought by introducing a source of random error, operating, seeing if the market accepts the new result keeping those that are accepted and discarding those that are unworkable or inefficient or otherwise unmarketable. Why,,, because it would absolutely never work in the real world. Well of course not, one wouldn't want airplane designers to built real planes that get 'selected out' of the design circulation. However, evolutionary algorithms can simulate this process so no planes have to be lost. Its cheaper. Time and again, optimum results are found using evolutionary methods as inspiration. Funny that.
No such R&D effort would ever result in a new or higher quality profitable marketable product... not ever and the enterprise would simply go bancrupt. Falsified. They wouldn't do it using real life product, but they would do it using simulations (which aren't cheap).
Yet evolutionists suppose that a seven sigma replicator arose by a random error generator and an accept/reject "market " mechanism, namely random mutation and natural selection. Not all evolutionists accept abiogenesis. Many do, but not all. Simple replication has been acheived, but it is not the kind of replication that would truly class as life. It is a good step though. This message has been edited by Modulous, Wed, 08-February-2006 04:06 PM
|
|||||||||||||||||||||||||||
Wounded King Member (Idle past 358 days) Posts: 4149 From: Cincinnati, Ohio, USA Joined: |
Perhaps it is worth pointing out that since you yourself were mentioning the many corrective systems you said were neccessary for a Von Neumann machine it seems a bit strange to complain when someone points out that such sytems can produce the sort of error levels you were calling for. Especially when living cells also have all these sorts of error correcting and detecting systems.
The basic replication machinery is much much less effective, as can be seen when elements of it are knocked out or misfunction, as in many cancers which show hypermutation and genetic instability. TTFN, WK
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
... living organisms are very complicated aggregations of elementary parts, and by any reasonable theory of probability or thermodynamics highly improbable. That they should occur in the world at all is a miracle of the first magnitude; the only thing which removes, or mitigates, this miracle is that they reproduce themselves. Therefore, if by any peculiar accident there should ever be one of them, from there on the rules of probability do not apply, and there will be many of them, at least if the milieu is reasonable.
John von Neumann, Theory of Self-Reproducing Automata. Imagine that Von Neumann stating the problems of probability and thermondynamics , miracles, highly improbable and surely he was semantically accurate being one of the great intellects in a semantically precise world and very knowledgeable of science in general. The fact that he determined that error rates and self correction devices as outlined previously is a "mathmatically intractable problem" in the actual design and operation of anything approaching life processes is simply reality. No one could imagine the creation of such a device with error rates in the 10**-9 range being achieved by some random process. A mathmatical analysis by Von Neumann is not subjective and carries enourmous weight. Of course his assumption of prepetuity after the fact is unconvincing because it assumes the first replicator could perform at the required accuracy. The probability of a replicator happening de neuvo capable of the required extant accuracy or quality in performance is greater than the 10**-150 commonly used definition of impossibility. This has been accepted by your own people and is well documented in the literature. The counter argument is that very poor replicators with high error rates could somehow be protected from opposing natural forces over eons of time as they evolved the required protective characteristics and and error correcting capacities; a premise unproven, undemonstrated and phantasmagorically improbable itself.
|
|||||||||||||||||||||||||||
Yaro Member (Idle past 6822 days) Posts: 1797 Joined: |
Don't work.
They presupose your conclusion. You assume that biology is intelligently designed, therefore you try to prove this by applying engineering methodologies to biology. Unfortunetly you are bobbing for oranges. Engineering theories are not applicable to biology because, even if they were intelligently designed at one point, they aren't subject to intelligent design today and therefore can't be expected to performe to some sort of man-made quality control standard. This message has been edited by Yaro, 02-08-2006 11:38 AM
|
|||||||||||||||||||||||||||
crashfrog Member (Idle past 1793 days) Posts: 19762 From: Silver Spring, MD Joined: |
The issue is that your example is "toy" and did not evolve but was designed, created de neuvo, by an intelligent designer. So, what you're saying is, you didn't even read the article?
The DNA 7 sigma replication scheme and machinery had to EVOLVE from non-life to life and from the simplest possible, highly error prone replicator (which no one can define or demonstrate) Why would it be both simple and error-prone? Spoken like someone who doesn't know anything about design.
|
|||||||||||||||||||||||||||
Wounded King Member (Idle past 358 days) Posts: 4149 From: Cincinnati, Ohio, USA Joined: |
Imagine that Von Neumann stating the problems of probability and thermondynamics , miracles, highly improbable and surely he was semantically accurate being one of the great intellects in a semantically precise world and very knowledgeable of science in general. The question is how similar to a modern living organism, or a hypothetical Von Neumann capable system, the original self-replicator needs to be.
The probability of a replicator happening de neuvo capable of the required extant accuracy or quality in performance is greater than the 10**-150 commonly used definition of impossibility. This has been accepted by your own people and is well documented in the literature. Why not provided some references to go with those claims. As far as I am aware this has barely any profile in the scientific literature but is rather a favourite of the ID crowd especially William Dembski who has popularised the concept. This is all just an argument from incredulity tarted up with some numbers you made up off the top of your head. That a problem is 'mathematically intractable' has absolutely nothing to do with its likelihood of occuring. N-body problems are mathematically intractable and yet few people resort to supernatural, or intelligent if you prefer, intercession to explain how the various bodies of the solar system interact. TTFN, WK
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
Human driven has nothing to do with it. Its analysis and mathmatics pure and siimple physical reality which is applicable and being applied in Nanoscience every day. See thats why our only hope of realizing Nano machines is to use molecules, DNA etc. because we have no hope of building them ourselves or "waiting for them to evolve".
Think about it ... if there were one airplane crash in all aviation military, public and private per billion flights we would essentially never have a crash. Yet we have reported crashes somewhere in the USA every day.. check the national database if you care to. Its more like 4 sigma even for the airlines. As RR used to say "there they go again". You program is designed, has an OS with error correcting processes none of which evolved.. period. This is the same old .. look we made life using only a very small piece of rna or dna .. blah blah blah. Assuming the answer you want or asserting that something is true is not rigerous science.. its a form of metaphysics. Having spent several year as as an OR analyst in the defense industry and energy industry I can assure you that design was never approached by random trial and error componentry pertubation. I am quite familiar with Monte Carlo methods , etc and the modeling of error propagation models. Never was there any attempt to design from random trial and error using simulation.... not ever.
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
Genetic ENGINEERING is the precise attempt to apply ENGINEERING principles to biological R&D to effect characterisitics not otherwise extant. LOL!
Engineering, math, physics, chemistry and information theory is absorbing the biological sciences at light speed because the soft mushy qualitative evolutionary framework simply is an anachronism to progress. Nanoscience is the application of ENGINEERING principles to the design of ano particle mfgt. etc. Check it out for yourself every Nanodegree program is set up in engineering divisions.. period.
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
Crashfrog,
This is supposed to be a forum restricting personal attacks.. how about following the guidelines. The rna world theory is currently the hot theory for how abiogenesis might have occurred.. period. Yet although a replicator of sorts, very small and simplistic in size and complexity it has very high error rates .. words of the community not mine.. thus it is quite problimatical to even envision how it could evolve anything before dying.
|
|||||||||||||||||||||||||||
Chiroptera Inactive Member |
Hey, Evopeach. You are still avoiding the problem in your OP.
Your argument seems to go like this: The error correction mechanism in DNA replication of the human cell is very efficient.Therefore, there must be an intelligent designer. First of all, your premise is wrong; "very efficient" is a subjective term, and unless you give some sort of criteria for "very efficient" that is relevant to your argument, your premise is meaningless. Second, you seem to missing the entire deductive reasoning part where you explain how your conclusion follows from your premise. I think you might want to work on this a bit before you start calling everyone a moron. "Intellectually, scientifically, even artistically, fundamentalism -- biblical literalism -- is a road to nowhere, because it insists on fidelity to revealed truths that are not true." -- Katha Pollitt
|
|||||||||||||||||||||||||||
Chiroptera Inactive Member |
quote: Problematical for whom? Why? "Intellectually, scientifically, even artistically, fundamentalism -- biblical literalism -- is a road to nowhere, because it insists on fidelity to revealed truths that are not true." -- Katha Pollitt
|
|||||||||||||||||||||||||||
AdminJar Inactive Member |
This is supposed to be a forum restricting personal attacks.. how about following the guidelines. Sorry, but I reviewed the post and there was no personal attack. What Kermee said was...
Why would it be both simple and error-prone? Spoken like someone who doesn't know anything about design. That is attacking the content of the post and not the poster. Comments on moderation procedures (or wish to respond to admin messages)? - Go to:
New Members: to get an understanding of what makes great posts, check out:
|
|||||||||||||||||||||||||||
Evopeach Member (Idle past 6939 days) Posts: 224 From: Stroud, OK USA Joined: |
Well its for sure the evo community knows its a lot closer to what we see working than any theory they can advance since chemical predestination, rna world, dna world, silicon life, "the dog ate my homework, " the magic genie", the intelligent comet and karma Bob visitor from the planet Nazbot have all been discarded. Except for the "life force" camp of Nobel Prize winners... which camp are you in?
You might start with Robert Shapiro "Origins", Hoyle, Crick, Morowitz works but maybe a big boy like you can operate Google. Mathmatically intractable is a form of falsification and has everything to do with it. Thr multi-body gravitational attraction problem is strictly deterministic and can be solved by iterative approximation methods to a high degree of accuracy. The abiogenesis and early evolution of the so called simple replicator theories by definition are not determiniative... see thats sort of the definition of RANDOM MUTATION. see the operative term is RANDOM.
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2025