Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
3 online now:
Newest Member: popoi
Post Volume: Total: 915,817 Year: 3,074/9,624 Month: 919/1,588 Week: 102/223 Day: 0/13 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Software Maintenance, Intelligent Design, and Evolution
lpetrich
Inactive Member


Message 1 of 12 (33676)
03-05-2003 1:30 AM


This issue was brought up in a recent thread. This message mentioned an example of a program that was 8000 lines of COBOL for doing reporting, a program that had been revised 69 times and that had become such a big mess that the thought of fixing it makes people "busy".
Most such examples of great bulks of clumsy, tangled code are likely to be in-house or proprietary commercial software, software whose source code is only seen by its official maintainers. I wonder what major open-source projects have similar such code; the main example I know of is Mozilla -- which was once closed-source. Its developers had to do a major rewrite of its HTML-layout engine in the middle of the project, because the original one was becoming almost impossible to maintain.
This fits well into my comments on a multi-design inference; one can sometimes see evidence of multiple programmers at work by evidence of different programming styles.
Much biological evolution works in similar fashion, with features being piled on top of earlier features and workarounds being a common practice. Consider the case of cetaceans, which are completely aquatic air-breathers. Yet like land vertebrates, their trachea departs from the esophagus in the ventral direction, while their nostrils are in the dorsal direction. But the top of the trachea plugs into the nasal cavities, making the air path completely separate from the food path.
Another example is the case of the Hox protein Ultrabithorax (Ubx). In the fruit fly Drosophila and other insects, it interferes with the formation of legs where it is expressed -- the abdominal segments. However, the shrimp Artemia has a version of Ubx that is expressed in the same region -- and that does not interfere with limb growth. Experiments in transplanting Artemia Ubx into Drosophila embryos reveal that the Artemia version causes the flies to start growing abdominal legs. So the disabling of leg growth by insect Ubx is a sort of workaround.
Other examples are various distortions, which often give the appearance of being workarounds. Cetacean nostrils do not open just above the mouth, the typical location for most vertebrates, but instead open at the top of the head. Interestingly, fossil cetaceans reveal intermediate-position nostrils. Flounders and similar fish have both eyes on one side of the head, as an adaptation to lying on one side on the ocean floor; as they grow, one eye moves to the other side of the head. Interestingly, other flattened fish are flattened on the dorsoventral axis; they rest on the ocean floor on their bellies. Vertebrate eyes usually point sideways; in a few groups, they point forwards -- but their early embryos have sideways-pointing eyes which later move forward.
Embryonic development features other workarounds of this sort. Land vertebrates and their aquatic descendants have no gills. However their early embryos grow fishlike gill bars and aortic arches paralleling them -- which end up either being used for different purposes or else resorbed. Also, land-vertebrate hearts develop from a fishlike two-chambered form to become either three-chambered or four-chambered; in the latter state, the heart has essentially split itself into two sub-hearts.
As to sonnikke's pique over computer programmers and engineers not accepting his theology, despite their being creators and designers, it could be that the sort of "design" that they tend to see is not some single great from-scratch design, but instead the sort of design that human designers are known to do -- piece-by-piece, with kludges and workarounds tending to accumulate over time.

Replies to this message:
 Message 2 by Peter, posted 03-05-2003 5:58 AM lpetrich has not replied
 Message 6 by DanskerMan, posted 03-18-2003 12:28 AM lpetrich has replied

  
Peter
Member (Idle past 1479 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 2 of 12 (33685)
03-05-2003 5:58 AM
Reply to: Message 1 by lpetrich
03-05-2003 1:30 AM


quote:
As to sonnikke's pique over computer programmers and engineers not accepting his theology, despite their being creators and designers, it could be that the sort of "design" that they tend to
see is not some single great from-scratch design, but instead the sort of design that human designers are known to do -- piece-by-piece, with kludges and workarounds tending to accumulate
over time.
For myself, it's that I don't see it as design at all.
The line of reasoning which has led to this thread being open
started with the suggestion that simplicity, rather than
complexity, was the hallmark of design. Over time designed
objects become more simple, elegant, and efficient, and that
really good designs start that way.
Computer programs were brought up, pointing out that they often
become highly complex and inter-dependent due to successive updates
and 'fixes' by a number of different authors. This process was
likened to evolution. And in the main I feel that the analogy
is quite accurate (excepting the 'author' part from my POV).
In terms of ID arguments I see these observations as problmeatic,
particular for those who accept the Christain God as the designer.
1) IC.
There are programs which, through iterative update, include
functionality that if removed would stop the whole program
from working. BUT there is a revision history that shows
that the 'current' system was not designed that way but reached
that point via a development process.
I cannot give any specific examples on IP&confidentiality grounds
(not with my current employer I will hastily add).
My feeling is that this indicates that a large number of revisions
can develop features that later become essential to normal
operation.
2) Design?
If simplicity is the hallmark of good design, and animals are
complex then we have a situation in which, should they have actually
been designed, they were designed poorly.
There are animal features (such as the human arrangement of
air and food intake) which offer a single point of failure
for two independent systems ... either one of which could
lead to the demise of the individual should they fail.
This is not a necessary design as pointed out in the OP,
since cetaceans have these tracts isolated.

This message is a reply to:
 Message 1 by lpetrich, posted 03-05-2003 1:30 AM lpetrich has not replied

  
lpetrich
Inactive Member


Message 3 of 12 (34020)
03-10-2003 3:04 AM


However, some software has kludginess that is difficult to conceal, notably software that other software depends heavily on, like operating-system software. I will now discuss an example I am rather familiar with.
The Macintosh Operating System was introduced by Apple in 1984, as the OS of its initial Macintosh model. It had taken some heroic software engineering to get an OS and a GUI shell inside of 128K of RAM with room to spare for application software, so the MacOS design team made several compromises.
* A very small screen with only 1-bit color depth. Thus, each byte contained 8 pixels.
* Single-tasking. This made it easier to fit apps into that small memory, and it also simplified OS design -- there was no need to keep apps from stepping on each other's toes.
* A single memory space with real-mode memory. There was no point in supporting multiple memory spaces, and for virtual memory, the original MacOS used an ingenious hack: the filesystem not only included a Unix-like "data fork", but also a mini-database "resource fork", which includes code segments, GUI-widget definitions, etc.
However, the Macintosh line was successful enough for Apple to continue with it, adding more RAM and faster CPU's, and ultimately improving the MacOS.
By the late 1980's, Apple had introduced the MultiFinder, which was capable of running several apps at once -- and keep running as they were running. However, there was a certain difficulty: the OS had no way to keep those apps from stepping on each other's toes, so Apple's programmers implemented cooperative multitasking -- apps only get interrupted at "safe" times. Preemptive multitasking was limited to device-driver tasks and the like, and they could not do anything "unsafe", like allocate memory or do GUI stuff.
And though virtual memory was added at that time, the MacOS continued to support only one memory space. More than one would have required creating big shared blocks that essentially defeat the purpose of multiple memory spaces.
The move from 1-bit color to 8-bit indexed color to 32-bit full color was, by comparison, relatively easy. The original graphics API, QuickDraw, was still functional; Apple simply added some API feeatures for handling color tables and full color. And moving to larger screen sizes, and even multi-monitor support, was also easy.
Moving onward, in the mid-1990's, Apple tried to create an improved version of the MacOS called "Copland". It had a new OS kernel, one which supported preemptive multitasking and multiple memory spaces. However, that did not fit in very well with the existing MacOS, so it was to be confined in a single memory space, the "Blue Box", and run as a single app, with existing-MacOS apps being sub-apps of it.
Copland failed because of grotesque feature creep and some awkward features that were difficult to get working properly, but that's mostly another story. One of those awkward features, however, was its way of handling a new GUI-widget set that Apple had devised. These were to live in the Blue Box, and Apple's old GUI widgets were to be implemented in terms of them. That was rather difficult to get working properly, especially when some apps had manipulated GUI-widget internals.
The "Blue Box" approach to running the MacOS inside of another OS was to be adopted by some amateur programmers, who created apps that enabled the MacOS to be run inside of various other OSes, like the AmigaOS, the BeOS, and Linux -- apps called "Basilisk", "ShapeShifter", "SheepShaver", and "Mac-on-Linux".
Apple's acquisition of NeXT allowed it to succeed with MacOS X where it had failed with Copland -- and it supports the "Classic" MacOS in exactly that Blue-Box fashion.
But many of Apple's developers were reluctant to do all the rewriting necessary to get from the old MacOS to the NeXTian app framework renamed Cocoa. So Apple's programmers thought of a halfway house called "Carbon". Many of the old MacOS GUI widgets and other API features would still be supported in Carbon, but not all of them, and a Carbon app would have its own memory space and not have to live in Classic's Blue Box.
One can point to other examples of OS kludginess, notably DOS and the various flavors of Windows, but I am not as familiar with those, and I'm not sure I'd want to be.

Replies to this message:
 Message 4 by Peter, posted 03-12-2003 2:27 AM lpetrich has not replied

  
Peter
Member (Idle past 1479 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 4 of 12 (34157)
03-12-2003 2:27 AM
Reply to: Message 3 by lpetrich
03-10-2003 3:04 AM


If I'm understanding the intent of your post, it kind
of sums up my objections to complexity being related to
design.
The OS that you mention has a very weak design process,
as is often the case in software projects. Re-use,
when software components are designed for re-use, can present
some elegant solutions in rapid time scales. Re-using legacy
code requires a certain amount of shoe-horning.
The complexity and interaction levels of kernels with this
revision history indicate (almost yell) that the up-front
design effort was lacking.
If that wasn't the point you were making, or I have misread
your intent in some other way, I apologies.

This message is a reply to:
 Message 3 by lpetrich, posted 03-10-2003 3:04 AM lpetrich has not replied

  
lpetrich
Inactive Member


Message 5 of 12 (34166)
03-12-2003 6:21 AM


Peter:
If I'm understanding the intent of your post, it kind
of sums up my objections to complexity being related to
design.
That's right. A once-and-for-all designer would likely have done things differently than what we see.
The OS that you mention has a very weak design process,
as is often the case in software projects. Re-use,
when software components are designed for re-use, can present
some elegant solutions in rapid time scales. Re-using legacy
code requires a certain amount of shoe-horning.
As an example, the "Classic" MacOS API features direct access to various OS features like "low-memory globals" and GUI-widget data structures like window definitions.
It is hard to justify this "design decision" for a multitasking OS, because there is the serious problem of different tasks having access to the same OS data.
By comparison, Apple's "Carbon" API, though closely following the "Classic" API, avoids this problem by using accessor functions. Such functions allow such exclusive access to be enforced much more easily, as well as giving more room for future growth. Looking back, Apple's OS designers could have saved themselves a lot of trouble by making the original MacOS more Carbon-like.
The MacOS API is comprehensively discussed at Apple's online documentation. Quickdraw is a rather obvious example of design evolution, while the file-system API ("File Manager") is bewilderingly complicated, and has the look of multiple design teams trying to implement similar functionality.
The complexity and interaction levels of kernels with this
revision history indicate (almost yell) that the up-front
design effort was lacking.
That's right. The original MacOS had had some design features that had made it non-expandable in certain ways, so its successors have to be run as an app inside of another OS.
If that wasn't the point you were making, or I have misread
your intent in some other way, I apologies.
No, you got it right. There is no need to apologize.

  
DanskerMan
Inactive Member


Message 6 of 12 (34599)
03-18-2003 12:28 AM
Reply to: Message 1 by lpetrich
03-05-2003 1:30 AM


As to sonnikke's pique over computer programmers and engineers not accepting his theology, despite their being creators and designers, it could be that the sort of "design" that they tend to see is not some single great from-scratch design, but instead the sort of design that human designers are known to do -- piece-by-piece, with kludges and workarounds tending to accumulate over time.
It doesn't matter how you look at it, computer programs, buildings, cars, they are *all* designed by *intelligence*, not random, un-intelligent "forces", that simply is impossible. The more complex a design gets, the more involved it gets, and the more obvious it becomes that intelligence is involved. Your analogy doesn't work because if you take out your designer(s), *nothing* would get created.
S
------------------
"We arrive at the truth, not by the reason only, but also by the heart."
Blaise Pascal

This message is a reply to:
 Message 1 by lpetrich, posted 03-05-2003 1:30 AM lpetrich has replied

Replies to this message:
 Message 7 by PaulK, posted 03-18-2003 2:35 AM DanskerMan has replied
 Message 10 by lpetrich, posted 03-20-2003 2:42 AM DanskerMan has not replied
 Message 11 by Peter, posted 03-20-2003 7:16 AM DanskerMan has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 7 of 12 (34604)
03-18-2003 2:35 AM
Reply to: Message 6 by DanskerMan
03-18-2003 12:28 AM


The point of this thread is that you are quite wrong. Evolution has got a lot in common with the tinkering that produces unnecessary complexity while creationism postulates a design-from-scratch scenario that should keep complexity down. The more complex life is the more likely it evolved, rather than being designed.

This message is a reply to:
 Message 6 by DanskerMan, posted 03-18-2003 12:28 AM DanskerMan has replied

Replies to this message:
 Message 8 by DanskerMan, posted 03-19-2003 12:44 AM PaulK has replied

  
DanskerMan
Inactive Member


Message 8 of 12 (34654)
03-19-2003 12:44 AM
Reply to: Message 7 by PaulK
03-18-2003 2:35 AM


The point of this thread is that you are quite wrong. Evolution has got a lot in common with the tinkering that produces unnecessary complexity while creationism postulates a design-from-scratch scenario that should keep complexity down. The more complex life is the more likely it evolved, rather than being designed.
I obviously totally disagree with you. And you can't escape the human designer(s), so how would you create a computer program if you could *only* sit in front of the computer and stare at it without touching a single key? You obviously couldn't, your *intelligent* intervention is req'd, and the more complex it gets, the more intelligence is req'd, it's simply an unescapable fact!
------------------
"We arrive at the truth, not by the reason only, but also by the heart."
Blaise Pascal

This message is a reply to:
 Message 7 by PaulK, posted 03-18-2003 2:35 AM PaulK has replied

Replies to this message:
 Message 9 by PaulK, posted 03-19-2003 2:39 AM DanskerMan has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 9 of 12 (34659)
03-19-2003 2:39 AM
Reply to: Message 8 by DanskerMan
03-19-2003 12:44 AM


Well there is one big problem in your argument, It assumes that evolution doesn't happen. However evolution does happen and - as we have been pointing out it fits the observed data much better than "design-from-scratch". And evolutionary principles have been used to produce designs, so design is certainly possible without intelligent intervention.

This message is a reply to:
 Message 8 by DanskerMan, posted 03-19-2003 12:44 AM DanskerMan has not replied

  
lpetrich
Inactive Member


Message 10 of 12 (34731)
03-20-2003 2:42 AM
Reply to: Message 6 by DanskerMan
03-18-2003 12:28 AM


Sonnikke:
It doesn't matter how you look at it, computer programs, buildings, cars, they are *all* designed by *intelligence*, not random, un-intelligent "forces", that simply is impossible. The more complex a design gets, the more involved it gets, and the more obvious it becomes that intelligence is involved. Your analogy doesn't work because if you take out your designer(s), *nothing* would get created.
First, "un-intelligent" forces are NOT random. Pick up a pen and then let go. Watch it drop. Is there a little elf that grabs the pen and pulls it down? Look at crystals. Is there some big team of tiny gnomes that arrange their molecules in crystal-lattice patterns? And consider moons and planets moving in approximately elliptical orbits around their primaries. Are there angels pushing them around?
Also, complexity does not necessarily imply design. Look at some snowflakes some time -- look at how complex they are. Are they designed by some fairies?
My point is that the if "intelligent design" was involved in the evolution of the Earth's biota, then that intelligent design has worked in much the same way as human designers, who are multiple, finite, and fallible, and who leave abundant evidence of that in their designs. The Earth's biota does not look as if it was designed from scratch in one swell foop, which causes trouble for the hypothesis of a single omnipotent and otherwise perfect designer -- why create a big mass of designs that looks like the result of multple, finite, and fallible designers?
Or was that an intended created appearance in the fashion of Philp Gosse's Omphalos?

This message is a reply to:
 Message 6 by DanskerMan, posted 03-18-2003 12:28 AM DanskerMan has not replied

  
Peter
Member (Idle past 1479 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 11 of 12 (34742)
03-20-2003 7:16 AM
Reply to: Message 6 by DanskerMan
03-18-2003 12:28 AM


You've hit the point of the discussion fairly well.
You argue that:
'The more complex a design gets, the more involved it gets, and the more obvious it becomes that intelligence is involved.'
Skipping that you've already called it a design on the assumption
that you were being loose with your wording ...
Some here have pointed out that the best 'designed' computer
programs are the most concise, and simple of things. While those
that have had many designers inputting over an extended time frame
(or those where commercial constraints have narrowed time-scales
so that management is unwilling to pay for up-front design
rigour) are exceedingly complex beasts.
If we look at biological systems in this light, we must conclude
that there is little design effort put in (even if design is the
correct hypothesis). Biological systems are extremely complex,
show 'legacy code' that is still in place, but unused etc. etc.
If God were the designer then he clearly had no commercial limits
or time constraints (he chose to do it in six days it wasn't
a work deadline with penalty clauses).
This leads to the possibility that God is either a poor designer,
that (S)He is not omnipotent, or that perhaps intelligent design
is not what happened.

This message is a reply to:
 Message 6 by DanskerMan, posted 03-18-2003 12:28 AM DanskerMan has not replied

  
crashfrog
Member (Idle past 1467 days)
Posts: 19762
From: Silver Spring, MD
Joined: 03-20-2003


Message 12 of 12 (34812)
03-20-2003 6:07 PM


Genetic Programming
I'm glad someone brought up the relationship of computer software, evolution, and intelligent design hypotheses. I was pondering starting a new thread but I'm glad I don't have to. (I'm new to the forum, so hello everybody.)
The technique of so-called "genetic" or "evolutionary" programming is not new; however a recent article in Scientific American highlights its use in creating patentable electronics. To put it simply, genetic programming is a kind of problem-solving computer-driven technique that applies the basics of evolutionary models (inheretable variation, descent with modification, survival of the fittest) to solving general problems, such as (in the case of the article) filtering out high or low frequency signals. The computer designs random electronic circuits from standard parts and puts them to the test - do they do any filtering, for example - and selects the most successful circuits. Then, processes like mutation and genetic crossover are applied to generate new circuits, which are then tested again, and so on.
The gist of the article is not only does this process give rise to successful circuits, it creates circuits that are often more efficient than the same circuit designed by a human. Also, while the article doesn't make this comparison, such circuits often bear similarities to living systems, in that they possess elements that are redundant or have no purpose. Also, many of the circuits made defy our understanding of their operation. They're simply too complex.
Basically, if evolutionary processes (random chance plus natural selection) can give rise to complex systems - in fact, can make them better than humans can design them - what does that say for "intelligent design" theories? Ignoring the fact that the whole genetic programming system is human-made - if you need God to set up evolution in the beginning, that's fine - doesn't this spell defeat for a theory based on the premise that complexity can't come from chance?
(P.S. I'm sorry the article is subscription-based; I suggest you get to your library. It's the Feburary issue. Great diagrams.)
------------------
Epimenedes Signature: This is not a signature.
{Note by edit - This message has been spun off as its own topic, at "Genetic Programming as evidence against ID" - Adminnemooseus}
[This message has been edited by Adminnemooseus, 03-23-2003]
Edited by Adminnemooseus, : Fixed link to new topic.

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024