What is the connection between the late beloved songwriter ... and Cray Supercomputers?
It was a pun. You need for us to explain a pun?
Man beseeching God: Why did you create the world like you did?
God, impatiently: Well, if I have to explain the joke then it's no longer funny.
Read Message 1204
and Message 1205
and think. BTW, I'm the one who mentioned the Cray-1S (our class from the UND Computer Science Dept toured their manufacturing facility in Chippewa Falls around 1980).
xongsmith was citing a pun, which I developed further.
Think about it:
- Despite both CDR Spock and Han Solo erroneously using it as a unit of time (the standard Physics for Poets college class should be augmented with Science for Screenwriters), the light-year is a unit of distance: the distance that light travels on one year = 9,460,730,472,580.8 km (exactly) ≈ 5.8786×1012 statute miles.
Subsequent quotes in this message are from that Wikipedia page.
The nearest known star (other than the Sun), Proxima Centauri, is about 4.24 light-years away.
- This has spawned other units, most of them informal, for the distance traveled by light in smaller units of time; eg, the light-day, the light-hour, the light-minute, the light-second, and the light-nanosecond.
For example, the distance from the sun to the earth is about eight (8) light-minutes, since it takes sunlight about 8 minutes to reach the earth. This was used in a sci-fi short story dramatized on the first revival of The Twilight Zone (or Night Gallery) in which a young boy (I keep thinking it was Bill Mumy) with the ability to predict the future was asked on live TV to predict the future of humanity and he predicted the most incredibly wonderful future for us. Immediately off-camera all the adults were ecstatic but the boy was inconsolably sad. When asked what was wrong, he explained that the sun had just exploded but we wouldn't know that for another eight minutes so he gave us joy before our sudden deaths. Final shot was the camera following an adult's gaze to the window and the sun.
Communication with deep space probes is delayed by distance:
Reflected sunlight from the Moon's surface takes 1.2–1.3 seconds to travel the distance to the Earth's surface (travelling roughly 350,000 to 400,000 kilometres).
New Horizons encounters Pluto at a distance of 4.7 billion kilometres, and the communication takes 4 hours 25 minutes to reach Earth.
Voyager 1 as of October 2018, nearly 20 light-hours (144 au, 21.6 billion km, 13.4 billion mi) from the Earth.
- A light-second is basically the definition of the speed of light:
... the light-second, useful in astronomy, telecommunications and relativistic physics, is exactly 299,792,458 metres ...
Therefore, a light-nanosecond is the distance light travels in one nanosecond, which is about 29.98 cm, about 3/16-ths of an inch short of a foot.
- CAVEAT: the speed of light depends on the medium through which it is traveling. The highest speed is through a vacuum (ie, the absence of a medium) and is hence slower for various media (eg, glass, air, water). Electrical signals travel at the speed of light and hence are also slowed down by the medium through which they travel.
Fun fact: When light transits from one medium to another, its speed changes thus causing the light to refract. This is why lenses and prisms work.
- While terms such as "light-<unit of time>" are used to measure the distance that light travels in that unit of time, we can also use terms such as "light-<unit of distance>" to measure how long it takes light to travel that distance.
Light travels approximately one foot in a nanosecond; the term "light-foot" is sometimes used as an informal measure of time.
Per the last, since a "light-foot" refers to one nanosecond of time, then through algebraic substitution Gordon Lightfoot's name becomes "Gordon Nanosecond".
QED / QEF
The Power of the Pun!
Circa 1977 in the USAF Electronic Computer Systems Repairman Course (305x4), we kept encountering delay lines without fully understanding their importance. The basic explanation is that they were needed to synchronize the arrival at signals to a gate. A logic gate (eg, AND gate, OR gate, inverter AKA "NOT gate") uses input signals to generate an output signal, but all those input signals had to be present at the same time (and be held there to compensate for propagation delays within the gate circuitry). An example we were given was an AND gate requiring 5 inputs, four of which were generated nearby but the fifth came from a circuit at the other end of this six-foot cabinet, so while four signals were present almost instantaneously, that last signal too at least six nanoseconds to show up. Hence those other signals needed to be delayed by six nanoseconds in order for all five input signals to arrive at that gate at the same time (actually, input levels were not a problem outside of having to be held, but it was the pulses that needed to be delayed). As a result, the switching speed of large computers were constrained by the length of the longest conductor (plus other factors, of course).
We also studied signal waveforms, especially the rise and fall times which were surprisingly slow (remember, this was 60's/70's discrete component tech). The designers of those circuits had to take all those delays into account by waiting a little extra longer in order to ensure that the signal had time to arrive. The technical term for those extra delays was "sloppage factor". And of course having to include those sloppage factors every time you turned around further contributed to slowing down the entire computer system.
Another problem with long conductors was the rapid degradation of the square-wave signals -- digital signals are all square-wave signals since you're switching between two discrete voltage levels. If you use Fourier analysis
to look at a square wave's power spectrum (the amplitudes of the waveform's harmonics; eg, for frequency f
, etc), you will find that a square wave has very strong harmonics. Since conductors and all electronic devices have inherent "stray" capacitance and inductance, just running a conductor creates an unintended filter circuit (as well as inductive coupling between wires creating "cross-talk" -- the longer the conductors the greater the effect). Those unintended filters would filter out whole sections of a square wave's higher harmonics, the ones that give the rise and fall edges their sharpness, thus reducing those signals to crap. That means that if a digital signal has to travel any appreciable distance then it will degrade into meaningless mush -- rule-of-thumb for RS-232 data cables was to keep them shorter than 25 feet. And of course, that would increase the need for greater sloppage factors.
A large part of my father's enjoyment of being a general contractor was meeting clients from other professions and learning about their work experience. When he did a home remodel for a data processing supervisor, he was given a tour of the data center (circa 1972). For every piece of equipment, he was told which mental hospital the designer had ended up in. The story he was given was that because of the mental exertion needed to get all those signals synchronized exactly right, the designers ended up having a nervous breakdown. What I learned in tech school confirmed Dad's story.
Shortening those distances, which is taken almost to its extreme by integrated circuits, has contributed enormously to switching speed and signal quality. Ever since tech school as I would watch computer tech become ever faster, disk density ever more compact, and systems ever more reliable, knowing the inherent problems I am still amazed that any of it can even work at all, let alone moving such immense amounts of data so reliably.
Now back to the Cray-1S
. Around 1981 at the University of North Dakota, I went on a Computer Science Dept field trip. In Minneapolis we first visited Sperry Univac and the offices of Cray Research, then we traveled to Cray's manufacturing facility in Chippewa Falls where they were had just completed a Cray-1S for delivery to Japan.
As I already described, viewed from above the computer looks like a "C". Our guide asked us why that was and one student offered, "Because 'C' for 'Cray'?" The guide chuckled, saying that he hadn't thought of that. Rather, it was because they needed to keep all interconnecting wires as short as possible in order to increase the speed of the computer (along with other measures, such as their choice of very fast IC chips). He told us that there was not a single wire in that computer longer than two feet, which is to say with a delay longer than two nanoseconds. And when they still needed a delay line for a signal, they would simply lay a little extra trace on the circuit board, like a zig-zag.
Edited by dwise1, : Minor grammatical errors
Edited by dwise1, : fixed typos using angle brackets