Cyborgs

Part 3 of 5 in the series Biotech

Science has given us a whole new toolbox for tinkering with life, and we have the power to modify animals in profound new ways. — Frankenstein’s Cat (2013)

Claude Shannon
Claude Shannon
 
 

“Claude Elwood Shannon is considered as the founding father of [the] electronic communications age,” writes New York University. “While working at Bell Laboratories, he formulated a theory explaining the communication of information and worked on the problem of most efficiently transmitting information”:

Shannon joined Bell Telephone Laboratories as a research mathematician in 1941. He worked on the problem of most efficiently transmitting information. Soon he discovered the similarity between boolean algebra and telephone switching circuits. By 1948, Shannon turned his efforts toward a fundamental understanding of the problem and had evolved a method of expressing information in quantitative form. The fundamental unit of information is a yes-no situation.i Either something is or is not. This can be easily expressed in Boolean two-value binary algebra by 1 and 0, so that 1 means “on” when the switch is closed and the power is on, and 0 means “off” when the switch is open and power is off. Under these circumstances, 1 and 0 are binary digits, a phrase that can be shortened to “bits.” Thus the unit of information is the bit. A more complicated information can be viewed as built up out of combinations of bits. For example, the game of “twenty questions,” shows how quite complicated objects can be identified in twenty bits or less, using the rules of the game. Also, something much more elaborate, such as is seen by the human eye, can also be measured in bits. Since each cell of the retina might be viewed as recording “light” or “dark” (“yes” or “no”) and it is the combination of these yes-no situations that makes up the complete picture.1

“In a landmark paper written at Bell Labs in 1948, Shannon defined in mathematical terms what information is and how it can be transmitted in the face of noise,” notes Scientific American:

Shannon defined the quantity of information produced by a source – for example, the quantity in a message – by a formula similar to the equation that defines thermodynamic entropy in physics. In its most basic terms, Shannon’s informational entropy is the number of binary digits required to encode a message. Today that sounds like a simple, even obvious way to define how much information is in a message.…

As well as defining information, Shannon analyzed the ability to send information through a communications channel. He found that a channel had a certain maximum transmission rate that could not be exceeded. Today we call that the bandwidth of the channel. Shannon demonstrated mathematically that even in a noisy channel with a low bandwidth, essentially perfect, error-free communication could be achieved by keeping the transmission rate within the channel’s bandwidth and by using error-correcting schemes: the transmission of additional bits that would enable the data to be extracted from the noise-ridden signal. Today everything from modems to music CDs rely on error-correction to function.2

New York University goes on to explain that “the basic elements of any general communications system include”

  1. a source of information which is a transmitting device that transforms the information or “message” into a form suitable for transmission by a particular means.
  2. the means or channel over which the message is transmitted.
  3. a receiving device which decodes the message back into some approximation of its original form.
  4. the destination or intended recipient of the message.
  5. a source of noise (i.e., interference or distortion) which changes the message in unpredictable ways during transmission.

It is important to note that “information” as understood in information theory has nothing to do with any inherent meaning in a message. It is rather a degree of order, or nonrandomness, that can be measured and treated mathematically much as mass or energy or other physical quantities are.… To a large extent the techniques used with information theory are drawn from the mathematical science of probability.3

Robert I. Watson, Sr., and Rand B. Evans explain in The Great Psychologists: A History of Psychological Thought, 5th ed., that Information Theory “arose from communications research. Just as with stimulus-response psychology, the”

inputs and outputs of a communication system, it soon became apparent, could not be dealt with exclusively in terms of the nature of these inputs and outputs alone nor even in terms of such internal characteristics as channel capacity and noise. The coding and recoding of inputs — how incoming signals are sorted and organized — turns out to be the important secret of the black box that lie athwart the communication channel [Jerome Bruner, In Search of Mind: Essays in Autobiography (New York: Harper and Row, 1983), p. vii-viii].

Norbert Wiener
Norbert Wiener

It was from this background of communications research that information processing theory originated. In 1948 Norbert Wiener at MIT coined the word cybernetics in his book Cybernetics: Or Control and Communication in the Animal and the Machine. The notion of feedback mechanisms was particularly influential in the later cognitive science.4

“The very word ‘cybernetics’ is a useful clue to the central meaning of the electronic revolution,” notes Arthur Porter in Cybernetics Simplified:

The speed-up of information movement creates an environment of “information overload” that demands pattern recognition for human survival. It was natural therefore, for the first explorers of this field to use a term from navigation.5

“As happens so often to scientists, we have been forced to coin at least one artificial neo-Greek expression,” writes Norbert Wiener in Cybernetics:

We have decided to call the entire field of control and communication theory, whether in the machine or in the animal, by the name Cybernetics which we form from the Greek κυβερνήτης or steersman. In choosing this term, we wish to recognize that the first significant paper on feed-back mechanisms is an article on governors, which was published by Clerk Maxwell in 1968, and that governor is derived from a Latin corruption of κυβερνήτης. We also wish to refer to the fact that the steering engines of a ship are indeed one of the earliest and best developed forms of feed-back mechanisms.

Although the term cybernetics does not date further back than the summer of 1947, we shall find it convenient to use in referencing earlier epochs of the development of the field.6

“Cybernetics is the science of control and communications, with special reference to self-controlling or adaptive systems,” notes F.H. George in Cybernetics:

It does not draw an absolute distinction between organisms and inanimate, or man-made, systems in this context, since either can be self-controlling and adaptive in behavior.7 In practice, cybernetics cuts across the so-called established sciences such as physics, chemistry, zoology, etc. by abstracting those common features which contribute to an integrated theory of control and communication.… The theoretical properties of the more mathematical features of cybernetics are embodied in fields known as automata theory, recursive function theory, Turing machines (part of automata theory), and metamathematics generally.8 Cybernetics tells us that all control and classification systems, all communication systems, have certain common characteristics, which allow us to describe them in terms of their feedback and control mechanisms.9

Wiener explains:

It has long been clear to me that the modern ultra-rapid computing machine was in principle an ideal central nervous system to an apparatus for automatic control; and that its input and output need not be in the form of numbers or diagrams, but might very well be, respectively, the readings of artificial sense-organs such as photo-electric cells or thermometers, and the performance of motors and solenoids. With the aid of strain-gauges or similar agencies to read the performance of these motor organs and to report, to « feed back », to the central control system as an artificial kinaesthetic sense, we are already in a position to construct artificial machines of almost any degree of elaborateness of performance.10

Those of us who have contributed to the new science of cybernetics thus stand in a moral position which is, to say the least, not very comfortable. We have contributed to the initiation of a new science which, as I have said, embraces technical developments with great possibilities for good and for evil. We can only hand it over into a world that exists about us, and this is the world of Belsen [Nazi concentration camps] and Hiroshima [nuclear bomb targets]. We do not even have the choice of suppressing these new technical developments. They belong to the age, and the most of any of us can do by suppression is to put the development of the subject into the hands of the most irresponsible and most venal of our engineers. The best we can do is to see that a large public understands the trend and the bearing of the present work, and to confine our personal efforts to those fields, such as physiology and psychology, most remote from war and exploitation.11

“There can be no turning back,” notes George, “and the only way in which we can be successful is for more and more people to have a more and more complete understanding of what is virtually a science of sciences: cybernetics.” 12

Continue to page 2 »


 

i The HowThingsWork.com website notes how today’s computers “work by manipulating bits that exist in one of two states: a 0 or a 1″ but that quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition.”

Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today’s most powerful supercomputers.

– Kevin Bonsor and Jonathan Strickland, “How Quantum Computers Work,” HowThingsWork.com, at http://computer.howstuffworks.com/quantum-computer.htm (retrieved: 29 October 2013); See also Kevin J. Crosby, “Qubits,” SkewsMe.com, at http://skewsme.com/blog/2013/10/qubit/ (retrieved: 11 March 2015).


1 “Claude Shannon,” New York University, at http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html (retrieved: 5 March 2015).

2 Graham P. Collins, “Claude E. Shannon: Founder of Information Theory,” Scientific American, 14 October 2002, at http://www.scientificamerican.com/article/claude-e-shannon-founder/ (retrieved: 10 March 2015).

3 “Claude Shannon,” New York University.

4 Robert I. Watson, Sr., and Rand B. Evans, The Great Psychologists: A History of Psychological Thought, 5th ed. (New York: HarperCollins Publ., Inc., 1991), p. 623.

5 Arthur Porter, Cybernetics Simplified (New York, NY: Barnes & Noble, Inc., 1969), p. v.

6 Norbert Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine (New York, NY: The Technology Press, 1949, 1948), p. 19.

7 F.H. George, Cybernetics (London: Teach Yourself Books, 1971), p. 3.

8 Ibid., p. 4.

9 Ibid., p. 128.

10 Wiener, p. 36.

11 Ibid., pp. 38-39.

12 George, p. 189.

Series Navigation<< ESPAcoustic Kitty >>

Leave a Reply


Amazon gift card



Operation Underground Railroad - We rescue kidnapped children from slavery
 Operation Underground Railroad - We rescue kidnapped children from slavery OurRescue.org 

This material contains copyrighted material whose use has not been specifically authorized by the copyright owner. SkewsMe.com is making it available without profit to viewers who have expressed a prior interest in receiving the included information in their efforts to advance the understanding of human rights, civil liberties, and social justice, for non-profit research and educational purposes only. We believe that this constitutes a 'fair use' of the copyrighted material as provided for in section 107 of the U.S. Copyright Law. If you wish to use this copyrighted material for purposes of your own that go beyond 'fair use,' you must obtain permission from the copyright owner.

This page is subject to change as new facts arise.

sitemap

Tinfoil Hat - Mind Control and Coercive Psychological Systems by Kevin J. Crosby - Skews.Me - SkewsMe.com Stop censorship