Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Friday, January 28, 2011

The Origins of Artificial Intelligence in English Romantic Poetry

by Len Hart, The Existentialist Cowboy

Ada Byron, daughter of the English Romantic poet Lord Byron is – arguably --the world's first computer programmer. She is otherwise known to history as Lady Lovelace. The case on her behalf is so strong that the Department of Defense developed a programming language in her honor: ADA. It was among the first so-called "Object-Oriented Languages," a set of languages dominated by C++, Java, etc. But that gets ahead of the story by some one hundred years or more.

Ada Byron was born in 1815. Divorced from Lord Byron, Lady Byron brought up her daughter, Ada, fearful that she might suffer Romanticist, "poetical" tendencies as did her father. Her education, therefore, consisted of mathematics and science. Predictably, Ada's understanding of mathematics –though profound –was influenced by simile, analogy, and metaphor.

It is not surprising that Ada was fascinated and intrigued when, in 1834, she encountered Charles Babbage's idea for a "calculating machine" about which Babbage offered a daring conjecture: a machine acting upon foresight.
Most of what we know of Babbage's "differential engine" we have learned from Ada Byron. Inspired by the "universality" of Babbage's ideas, Ada proceeded to write more notes on Babbage than Babbage wrote at all. Her fascination with Babbage's "engine" is noteworthy for at least two outstanding developments.

It was Ada, inspired by Babbage's calculating machine, who first articulated, if not invented, the very concept of "software" –a set of instructions to be carried out by a "universal" machine – a machine capable of acting meaningfully upon those instructions. The obvious progeny of this concept is the multitude of software packages that now drive everything from desktops to mainframes.

Of even greater interest to physicists and cosmologists is that Ada's ideas and hopes for the 'computing machine' lead inexorably to Claude Shannon's concept of information as the inverse of entropy – a western version of the yin and yang. Entropy is or is associated with the Second Law of Thermodynamics, a general principle constraining the 'direction' of 'heat transfer'; in the vernacular: things run down. Chaos increases. Organization becomes dis-organization and disorder. Hot things cool down in the absence of new infusions of energy. Eventually all movement ceases entirely. Some have called it the 'heat death' of the Universe –a final and eternal 'thermodynamic state' in which there no longer exists sufficient 'free energy' to sustain motion or life.

Shannon then spent 31 years at Bell Labs, starting in 1941. Among the many things Shannon worked on there, one great conceptual leap stands out. In 1948, Shannon published "The Mathematical Theory of Communication" in the Bell System Technical Journal, along with Warren Weaver. This surprisingly readable (for a technical paper) document is the basis for what we now call information theory--a field that has made all modern electronic communications possible, and could lead to astounding insights about the physical world and ourselves.

Names like Einstein, Heisenberg, or even Kurt Goedel are better known among the people who have led twentieth-century science into the limits of knowledge, but Shannon's information theory makes him at least as important as they were, if not as famous. Yet when he started out, his simple goal was just to find a way to clear up noisy telephone connections.

--Heroes of Cyberspace; Claude Shannon
Shannon wrote A Mathematical Theory of Communication [PDF] for Bell Labs in 1948 –more than a hundred years after Ada wrote what is considered to be the world's first computer program, a plan that she shared with Babbage. In it, Ada suggested how his machine might calculate Bernoulli numbers. This was the world's first computer program.

The second development began the debate about Artificial Intelligence. In his paper Computing Machinery and Intelligence, Alan Turing, devoted several paragraphs to "Lady Lovelace's Objection" to the very concept of A.I. It was a concept which Ada discounted in her notes:
The analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to [do]. It can follow analysis; but it has no power of anticipating any...truths. Its province is to assist us in making available what we're already acquainted with.
Some 100 years later Turing ascribes to her a cautious "logical positivism:"
It will be noticed that he [ D. R. Hartree ] does not assert that the machines in question had not got the property [ A.I.], but rather that the evidence available to Lady Lovelace did not encourage her to believe that they had it.
The point is not whether Lady Lovelace or Turing is correct with regard to A.I., but rather that it was Ada who foresaw the playing field and terrain, the scope of the debate. It was Turing, however, who defined 'artificial intelligence'. By his definition, computers have already achieved consciousness. Turing had said that we may consider a machine intelligent if, in a blind test, we cannot differentiate the computer's response from that of a living, breathing person. By that standard, computers are now conscious and intelligent. IBM's 'Big Blue' defeated Chess champion Boris Spasky who charged that human beings had directed the machine. I often share that attitude with my own computer's chess program. It actually seems to learn from its mistakes.

Ada understood that the meaning of a machine is what it does. Her contribution is that this meaning may be shaped by what are now call 'programmers' who literally instruct the machine, providing it a well-planned list of discrete tasks, a program –in other words: software.

Computer technologists speak of "state". Each state is particular, analogous in some way, to a particular task that may be accomplished while in that "state." Before computers, machines may have processed information –but in a crude way. The instructions had been built-in. Consider the lever –a simple machine that nevertheless may be said to have two "states" –up or down. The meaning of either state is to be found in the work done by (or in) that state. As a general principle, the meaning of a given state is the utility it creates, the work that it does.

In Ada's wake, early computers were mere assemblages of electrically controlled "levers" called "relays." The origin of binary languages may be found in relays which are either 'on' or 'off' . Mechanical relays would be replaced by vacuum tubes and, later, by transistors. Simple circuits were called either "and-gates," or "or-gates", or, generally –'flip/flop' circuits. Even now, the largest supercomputers, reduced to their smallest components, are capable only of processing just two states: 0 and 1. But upon this basic alphabet, patterns of increasing complexity have grown exponentially. The computer has become an electronic loom in which each pattern represents a "state."

That Ada glimpsed this future almost a century and a half ago is remarkable.


Thursday, September 04, 2008

The Four Horsemen of Digital Apocalypse

by Len Hart, The Existentialist Cowboy

I have just survived the worst and most insidious 'virus/trojan' attack since I got my first PC. Despite the fact that I was loaded to the gills with anti-malware, this attacker ran and survived the gauntlet, set up shop and had the nerve to threaten to shut me down. The ensuing epic struggle was a 'virtual' Armageddon, a final, desperate battle between me and the Four Horsemen of Digital Apocalypse.

The offending malware pretended to be a Windows alert. It informed me that my computer was 'unprotected'. It demanded that I 'download' their latest whatever. To do so, I was required to click on an "I AGREE' button. I could not click that button in good conscience. I rarely agree with anything --let alone a 'button', let alone a button that was daring to 'threaten' me.

Never, ever click on a button unless you know where it comes from and trust it. Likewise, suspect any prompt that does not give you a 'I DECLINE' option.

Given the deceptive nature of these programs, I am suspicious of 'any' label given a button. Many are indistinguishable from legitimate 'windows' buttons. In this case, no alternatives were given me. No 'X' in a corner; no 'decline' option; no 'maybe later'; no way to get out! Even CTRL-ALT-DELETE had apparently been disabled. There seemed to be no way to get this piece of crap off my screen and out of my computer.

In the conditional logic of this rudimentary program, you are simply not given an exit from the loop. It is this fact that gives the game away. Only a crooked or incompetent (or both) programmer would write such a conundrum. It was deliberate. It was crooked. It was intended to wrest from me control over my computer.

I found in this experience a parable and a rule of thumb. Any proposition (save life itself) is an evil bargain if there is no escape save death. From a programmer's standpoint, the 'logic' of this code was a very simple matter of omitting a critical 'else' or 'else/if' statement here and there. If you click 'agree', you allow this 'trojan' free reign over your computer. But --there is no 'decline' or 'later' button of any kind. Doing nothing at all is, likewise, not an option. Your computer is effectively lost to you.

This 'spammer' --whom I suspect resides in either China or Florida --has much in common with the mob. It's an offer, you cannot refuse. It might have been worse. At least, I didn't awaken in bed with a dead horse. "Offers that cannot be refused" are commonly used not only by spammers but politicians and, most insidiously, religions. The Christian religion, primarily, puts a 'box' on the computer screen of your mind (or soul) with a choice: hit the 'accept' button or lose the use of your 'computer', in this case, your 'soul'.

Elsewhere in Christianity, however, it is said that such a choice must be made freely. But a coerced choice is anything BUT free! Thus --Christian theology --by enforcing a decision through blackmail --has 'spammed' your mind, denied you free will. By definition, there is no free will without choices. Christianity, by violating its own principles, nullifies itself as valid religion, philosophy, or moral guide. As a theology, as a philosophy, as a 'program', it is, therefore, fallacious and, perhaps, deliberately misleading. Any 'formal system' in violation of its own premises is false! I daresay most religions are of this form. Most, if not all, theology is false.

Decisions made under threat of death are not morally valid. Even 'confessions' made under the threat of death are not admissible in court. In the case of the Christian religion, the threat is not merely one of death, but after death an everlasting hell fire the existence of which no one can prove. No wonder much of the history of the human race is a record of bad decisions and equally bad consequences. Though I am no expert on the fine points of every religion on earth, I will venture this: of the world's major religions, only Buddhism seems free of indefensible and unsupportable dogma. I have yet to hear anyone threaten me with eternal hell fire should I decide not to follow the 'path' of the Buddha.

Of a slightly different logical structure was the 'indulgence' scam perpetrated by the Catholic Church, nothing more than Pope Leo's scheme to raise enough money to pay off his debts, build the new St. Peter's, finance his orgies, underwrite the art. Life is always a mixed bag. Amid the waste, evil, and debauchery, the monies raised by this criminal fraud paid for the lasting works of Michelangelo and Raphael.

Historically, 'religion' purports to explain everything. Those observed phenomena not explained by science or common sense are 'explained' theologically, in terms of the 'supernatural'. But a 'supernatural' explanation is, in fact, no explanation at all. Something is unexplainable if it cannot be explained in terms of 'natural' phenomena. There are, therefore, by definition, no 'super natural' explanations, only natural ones. Thus religion is tautological in its inception. Scientific explanations are only 'natural', by definition, not 'super'! Put yet another way: 'supernatural' is an oxymoron.

Interestingly, there is salvation of sorts to be found in the limits of logic itself. Rudy Rucker, a mathematician gifted with a redeeming sense of humor, wrote of Kurt Gödel's Incompleteness Theorem that it was '... so simple, and so sneaky, that it is almost embarrassing to relate.'
His basic procedure is as follows:
  1. Someone introduces Gödel to a UTM, a machine that is supposed to be a Universal Truth Machine, capable of correctly answering any question at all.
  2. Gödel asks for the program and the circuit design of the UTM. The program may be complicated, but it can only be finitely long. Call the program P(UTM) for Program of the Universal Truth Machine.
  3. Smiling a little, Gödel writes out the following sentence:"The machine constructed on the basis of the program P(UTM) will neversay that this sentence is true." Call this sentence G for Gödel. Note that G is equivalent to: "UTM will never say G is true."
  4. Now Gödel laughs his high laugh and asks UTM whether G is true or not.
  5. If UTM says G is true, then "UTM will never say G is true" is false. If "UTM will never say G is true" is false, then G is false (since G = "UTM will never say G is true"). So if UTM says G is true, then G is in fact false, and UTM has made a false statement. So UTM will never say that G is true, since UTM makes only true statements.
  6. We have established that UTM will never say G is true. So "UTM will never say G is true" is in fact a true statement. So G is true (since G = "UTM will never say G is true").
  7. "I know a truth that UTM can never utter," Gödel says. "I know that G is true. UTM is not truly universal."

    With his great mathematical and logical genius, Gödel was able to find a way (for any given P(UTM)) actually to write down a complicated polynomial equation that has a solution if and only if G is true. So G is not at all some vague or non-mathematical sentence. G is a specific mathematical problem that we know the answer to, even though UTM does not! So UTM does not, and cannot, embody a best and final theory of mathematics ..

    Although this theorem can be stated and proved in a rigorously mathematical way, what it seems to say is that rational thought can never penetrate to the final ultimate truth ... But, paradoxically, to understand Gödel's proof is to find a sort of liberation. For many logic students, the final breakthrough to full understanding of the Incompleteness Theorem is practically a conversion experience. This is partly a by-product of the potent mystique Gödel's name carries. But, more profoundly, to understand the essentially labyrinthine nature of the castle is, somehow, to be free of it.

    --Rudy Rucker, Infinity and the Mind
The work of Alan Turing proving that certain propositions in a 'closed logical system cannot be proved within that system' is, of course, a corollary to Kurt Gödel's famous proof. Both have had enormous consequences in academia, computing and philosophy. It is hoped that one day, the impact of this work will be felt in the field of politics.

Both Gödel and Turing were concerned with the inherent flaw in any formal system. The question raised is this: if a single trojan could very nearly take over my computer, might a much better, wider and highly co-ordinated attack seize the internet itself. Is our salvation to be found in Gödel and/or Turing? What lesson is learned by the defeat of Chess genius Gary Kasparov by IBMs 'Big Blue'.
Chess is a game of guile and strategy. Chess means putting your emotional engines out of sight and choosing moves with cold calculation. In the end, Kasparov's cool cracked. He angrily resigned -- charging, at first, that IBM had let a human call the moves. I doubt anything of the kind, just because the computer's eventual victory was predictable.

Two generations ago, Alan Turing gave us an important thought model for all this. Turing said, suppose you go into a room with a keyboard and a monitor. You type in questions and receive answers. Then you try to determine whether the answers are being given by a human or by a machine. Ever since then, we've said that a computer which can't be told from a human passes the Turing test.

Most of us have assumed that no one could ever create a Turing Machine because that veers close to creating sentient intelligence. Here the argument over Deep Blue heats up because of Kasparov's initial belief that he was dealing with humans. Deep Blue really did pass the Turning Test as far as Kasparov was concerned.

That's why I think this strange little chess game was significant -- not because the outcome was a surprise, but because Kasparov thought Deep Blue might be human.

--Dr. John Lienhard: Kasparov and Deep Blue
It does not follow, however, that because Big Blue defeated Kasparov, that computers will eventually render the human being obsolete, that eventually there may be no defense against a computer generated 'virus' or trojan by which a 'federation' of networked computers will assume complete and total dictatorial control and thus rule the world.

I love Gödel's proof and as well the related work of Alan Turing. The very language of computers is like that of human beings --'flawed' or, more precisely, incomplete. We need never fear computers taking over. But it is not only because Gödel's Proof is 'logical' that it is compelling. It is positively liberating. 'Incompleteness', itself, is liberating. 'Incompleteness' should be celebrated. In 'incompleteness' is our salvation from a dictatorship imposed upon us by computers and/or inflexible systems.

Whenever a fundie, or a Nazi, or a Republican comes peddling an all-embracing system or purports to have all the answers, a complete and unassailable ideology, system or weltanshauung, or tries to blackmail me into swallowing it, I have Gödel's proof that NO system is complete, that NO one person or organization has all the answers.

There is no holy writ!

There is no voice high or low that will replace my own conscience and my own abilities to work out the truth as best I can. There is no way that I may be blackmailed with offers I dare not refuse for fear of either hell fire or, worse, the loss of my computer forever!

There is no blackmailing me into any system or cult that, like the GOP, presumes to have all the answers but is, in fact, wrong about everything! I am a free man! I make my own choices and live with the consequences.

I am immune to coercion. No one living can make meaningful statements (let alone truthful ones) about an after life that may or may not exist. Therefore, I choose to base the only life of which I have knowledge upon matters about which meaningful knowledge is, at least in theory, obtainable. It is nonsensical to base my life upon the expectation of an after life about which no meaningful statement can be made whatseover.
"Why Richard, it profits a man nothing to lose his soul for the whole world... but for Wales?"

--Character of St. Thomas More, A Man for All Seasons, Sir Robert Bolt
I submit that it profits one nothing to compromise his/her integrity in this life in the expectation of rewards in a hypothetical afterlife the existence of which may not be known and about which nothing meaningful may be said.

For those interested in avoiding 'armegeddon' with Trojans:
As that very real war goes on in the Middle East, back here at home we continue to wage a virtual war against a different kind of spam. And of course, it's not just in the U.S. Just a few days ago, the French government announced a new project by which Internet users could alert their ISPs when they receive spam messages:
http://www.wxpnews.com/Q85JLJ/071009-Signal-spam

And we're hearing that Japanese users are getting an increasing amount of spam mail from Chinese servers, most of it advertising online dating services and adult-oriented web sites:
http://www.wxpnews.com/Q85JLJ/071009-Spam-in-Japan

Spam web sites are causing trouble for Google, as many of these sites are coming up in search results and some of them are downloading malware onto users' computers when they follow the links.

Meanwhile, the Securities and Exchange Commission (SEC) is cracking down on the recent deluge of "pump and dump" spam messages that attempt to inflate the prices of stocks issued by small companies. Recently this has become the second largest category of spam, with as many as 100 million of these messages being sent every week, many of them in the form of PDF attachments. The SEC has reported a 30 percent drop, however, since they initiated an aggressive program that includes freezing the trading on some of these companies

http://www.wxpnews.com/Q85JLJ/071009-Crackdown

Another, more malicious variety of spam that has popped up in the last few months exploits the popularity of social networking. These messages claim to be from an "old school friend" or a "childhood friend" and contain a link that's supposedly for the sender's MySpace (or other social networking) homepage. However, clicking the link takes you to a site that downloads a Trojan which can gather personal information such as account numbers and passwords and send them back to the spammer:

http://www.wxpnews.com/Q85JLJ/071009-Social-spam

Spammers and email scammers are great at taking advantage of whatever's currently in the news and trends in public opinion. Shortly after September 11, there was a spate of spam messages appealing to Americans' patriotic feelings. As the public tide turned, we now see spam messages that hook into anti-war sentiments. The recent downturn in the housing market and the subprime loan scandals have resulted in a new flood of spam messages pertaining to home financing.

You might even be a spammer yourself and not know it. Thousands of computers are infected with malicious software called 'bots that turn them into "zombies" that can be controlled by spammers and used to send spam messages (and hide the true origins of the spam).

--War Against Spammers Goes On
Published Articles


Friday, April 13, 2007

Surviving the Epilogue: A Farewell to Vonnegut

The world remembers Kurt Vonnegut whose passing this week is a painful reminder that the world my generation tried to create may have been stillborn with the dawning of this new millennium. Even as a rabid, paranoid GOP tried to impeach the best President in a generation, there were, amid progress in Palestine, real hopes for lasting peace.

It was a time when Bush had not yet stolen the American presidency. Paul Gigot of the Wall St. Journal had not yet gloated of a GOP coup d'etat. Where are those hopes now? Did my generation fail its ideals? Do those hopes lie bleached on the deserts of Iraq -or awash in Gigot's amoral cynicism and his utter lack of intellectual integrity. As Vonnegut himself asks in the following video - is the story over?

Harvey Wasserman of the Free Press wrote "...lets not forget one of the great engines driving this wonderful man - he hated war." Most recently, Vonnegut hated the war in Iraq and the men who planned it and started it. Those men survive to threaten our future, to start another war. Meanwhile, a lonely voice of sanity is gone. Vonnegut is already missed.


This plot has a long back-story - some 1000 years. In 1066, William, Duke of Normandy, crossed the English Channel, conquered the Anglo-Saxon kingdom of Harold, and asserted his claim to the English throne. Though he ruled by force and forts, William stayed, imposing his rule upon an unwilling population that absorbed his native language of French but would not cease to be "English". Incidentally, it is almost possible to date the assimilation of a French word by its English pronunciation. Beef, for example, most certainly dates back to the invasion. Rendezous, litle changed from the original French, was not assimilated until much, much later. I wouldn't want to hazard a guess.

In a little more than two centuries hence, Norman "Kings" would be referred to as "English" and would assert their right to rule over Normandy – now thought of as "foreign". Still, William’s crossing of the English Channel, a feat never again equaled, has become a bookend for a millennium only recently ended.

It seems like yesterday that the world celebrated the end of a millennium. But historians will most probably mark the end of that era with another event. Nearly one thousand years after William’s daring channel crossing, American, British and allied soldiers mirrored his feat by invading Normandy, an event that may yet prove to be of equal historical importance. It may be tempting to think of William’s invasion as the beginning of an era and the conclusion of World War II as its end.

World War II changed the world in profound ways. It was a dramatic culmination of issues that are easily traced to 1066. Secondly, World War II defined the Twentieth Century even as it summed up the millennium. It was an event that shaped the lives of Vonnegut's generation.

First of all, World War II sobered the world. When the Americans exploded the first Atomic Bomb in the desert of Alamogordo in July, 1945, American scientist, Robert Oppenheimer was inspired to quote an old Hindu poem:
"I have become death, the destroyer of worlds".
That blinding flash in the desert was the reductio ad absurdum of a process of technological warfare that began with William’s victory over the English at Hastings, the English victory over the French at Agincourt, and the American victory over the Lakota Sioux in the Black Hills, the American genocide throughout the horrific trail of tears, an event my own ancestors barely survived. Warfare became unthinkable and in becoming unthinkable became never-ending: a cold war of fifty years followed now by the unceasing struggle against world terrorism. George Orwell's perpetual war.

Secondly, the computer, itself a product of World War II, has changed the way we think about the universe. Information is seen to be the very warp and weave of space-time. Not Eniac - but Colossus - was the first electronic computer. Colossus was the product of English and American code-breakers, the team headquartered at Bletchly Park. They cracked the Nazi enigma machine but would not reveal the eastern Nazi troop build-up to Russian allies. It would have blown the Enigma advantage but the Russian people would pay with their millions of lives.

Inspired by Bertrand Russell and Alfred North Whitehead who had sought, in Principia Mathmematica, to ground Mathematics upon a foundation of pure logic, Alan Turing envisioned a machine that could write symbolic theorems derived from symbolic axioms. Such a machine could, and did, automate the code-breaking process. Turing has forever established the criterion by which we may judge artificial intelligence, i.e. if a computer’s responses to our queries cannot be distinguished from those of a human being, then that computer may be said to be "thinking." Mankind will have created "consciousness" in a machine.

Some thinkers have put forward the idea that at the end of the first millennium – the year 1000 - human consciousness was raised. In fact, the first stirrings of a Renaissance are in evidence in a mere two centuries hence. It’s full flowering, of course, came in the Fifteenth and Sixteenth centuries.

Arguably, the 20th was the bloodiest of centuries, and, until Iraq descended into chaos, it had been said, erroneously, that we had been at peace for over 50 years. That was not so, course. What are called "isolated" conflicts –Korea, Viet Nam, Persian Gulf War I –were but continuations and aftershocks of a conflagration that engulfed the world. Can it be said that the legacy of that conflagration will raise human consciousness yet again as it had been some one thousand years earlier? We must hope. There is no alternative. Perhaps humankind, surviving yet another one thousand years under the threat of nuclear annihilation, will so conclude. Bluntly, however, unless men like Bush are forever forbidden any power at all, mankind will be fortunate to survive another 50 or 100 years, let alone a millennium.

Lest we despair Wasserman adds:
Now he's (Vonnegut) having dinner with our beloved siren of social justice, Molly Ivins, sharing a Manhattan, scorching this goddam war and this latest batch of fucking idiots.
Vonnegut lives and rocks on. You can find a good list of his major works at the usual reference sites: Wikipedia and at the Harold Tribune.