29
Topics in Biological Physics Seminar Information and Computation – Session B: A. Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951 – Design of Computers, Theory of Automata and Numerical Analysis. By: Adam Lampert

Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

  • View
    218

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Topics in Biological Physics Seminar

Information and Computation – Session B:

A. Turing, 1950 – Computing Machinery and Intelligence.

J. Von-Neumann, 1951 – Design of Computers, Theory of Automata and Numerical Analysis.

By: Adam Lampert

Page 2: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Can Computers Become as Intelligent as We Are?

• Fundamental part – Are we simply complicated computers?

• Practical part – How do we build “intelligent” computers?

Page 3: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Fundamental Part - Are We Simply Complicated

Computers?

Page 4: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

• Turing Machine (Computers)

– Finite Automata (Control)

– Infinite tape (Store)

– Executive unit

What is a computer?

Page 5: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Universality of Turing Machines

• Church-Turing thesis (informally): any realized discrete computation can be done by an equivalent Turing Machine

• TM

• TM with many tapes

• Cellular automata

• Usual computers*

• Logic gates

• Neural networks

• Physical realizations

All of the following are equivalent to TM:

Page 6: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Are We Simply Complicated Computers Then?

• Conjecture (Turing): Computers may become indistinguishable from humans.

• Why should we agree with Turing? – Universality of computational devices (Church-Turing thesis)– Consistent with our intuition about the physical world

• Why shouldn't we agree with Turing?– Consciousness– Limitation of computation– Continuous computation– Self replication

Page 7: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Arguments Against Turing Conjecture - Consciousness

• Argument: Computes, as an automated device, does not have consciousness, and therefore are distinguishable from humans.

• Answer 1: Complicated enough computers might be able to be conscious.

• Answer 2: Even if computers can not be conscious, it doesn’t mean that they are empirically distinguishable from humans.

Page 8: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Arguments Against Turing Conjecture – Computers Are Limited

• Background:– Is there a problem that no computer can ever solve?– YES!– The halting problem.– Goedel Theorem.

• Argument: Computers are fundamentally limited, so we can do better than them.

• Answer: we are also fundamentally limited!

Page 9: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Proof of the Halting Problem• The problem: is there an algorithm which decides whether a given TM accepts a

given input or not.

• Define <M> as the (string) representation of a TM M.

• Assume that the algorithm exists: H(<M>,w) = True if M returns True on input w, False otherwise.

• Define: S(<M>) = H(<M,<M>>)

• Define: T(<M>) = ¬S(<M>) = False if M returns True on input <M>, True otherwise

• T(<T>) = False if T returns True on input <T>, True otherwise.

• A contradiction!

• Why did we get the contradiction?– We have represented TM in the terms of its own language– Therefore, we could announce: “this statement is false”.

Page 10: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Arguments Against Turing Conjecture - Continuous Computation

• Background:– The Church-Turing thesis applies to discrete machines.– Continuous machines may be capable of computations beyond the Turing

limit, and may solve the halting problem for Turing machines.– Among such machines are certain (theoretical models of) neural networks

and chaotic systems.

• Argument: Our brain is continuous capable of computations beyond the Turing limit.

• Answer: Our world is noisy nearby states of the brain are indistinguishable.

Page 11: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Arguments Against Turing Conjecture – Self Replication

• Argument: if A construct B, then A must be more complicated than B.

• Therefore, a computer can not self-replicate, but we can.

• Answer: A can be just as complicated as B, and a computer can indeed self-replicate (Von-Neumann, 1951).

• Furthermore, this doesn’t induce any limitation on the rest of the machine abilities.

Page 12: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Self-Replicating Machine – Von-Neumann Construction

• Machine A – construct machine T from its description IT.

• Machine B – generate a copy of any given instruction IT.

• Machine C – receive instruction IT, operate A to create T, operate B to copy IT, and supply T with IT.

• Machine D is composed of the triplet A + B + C. It generates T + IT from IT.

• In particular, D generates D + ID from ID.

• Machine E is composed of D + ID.

• E is self replicating!

Page 13: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Self-Replicating Machine - Langton Loops - Background

• Cellular Automata – each time step, the value of each cell is determined by the current value of its own and of its neighbors.

1D 2D

Page 14: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Self-Replicating Machine - Langton Loops

• Langton loops are loops-shaped object within a certain 2D CA invented by Langton in 1979.

• These loops are self replicating.

Page 15: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Cell Replication• Very simplified description of cell replication:

– The DNA is composer of two strands.– “DNA polimerase” et al. separated them and build up a new

DNA out of each strand.– A always goes with T, C always goes with G. – Each strand contain all the information.– Two identical DNAs are composed.– Then, the cell is divided into two cells such that each one

contains a DNA and about half of the other cell contents.

• Computational perspective– Neither DNA nor DNA polymerase et al. contains any

information about itself.– DNA polymerase can simply duplicate any given DNA.– The DNA only contain the information how to construct

DNA polymerases

Page 16: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Practical Part - How Do We Build “Intelligent”

Computers?

Page 17: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Who Do We Examine If a Machine Is Intelligent? Empirical Test

• Standard Turing test:

• Imitation game:

Page 18: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Weaknesses of the Turing test

• May not cover all aspects of intelligence.

• The interrogator should be really tough to diagnose tough computers.

• Many unintelligent human behaviors must also be simulated by the machine.

• The original proposed test is mostly textual, although some aspects of intelligence may not be so.

• Do not examine real time responses.

Page 19: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Turing’s Claim (1950)

• “I believe that in about 50 years’ time it will be possible to program computers … to make them play the imitation game so well that an average interrogator will not have more than 70 percent chance of making the right identification after five minutes of questioning.”

Page 20: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Historical background• 1936 – Turing machine.

• 1939-40: The Bombe, machine for Enigma decryption during World War II.

• 1943 J. Eckert and J. Mauchly - construction of ENIAC. Considered the first electronic computer and was used to calculate ballistic firing tables during World War II.

Page 21: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

• late 30s, and 40s - Recent research in neurology had shown that the brain is an electrical network of neurons that may fire in all-or-nothing pulses.

• 1943 – W. Pitts and W. McCulloch showed that networks of idealized artificial neurons may perform logical functions.

• 1947 - The invasion of the transistor (replaced the vacuum tubes later). Note that a vacuum tube was estimated by Von-Neumann (1951) to be less efficient than a neuron cell by a ratio of one to million, in comparison of performance per volume and energy consumption.

• 1948-1949 – G. Walter's analogous robots - capable of phototaxis: found their way to light.

• 1949 – EDSAC - inspired by J. Von-Neumann, constructed by M. Wilkes and his team in England. Calculated arithmetic, differential equation, power series, etc.

Historical Background - Continue

Bottom line: computers are used mostly to numerical purposes, but some inspiration could come from neurology

Page 22: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

A Bit More About EDSAC (1949)

• Wight: 1 ton• Area: 45 m• Storage: 2k bytes

2

Page 23: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Human Intelligence vs. Machine Intelligence

• In certain problems humans have a clear advantage over today’s computers.– Visual recognition.– Language.

• In certain other problems today’s computers has a clear advantage over today’s humans.– Numerical calculations.– Searching over large amount of data.

Page 24: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Machine Learning

• Child computer

• Teacher

• Adult computer

Page 25: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Machine Learning – Example: Generalization

• Goal: generate a computer that return y=ax+b for any given x.

• Equip the child computer with “the answer is a straight line”.

• This is called the inductive bias.

• Teacher or environment provide it with the value of y for two values of x.

• the computer generalize the answer and can calculate y for any x.

Page 26: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Machine Learning – Example: The Perceptron

• The perceptron is a very simple model of neural network.

• Goal: give the correct answer y for any input x.

• Learning program – get x as input and δ as the correct output. Changes the weights w and b according to:

• The perceptron is a linear classifier: it converges if and only if the data set is linearly separable, and there are not too much mistakes.

• Today there are much more sophisticated models of neural networks.

Page 27: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Perceptron – Example 1: AndInitial randomized configuration:

First Training Sample: X1 = 1, X2 = 0, δ = 0

Learning Rule:For each wrong classification:W1 = W1 ± α ×X1

W2 = W2 ± α ×X2

b = b αHere, α = 0.1

0.22

0.47 0.51

0.32

0.37 0.51

Second Training Sample: X1 = 0, X2 = 1, δ = 0

0.410.37

0.42

-+

Result:

Result:

Page 28: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

0.55

0.060.65

Perceptron – Example 2: Xor

Our perceptron is not able to represent the XOR function.

Page 29: Topics in Biological Physics Seminar Information and Computation – Session B: A.Turing, 1950 – Computing Machinery and Intelligence. J. Von-Neumann, 1951

Where Are We Today?• Winner of Loebner prize in 2008 has managed to fool 3 out of 12 judges

that he was human, in a short textual conversation (you can talk with “Frank” at www.artificial-solutions.com. Try also http://www.abenteuermedien.de/jabberwock, winner of 2003’s prize).

• Great progress in many problems of AI.

• Humans are still much better at many tasks (recognition of objects within pictures, translation of text).

• Today’s computers out-compete humans at games with few choices and complete information (Checkers, Chess(?)).

• Expert humans are still better at games with many choices or incomplete information (Go, Bridge).