12
Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

Embed Size (px)

Citation preview

Page 1: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

Thoughts about the Way of Machine Learning

-- with EBL and Bayesian as examples

WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu

June 2005

Page 2: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

What are We Doing

● Input– Prior knowledge: Our bias on the matter

– Examples: Data support the fact

● Output– Posterior knowledge: stronger than priori knowledge

● (EBL) A compact form of prior knowledge, closer to fact● (Bayesian) A modified form or prior knowledge, supported

by fact

Page 3: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

How EBL Achieve It● Data

● If f(x) is either 0 (we don't care) or 1 (we want)● EBL cares only those positive examples

● Prior Kowledge– Complete and Correct

Page 4: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

How EBL Achieve It

● Posterior knowledgeonly a restatement of the prior knowledge

– for each unexplained sample xi

● Construct an inference tree to explain why f(xi) is 1● Generalize this inference tree to a rule● Delete those samples that can be also explained● Add the generalization to posterior knowledge set

● The generalization● Expand the prior knowledge ● Confine the feature space

Page 5: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

How Bayesian Achieve It● Why Bayesian is stronger than EBL

– If prior knowledge is not perfect?– If f(x) has values other than 0 and 1 (uncertainty)?

● Prior knowledge– How possible a sample x has f(x)=1/0, without seen its features:

p(0) = p0, p(1) = 1-p0

● Explain by prior knowledge– How likely x has f(x)=1/0 with features d(x) of x is known:

p(d(x) | 0) and p(d(x) | 1)

Page 6: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

How Bayesian Learning Achieve It

● Posterior knowledgenot only a restatement of the prior knowledge, butmodified by the explanation of likelihood

● Apply the learned posterior knowledge to a new x with d(x) measured

Page 7: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

EBL v.s. Bayesian

● Similarities– Both deductive, both use prior knowledge to explain

traning samples

● Differences– Bayesian does not require perfect prior knowledge – Bayesian does uncertain reasoning– Bayesian does not reduce dimensionality of feature space,

but it can learn weights on features dimensions to indicate importance.

Page 8: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

An Example about Fishing

● A lake with several kinds of fishes● We love a certain kind, namely, GoodFish● So we want to build a machine

– Meansure features (weight and length) of each caught– With the posterior knowledge to judge whether it is a

GoodFish

● Question is: how should we train that machine?

Page 9: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005
Page 10: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

Is EBL and Beyesian really Learning

● NO● The essence of learning is development, ● The essence of development is to enlarge the

hypothesis space and to accumulate knowledge,● EBL is to prove knowledge by given observation,

without any increament of knowledge.● Bayesian is

– to adjust prior knowledge to better fit observation, or,– To constrain explanation of data by prior knowledge

Page 11: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

How should we implement development

● I believe: Development is an iterations of

1. To generate new hypothesis by association of thinking

2. To prove the new by taking old as prior knowledge

3. To keep the proven ones, and neglect unproven

4. To direct generation of new hypothesis by learning previous successful generation

5. Goto 1.

Page 12: Thoughts about the Way of Machine Learning -- with EBL and Bayesian as examples WANG Yi, ZHAN Dan, YEI Ruo-fen, LIU Lu June 2005

Yet Another Example

● An example you might do not like● But an example considering development of

human being● Also an example that combines the essense of

learning– Development– Social interaction– Embodiment– Integration