13
RICHARD E. GRANDY INFORMATION-BASED EPISTEMOLOGY, ECOLOGICAL EPISTEMOLOGY AND EPISTEMOLOGY NATURALIZED Only a very short time after the formal birth of information theory in 1948 (Wiener, Shannon) its potential value in psychology began to be realized (Miller and Frick, 1949). Quine's influential suggestion (1968) that epistemology be naturalized, i.e., be viewed as a branch of psychology, might have been expected to bring information theory into epistemology but has not. Philosophers have regularly looked at the Shannon and Wiener definition and then declared that philoso- phers ought to be concerned about a different concept of information (Bar-Hillel, 1952, Hintikka and Suppes, 1970). More recently, Dret- ske (1981) makes information central to epistemology, but not in the sense defined by Shannon and Wiener. Instead he introduces his own alternative definition. Meanwhile in psychology, Gibson (1966) has proposed an "ecological theory of perception" which makes infor- mation central, but he too rejects the S&W (Shannon and Wiener) definition (1966, p. 245). I will review both Dretske's and Gibson's reasons for rejecting the standard definition and argue that their reasons are defective. I conclude, however, by arguing that from the point of view of a naturalized ecological epistemology one must go beyond the standard definition and include other considerations. 1. A STANDARD DEFINITION Perusing works on information theory one finds a multiplicity of distinct information concepts and various notations and distinct (though equivalent) definitions of the recurring concepts. 1 In this brief essay I will not attempt to survey the alternatives nor to provide the persuasive background for the definition I will use in the rest of the paper. In most presentations, it is given a derivative and opaque presentation, which may explain why it has not received the attention I argue it deserves. I will use as the central concept that of average mutual information between two sets of possibilities. We must begin with some preli- Synthese 70 (1987) 191-203. O 1987 by D. Reidel Publishing Company

Information-based epistemology, ecological epistemology and epistemology naturalized

Embed Size (px)

Citation preview

Page 1: Information-based epistemology, ecological epistemology and epistemology naturalized

R I C H A R D E. G R A N D Y

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y ,

E C O L O G I C A L E P I S T E M O L O G Y A N D

E P I S T E M O L O G Y N A T U R A L I Z E D

Only a very short time after the formal birth of information theory in 1948 (Wiener, Shannon) its potential value in psychology began to be realized (Miller and Frick, 1949). Quine's influential suggestion (1968) that epistemology be naturalized, i.e., be viewed as a branch of psychology, might have been expected to bring information theory into epistemology but has not. Philosophers have regularly looked at the Shannon and Wiener definition and then declared that philoso- phers ought to be concerned about a different concept of information (Bar-Hillel, 1952, Hintikka and Suppes, 1970). More recently, Dret- ske (1981) makes information central to epistemology, but not in the sense defined by Shannon and Wiener. Instead he introduces his own alternative definition. Meanwhile in psychology, Gibson (1966) has proposed an "ecological theory of perception" which makes infor- mation central, but he too rejects the S&W (Shannon and Wiener) definition (1966, p. 245).

I will review both Dretske's and Gibson's reasons for rejecting the standard definition and argue that their reasons are defective. I conclude, however, by arguing that from the point of view of a naturalized ecological epistemology one must go beyond the standard definition and include other considerations.

1. A S T A N D A R D D E F I N I T I O N

Perusing works on information theory one finds a multiplicity of distinct information concepts and various notations and distinct (though equivalent) definitions of the recurring concepts. 1 In this brief essay I will not attempt to survey the alternatives nor to provide the persuasive background for the definition I will use in the rest of the paper. In most presentations, it is given a derivative and opaque presentation, which may explain why it has not received the attention I argue it deserves.

I will use as the central concept that of average mutual information between two sets of possibilities. We must begin with some preli-

Synthese 70 (1987) 191-203. O 1987 by D. Reidel Publishing Company

Page 2: Information-based epistemology, ecological epistemology and epistemology naturalized

192 R I C H A R D E. G R A N D Y

minaries: let Bi and Cj be two sets of alternatives such that each set is finite (i.e., 0 < i < n and 0 < j < m for finite n, m) and that each set is disjoint and exhaustive (i.e., that on any particular occasion exactly one member of the set Bi and one member of the set Cj obtain).

I assume that there is a well defined probability z distribution over the joint possibilities for B and C (the usual situation is that the events are repeatable event types; if you understand probabilities of singular events then they can be encompassed as well). This means that for each conjunct ion Bi & Cj there is a well defined probability. Con- ditional probabilities, P(Bi/Cj) the probability of Bi given Cj, can be defined from the joint probabilities in the usual way. 3

If the occurrences of Bi and Ci are correlated, i.e., if P(Bi/Ci)> P(Bi) then Ci conveys some information about the likelihood of Bi. The most useful definition of the amount of information conveyed proves to be one that averages over the various possibilities weighting them according to their likelihood, and which assesses the amount of information by applying the log (base 2 always) function. Thus officially, the Average Mutual Information 4 between Bi and Cj is:

P(Bi & Cj) ~, ~.. P(Bi & Cj) log , s P(Bi).P(Cj)"

It should be noted and emphasized that the definition is totally symmetric between B and C.

In the particular examples I will be using to illustrate various points in the remainder of the paper, most of the cases involve only pairs of alternatives Bi and Cj and so the relevant equation in those cases is the special case where i~{1, 2} and j~{1 , 2}. For simplicity and perspicuity in those cases I will use " B " for " B I " and " - B " for " B 2 " . Since the indices i and j are otherwise arbitrary labels, I will assume for simplicity of statement that same subscripted Bs and Cs are nonnegatively correlated according to a natural convent ion so that

P(Bi/Ci) >--P(Bi/Cj) for all j =/: i. s

2 . G I B S O N A N D E C O L O G I C A L E P I S T E M O L O G Y

Divide and solve has been a successful strategy in all sciences and seemed promising in this case as well, but of course how successful it is depends on how one divides. Gibson on the other hand emphasized

Page 3: Information-based epistemology, ecological epistemology and epistemology naturalized

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y 193

that the senses are systems that pick up information about the environment of the subject. 6

Gibson's work arose in opposition to a climate of research in perception which emphasized the study of the effects Of arbitrary and simplified stimuli. Gibson is explicit in several places that he is not using information in the S&W sense, but is not explicit as to what his sense is or how it differs. Reading between the lines I think we can discern four (somewhat related) reasons why he might want to dis- tinguish his notion of information from the S&W version (1966, p. 245).

(a) Passivity

In the standard presentations of the theory the text usually formulates matters in terms of a source of the information sending signals to a passive receiver. Given Gibson's stress on the active character of the senses this would seem out of keeping with his approach. However, the mathematical definition of information, as opposed to the heuristic setting in which it is often presented, makes no mention of 'source' or 'receiver'. As noted earlier the definition makes the informational relation quite symmetric. Thus if the definition is otherwise satis- factory for the ecological approach we should consider keeping it and simply disavow the unwanted connotations.

(b) Channels

Since information theory was developed for engineering purposes, specifically for communication through electromagnetic channels, it is not surprising that again much of the presentation is put in terms of channels. But the mathematical definition says nothing of 'channels', it only requires certain conditional probabilities. In applications such conditional probabilities may be the result of channels, but they need not be. Again I think that Gibson was jousting with connotations.

( c) Specification

The most precise statements of what information is in Gibson or his followers to my knowledge is that the proximal stimulus (the incident optic array for Gibson) specifies the distal stimulus (the environment

Page 4: Information-based epistemology, ecological epistemology and epistemology naturalized

194 R I C H A R D E . G R A N D Y

for Gibson). One of his recurring themes is that there is sufficient information in the incident light to totally determine the character of the surrounding objects. Thus it may be that Gibson rejects the S&W definition because it allows information to be nonzero in cases where a less than perfect specification obtains, i.e., where some ambiguity about the surrounding objects remains.

From a methodological point of view it seems preferable to adopt the definition and to put forward as an additional hypothesis the claim that the mutual information between distal and proximal stimulus always equals the entropy of the distal stimulus. The general approach to perception would survive even if this empirical hypothesis were disproved, as it seems almost certain to be.

( d) Digitalization

The standard presentations of the theory are typically phrased in terms of signals consisting of Os and ls, and of sources that have easily discriminated discrete states. But neither internal states nor the world come with those kinds of structure. Or so would run another objection to using the S&W definition. It is true that the mathematical analysis requires a partitioning into sets of disjoint and mutually exhaustive states and that there is often no easy or obvious way to divide the environment. But that is a challenge to a theory and one that ought to be met in some form in any event. If we cannot say what kind of information one picks up or how it is stored or used the theory would be vacuous.

Gibson himself is quite explicit and detailed on some of the kinds of information that are picked up (e.g., about distance, velocity). In some cases partitioning is easy (prey present, prey not present). For the final stage of analysis of human epistemology we would presumably want an account of the relation between beliefs sententially expressed and the world - - in this case the sentential expression would provide the partitioning. In other cases we would clearly want to understand the relation between continuous quantities in the environment (position, distance, velocity, mass) and the essentially continuous internal states to which they relate. But there is a form of the mathematical definition by S&W that is applicable to continuous variables, so this is no obstacle.

Page 5: Information-based epistemology, ecological epistemology and epistemology naturalized

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y 195

( e) Discrimination Versus Perception

The most explicit and extended discussion (Gibson, 1966, pp. 245-46) contrasts his sense of information which allows "perception of" with "perception considered as discrimination" (his emphases). His point is well-taken if one formally identifies the information in the stimulus with some of the concepts in mathematical information theory, but not with the one that I have been discussing which explicitly makes the (mutual) information a relational matter.

In summary, a reasonable (modified?) Gibson program should embrace (my formulation of) the S&W definition.

3. D R E T S K E A N D N A T U R A L I Z E D E P I S T E M O L O G Y

After a lucid presentation of the S&W approach Dretske explains that it is not useful for his epistemological purposes because it only concerns the average information in a set of messages whereas epis- temology is concerned with whether a person knows a particular fact on the basis of a particular signal. This is true if by epistemology one means a suitable naturalized construal of traditional epistemology, but I want to argue that it does not apply to appropriately naturalized epistemology. 7

To make the situation clearly ecological, let us consider not a human subject but a frog and let the informational question be whether there is an edible bug present. That is, we divide the possible states of the environment into two kinds: B = "bug present" and - B. s To avoid questions about the nature of beliefs, let us consider the internal states of the frog, specifically the state C which activates bug catching behavior (at least absent any additional overriding states such as might indicate predator presence) and - C. 9

In the ideal case for the frog B and C are perfectly correlated (via the usual perceptual causal chain). This would give the following joint probability distribution:

P ( B & C) = O. 1 P ( B & - C) = 0 P ( - B & C ) = O P ( - B & - C) = 0.9

Page 6: Information-based epistemology, ecological epistemology and epistemology naturalized

196 R I C H A R D E . G R A N D Y

This is also the case in which the frog would always know, on Dre tske ' s account , that there was (or was not) a bug present and the case that maximizes mutual information according to the definition in section 1; the precise number is 0.47 bits. Le t us call this ideal situation Frog I.

However , if we are to be ecologically realistic the ideal situation rarely obtains, so let us consider and compare some less than ideal situations. In the next case, Frog II, the internal state C always correct ly indicates that a bug is present when C obtains, but in ten percent of the cases in which a bug is actually present - C occurs.

P(B & C) = 0.09 P(B & - C) = 0.01 P(-B& C)=O P ( - B & - C) = 0.9

In this case the frog always knows (according to Dretske) that a bug is present when C occurs, but does not know that no bug is present when - C obtains, only that there is probably not a bug. m The mutual information in this case is 0.39 bits.

Let the case of Frog I I I be the converse, one in which the prob- ability of - B given - C is one, but the bug detectors are overzealous and one in ten of the C signals is erroneous. More formally:

P(B & C) = 0.1 P(B & - C) = 0 P ( - B & C) = 0.011 P ( - B & - C) = 0.889

In this case the frog knows there is no bug present whenever there is not, so that in almost ninety percent of the cases he has knowledge, but in the remaining cases he only knows that there is probably a bug present. In this case the mutual information is 0.42 bits.

F rom the purely information theoretie point of view Frog I I I is bet ter off. F rom an epistemological point of view things are less clear cut. On the one hand it would seem that Frog I I I is bet ter off for it more often has conclusive, ra ther than merely probable knowledge. But on the other hand what Frog I I ' s knowledge is about is the presence rather than absence of bugs. From an ecological point of view we can readily calculate which frog is bet ter off if we make some reasonable assumptions. Le t us suppose that the net energy gain in

Page 7: Information-based epistemology, ecological epistemology and epistemology naturalized

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y 197

catching a bug is 1 bu (bug unit) and that net energy loss in attempting to catch a bug when none is present is 0.1 bu. Then our ideal Frog I will average 0.10 bu per event, Frog II 0.09 bu and Frog lII 0.099.11 Clearly Frog III fares better.

To sharpen the contrast a little further we need Frog IV, who makes a few errors in both directions. For Frog IV let

P ( B & C) = 0.095 P ( B & - C) = 0.005 P ( - B & C) = 0.005 P ( - B & - C) = 0.895

Frog IV never has definitive knowledge, though it often (99 percent of the time) has knowledge whether a bug is probably (not) present. The mutual information of the system is 0.40, and the net value is 0.0945 bu. Thus we can see that on information theoretic and ecolo- gical grounds Frog IV fares better than Frog II and worse than Frog III, while on epistemological grounds he is worse off than both II and III. My conclusion is that Dretskean epistemology, which is one variation of information-based epistemology, is not ecological epis- temology.

4 . I N F O R M A T I O N B A S E D E P I S T E M O L O G Y A N D

E C O L O G I C A L E P I S T E M O L O G Y

In the cases above, ecological value and amounts of information matched nicely and this might suggest that ecological epistemology could be pursued entirely in terms of information maximization. However, this would be wrong. One way to see this is to consider a little more closely the two kinds of information loss involved in situations like those of Frogs II and III. Any deviation from perfect correlation entails a loss of information. Thus we can consider an improved version of Frog II whose quantity of information matches that of III and then compare their ecological situations. Thus for Frog II'

P ( B & C) = 0.095 P ( B & - C) = 0.005 P ( - S & C ) = 0 P ( - B & - C) = 0.9

Page 8: Information-based epistemology, ecological epistemology and epistemology naturalized

198 R I C H A R D E. G R A N D Y

"In this situation Frog II' has 0.424 bits of information, slightly greater than the 0.421 of Frog III, but will have an average ecological value of 0.095 compared to the 0.099 of Frog III. Thus greater information and greater ecological value diverge. Let us consider why.

I have been implicitly using a formula for calculating expected ecological value for an organism that should be made more explicit. Let us assume that the states Bi are states of the environment and Ci are states of the organism and that Ai are actions that the organism can perform. We can let Vii be the ecological value of performing action Ai when B] obtains. 12 If we assume that each internal state Ci produces the corresponding action Ai, then the expected value is the sum of the values of each action Ai in circumstance Bj times the probability of Ai & Bj, i.e.:

Expected value = ~ ~ Vii x P( Ci & B D i i

a formula familiar from decision theory where it gives the expected utility of a strategy.

If we further assume that the organism's actions are appropriate, i.e., that the action Ai is the best action given Bi, 13 then the expected value is maximized if P(Ci&Bj)=O for all i ~ j . But this is equivalent to maximizing the mutual information between B and C. Thus whatever the distribution of values the optimal expected value derives from maximizing information.

However, if the information is less than maximal then the dis- tribution of values matters as we have seen above. In section 6 we will consider a particular way in which it can frequently matter.

5 . N A T U R A L I Z E D E P I S T E M O L O G Y A N D E C O L O G I C A L

E P I S T E M O L O G Y

The phrase, if not the concept, of "epistemology naturalized" was introduced in Quine (1968) with eonsiderable fanfare but little detail. Epistemology should be seen as a part of psychology, we were told, but we were not told which branch, nor indeed whether it was part of present day psychology or of some future development. My own view, just a little less sketchy than Quine's, is that epistemology should be seen as a part of ecological psychology as developed by Gibson, but

Page 9: Information-based epistemology, ecological epistemology and epistemology naturalized

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y 199

that the concept of information involved should be identified with that of Shannon and Weaver. 14

What about the notion of value, which I have also argued has a central place? In the examples above I chose a case in which we could identify the value of an action in terms of the contribution it made to the survival of the organism. But if we consider other kinds of action, especially those involved in sexual behavior, there is another kind of value that en ters - - the contribution that the action makes to the survival of the species. In general there can be divergence between value to the individual and value to the species. Furthermore, when we extend the theory up the evolutionary scale to organisms with a richer repertoire of inner representations, we can also consider a third notion of value, the preferences of the organism. These preferences need not always conform to either of the previous two notions of value. I defer a fuller discussion of these issues to another time.

One major objection to the naturalizing of epistemology was recently given a particularly articulate formulation by Kim (1985). Epistemology is a normative subject, he argues, and psychology is not. Therefore, (ignoring some important subtleties in his argument) epis- temology cannot be a part of psychology. It might be superseded by psychology but it cannot be a part of it. My answer to this objection is already implicit in the above discussion but it is worth making more explicit.

Given a precise formulation of the epistemological issues in terms of information, we can pose questions about whether an organism max- imizes information. More relevantly to a naturalized epistemology, we can ask whether an organism maximizes information given particular biological constraints and certain environmentally relevant tasks. The mathematical characterization of optimality begins to give a nor- mative aspect to this kind of psychology, and the observation that such optimality maximizes the organism's success (in whichever of the three notions of value is relevant) gives the ecological psychologist at least conditional imperatives.

A related argument that might well be raised against my conception of naturalized epistemology based on the S&W definition of in- formation is that epistemology is concerned with our relation to propositions, entities each of which is unique and whose truth values are eternal and unchanging, whereas information concerns frequen- cies, matters that can be repeated, and signals, which can be correct

Page 10: Information-based epistemology, ecological epistemology and epistemology naturalized

200 R I C H A R D E . G R A N D Y

on one occasion and not on another. My response is to reject that conception of the objects of epistemology. I would be happy to replace the propositional attitudes with sentential ones, and their objects (sentences) typically have the kind of variability in truth value that we require (cf. Grandy, 1985).

The naturalistic information-based approach also gives us a useful perspective on both Dretskean and causal theories. In effect, for Dretske a belief qualifies as knowledge if it could only have been caused by the state of affairs that renders it true, while for causal theories a belief qualifies as knowledge only if it was caused by the state of affairs that renders it true. Each is requiring a conditional probability of one, but in different directions. To require a conditional probability of one in both directions would be far too restrictive, but if we are willing to forego the fascination with knowledge, then we can consider the mutual information between the agent's beliefs and the state of the world. The naturalized epistemologist ought not to ask what organisms know but how knowledgeable they are. The latter is a matter of degree that can be measured information theoretically.

Similarly, from this point of view reliability theories are praisewor- thy but incomplete attempts to naturalize epistemology. Since reli- ability is a matter of degree and varies with context, the reliability theories end with a definition of knowledge that fudges with a term such as "sufficiently reliable" and which is relativized to context. Better to acknowledge that reliability is a matter of degree and compare the degrees. Context enters clearly explicitly and formally if one conditionalizes all of the probabilities in the definition of mutual information on some assumed background, Replace each P(Bi & C]) with P(Bi & C]) given E, which gives a definition of mutual in- formation of Bi and C] given the assumption that E. 15

6. A N A P P L I C A T I O N

In section 4 we noted that given two equal but less than maximal information situations there can be a significant difference in their ecological value. I want to consider the more general question of when and why.

Suppose that among the possible situations Bi and actions A] there is an action whose appropriate performance has a considerably greater value than performance of any action (same or other) in any circum-

Page 11: Information-based epistemology, ecological epistemology and epistemology naturalized

I N F O R M A T I O N - B A S E D E P I S T E M O L O G Y 201

stance (other or same). More formally, there is a Vii such that Vii ~> Vii and Vii >> Vfi for all j = i. It is not difficult to show that given equal information situations, i.e., distributions of joint prob- abilities that yield equal amounts of information, a higher value will attach to the one that maximizes the probability of Ai. In other words, if the most valued outcome is to perform action Ai in circumstance Bi, then it is better to produce false signals that indicate Bi than to omit signalling Bi (given that the information is equal in omission and commission) when it in fact occurs.

If we consider what kinds of situations and actions have such dominant value differences, at least for species relatively high in the food chain, we can see that two particularly significant such events will be the presence of food and of predators. For organisms high in a food chain opportunities for a meal are relatively rare and it is important to make the most of them. Similarly, opportunities to be a meal are also relatively rare and appropriate action is equally imperative. Thus, given the assumptions about frequency ju.st listed, it is better, if an organism is going to err, to err on the side of overrating the low probabilities associated with preying and being preyed upon.

On the other hand, overrating low probabilities in circumstances where no significant difference results has low cost. Thus as a general survival strategy, if one cannot be exactly sure about the state of the environment it may be better to consistently err by overestimating small probabilities. Whether this constitutes an explanation, let alone a justification, of the phenomenon is unclear, but Tversky and Kah- neman (1982) observe that exactly this kind of systematic misjudg- ment occurs in human subjects under a variety of conditions. One of the virtues of naturalized, information-based epistemology might be that it can not only explain why plausible strategies (believing basic perceptual reports) are indeed appropriate but also explain why ap- parently irrational strategies may actually be appropriate given the constraints that obtain in the world.

N O T E S

i I am indebted to Don Johnson and Joe Schatz for helpful discussions of the topics of this section. 2 An objective distribution. 3 P(B i /C j ) = e ( B i & Cj) /P(Cj) .

Page 12: Information-based epistemology, ecological epistemology and epistemology naturalized

202 R I C H A R D E. G R A N D Y

4 See Gallagher (1968, 16ft.) Jones (1979, 18ft.) or Reza (1961, 104ft.) for details and justification. 5 I also assume without further repetition that the value of log 0/0 is 0. 6 I am indebted to Jerry Gratch for helpful suggestions regarding this section. 7 I am uncertain whether Dretske intends to be doing naturalized epistemology or not, so I am uncertain whether the following is in any way a criticism of his approach. 8 See Lettvin et al. for a discussion of the importance of this particular distinction in the frog's visual system. 9 Obviously in reality much more precise information about the location of the bug is required and available, but no significant point is lost in our context by this simplification. ~(~ Dretske has only the briefest discussion of knowledge of statements involving probabilities. 1~ 0.1 for the bug caught in the 0.1 cases when one is present minus 0.001 (the energy lost, 0.1, times the probability of the failed attempts, 0.01) for failed attempts. 12 That is Vii = Value(A/given Bj). 13 More formally, Vii >i Vii for all i and j where i ~ j. t4 Given my emphasis on the S&W definition of information, which rules out orthodox Gibsonians, about the only work in psychology which falls clearly within epistemology naturalized is that of Marr (1982) and his co-workers. ~5 For a more detailed pessimism about reliability theories, see Grandy (1980).

R E F E R E N C E S

Attneave, F.: 1959, Applications of Information Theory to Psychology, Henry Holt and Co., N.Y.

Bar-Hillel, Y.: 1952, 'An Outline of a Theory of Semantic Information', in Language and Information, Addison-Wesley, Reading, Mass., 1964.

Dretske, F.: 1981, Knowledge and the Flow of Information, Bradford Books/MIT Press, Cambridge.

Gallagher, R. G.: 1968, Information Theory and Reliable Communication, Wiley, N.Y. Gibson, J.J.: 1966, The Senses Considered as Perceptual Systems, Houghton Mifflin,

Boston. Grandy, R. E.: 1980, 'Ramsey, Reliability and Knowledge', in D. H. Mellor (ed.),

Prospects for Pragmatism, Cambridge University Press, Cambridge. Grandy, R. E.: 1985, 'Some Misconceptions About Belief', in R. Warner (ed.), Philoso-

phical Grounds of Rationality: Intentions, Categories, Ends, Oxford University Press, Oxford.

Hintikka, J.: 1970, 'On Semantic Information', in J. Hintikka and P. Suppes (eds.), Information and Inference, D. Reidel, Dordrecht.

Kim, J.: 1985, 'What is "Naturalized Epistemology"?' Paper presented at the Western Division APA, Chicago, April, 1985.

Jones, D. S.: 1979, Elementary Information Theory, Oxford University Press, Oxford. Lettvin, J. Y. et al.: 1959, 'What the Frog's Eye Tells the Frog's Brain', Proc. IRE 47,

1940-51. Marr, David: 1982, Vision, W. H. Freeman & Co., San Francisco.

Page 13: Information-based epistemology, ecological epistemology and epistemology naturalized

INFORMATION-BASED EPISTEMOLOGY 203

Miller, G. A. and F. C. Frick: 1949, 'Statistical Behavioristic and Sequences of Responses', Psych. Rev. 56, 311-24.

Quine, W. V. O.: 1968, 'Epistemology Naturalized', reprinted in Ontological Relativity, Columbia University Press, N.Y., 1969.

Reza, F.: 1961, An Introduction to Information Theory, McGraw-Hill, N.Y. Shannon, Claude E.: 1948, 'The Mathematical Theory of Communication', reprinted in

The Mathematical Theory of Communication with Warren Weaver, University of Illinois Press, 1949.

Tversky, A. and Kahneman: 1982, Judgement Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge.

Wiener, N.: 1948, Cybernetics, Wiley, N.Y.

Department of Philosophy Rice University Houston, TX 77251 U.S.A.