Sunday, March 4, 2012

7-Mar-12 Knowledge - True belief plus evidence?

This coming Wednesday at the Ideas Cafe, we are going to discuss what knowledge is.

How do we come to the conclusion that we "know" something? 

The following quote from Bertrand Russell's "Theory of Knowledge" illustrates the difficulty in defining knowledge. "The question how knowledge should be defined is perhaps the most important and difficult of the three with which we shall deal. This may seem surprising: at first sight it might be thought that knowledge might be defined as belief which is in agreement with the facts. The trouble is that no one knows what a belief is, no one knows what a fact is, and no one knows what sort of agreement between them would make a belief true." 

When does a thought become a belief?

What is a fact?

How would we know that they agree?

If and when we do prove that there is agreement between the belief and the fact, what are the assumptions under which this agreement is valid?

Will this agreement continue to be valid the future? How do we "know" that without having the ability to foretell the future?

If we are not sure that this agreement will be true in the future, what is the point?  What good is knowledge that may not be true tomorrow?  After all, the real reason we want knowledge is so that we can put it to good use in the future, so is it not pointless if we are not sure it will apply after today?

Will we ever be able to know all there is to know about anything?  Since the answer is likely no, then how will we ever be able to tell that what we "know" today will not be proven invalid tomorrow?

I am reminded of the turkey story.  To the turkey in a turkey farm,  the turkey farmer is the turkey's best friend.  Everyday the farmer comes to feed the turkey and clean the barn to make life comfortable for the turkey.  The turkey has every fact to confirm its belief that the farmer is its best friend and benefactor.   Then two weeks before Thanksgiving, the farmer comes and wring the turkey's neck.

Are we still sure that what seems good to us so far is not going to turn on us in the future?

What about facts?

We can perform repeatable experiments to show a certain scientific rule or principle. Ever since Archimedes jumped out of his bath to run down the streets yelling "Eureka!" we believe that we know why things float. Seems pretty certain and elemental.

Even ship disappearance in the Bermuda triangle are explained by Archimedes principle on the basis of bubbling gases that sank the ships.  But how do we know that it is not that the principle is flawed and we are just hanging on to our trusted belief?  How do we know that there is not going to be a future Einstein that will show us under what conditions Archimedes principles may not hold?

If black and white scientific principles can leave room for doubt, then everything else that have fuzzy boundaries and interpretations will be that much more uncertain.

How do we draw the line to say that we are confident enough that we can say we know and file the information away as knowledge?

When should we have enough doubt to bring it back out to reexamine that particular belief against more "facts"?

""Theory-induced blindness" is a term used to describe the tendency that once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws.  If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing.  You give the theory the benefit of the doubt, trusting the community of experts who have accepted it."  (from Thinking Fast and Slow,  Daniel Kahneman)

I am sure the philosophers among us will enlighten us on the rich history of philosophical discussions on knowledge.  Please also bring your own experiences and anecdotes to share about when the light bulb turned on for you and you "know" something.

Looking forward to your ideas this Wednesday.

4 comments:

  1. An interesting 20 minutes of Kahneman. It suggests to me that when
    thinking about what I "know" it will be a good idea to check whether it
    is based on the the relatively long period of "experiencing" or only on
    the remembered part which is largely the most recent part of the
    experiencing.

    http://www.ted.com/talks/lang/en/daniel_kahneman_the_riddle_of_experience_vs_memory.html

    "Nobel laureate and founder of behavioral economics Daniel Kahneman
    reveals how our "experiencing selves" and our "remembering selves"
    perceive happiness differently. This new insight has profound
    implications for economics, public policy -- and our own
    self-awareness."
    Dan

    ReplyDelete
    Replies
    1. Hi Dan,

      Great video on TED. Although I did not quite get his point about the thought experiment of what happens if we have no pictures and no memory of the vacation.

      From the TED talk page there is a link to Kahneman's site in Princeton and there are a few more past lectures linked in his web page. I am looking forward to watching those later on.

      Oliver....

      Delete
  2. I guess Kahneman is a social psychologist (social cognition) or a political psychologist. The stock requirements for "knowledge" are:
    (1) You believe it to be true.
    (2) You have adequate evidence that it is true.
    (3) It actually is true (is there a real state of affairs?).
    (4) More controversially, there is no fairly well-known probability that it might be otherwise.

    The fourth condition, has been described as: You parked your car downstairs in the underground garage. It is a certain colour and you know it was that colour when you parked it. However, there are vandals repainting cars in the garage and because of this you cannot be sure that it was not repainted another colour. Courtesy of a professor from Douglas College based on a paper in a journal.

    What are all the theories of truth? Consensus theory, correspondence theory, pragmatic theory, and there must surely be others.

    As for theories, they can be evaluated on a number of grounds: validity (both internal and external), heuristic value (leads to further discovery), scope, parsimony (maybe both simplicity and beauty), maybe something about verifiability and falsifiability. I am not so sure about theories as a basis for knowledge.

    Anyway that's all for now. CL

    ReplyDelete
  3. One other comment: truth theory may include a utility theory of truth -- if it is true, then it must be useful. Likewise a good theory must be useful. Is this like rationally useful? But there is the problem that "rational control" of human nature can lead to perverse results. Humans need not always be rational nor even essentially so. So do truth and theory need be human / humane too? CL

    ReplyDelete