The Grice Club

Welcome

The Grice Club

The club for all those whose members have no (other) club.

Is Grice the greatest philosopher that ever lived?

Search This Blog

Sunday, June 13, 2010

Reductive and Eliminationist Analyses of "C causes E"

by JLS
for the GC

DOCTOROW is referring to a Lebesgue measure approach to causation -- which is VERY welcome. This from the Stanford entry (a) prompted me to look close to Dupre (b). The general philosophical point seems to be that referred to in (a): 'to analyse 'cause' we need ANOTHER 'concept' to do the job. The question then may arise as to whether Dupre is offering an elminationist (as I don't think he does) or a merely 'reductive' analysis.

Notes.

(a) From

http://plato.stanford.edu/entries/causation-probabilistic/

"The original hope of Reichenbach and Suppes"

--- [Reichenbach is discussed at some length by Grice
in "Actions and Events", Pacific Philosophical Quarterly,
vol. 67, but basically via Davidson's interpretation --
and in connection with the "Reichenbach" 'fact' operator;
Suppes, who had worked with Davidson from as early as 1957
on subjective-probability accounts to rational
choice theory, contributed to the Grice festschrift
ed. by Grandy/Warner]

"was to provide a reduction of causation to probabilities. To
what extent has this hope been realized within the causal modeling
framework? Causal modeling does not offer a reduction in
the traditional philosophical sense; that is, it does not offer
an analysis of the form"

‘X causes Y if and only if…’

"where the right hand side of the bi-conditional makes no reference to causation. Instead, it offers a series of postulates about how causal structure constrains the values of probabilities."

(b) From

http://plato.stanford.edu/entries/causation-probabilistic/supplement4.html

"C causes E if and only if ΣB P(E | C & B) × P(B) > ΣB P(E | ~C & B) × P(B)"

20 comments:

  1. Thank you, JLS for the JC!

    I think that Grice's view of Reasoning as constructing steps rather than "making previously premises explicit" fits in well with our reasoning regarding Measure including volume, area, length. G. W. Mackey in his 1957 through 1963 papers introduced "questions" and "answers" into Quantum Logic's Axiomatic approach, which were later developed by Jauch of Switzerland into "yes-no" experiments or "yes-no" propositions that opened up (with C. Piron) a whole new era of Quantum Logic by the early 1970s. A good reference (very important!) is:

    1) Jammer, Max (1974): "The Philosophy of Quantum Mechanics," Wiley: N.Y., London.

    Quantum Logic declined later because (in my opinion) is followed Heisenberg's Uncertainty Principle too much and failed to focus on the logical conditional except for a few people like Peter Mittelstaedt of U. Koln (Cologne) Germany, but it was revived again in the 1990s, although it is still in something of a quagmire (and "There but for the grace of God go we all.").

    Osher Doctorow

    ReplyDelete
  2. Most philosophical writing on probability tends to be an abstracted, "what if" construct as if science and research (like hypothesis testing) could be done "a priori" without reference to any empirical phenomena or data whatsoever.

    In laboratory contexts, at a molecular or atomic level, the "quantum" logic (usually Kolmogorovian) might apply at times--yet the formalization of prob. cannot even begin to account for the natural world. Even basic chemistry involves many different types of reactions, some very predictable (say salt dissolving in H20), but many not (there are many unstable compounds, like potassium, or random such as various isotopes).


    Considering many macro-or complex phenomena--say weather, earthquakes, ..solar activity-- probability does not offer any complete account of science (or Laplacian if you will--see BRICMONT on this issue), nor does it usually apply to the human world (say, stock market..politics, war-o-meter, or pennant race) however hard a Reichenbach or Kolmogorov (really the founder of "quantum" probability) or Suppes try (tried). Suppes knows as little about what led to say Katrina as you or I...hold on to yr hat, Doc!


    Diogenes J.

    ReplyDelete
  3. "Cause" is just a word, so, even withing the confines of customary usage, its precise meaning depends on its user's intention. I know why a negligence jury needs to determine causation, so I can frame an instruction to that jury for assessing causation for that purpose. But what is the philosopher's purpose? What fork in what decision tree turns on how causation is determined, and why would other paths resulting from other criteria be invalid?

    From the point of view of abstraction for abstraction's sake, I can feel (rather than defend) an analogy between causation and information:

    Causation is to probability as information is to possibility.

    Since I don't understand the symbolic proposition at the end of the main post, I may be saying something consistent with that proposition or I may not. But, if my analogy is correct, is it useful? I mean, what's the question?

    ReplyDelete
  4. ... the symbolic proposition at the end of the main post

    It's basically saying C causes E if the conditional probability of C (er the set of all C's and B's leading to E's!) happening IS greater than the conditional probability of ~C happening (er the set of all ~C's and B's leading to E's!), assuming all variables can actually be known and/or defined.

    Ergo, it's a sophisticated form of begging the question (j-k JLster). Even Fatty Hume however ham-fisted would say, well, where does the causa proxima...C, say, assuming said entity can even be defined...actually...begin?? (initial conditions as I think the dweebs say). Was it...Twinkies for Breakfast when Mr White offed Mr Milk, or....his mama beating him when he was leetle, or perhaps he was shell shocked from Nam..? (and human-intention complicates things greatly). Prob. equations look good, and actually might work with dice or sodium atoms (or nuke weapons testing), but pretty much meaningless as applied to human affairs (or earthquakes and tectonic plates for that matter...)

    J., temporarily channelin' ghost of Paulie Feyerabend

    ReplyDelete
  5. Diogenes J., thank you for expressing the views of many philosophers and scientists, engineers and Medical people, with whom I disagree. It is a good opportunity for me to argue against your and their views on each issue.

    I have perhaps one advantage - I formerly believed in your type of views when I was young, and I changed later, so I have some some idea how "both sides" think.

    The "naive view of probability", as I call the general opposition to it among both many students and researchers, is roughly similar to Einstein's view that roughly maintains that probability is the "unknown" and that we will ultimately KNOW and thereby overcome it.

    Some of the "Naivists" (as I will call them for brevity) think that Statistics does not suffer from these problems, or that if it does then it will also ultimately be overcome. Their reasoning is that Statistics simply inputs raw data from the real world and so is like science in adding real world evidence.

    The counter-argument against that view is quite remarkable.

    1) The main mathematical tool used by Statistics is Probability. There is no Statistics without Probability.

    2) There is a small thin book called "How to Lie with Statistics" from the 1970s or 1960s, whose author I have forgotten but which I will attempt to locate, which illustrates or dissects the fallacies of (1). For example, political polls which leave out probabilistic information (almost absurdly referred to as "margin of error" when they are really referring to probabilities) are almost meaningless. If I tell you that 2/3 of my sample is for or against someone or something, that alone is almost meaningless information. It would be like Sir Isaac Newton reporting that 2/3 of the time, gravitation follows his laws in a certain sample - what does that mean for you, me, or the falling object for example? If I tell you only that my sample consists of 200 people, what does that alone mean? If my population is 201 people, then wonderful! I have an almost "complete population sample". But if my population is 6 billion people, should I be as sure of the relevance of 200 people? I have left out Probability in these cases, which takes into consideration the population as well as the sample, the randomness of the sample, the size of the sample, the Probability of obtaining a sample result of the type obtained or larger or smaller sample results, the hypothesis being proposed or alternative hypothesis being rejected, and so on.

    While Sir Isaac Newton was fortunate in studying gravity which near the Earth holds EXACTLY in experiments no matter how many experiments you make (which by the way is the same as Probability 1 on a scale of 0 to 1 inclusive, with Probability measures by decimals or fractions or whatever), there are relatively few laws of this type in the universe. Such laws are ESPECIALLY mostly lacking in the weather, earthquakes, solar activities, human world or life which you mentioned as examples. There, Probability becomes of considerable importance in many scenarios.

    Osher Doctorow

    ReplyDelete
  6. It is also relevant to mention Probability in Quantum Mechanics (QM). QM is IMPOSSIBLE with Probability. This was discovered by Max Born in coordination Werner Heisenberg and Schrodinger, who with Erwin Schrodinger founded QM, although Einstein and Planck and De Broglie helped pave the way at more elementary levels.

    To make a long story short, the main variable in QM, which I will write as w (the Schrodinger wave function), turns out to be mathematically directly related to a Probability as follows:

    1) ww* = P(finding a particle in a particular volume of space), where w* involves w with addition replaced by subtraction, and ww* is w multiplied by w*.

    Those familiar with complex numbers or complex variables can notice that w = a + ib, w* = a - ib, but this isn't necessary to get the rough idea here.

    By the way, in my previous post I referred to "the fallacy of (1)". I meant "the fallacy of the Navisits", not of (1) which is accurate.

    Osher Doctorow

    ReplyDelete
  7. I meant to type in my first sentence of the previous post "impossible without Probability", not "impossible with Probability".

    Osher Doctorow

    ReplyDelete
  8. Before I try to reply to Lawrence J. Kramer, I should mention another misunderstanding about Probability.

    Many people believe that "The Probability of event A" is just roughly speaking the fraction of cases in which A occurs. So the Probability of obtaining a 10 in a deck of 52 cards (with random shuffling, etc.) is 4/52 because there are 4 10s in a deck (4 cards with the number 10 on them). This reduces to 1/13. Although the example is correct, the idea of Probability as roughly speaking the fraction of cases where something occurs is somewhat less valuable than regarding Probability as a VOLUME in many cases. For example, in this example, we know that card decks sold by particular companies have a certain volume, let us say V (a particular number in terms of cubic inches, for example). Now calculate the volume of the 4 cards which have the number 10 on them. It will turn out that the volume is one-thirteenth of the total volume V! So the Probability of obtaining a 10, which is 1/13 from the above discussion, is also the volume of the 10 cards (putting them on top of each other in a pile) relative to the total volume V.

    If readers can keep their focus on the VOLUME viewpoint which is geometric in origin, rather than the idea of fractions of cases in a random sample, then they will arguably come closer to the REAL nature of Probability in the REAL world. We do not "rebel" against the notion of a physical object having a volume, and it is equally mistaken to rebel against the notion of a physical object having a Probability.

    Osher Doctorow

    ReplyDelete
  9. While Sir Isaac Newton was fortunate in studying gravity which near the Earth holds EXACTLY in experiments no matter how many experiments you make (which by the way is the same as Probability 1 on a scale of 0 to 1 inclusive, with Probability measures by decimals or fractions or whatever), there are relatively few laws of this type in the universe. Such laws are ESPECIALLY mostly lacking in the weather, earthquakes, solar activities, human world or life which you mentioned as examples. There, Probability becomes of considerable importance in many scenarios.

    Yes I agree--were that not the case, humans would probably not operate aircraft--and that holds for supposed quantum "anomalies" as well--if they were really a serious issue for technology--or say, metallurgy--it would not be prudent to allow planes, trains, rockets, bridges, etc. So they're not...the uncertainty principle becomes negligible at atomic levels.

    But few natural events are as predictable and orderly as gravity--quakes, canes, global warming, weather, diseases mostly defy human attempts at understanding, even when researchers use very high-powered stats. programs. NCAR with millions of dollars worth of gear could not forecast Katrina in any precise fashion. Same holds for USGS---otherwise, they'd be held guilty for not accurately predicting killer quakes. Sort of obvious examples, but those disasters say something about "official" science institutions. I'm not anti-science, and in a sense hold to deterministic views (Einstein himself did not completely reject classical mechanics), but I'm against bad or grandiose applications of science. When the academic science establishment insists great progress has been made across the board usually that means the Defense industry has new death-machines. Feyerabend was on to their games.

    ReplyDelete
  10. J, well said! In fact, "progress" is mostly in terms of speed rather than depth so to speak. We can speak faster to distant people but not any wiser usually.

    I also should point out a "disclaimer", namely that I have been roughly translating Probability into simple English. There are "Probability Spaces" where our notion of volumes in Euclidean spaces does not apply. I have been referring to Probabilities on Euclidean Space or Probabilistic analogs of Euclidean space in order to relate things more to Intuition. Also, volumes have to be "normalized" or "standardized" so that Probabilities stay between 0 and 1, such as by dividing all volumes by a super-gigantic volume and not considering anything larger. There are also complications on certain discrete Probability spaces.

    Osher Doctorow

    ReplyDelete
  11. Good. Just a reminder. For Dupre:

    "C causes E" is true iff

    "ΣB P(E | C & B) × P(B) > ΣB P(E | ~C & B) × P(B)"

    --- which, as J. remarked possibly 'begs' (or asks, as I prefer) for some kind of principle (rather than question) I find begging is hardly respectable, but petition a principle sounds sort of ok. So we do need to analyse the analysans in some more detail -- or not.

    I like Kramer's point about the use of 'cause' by philosophers. I think this was one of Grice's examples in "Philosopher's Paradoxes" ("Charles I's decapitationw willed his death"). It may also relate to Russell's animosity against animism (he thought 'cause' belonged to stone-age metaphysics).

    Kramer's point is particularly apt. Revising the literature, there is indeed some evidence for a sort of localism that goes against the spirit of Grice (wherever THAT is -- he was cremated): there's cause in Newtonian mechanics, and cause in biology and cause in psychology and cause in law (Hart/Honore usually cited). The idea that there is "cause" simpliciter is thus challenged. Grice would disagree on moral grounds: "Philosophy, like virtue, is entire"! --

    I will work on Doctorow's idea of thinking in terms of 'measure'. Yes. I understand the variabales at play here: volume, length, etc. --.

    There is also the problem of singular versus general causation, which may relate. P(cancer/smoking) seems to be an er, sort of, 'general' statement. We are not saying that, HIS smoking causes HIS emphysema (we know who -- Grice died of emphysema -- and not, as some online sources have it, of 'non-natural causes'). In fact, Lady Susan Walton died recently, and I was offended by the obituary:

    "She died of natural causes". Why not just state something MORE obvious! If anything, an obituarist's obigation is to provide a causal probabilistic account of why people die, or stuff.

    ReplyDelete
  12. There is also the problem of singular versus general causation, which may relate. P(cancer/smoking) seems to be an er, sort of, 'general' statement. We are not saying that, HIS smoking causes HIS emphysema...

    In that case, I would say the inference of causation of smoking--> emphysema IS warranted, at least as sufficient cause (with --> as, "results in", or "more often than not results in" [to please Dupre..let's see how JL..or Dupre would use that with his sigma and CP formula], not just semantic "implies").

    A normal study of smoking and cancer would, however, probably make use of ...frequentism (however detested by many new probability types)--the researchers look at the "normal distribution," examine/tabulate data, compare smokers to non-smokers, etc. And a percentage might apply (with margin of error, etc)--. But that has little to do with logic per se, and people were doing that pre-Bohr (or pre-Kolmogorov). Insurance companies still rely on frequentism (or a posteriori methods as it were)

    ReplyDelete
  13. Good. I should learn more about the emphsema since it killed him. Grice I mean. He was 75. I would go counterfactually, and say that the cause was his mother. Apparently, she said that he looked 'very sophisticated' holding those cigarettes. He picked up the habit early in his life and left it only too late (a box of cigarettes was untouched at the time of his death).

    So we would have:

    P(Grice's DEATH of emphysema/Grice's smoking)

    I suppose I should be intersted in other various famous cases to build a frequency or stuff?

    ReplyDelete
  14. As I said, I cannot defend the analogy I offered between causation and information, and I am now unoffering it. On reflection, but still without any real authority, it seems to me that information is a form of causation. Causation reduces entropy; information reduces a certain kind of entropy.

    From this perspective, it simply makes no sense to inquire, formally, whether C "causes" E. We can ask to what extent C reduces the probability that E will not happen, and if the answer is "to zero," we can say, in natural language, that C causes E. (And we can say it in the past tense, too, I suppose.)

    How do you know that the Prince of Wales has two sons named Harry and William?

    My son told me that he has a son named William, and my wife told me that he has two sons, and one son's name is Harry.


    Leaving aside quibbles over "know," it's clear that my son and wife each supplied information. But can we say that each "caused" me to know that the Prince had two sons named William and Harry? The question doesn't compute. We can ask if each cause me to be know more, i.e., to need less information to know what I wished to know. We can even say that my wife "caused" me to know the answer to my question by removing all open matters, but that hardly seems fair to my son, except, of course, that it doesn't matter, as no emoluments, privileges or immunities attach to the honor.

    Anyway, this is the idea of which I would like to be disabused next.

    ReplyDelete
  15. I think it is a good idea. I believe Dretske, Stampe, and what Floridi calls the "classical theory of semantic information" sees it along those lines.

    I will mediate on the point about "informing" and "causing to know".

    There are quibbles I would have as to the information being "true". I.e. REAl information, rather than "information". If your wife tells you that there´s life in Jupiter and your son tells you that life in Jupiter is supported by the heat provided by the sun, they both have "informed" you, but I´m not sure I would like to consider the "causes" of your "knowledge".

    In general, if p, only p can cause the "state" of what we call "knowledge" that p. I would not think "sources of information", they being contingent, matter.

    According to some theorists (my aunt), if you don´t know it, it doesn´t exist. Or stuff.

    ReplyDelete
  16. There are quibbles I would have as to the information being "true".

    As in "Leaving aside quibbles over 'know,' it's clear that my son and wife each supplied information."?

    I was going to use "believe" instead of "know," but it seemed overly cautious for the context. Information theory is not concerned with truth - it's the entropy in my beliefs that is being reduced.

    ReplyDelete
  17. Well, I don't think it was being overly cautious, but will come back to your point, I hope.

    ReplyDelete
  18. Perhaps 'content' is a good term. When I was writing my PhD -- a section of which IS on informativeness, I was onto bytes -- byte by byte as it were.

    We do speak of 'informational CONTENT' or 'informative content' and so on. So 'content' seems to be a good term. It also relates to the term as used by 'philosophical psychologists' to refer to the 'that'-clause.

    So, in the Kramer's scenario, the content of his belief,

    "Prince Chas has a son called Henry and a son called William" was 'provoked' by having been 'told' (if you are not happy with 'informed') about that content by, respectively, his wife and his son.

    Each agent had a role -- a causal role, or co-causal role -- in the production of the total 'content', and Kramer was wondering if it was okay to 'reduce' the conjunction of the causal roles. I guess I like the scenario.




    The definition of the information entropy is, however, quite general, and is expressed in terms of a discrete set of probabilities pi:


    In the case of transmitted messages, these probabilities were the probabilities that a particular message was actually transmitted, and the entropy of the message system was a measure of how much information was in the message. For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message.

    ReplyDelete
  19. the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message.

    Isn't it the sum of the weighted number of such questions, so that trivial binary questions add very little entropy?

    And doesn't that then tie back to causation in general as that which reduces entropy? If I shoot at a target, and my aim is true, a host of binary questions remain about the functioning of the weapon. But the probability of the target being struck is very high because the probability of the binary gun-operation events not happening is very low, so, in common parlance, I can be said to have caused the target to be hit.

    Likewise in communications. We may leave to implicature low-probability binary questions, the ones to which the answer is so obvious as not to need stating. Aren't important words usually longer than unimportant ones so that if they are garbled, there is less likelihood that the message will be corrupted?

    ReplyDelete
  20. Good. For the record, I must say that the last two passages in my commentary were directly from wiki -- "entropy (communication)" and I was mentioning them to check their use of "content" which seemed to me like a good term which is 'neutral' as regards it being defined in 'bytes' (or bits) but without it being 'true' -- but cfr. 'informational' or 'informative' content.

    Yes, I suppose a lot of yes/no questions would be so trivial that they have to relagated to the level of the implicature -- never to be explicited, we hope!

    I was also thinkg of Stephen Neale (A gricean) and his paper on "Content and Grain". I would think that if you get to believe that p and q via your son and your wife, one has to be careful as to whether they are 'informing' you about the SAME things. There is fine-grained content, and the other.

    You would need to trust that when your wife uses "offspring of prince Chas" she means the same as what your son does. Only THEN you can combo the two sources of 'info' into a more complex 'content' that merges both bits or bytes, etc. But I'll re-read what you say.

    ReplyDelete