[Editor’s Note: Here’s a submission from Derick, guest from our Saussure episode.]
“Plurality should not be posited without necessity.” -Duns Scotus
“Entities must not be multiplied beyond necessity.” -William of Ockham
Here’s a philosophical thesis that should be obvious but apparently isn’t: Ockham’s Razor is not an ontological rule nor even a necessary rule of logic. It is a heuristic that is generally useful but given that complexity in one category can mask simplicity in another, and there is actually no empirical reason for us to favor elegance, which is, after all, an aesthetic preference, then it is impossible to ascribe to reality as a principle or a law the way one would the law of non-contradiction (itself actually logically questionable) or the fundamental theories of matter. Or, as Seth Paskin put it in a recent episode on Quine which inspired this line of reflection: Why should we care?
Few scientists would deny that Ockham’s presumption of parsimony is only a heuristic, and they would also admit that “simplest” here is only in relation to explanatory power itself. It entails no obvious ontological commitments and may not even entail a specific approach to epistemology. This is more obvious in the vulgar (and incorrect) paraphrase of the razor as “other things being equal, a simpler explanation is better than a more complex one.” This is easy enough to see through: simplicity here is ill defined. Epistemic simplicity is not verbal simplicity is not simplicity in ontology is not physical simplicity. The maxims kicking off this post, however, don’t include such conflations of category, so this misreading of the razor is due to bad paraphrasing and an aesthetic preference.
Formalizing the razor doctrine is problematic: “Necessity” in Ockham’s maxim implies that it is known what entities are necessary in an explanation. In short, there is a circularity to the razor’s edge. When expanded in a way to make its formalization less circular and more concrete, we are given Ray Solomonoff’s inductive inference rules for mathematics. While this is above my mathematical skill set, I can say that it rests on set theory, and is predictive of languages and not just integers. The break-down, however, still occurs when one throws Hume’s monkey-wrench in the mix: the infamous problem of induction still applies. I cannot justify based on any logically sound reference other than probability that any empirical standard that wouldn’t itself be based on something like a gamblers fallacy. The chances of things not being the way I see them may be small, but they exist and cannot be adequately accounted for. So how can I know what is necessary to avoid adding more unnecessary entities.
Other attempts to answer this problem have been equally problematic: Karl Popper asserts that it is neither simplicity nor empiricism that justifies the razor but “because their empirical content is greater; and because they are better testable” (from Popper’s The Logic of Scientific Discovery 1992, Chapter 7). “Their” here being hypotheses created with Ockham in mind. This answer from Popper nods to the fact that empiricism or simplicity alone aren’t justified without completion or some unacknowledged aesthetic criterion, but then says that the razor makes the falsification criterion hold more easily. Is this dependence on falsification necessary for the validity of the razor as a heuristic? Hardly, as one who has tried to prove the validity of a statistical inference without recourse to another methodological heuristic (such as Bayesian information criterion) can tell you. In terms of math, a statistical inference is never completely falsifiable, as there is always a chance that you haven’t seen the last instance of possible dis-confirmation.
Furthermore, leaving the realm of physics where a criterion of elegance is more generally applied, one can look at pragmatic uses of the razor in medicine. We can see the preference for the rules of parsimony being trumped by complex systems: Hickam’s Dictum makes it clear, “Patients can have as many diseases as they damn well please.” Now, this seems like a hypochondriac’s missive, but what is at stake here is that sometimes statistically it is more probable for several simple or common diseases to produce a complicated condition than one super-rare one. So we see the circular nature of positing entities: we have to take into account the domain when figuring out what is necessary, and there are few simple ways to decide which domain of parsimony is needed.
When you try to apply Ockham’s Razor to the doctrine of the razor itself, it seems to shave a bit too close for any simple guideline to emerge, and yet Ockham’s Razor has emerged nonetheless, and been a useful heuristic. So this leaves us with a foundation that is harder to shore up in regards to the usefulness of the razor, yet we know it obviously is useful, as the simplification of entities involved in an explanation does tend to yield clearer results… at least so far.
How close this razor will shave is yet to be determined.
-C. Derick Varn
The reasoning you use to question the utility of Ockham’s razor, “there is actually no empirical reason for us to favor elegance” is contradicted by your conclusion, “…we know it obviously is useful, as the simplification of entities involved in an explanation does tend to yield clearer results… at least so far.” In fact, the utility of the Ockham’s razor idea is empirically demonstrated time and again, with the most striking examples coming from fundamental disciplines like physics and mathematics; in these cases, it is apparent that the most universal phenomena, those with the furthest reach and of the most consequence, are governed by profoundly elegant rules. That such enormous complexity can arise out of such frugal simplicity gives us direct empirical evidence that we should expect to find logical elegance at the basis for most any phenomenon, as Ockham’s razor suggests, and suspect that any convoluted theoretical stand-in we have reached is only temporary, awaiting a deeper insight that will reveal a more elementary order. The razor would suggest that next up for this kind of cutting revolution in understanding are the standard model and our interpretation of quantum mechanical systems.
Essentially, Ockham’s razor suggests that if there is any way to put a complex theory in simpler terms and still have the theory explain the same resultant set of consequences, then that complex theory reduces to the simpler; the simpler explanation cuts closer to the true explanation, because it leaves out any unrelated or unnecessary logical forms (and the actual phenomena described by the theory certainly do not embody the logical complications added by the inelegant theory). The simpler theory is always closer to the true explanation, because the phenomena the theory describes are embodied by the true reasons those phenomena exist (namely, the true explanation which the theory tries to grasp).
Really, what would it mean if the explanation for, or the logical underpinnings to any phenomenon were more complicated than the phenomenon itself, as would surely be the case if Ockham’s razor were an incorrect axiom? Would this not suggest that the logical underpinnings to that over-complex explanation should be more complex than the explanation itself, as well? If this were the case, why should we find the exact opposite, that every complex phenomenon arises from a set of simpler phenomena and principles, which in turn (if they are not of fundamental simplicity) arise from sets of yet simpler phenomena and principles? You can take for an example literally any phenomenon described by science or mathematics. And still you would suggest that Ockham’s razor is without any empirical evidence in its favor?
It would mean that the phenomenon should not be erroneously logically reduced to the being of its own explanation. We find that complex systems are made up of simple parts because that is the way we happen to think about the world, not because there is some preordained order that has decided how the world should elegantly be and not only that but also that we will experience it identically with the way that it is. Whether it is this way will never be decided by further applications of Ockham’s Razor as it will either consist in an accurate use resulting from a world that already always was this way in itself regardless of our measurements, or rather an error resulting from a world that is not. But for instance when it comes to general relativity and quantum mechanics, our finite surface-of-the-earth skeptical intuitions have only been incredibly misleading.
Derick the way you are able to circle around the historical uses of this logic in a critically negative manner suggests to me that it has not always been Ockham’s Razor which works to cut away at unnecessary explanations, but rather some other form of dialectic within which this kind of apparent aesthetic pragmatism could otherwise be logically (paradoxically?) subsumed.
“It would mean that the phenomenon should not be erroneously logically reduced to the being of its own explanation.”
I don’t see how this could be erroneous in any case; what is the “being of its own explanation” if not the actual reason for the phenomenon itself? What instantiates a phenomenon if not the full set of reasons for why it is instantiated?
“We find that complex systems are made up of simple parts because that is the way we happen to think about the world…”
I’d say this is perfectly backwards- we think about the world in this way because we empirically find that complex systems are made up of simple parts. Empirical evidence aside, it seems to me like an impossible logical inversion to try and figure out some way that simpler ideas could be made up of more complex ideas (you might protest that this is just the way that we happen to reason, but I think there is a pretty decent case to be made for the idea that logical truths are discoverable a priori, and that one could discover the Pythagorean theorem, for instance, without ever taking out a ruler and measuring a right triangle).
“But for instance when it comes to general relativity and quantum mechanics, our finite surface-of-the-earth skeptical intuitions have only been incredibly misleading.”
In the case of general relativity, it would be better to say “had been incredibly misleading, before Einstein noticed paradoxes given the known laws of electromagnetism in the context of the relativity principle, and discovered the phenomenon by which the Universe eliminates those paradoxes, (spacetime stretching), along with discovering the elegant mechanism underlying gravitation.” Quantum mechanics is due for a similar breakthrough, whether it comes in the next century or if such knowledge is truly beyond the human reach.
Well two wildly different things are being conflated here, there is the reason for some phenomenon occurring in the formal sense that every physical thing must have some cause, and there is also the reason why we have an explanation for there being such a phenomenon (say, having gone about performing some scientific experiment). Whether or not we have an explanation for some phenomena occurring has potentially nothing to do with its actual occurrence. What instantiates a phenomenon then is always something other than our explanation for its appearance, although the latter should obviously follow immediately from the former were Ockham’s Razor some how revealed to be true a priori.
Did not Aristotle the father of practical intuition remark that the whole is greater than the sum of its parts? You might have a better understanding of general relativity than I do (I would love to see this alleged intuitive explanation of its theory), but it seems this kind of excess is inherent to our findings of quantum mechanics phenomena – and not just to our personally diminished capacity for measurement.
This supposes a teleology to evolution that does not occur. Were the world a complex system made up of simple parts and it were also advantageous to think about the world in this way, we could have the capacity to act as such and, like the vast majority of animal species, we might not. And moreover the capacity for thought itself then could very well be an expression of the same divisive incision into the world that prevents it from being logically unified in any way at all.
What about love? We long for it to be so simple and we even have this sweet little idea for it called “love” which everybody can understand and yet it couldn’t be more difficult to really follow its consequences all the way through. Or in the case of science, it seems that it all should just coincide with some logical state of affairs and follow exactly from it, and yet we can not ever seem to set aside this grueling labor of back and forth discourse that should always be unnecessary given some ideal application of Ockham’s Razor.
“What instantiates a phenomenon then is always something other than our explanation for its appearance”
Yes, of course; I never proposed that our explanation for a phenomenon is the reason for its appearance. The phenomena we theorize about clearly exist independently of our theorizing, and our theorizing always seeks to cut closer to the true explanation.
“You might have a better understanding of general relativity than I do (I would love to see this alleged intuitive explanation of its theory)”
I’m glad you asked- take a look at these two chapters on general relativity (chapters 1 and 2). I think you’ll find the discussion quite interesting: http://www.scribd.com/doc/87848420/The-Fates-Unwind-Infinity#outer_page_214
“What about love? We long for it to be so simple and we even have this sweet little idea for it called “love” which everybody can understand and yet it couldn’t be more difficult to really follow its consequences all the way through”
We have the simple word love, and its definition, but of course, the word and definition are not the actual phenomenon “love”. Like many nouns, “love” is a simple label which references a complex phenomenon, in this case perhaps one of the most complex phenomena we know of. We have no comprehensive explanation for love (mostly because we have no comprehensive explanation for consciousness), though neuroscience reveals that the state corresponds to the systematic release of certain neurotransmitters in certain brain regions; clearly love arises as a global neural phenomenon out of simpler neural exchanges occurring simultaneously and in a complexly coordinated manner.
If Ockham’s razor implies that it is best to eliminate as much as possible of the irrelevant, the illogical, the non-communicative, then I think we get the best of both worlds including the fullness of expression, relevance and logic as well as the conciseness of expression. Somewhere between the chaos of naming everything and the strictness of only naming the formula emerges the process of complexity theory.