"Plurality should not be posited without necessity." -Duns Scotus
"Entities must not be multiplied beyond necessity." -William of Ockham
Here's a philosophical thesis that should be obvious but apparently isn't: Ockham's Razor is not an ontological rule nor even a necessary rule of logic. It is a heuristic that is generally useful but given that complexity in one category can mask simplicity in another, and there is actually no empirical reason for us to favor elegance, which is, after all, an aesthetic preference, then it is impossible to ascribe to reality as a principle or a law the way one would the law of non-contradiction (itself actually logically questionable) or the fundamental theories of matter. Or, as Seth Paskin put it in a recent episode on Quine which inspired this line of reflection: Why should we care?
Few scientists would deny that Ockham's presumption of parsimony is only a heuristic, and they would also admit that "simplest" here is only in relation to explanatory power itself. It entails no obvious ontological commitments and may not even entail a specific approach to epistemology. This is more obvious in the vulgar (and incorrect) paraphrase of the razor as "other things being equal, a simpler explanation is better than a more complex one." This is easy enough to see through: simplicity here is ill defined. Epistemic simplicity is not verbal simplicity is not simplicity in ontology is not physical simplicity. The maxims kicking off this post, however, don't include such conflations of category, so this misreading of the razor is due to bad paraphrasing and an aesthetic preference.
Formalizing the razor doctrine is problematic: "Necessity" in Ockham's maxim implies that it is known what entities are necessary in an explanation. In short, there is a circularity to the razor's edge. When expanded in a way to make its formalization less circular and more concrete, we are given Ray Solomonoff's inductive inference rules for mathematics. While this is above my mathematical skill set, I can say that it rests on set theory, and is predictive of languages and not just integers. The break-down, however, still occurs when one throws Hume's monkey-wrench in the mix: the infamous problem of induction still applies. I cannot justify based on any logically sound reference other than probability that any empirical standard that wouldn't itself be based on something like a gamblers fallacy. The chances of things not being the way I see them may be small, but they exist and cannot be adequately accounted for. So how can I know what is necessary to avoid adding more unnecessary entities.
Other attempts to answer this problem have been equally problematic: Karl Popper asserts that it is neither simplicity nor empiricism that justifies the razor but "because their empirical content is greater; and because they are better testable" (from Popper's The Logic of Scientific Discovery 1992, Chapter 7). "Their" here being hypotheses created with Ockham in mind. This answer from Popper nods to the fact that empiricism or simplicity alone aren't justified without completion or some unacknowledged aesthetic criterion, but then says that the razor makes the falsification criterion hold more easily. Is this dependence on falsification necessary for the validity of the razor as a heuristic? Hardly, as one who has tried to prove the validity of a statistical inference without recourse to another methodological heuristic (such as Bayesian information criterion) can tell you. In terms of math, a statistical inference is never completely falsifiable, as there is always a chance that you haven't seen the last instance of possible dis-confirmation.
Furthermore, leaving the realm of physics where a criterion of elegance is more generally applied, one can look at pragmatic uses of the razor in medicine. We can see the preference for the rules of parsimony being trumped by complex systems: Hickam's Dictum makes it clear, "Patients can have as many diseases as they damn well please." Now, this seems like a hypochondriac's missive, but what is at stake here is that sometimes statistically it is more probable for several simple or common diseases to produce a complicated condition than one super-rare one. So we see the circular nature of positing entities: we have to take into account the domain when figuring out what is necessary, and there are few simple ways to decide which domain of parsimony is needed.
When you try to apply Ockham's Razor to the doctrine of the razor itself, it seems to shave a bit too close for any simple guideline to emerge, and yet Ockham's Razor has emerged nonetheless, and been a useful heuristic. So this leaves us with a foundation that is harder to shore up in regards to the usefulness of the razor, yet we know it obviously is useful, as the simplification of entities involved in an explanation does tend to yield clearer results... at least so far.
How close this razor will shave is yet to be determined.
-C. Derick Varn