Verisimilitude (or truthlikeness) is a philosophical concept that distinguishes between the relative and obvious (or seemingly so) truth and falsity of assertions and hypotheses. The problem of verisimilitude is the problem of articulating what it takes for one false theory to be closer to the truth than another false theory.
This problem was central to the philosophy of Karl Popper, largely because Popper was among the first to affirm that truth is the aim of scientific inquiry while acknowledging that most of the greatest scientific theories in the history of science are, strictly speaking, false. If this long string of purportedly false theories is to constitute progress with respect to the goal of truth, then it must be at least possible for one false theory to be closer to the truth than others.
Karl Popper on verisimilitude
Popper assumed that scientists are interested in highly informative theories, in part for methodological reasons — the more informative a theory, the easier it is to test, and the greater its predictive power. But clearly informative power by itself is rather easy to come by, and we don't want to gain content by sacrificing truths. So Popper proposed that closeness to the truth is a function of two factors — truth and content. The more truths that a theory entails (other things being equal) the closer it is to the truth.
Intuitively at least, it seems that Newton's theory of motion entails a good a large number of more truths than does, say, Aristotle's theory — notwithstanding the fact that both are known to have flaws. Even two true theories can have differing degrees of verisimilitude, depending on how much true information they deliver. For example, the claim "it will be raining on Thursday next week," if true, is closer to the truth than the true yet logically weaker claim "it will either be raining next Thursday or it will be sunny."
Popper's formal definition of verisimilitude was challenged by Pavel Tichý and David Miller, who argued that Popper's definition has an unintended consequence: that no false theory can be closer to the truth than another. This result gave rise to a search for an account of verisimilitude that didn't deem progress towards the truth an impossibility.
Post-Popperian theories of verisimilitude
Some of the new theories (e.g. those proposed by David Miller himself and by Theo Kuipers) build on Popper's approach, guided by the notion that truthlikeness is a function of a truth factor and a content factor. Others (e.g. those advanced by Gerhard Schurz in collaboration with Paul Weingartner, by Mortensen, and by Ken Gemes) are additionally inspired by Popper's approach but locate what they believe to be the error of Popper's proposal in his overly generous notion of content, or consequence, proposing instead that the consequences that contribute to closeness to truth must be, in a technical sense, "relevant." A different approach (already proposed by Tichý and Risto Hilpinen and developed especially by Ilkka Niiniluoto and Graham Oddie) takes the "likeness" in truthlikeness literally, holding that a proposition's likeness to the truth is a function of the overall likeness to the actual world of the possible worlds in which the proposition would be true. An attempt to use the notion of point-free metric space is proposed by Giangiacomo Gerla. There is currently a debate about whether or to what extent these different approaches to the concept are compatible.
Verisimilitude and methodology
Another problem in Popper's theory of verisimilitude is the connexion between truthlikeness as the goal of scientific progress, on the one hand, and methodology, on the additional hand, as the ways in which we can to a few extent ensure that scientific research actually approaches this goal. Popper conceived of his definition as a justification of his own preferred methodology: falsificationism, in the following sense: suppose theory A is closer to the truth than theory B according to Popper's qualitative definition of verisimilitude; in this case, we will (or should, if that definition had been logically sound) have that all true consequences of B (ie: all predicted consequences of theory B's mathematical and physical predictions subject to a particular set of initial conditions) are consequences of [theory] A (['s] similarly predicted consequences - ie, informally, B ≤ A), and that all false consequences of A are consequences of B (in that those set of events deemed impossible by theory A are a subset of those events deemed impossible B, subject to the same initial data conditions for both - ie, informally, ¬B ≥ ¬A, so that ¬A ≤ ¬B); this means that, if A and B are so related, then it should be the case that all known false empirical consequences of A additionally follow from B, and all known true empirical consequences of B do follow from A. So, if A were closer to the truth than B, then A should be better corroborated than B by any possible amount of empirical evidence. Lastly, this easy theorem allows to interpret the fact that A is actually better corroborated than B as a corroboration of the hypothesis (or 'meta-hypothesis') that A is more verisimilar than B.