# Does the Many-Universes Hypothesis Really Explain the Fine-Tuning?

The last thirty years have witnessed a major revival in the philosophical, theological, scientific, and popular literature of the traditional design argument for theism. Probably the most convincing and widely discussed of these arguments is based on the so-called “fine-tuning” of the cosmos, which refers the fact that the parameters of physics and the initial conditions of the universe are balanced on a razor’s edge for life to occur. For example, calculations by Brandon Carter indicate that if the force of gravity had been stronger or weaker by one part in 10^{40}, then life-sustaining stars could not exist (Davies, 1984, p. 242); similarly, calculations indicate that if the strong nuclear force, the force that binds protons and neutrons together in an atom, had been stronger or weaker by as little as 5%, life would be impossible. ( Barrow and Tipler, p. 322.) As the eminent Princeton physicist Freeman Dyson notes, “There are many . . . lucky accidents in physics. Without such accidents, water could not exist as liquid, chains of carbon atoms could not form complex organic molecules, and hydrogen atoms could not form breakable bridges between molecules” (1979, p. 251) — in short, life as we know it would be impossible.

Many people find this extraordinary fine-tuning of the parameters and initial conditions of the universe as strong evidence for some sort of designer, such as the God of classical theism. For example, theoretical physicist and popular science writer Paul Davies–whose early writings were not particularly sympathetic to theism–claims that with regard to basic structure of the

universe, “the impression of design is overwhelming” (Davies,* *1988, p. 203). Similarly, in response to the life-permitting fine-tuning of the nuclear resonances responsible for the oxygen and carbon synthesis in stars, the famous astrophysicist Sir Fred Hoyle declares that “A commonsense interpretation of the facts suggests that a superintellect has monkeyed with physics . . . and that there are no blind forces worth speaking of in nature.” (Quoted in Davies, 1982, p. 118.)

In response to explaining the fine-tuning in terms of design, however, many atheists have offered an alternative explanation, what I will call the atheistic many-universes hypothesis, where the adjective “atheistic” is included to distinguish it from the theistic many-universes hypothesis. According to the many-universes hypothesis, there are a very large–perhaps

infinite –number of universes, where by the term “universe” I mean any region of space-time that is disconnected from other regions in such a way that the constants or laws of physics in that region could differ significantly from the other regions. Furthermore, these universes are postulated to be produced by some “universe generator,” and the fundamental parameters of

physics are postulated to vary randomly from universe to universe. Of course, in the vast majority of these universes the parameters of physics would not have life-permitting values. Nonetheless, in a small proportion of universes they would, and consequently under this hypothesis it is no longer improbable that universes such as ours exist that are fine-tuned for

life to occur.

Although several models have been proposed as to what this “universe generator” could be, currently the most popular one is what I will call the “inflationary many-universes model.” According to the inflationary universe model of the big bang–a model currently in vogue among many cosmologists– the universe arose as a rapidly expanding “bubble” out of a so-called “false vacuum” state of an hypothesized group of interacting quantum fields, called Higgs fields. According to the many-universes version of this model, which is claimed to be a natural extension of it, a very large, if not infinite, number of these bubbles have formed in this false vacuum. Morever, each of these “bubble universes” is postulated to form by symmetry breaking of the Higgs fields, yielding randomly varying values for the initial

conditions of the universe and the parameters of physics. Imaginatively, one can think of this

superspace of Higgs fields as an infinitely extending ocean full of soap, with each universe as

a soap-bubble which spontaneously forms on the ocean.

## II. A Critique of the Many-Universes Hypothesis

In the rest of this paper, I will focus on one key problem with the many-universes hypothesis

as an ultimate explanation of the fine-tuning: namely, it seems that the “many-universes

generator” would need to be fine-tuned, and hence it seems to transfer the problem of

explaining cosmic fine-tuning up one level to that of the many-universes generator itself.

In support of this claim, we begin by noting that in all currently worked-out proposals for

what this “universe generator” could be, the “generator” itself is not only governed by a

complex set of physical laws that allow it to produce the universes, but also requires a set of

fine-tuned parameters. Even the so-called “chaotic inflation” many-universes model, which

attempts to eliminate some of the fine-tuned initial conditions of the standard inflationary

models by hypothesizing that these initial conditions vary at random over the superspace of

Higgs fields, cannot avoid the fine-tuning of its parameters. As philosopher John Earman has

recently pointed out, “The inflationary model can succeed only by fine-tuning its parameters,

and even then, relative to some natural measures on initial conditions, it may also have to

fine-tune its initial conditions for inflation to work.” (1995, p. 156)

At present, therefore, the chaotic inflation model simply seems to transfer the atheist’s

problem of accounting for the fine-tuning of our universe up one level. And this is not

surprising. After all, even my bread machine has to be made just right–fine-tuned, if you

will–in order to work properly, and it only produces loaves of bread, not universes! Or

consider a device as simple as a mouse trap: it requires that all the parts, such as the spring

and hammer, be arranged just right in order to function. Thus, at present it seems doubtful

that the atheistic many-universes hypothesis can provide an adequate ultimate explanation of

cosmic fine-tuning.

Nonetheless, it is at least conceivable–though I think unlikely–that in the future a

many-universes-generator model could be developed which does not require a fine-tuned set

of parameters. This hypothesis of a non-fine-tuned many-universes generator, however, seems

to face two major problems, which we will now examine.

### Problem #1: The “Natural Extrapolation Rule.”

Although one cannot completely rule out the hypothesis of a non-fine-tuned many-universes

generator, it does stretch the limits of plausibility and conceivability. Thus, in regard to this

hypothesis, I suggest that we should invoke a general rule of reasoning: namely, * everything
else being equal, we should prefer hypotheses for which we have independent evidence or
which involve natural extrapolations from what we know by experience or from reasonably
well-established theories. * Let’s first illustrate and support this principle, and then apply it to

the hypothesis under consideration.

Most of us take the existence of dinosaur bones to count as very strong evidence that

dinosaurs existed in the past. But suppose a dinosaur skeptic claimed that she could explain

the bones by postulating a “dinosaur-bone-producing-field” that simply materialized the bones

out of thin air. Moreover, suppose further that, to avoid objections such as that there are no

known physical laws that would allow for such a mechanism, the dinosaur skeptic simply

postulated that we have not yet discovered these laws or detected these fields. Surely, none of

us would let this skeptical hypothesis deter us from inferring to the existence of dinosaurs.

Why? Because although no one has directly observed dinosaurs, we do have experience of

other animals leaving behind bones and other remains, and thus the dinosaur explanation is a

*natural extrapolation *from our common experience. In contrast, to explain the dinosaur

bones, the dinosaur skeptic has invented a set of physical laws, and a set of mechanisms, that

are clearly *not* a natural extrapolation from experience or any well-established theory.

In the case of the fine-tuning , we already know that minds often produce fine-tuned devices,

such as Swiss watches. Postulating God–a “supermind”–as the explanation of the

fine-tuning, therefore, is a natural extrapolation from of what we already observe minds to do.

Now, the inflationary many-universes model could be argued to be a natural extrapolation

from reasonably-well established scientific ideas: for example, the many-universes version of

the inflationary model is arguably a natural extrapolation of the inflationary model, and the

inflationary model, though speculative, could be considered a natural extrapolation of

reasonably well-established ideas, such as that of symmetry breaking, in modern particle

physics. The hypothesis of a *non-fine-tuned* many-universes generator, however, not only fails

to be a natural extrapolation of any well-established theory, but actually goes against what we

know regarding the need for fine-tuning in all currently developed many-universe models and

what we know about the need for fine-tuning from common experience, such as the bread

machine example given above. Accordingly, just as with the dinosaur-bone-producing field

hypothesis, the simple conceivability of a non-fine-tuned many-universes generator is not a

sufficient reason to take it seriously. By the “natural extrapolation” principle, therefore, we

should prefer the theistic explanation of the fine-tuning over the non-fine-tuned

many-universes generator explanation, everything else being equal.

### Problem #2: The Apparent Design of the Laws of Physics

Even if such a many-universes model could be developed that dispensed with the need for

fine-tuned parameters, the atheist would still need to account for the apparent fine-tuning of

the laws of physics: just as the right values for the parameters of physics are needed for life to

occur, the right set of laws also seem to be needed. If, for instance, certain laws of physics

were missing, life would be impossible. For example, without the law of inertia, which

guarantees that particles do not shoot off at high speeds, life would probably not be possible

(Leslie, Universes, p. 59). Another example is the law of gravity: if masses did not attract

each other, there would be no planets or stars, and once again it seems that life would be

impossible. Yet another example is the *Pauli Exclusion Principle*, the principle of quantum

mechanics that says that no two fermions–such as electrons or protons–can share the same

quantum state. As prominent Princeton physicist Freeman Dyson points out (1979, p. 251),

without this principle all electrons would collapse into the nucleus and thus atoms would be

impossible. In terms of the laws governing the inflationary scenarios themselves, one not only

needs the basic laws of quantum mechanics (such as the Schrodinger equation, the

commutation relations holding between the operators for position and momentum, and the

eigenvalue/eigenvector rule which says that the only allowed values of an observable are the

eigenvalues of its associated operator), but one also needs a large number of specially constructed fields. Indeed, according to Alan Guth, the simplest inflationary model requires 24 interacting Higgs fields, a set of associated algebraic operators for the strong, weak, and electromagnetic interactions, and a set of additional quantum fields corresponding to the various types of particles in the universe. (1997, p. 139).

How might advocates of the atheistic many-universes hypothesis respond to this “fine-tuning” of the laws problem? Let’s look at three responses they could give.

*Response 1*:

First, they could hypothesize the existence of a “super many-universes generator” that allows for a random variation of laws of physics themselves in some superspace. As John Polkinghorne notes, atheists could speculate that “the laws of nature themselves fluctuate so

that a vast portfolio of conceivable or (to us) inconceivable worlds rise and fall in a relentless explosion of random possibility…” (1998, p. 9). Polkinghorne quickly dismisses this suggestion as a “rash and desperate” claim that has “moved beyond anything that could be

called scientific…” (p. 9). I would like to consider the problems confronting this proposal in a little more depth, however, since even though it seems implausible to us today, we cannot simply rule it out: after all, quantum mechanics would have seemed enormously implausible

to a classical physicist of the 19th century, yet it is the cornerstone of modern physics.

The first problem with this proposal is that the universe generator itself would have to obey some set of laws, at least if it is to be meaningfully thought of as a physical thing in any sense analogous to the physical things in our universe. So, we would still have the problem of apparent design: why the set of laws that allow for such fluctuations instead of laws that would be sterile, such as those of classical mechanics. Suppose, however, that we ignore this problem. A further problem with this proposal is the simplicity, beauty, and elegance of the laws of physics in our universe. Consider, for instance, the simplicity of Newton’s law of gravity, F = Gm_{1}m_{2 }/r^{2}. Among other things, the exponent of r, the distance between two masses, is “2,” which clearly has a simpler form than, for instance, an irrational exponent.^{(1)}

Yet, since the irrational numbers are infinitely more numerous than the intergers (or rational) numbers, if laws were being randomly generated, one would expect our laws to typically have irrational exponents, not integer or rational ones. Similar things could be said about the simplicity of the Schrodinger equation, or of Einstein’s field equation of general relativity which is often recognized as having been chosen by Einstein because of its simplicity. So, it seems that to get the superuniverse generator hypothesis to account for the simplicity of the laws of nature, one would further have to hypothesize that it was “constructed” in such a way that it was much more likely for the laws to fluctuate over simple laws than complex laws. Moreover, to account for the beauty, elegance, harmony and seeming ingenuity of nature

recognized by such prominent physicists as Albert Einstein, Paul Dirac, and Steven Weinberg (e.g., Weinberg, 1992, chapter 6, “Beautiful Theories.”) one would have to postulate that the laws also tended to fluctuate over those that were particularly elegant and harmonious. These assumptions, however, seem very *ad hoc*.

A final problem that this superuniverse generator hypothesis is that unlike in classical physics, many of the “laws” of quantum mechanics are not expressible as equations governing the relation between various physical quantities, but rather as “rules.” One can at least conceive of how the equations, such as Einstein’s field equation of general relativity, could actually express real physical relations between various quantities, and thus exist in some sort of multi-dimensional physical space. Specifically, one could imagine a space of functional

relationships linking various quantities, such as mass and spacial curvature, with the superuniverse fluctuations occurring over that space. Although one probably would run into severe mathematical problems–such as the existence of a natural, non-arbitrary ordering of

the functional relationships, and a natural measure for them–one could conceive of these being surmountable. It is much more difficult, however, to conceive of how a “space of rules” could be physically meaningful. Consider, for instance, the quantum mechanical rule that all

states are to be represented as vectors in Hilbert space, and that to each observable there corresponds a unique Hermitian operator; or consider the eigenvalue/eigenstate rule that says that the only allowed values of an observable are the eigenvalues of its corresponding

operator; or consider the probability rule that says that the probability of the measurement of an eigenvalue E of an observable O is proportional to the square of the coefficient of its corresponding eigenstate in a spectral expansion of the state for that observable; or finally,

consider the various so-called “superselection ” rules that restrict the allowed quantum states, such as that underlying the Pauli Exclusion principle that says that the joint state of any two identical fermions must anti-commute. Each of these rules, and others like them, place

constraints on the possible states, the form of the operators, the form of the equations, and the relations between the states, operators, and results of measurement. It is difficult to see how one could even go about formulating these rules in such a way that we could meaningfully

conceive of a physically existing space of such rules, with each rule representing some physical relationship over that space. This, however, is what would be needed in order or the superuniverse generator to select the rules.

Admit tingly, one cannot completely rule out the above superuniverse generator hypothesis. Nonetheless, it does stretch the limits of plausibility and conceivability. Hence, it seems that the “natural extrapolation” rule explicated above applies to this hypothesis with particular force.

*Response 2:*

Perhaps, however, advocates of the many-universes explanation of the fine-tuning could simply deny that the laws are “fine-tuned” in any way that is relevantly analogous to the fine-tuning of the parameters of physics. So, whereas they admit that the existence of life-permitting values for the parameters of physics needs an explanation, they could deny this in the case of the laws. Specifically, they could argue that in the case of the fine-tuning of the parameters of physics, we can quantitatively estimate the degree of fine-tuning–for example, one in 10^{40} in the case of the gravitational constant. Thus, we have an objective basis for saying that such fine-tuning is in itself improbable and in need of explanation. In contrast, atheists could argue, we have no way of quantitatively estimating the degree of fine-tuning of the laws of physics, and thus our basis for saying that it is improbable, or in need of explanation, is less objective.

Although admittedly our basis is less objective, it is still significant. First, we do have an objective basis for claiming that the laws are fine-tuned: we can make solid, scientific arguments that certain laws — such as the* Pauli Exclusion Principle* mentioned above — are

necessary for intelligent life. Second, we can often judge that a system that requires many interrelated elements for it to function is unlikely to have arisen by chance, even though we have no *quantitative* basis for our judgement of unlikelihood. For example, most people–whether atheists are theists– would agree that it is very unlikely that complex body organs such as the heart, eye or wing could have arisen merely by chance (without natural selection). And they would agree with this assessment whether or not we have a quantitative basis for it. This is why, as Richard Dawkins notes, before Darwin it was impossible to be “an intellectually fulfilled atheist” (1986, p. 6), for it seems part of human rationality to demand explanations of complex, interrelated systems, many of whose elements are necessary for some seemingly meaningful purpose. The system of laws uncovered by physicists, however, seem to be such a system of interrelated elements.

*Response 3:*

A third way advocates of the many-universes hypothesis could respond to the apparent “fine-tuning” of the laws of physics is by claiming that, as far as we know, there could be a single, fundamental law underlying the hypothesized many-universes generator, even though current models of the universe generator must hypothesize a seemingly “well-designed” set of interrelated laws.

Besides being entirely speculative, the problem with postulating such a law is that it simply moves the improbability of the fine-tuning of the laws of physics up one level, to that of the postulated physical law itself. Under this hypothesis, what is improbable is that of all the conceivable, simple fundamental physical laws there could be, there just happens to exist the one that results in a universe generator that produces life-sustaining universes. Thus, trying to explain the fine-tuning of the laws by postulating this sort of fundamental law is like trying to explain why the pattern of rocks below a cliff spell “Welcome to the mountains Robin Collins” by postulating that an earthquake occurred and that all the rocks on the cliff face were arranged in just the right configuration to fall into the pattern in question. Clearly this explanation merely transfers the improbability up one level, since now it seems enormously

improbable that of all the possible configurations the rocks could be in on the cliff face, they are in the one which results in the pattern “Welcome to the mountains Robin Collins.” And this holds whether or not we include as part of our hypothesis that the configuration of rocks on the cliff face was “simple” or “complex.”

## III. Conclusion:** **

In light of the above considerations, I conclude that it is doubtful for a variety of reasons that

the many-universes hypothesis can avoid simply transferring the problem of explaining the

cosmic fine-tuning up to the level of the universe generator itself.

## Bibliography

Barrow, John and Tipler, Frank. *The Anthropic Cosmological Principle*. Oxford: Oxford

University Press, 1986.

Davies, Paul. *The Accidental Universe*. Cambridge: Cambridge University Press, 1982.

_________. *God and the New Physics*. New York: Simon and Shuster, 1983.

_________. *Superforce: The Search for a Grand Unified Theory of Nature*. New York:

Simon and Schuster, 1984.

_________. *The Cosmic Blueprint: New Discoveries in Nature’s Creative Ability to Order the
Universe*. New York, Simon and Schuster, 1988.

Dawkins, Richard. *The Blind Watchmaker*. New York: W.W. Norton, 1986.

Dyson, Freeman. *Disturbing the Universe*. New York: Harper and Row, 1979.

Earman, John. *Bangs, Crunches, Wimpers, and Shrieks: Singularities and Acausalities in
Relativistic Spacetimes*. Oxford: Oxford University Press, 1995.

Guth, Alan. *The Inflationary Universe: The Quest for a new Theory of Universe Origins*.

New York: Addison-Wesley, 1997.

Leslie, John. *Universes*. New York: Routledge, 1989.

Polkinghorne, John. *Belief in God in an Age of Science*. New Haven CT: Yale University

Press, 1998.

Wald, Robert. *General Relativity*. Chicago: The University of Chicago Press, 1984.

Weinberg, Steven. *Dreams of a Final Theory*. New York: Vintage Books, 1992.

1. As physicist Arnold Sikkema pointed out to me, one might argue here that there are

reasons of internal consistency why the exponent of Newton’s law must be “2,” and hence that

of necessity the universe generator could not produce universes in which the gravitational

force did not obey something like an inverse square law. The answer to this objection begins

by first noting that the exponent of “2” can be thought of as following from the fact that the

gravitational field can be represented in terms of a potential that obeys Poisson’s equation.

Since the gravitational field, or potential, can be represented in this way, it follows from the

divergence theorem and the fact that we live in a three-dimensional space that the exponent of

“r” must be “2.” Hence, if the exponent of “r” was something other than “2,” it follows that

gravity could *not* be represented in terms of a potential that obeys Poisson’s equation. But, it

does not follow from the inability to represent gravity as such a potential that it is inconsistent

for the exponent of “r” to be something other than “2” for r.

Changing the exponent of “r,” of course, would also have consequences for other equations in

physics. For example, if we consider general relativity to be the correct theory of gravity, then

the fundamental equation of general relativity relating mass with space-time curvature

(Einstein’s equation) would have to be different if some other exponent than “2” occurred in

Newton’s law. The reason is that it follows from Einstein’s equation that in the Newtonian

limit–that is, those ordinary situations where Newton’s law is known to be valid–that the

force of attraction can be represented in terms of a potential that obeys Poisson’s equation

(See Wald, 1984, pp. 76-77 for derivation). No doubt, if the exponent of “r” were something

other than “2” in Newton’s law of gravity, the fundamental equation of general relativity

would probably need to be much more complex than it is. As is, however, the equations

governing gravity are “doubly simple”: Einstein’s equation is well-known for its simplicity,

and when the Newtonian limit is taken of Einstein’s equation, it yields a further equation that

it also simple. Thus, one has a harmonious network of simple equations.

Finally, it should be noted that changing Newton’s law might also have further physical

consequences for other equations in physics, thus affecting the whole network of equations of

physics. (Of course, whether this is the case will have to await a theory of quantum gravity,

something we do not have yet.) From this larger perspective in which each equation is seen as

part of an interlocking network of equations, however, the problem of simplicity still remains:

namely, Why does the super-many-universe generator produce a physically consistent

network of equations each of which displays such simplicity?