Via Ed Feser I was led back to Alvin Plantinga’s Where the Conflict Really Lies: Science, Religion, and Naturalism, which has a lot about probability. I’ll not do much of that today, but there is one point about assigning probabilities to a proposition that I want to make in the context of infinity.
Plantinga was speaking in the context of “fine tuning” of physical parameters, such as the speed of light c. It is a “parameter” because we only know its value by observation and estimation, which implies there is always uncertainty in its value (which should be, but usually isn’t, considered in equations which use the parameter).
Now fine tuning is a huge topic which I’ll only gloss. Some claim the light-speed parameter could have taken any positive value. By “any” they mean any real number greater than, or perhaps bounded from below, by 0. That’s a lot of points! An infinite amount of them. And not just a counting infinite number, like 1, 2, 3, …, but the infinity of the continuum, which are all the numbers between 1 and 2, between 2 and 3, …, and which therefore cannot be counted.
There are several questions. About physical parameters, the goal is to find simpler arguments from which we can deduce the values of higher-order parameters. That means there might be a set of axioms (which we know via a version of induction) that allow us to deduce c must be this-and-such value and none other. This could be true for c and for every other physical parameter, such as Planck’s constant and so forth.
Either way, it is not true that c could be “any” value in the continuum. Something actual caused the value of c. The equations we might discover might given insight into this cause and then again they may only allow us to determine the value of c. What’s the difference? Knowing the cause means knowing essence of the active power that made c what it is and, just as important, what it isn’t. Mighty task, especially considering we’re skirting along ancient arguments about why there is something rather than nothing. This is the point at which physics and metaphysics meet.
To determine a parameter is much easier. For instance, we have several formulas that allow us to determine the value of π or of any of its digits. Now π is transcendental, meaning its digits go on forever, in the same spirit of the continuum. That means we’ll never know all the digits of π Anyway, these formulas allow us to determine, in the sense of know, particular digits, but none of these formulas tell us why π takes the value it does and not some other. In other words, we don’t know, and I’m guessing we can’t know, what caused π to take the value it does.
You have the idea by now. Fine tuning arguments about parameters will say nothing about what caused any parameter value. That means fine tuning arguments are about our knowledge of parameters and not their cause. This is important. If we knew what was causing the values of parameters, we wouldn’t need to, and shouldn’t, speak of the “chances” parameters take certain values. We could just speak of causes.
But if we don’t know the cause(s), then probability is appropriate. Since probability is always conditional of the premises assumed, we have to figure which premises are useful in fine-tuning arguments. Which premises are “best”? I don’t know. Which premises are good or useful? Let’s examine one. Is the premises that, for example, c can take any value in the continuum north of 0 a good one? I don’t think so. Here’s why.
The first problem is the end point, which is infinity. Are we right, in our premise, to entertain that light speed can be infinite? What knowledge do we have that makes this assumption plausible? Well, I don’t know of any but that’s far (far) from proof there isn’t, so assume that some exists. All right, so c can be “any” number inclusive between 0 and infinity. Wrap your mind around this. Try. You can’t. Nobody can. It is impossible to think of the continuum in the sense that you can think of all the numbers in it. Since probability is a matter of epistemology, i.e. our thoughts, we can’t then know all the probabilities of all the numbers in the continuum. We are left in a state of mystification. That being so, probability is of no help to us.
This is why assigning probabilities to the continuum when infinity is involved doesn’t work (or work without lots of provisos), and is the reason (cause) of any number of paradoxes that have been discovered.
Here’s another way to think of it. Suppose we knew (somehow) c could be any number in the set {s_1, s_2, …, s_n}, where s_n can even be infinity (don’t argue with me that infinity isn’t a number), but where n itself is finite. Given just that premise, and none other, the probability that c takes value s_i is 1/n. No problem.
Now let n grow, but remain finite. Let n = googol, or let n = a googol tetrated a googol times. You don’t have to understand that except to realize this number is hugeous. It’s big, baby, big. But as big as it is, the probability that c takes value s_i is still 1/n (now a teeny tiny number, but not 0). Let n grow as large as you like, but keep it just this side of infinity, and again, probability is not damaged.
Large n obviously are no practical limitation, at least as far as measuring values of any parameter. If we create instruments that allow us to peer closer at c, then however large our n, the probability is still computable. This is important because we need some kind of measurement to inform us of c.
Without understanding the finer mathematical details, you can see that large numbers are no trouble, but the continuum is. Infinity (at either end) isn’t the real problem, but the uncountable nature of the continuum is. Think: how many numbers are there between 0 and 1? More than a countable infinity. We have another continuum! And if we try and assign probabilities to all the numbers in [0,1], paradoxes arise just as if we had [0, infinity].
That means we have to think of these things in a different way. We can’t in probability start with infinity or the continuum. We have to start with something finite and comprehensible, and then head off to infinity. But it turns out the path you take to the continuum or infinity matters. Infinity is a big place! You can end up in different corners of it depending on the road you use to get to it. This is why we can’t glibly speak of probabilities of parameters taking “any” value. There’s just too many values!
Incidentally, I have a greater discussion of all this in my upcoming book, Uncertainty.
Update Had a good question about measurement from Geert de Vries on Twitter.
Length (a meter) is defined as a function of c. If we take c = some definite value (with 0 uncertainty), then meter will be fixed because the value of c is also fixed by assumption. There is no problem with this because “meter” is arbitrary. We can define any number of things based on assuming a fixed c or any other parameter.
But if we really want to know the actual distance from here to there (and light is involved), then we must take the uncertainty in c into account because meter inherits that uncertainty. Simple as that. Of course, for many applications this uncertainty will be negligible. But negligible does not mean non-zero. It only means that in this decision the true inherent uncertainty does not change any decisions we make based on the distance.
This brings up a whole other discussion of how to decide. But one thing should be clear: decision and uncertainty are not the same thing.
“there is always uncertainty in its value (which should be, but usually isn’t, considered in equations which use the parameter).”
Any actual scientist knows that the uncertainty in c is *always* used whenever it is relevant. I stopped reading here.
Thanks Briggs! That helps clear up the way I was trying to think about fine-tuning.
Lee,
“…whenever it is relevant.”
Which is, of course, always. And, just as obviously, I mean all parameters all the time.
That you “stopped reading” does, however, indicated the behavior of many actual scientists these days.
We can never know the value of pi because our heads will explode. I learned this in the movie “Pi”. Something about seeing God at that point…..As for what caused pi to take the value it does, that would seem to be a function of geometry, which is something humans created. We might actually figure that one out.
Infinity is a number when used in mathematical calculations. Beyond that, in metaphysics, it may or may not be. Since we’re talking math, infinity is a number.
Lee: Another comment from someone who couldn’t bother tor read the article. Let me put this bluntly: Who cares? Read the article all the way through or be ignored.
Sheri, I caught that pi statement too. Seems that the cause of pi having the (unknowable) value it does simply and completely depends on the rules of plane geometry. All true circles must have true diameters and the ratio is always the same. What else could it be?
Gary, Sheri,
Well, no. π is much more important than simple geometry. It exists not as a Platonic universal, but as an Aristotelian one. There must be some reason that it takes the value it does, and is not something else. Even if it were true that its value was entirely because of simpler axioms, then we’d still have to explain why these and not other axioms are true.
And usually we can’t. Why is this axiom true? is something we can’t answer. How we know it is true is another matter; we can often tell if something is true or false. But knowing its cause is an entirely separate question.
“we can’t then know all the probabilities of all the numbers in the continuum”
It’s difficult to understand what Dr. Briggs is driving at. We all know that probability values in continua are routinely infinitesimal, which is why folks apply probability densities to them rather than probabilities. If we don’t have any idea what time it is, for example, we can still assign probability densities to the various values that can be taken by the tangent of the angle that the hour hand forms with the vertical.
If he’s saying we can’t know the probabilities because they’re all infinitesimal, that’s pretty much just a way of looking at things. We could just as accurately say we know all of them since we know that all are infinitesimal; if someone gives you any real number, you can give the probability density of that number’s being the hour-hand angle’s tangent.
Joe Born,
“…we can still assign probability densities to the various value.”
Therein lies the problem. How they are “assigned.” This is usually quite arbitrary and ad hoc and done without a full understanding of measurement. And that’s when the paradoxes crop up. The premises which drive these assignments are the real problem. Not the downstream math, which is fine.
Another problem with infinitesimals is the probability, conditional on assigning a density, that the hour hand takes any value is itself infinitesimal (or 0). But this isn’t so in reality, because in reality the hand will take a value not along the continuum, but at some actual place. And even if this wasn’t so, we can only ever measure that value to finite, discrete precision. We need in science to align our probabilities to measurement.
A final problem, even if you accept infinitesimals, is assigning probabilities to the intervals like [0, infinity)] (pick whichever punctuation mark which accords with your understanding of infinity). These leads to “improper” probabilities, which is to say, numerical creations which are not probabilities at all. But treated like them to save the math!
All,
This paper (and references) by Stephen Simpson on infinities in mathematics is interesting (I’m not saying I agree with all; I’m saying it’s interesting). It requires a postscript viewer.
The speed of light can be any number you want it to be. But if you are smart, you want it to be equal to 1, because that is the easiest number to multiply it with.
The problem then is that the relation of the speed of light with other distance and time measuring units is very inconvenient. The distance between New York and Amsterdam is a tiny number when expressed in light speed terms and the number of vibrations of some atom. Much more conveninently put in how many standard parts of the day to fly, or in days and parts of the day when you go by ship.
Metaphysics and physics have only one thing in common – the word “physics.”
JMJ
@Briggs:
” Now ? is transcendental, meaning its digits go on forever, in the same spirit of the continuum.”
Nitpicking, but a transcendental number is number that is not the root of a non-zero polynomial with rational coefficients. Trivially, transcendental numbers are irrational but not conversely. Square root of 2 is irrational but not transcendental; pi is transcendental.
Pingback: Infinity, (Physical) Parameters, Fine Tuning, & Probability | Reaction Times
Briggs, there are some relations that intrude on your discussion:
First, from physics, 1/(c^2) = epsilon sub 0 x mu sub 0 (permissivity x permittivity for empty space).(Solution to Maxwell’s equations)
So constraints on c are implicitly due to constraints on those physical quantities. One could regard h as a more fundamental constant than c, and one might consider setting up a range for it that we do not observe quantum effects on macroscopic objects (generally)–no “real” Schrodinger’s cat.
Second, the Euler relation e ^(i pi ) = -1 is not just geometrical (although it can be interpreted as such) and is not a measurement property.
In the same way, the transcendental number e is not geometrical.
Those are mathematical quantities and albeit non-rational, they are fixed by mathematics. The string of digits following 3 in pi is not a random number because you can always calculate the next number. The string of digits following 2 in e is not a random number, likewise.
Well, no. ? is much more important than simple geometry. It exists not as a Platonic universal, but as an Aristotelian one. There must be some reason that it takes the value it does, and is not something else. Even if it were true that its value was entirely because of simpler axioms, then we’d still have to explain why these and not other axioms are true.
Briggs, isn’t this making too much of it? Pi takes its value from the relationship of circumference to diameter. No need to go deeper to find a cause. The geometric units (circle, line) are completely defined. What caused those units to exist in the first place, of course, is a different and non-trivial matter.
“The premises which drive these assignments are the real problem. Not the downstream math, which is fine.”
I guess this is all too metaphysical for me. Yes, one can assign probabilities badly, but I don’t see how Dr. Briggs has identified any real-world problem here that defies appropriate probability (-density) assignments merely because uncountable infinities are involved.
Briggs or any one else: If we cannot say why the axiom is true and thus “explain” it, then does that mean squares having an area of Side A x Side B and all other such concepts are not knowable was to “why” they are true? Are we then being forced back to a First Cause? How do definitions vary from other things and how can definitions have a “why”? Seems some of the values discussed are definitions.
This sounds a lot like you can never divide something in half over and over because each half can be subdivided and subdivided and you to on to infinity and there is no end. You never run out of halves.
JMJ: Are you sure? A lot of physics looks pretty metaphysical. Just saying……
The late Peter Tosh similarly clarified similarly significant distinctions:
“I don’t smoke marijuana, man. Marijuana is a girl from Cuba…Ganja…is a bird from Australia, I smoke HERB.”
That’s actually sensible, perhaps entertaining, but basically useless drivel. But it is sensible.
Consider this: “…these formulas allow us to determine, in the sense of know, particular digits, but none of these formulas tell us why ? takes the value it does and not some other…”
Imputing such a philosophical issue (‘formulas that describe don’t explain’) is akin to asking “how high is up?” or “why is a carrot more orange than an orange?”
Which leads to this philosophical conundrum: Is this profundity:
….”decision and uncertainty are not the same thing” ….
…more or less profound than…
…smoking marijuana vs. smoking HERB…
or,
How many licks does it take to reach the center of a Tootsie Pop?
http://www.youtube.com/watch?v=O6rHeD5x2tI
It is good that you are thinking about fine-tuning, but your example is chosen badly. The speed of light is not an example of a fine tuned parameter. There is a huge difference in physics between dimensionless and dimensional parameters.
A dimensionless parameter, such as the fine structure constant alpha or the ratio of the proton to electron mass is just a number. It’s fixed in physics (neglecting complications from specifying re-normalisation schemes etc.). We can either measure it (with uncertainty, as you say), or derive it from more fundamental dimensionless coefficients (such as couplings of electrons and quarks to the Higgs Boson, the EW theta angle, strong coupling constant etc.) which again we can only measure (again, to within a certain precision). This is, basically, just a fact of nature which (from the point of view of physics) we can only find by measurement, and if your post had discussed these quantities rather than the speed of light, then I would agree with it.
However, the speed of light is dimensional. That means that its value depends on our choice of units. We can choose it to be any value we like just by redefining what we mean by the `meter’ or by the `second’. We can only measure its value in reference to some `standard meter’ or `standard second’. Indeed, this is particularly problematic today, because the `standard meter’ is defined in terms of the speed of light. We define the second from the frequency of a particular spectral line of caesium, and the meter by using that definition and a fixed value of the speed of light. So the speed of light in a vacuum is precisely 299792458 meters per second by definition. (Or in particle physics, the speed of light in a vacuum is precisely 1 length unit per time unit). The statistical uncertainty in measuring it is then just added into the systematic uncertainty in any measurement of length, because we are not quite certain if we have our measuring rod properly tuned.
Planck’s constant is a similarly defined quantity, which is used to define our units to measure energy.
Interesting take, and it has furthered my understanding of fine-tuning. The problem statistically is not the infinite bound of whole numbers, but the infinitude of numbers between whole numbers. However this infinitude it appears could become so infinitesimal in difference between one number and the next that the general probability of fine-tuning between such numbers would become nugatory.
Cause and effect may be more central to the “speed of light” than simply asking what caused the value to be what it is. Not just light but any zero rest mass field propagates with a “speed” that will be measured to be the same as that of light. In particular, gluons and probably gravitational radiation (or gravitons) also propagate with the same speed. But the notion of the value of speed is, as BB mentions, linked to the relationship (i.e. coordinates) one chooses to define between space and time.
The connection to causation is more immediate if one considers the universe (space + time) to be occupied by events, each of which is identified by a particular point in space and time. A postulate, supported so far by every observation (except perhaps for spooky action at a distance, but that issue is not closed), is that subsets of space and time can be demarcated into sets, call them C (for causes) and E (for effects), such that events in C can cause events in E and events in the complement of C cannot cause events in E. For our intuitive notion of cause and effect to hold, for any specific bounded region E, the complement of C cannot be empty, i.e. we have trouble if, say, events in the future could cause events in the past. That means that the boundary between C and E is not empty. Now physics involves the connections between events in space and time and in particular the distance, s, between them calculated from both the space and time separating them via a metric (which contains the dynamics) stated in differential form ds^2 = dx^2 – dt^2, where x is a space coordinate and t is a time coordinate and we are assuming the simplest metric of special relativity i.e. 1, -1. With this choice of metric events in C are connected to events in E by paths that have a total integrated space time distance separating them less than zero, which are called time like intervals. Events in C complement are connected with events in E by intervals greater than zero, called space-like. Events on the boundary between C and E are connected by paths that have zero space time intervals, called null paths or trajectories. For null trajectories ds^2 = 0 so that dx^2 = dt^2 i.e. dx/dt = 1*u. Here u represents the units we choose to work with. As BB said we can choose any u we wish so the numerical value of dx/dt for null trajectories can be anything we want, but not either zero or infinity. The value of dx/dt in two dimensions can be viewed as the slope of the boundary in the 2 dimensional space time between C and E. If that slope were infinite, then no entity evolving forward in time could cause any other entity on a different (we are assuming the path of an entity here is a straight line in time upward) line. If the slope (i.e. dx/dt on the causal boundary) were zero, then every event could cause every other event. Both of these give us great problems in physics so we conclude that dx/dt cannot be either zero or infinity. That means the “speed” of things on the causal boundary must have some finite value and we can choose that value to be anything we wish by our choice of units.
So the “speed” we associate with light (or gluons or gravitons) is more precisely viewed as the “speed” associated with distances between events that can cause one another and in particular, the “speed” associated with events on the boundary between causally connected events. So asking what “causes” the speed of light to be some value is more appropriately viewed as what “causes” some events to be causally connected and others not to be causally connected. If the standard classical relativistic dynamics are assumed, and we choose units so that the speed of light is 1, then that value is not 1 plus or minus some uncertainty, it is exactly 1. It is as precisely defined as the slope of a line in two dimensions. Any uncertainty associated with the value is an uncertainty in the validity of the theory, at least for relatively near neighboring events.
Asking whether the “speed” of light varies is equivalent to asking whether the relationship between cause and effect varies. For events separated by short paths and paths where the metric is not pathological (i.e. no singularities or horizons) it would be nice if causality behaved well. Most of the measurements we do are within this well behaved region. For long paths involving the global (not local) structure of space time causality becomes more complicated, and conjectural, especially if there are exceptional gravitational features.
Erratum: The interesting and relevant boundary is the boundary between C complement and E not between C and E above.
This post was interesting and helpful.
“One of the most successfully cultivated branches of philosophy in our time is what is called inductive logic, the study of the conditions under which our sciences have evolved. Writers on this subject have begun to show a singular unanimity as to what the laws of nature and elements of fact mean, when formulated by mathematicians, physicists and chemists. When the first mathematical, logical, and natural uniformities, the first laws, were discovered, men were so carried away by the clearness, beauty and simplification that resulted, that they believed themselves to have deciphered authentically the eternal thoughts of the Almighty. His mind also thundered and reverberated in syllogisms. He also thought in conic sections, squares and roots and ratios, and geometrised like Euclid. He made Kepler’s laws for the planets to follow; he made velocity increase proportionally to the time in falling bodies; he made the law of the sines for light to obey when refracted; he established the classes, orders, families and genera of plants and animals, and fixed the distances between them. He thought the archetypes of all things, and devised their variations; and when we rediscover any one of these his wondrous institutions, we seize his mind in its very literal intention.
But as the sciences have developed farther, the notion has gained ground that most, perhaps all, of our laws are only approximations. The laws themselves, moreover, have grown so numerous that there is no counting them; and so many rival formulations are proposed in all the branches of science that investigators have become accustomed to the notion that no theory is absolutely a transcript of reality, but that any one of them may from some point of view be useful. Their great use is to summarise old facts and to lead to new ones. They are only a man-made language, a conceptual shorthand, as some one calls them, in which we write our reports of nature; and languages, as is well known, tolerate much choice of expression and many dialects.”
William James, What Pragmatism Means
I came back to this post to mention to Bob Kurkland that I have a problem with why it is h bar in all the equations. As in “who asked the mathematicians to get involved”
Now it looks like I might need to read FAH’s contribution.
The Global Circulation models have parameters that can be adjusted. That we can adjust these parameters means we don’t know what the actual parameter is. Claiming to be pessimistic or optimistic in the setting of the parameters is irrational because if there are parameters to be adjusted that means we aren’t actually adjusted the parameter, we are only adjusting what we hope needs adjusting. Not only do we have an infinite range of possible adjustments within the parameter, we have an infinite variety of adjustments as to what the parameter is adjusting.
When I hear the masters say “We adjusted our parameters for maximum pessimism”, I actually hear “We don’t know what the hell we are tweaking, we know we can tweak it!”