**Zeno Phobia**

Counting day. How many ways can you skin a cat if the number of sharp versus dull knives is this and such. Simple stuff, easy to get the hang off. There were no cats present—I saw a chipmunk—thus no practicum. Maybe next year.

Speaking of counting, the discrete solution to Zeno’s so-called paradox came up. That conundrum goes: if you start a journey from A to B by going half way, then going half way from there, and so on, you’ll never get to B because no matter how long a distance is left, you’ll only get half way to the destination on the next step.

This has an analytic solution, which requires calculus to understand, and which is fine. But it requires the premise that the universe can be infinitely divided in space. And there’s no indication that that’s true.

But if we abandon that premise and embrace a quantum, which is to say discrete, universe, then Zeno is easily answered. Between A and B are a discrete or quantum number of steps. You can either go halfway to B if n is odd, or something other than halfway if n is even. For instance, if you are at A and there is only one step to B (n = 2) it is not possible to go half way. You go to B or you go nowhere. Movement cannot be infinitely graduated. You must move at least one step at a time.

**Missing Infinities**

This is a reminder that no measurement can be infinitely graduated, or have infinite precision. Everything we can do for real that can be quantified is necessarily discrete. But, of course, we can approximate the discrete with the continuous. Fine. As long as we remember it is always an approximation, lest the deadly sin of reification appear. And, even more of course, is that not everything is quantifiable, nor should we seek for quantification where it is not necessary to do so.

The number 10 is just as far from it as 10^{100}, a googol, is. And a googol is just as far from infinity as a googolplex is. A googolplex is 10^{googol}. And if you were to take that creature to the power of 10, it too would be just as far from infinity.

Infinity is big. BIG. It is really far away. It is infinitely far away. Unimaginably far away. So far away you’ll never get there. Even if you really want to.

**It’s Magic!**

Normal distributions are ubiquitous and always an approximation, usually a crude or awful one. The probability of *any* observable given a normal is 0, forever 0. Normals say nothing can happen, because why? Because normals, and other continuous distributions, attempt to give probability to infinite numbers of events. And we’ve just seen this is an awful lot of events.

Imagine a bag with an infinite number of white marbles and one black one. What’s the probability (given these premises) you pull out the black? Zero.

But math says normals are pretty, and many statisticians think statistics is a branch of math, instead of philosophy, and so off we go down the bell-shaped road, memorizing equations and not meanings.

Enter the most Deadly Sin Of Reification!

Raise your hands. Who’s heard a sentence like, “Temperature is normally distributed”? Everybody! That’s the sin right there, in all its ugliness. Temperature is *not* normally distributed. Nothing is, and nothing can be.

We *can* say, “Our *uncertainty* in temperature is quantified by a normal with this and such parameters.” But there is nothing *causing* temperatures to line up in the shape of a normal. It is impossible that temperature can, since it is finite.

Reification happens when the temperature is forgotten and the normal representing it *becomes* as real or realer than the temperature. Of course, it isn’t just normals and it isn’t just temperature. It happens every time the model—and the normal is a model—becomes realer than reality. This is magic: and not the only time we meet magical thinking in probability.

Example? Time series are a perpetual source of magical thinking. How many times have we seen plots of the actual real this-is-it hey-it’s-me data overlaid with “running averages” or other smoothers (this includes straight-line regressions)? Directly this is done, the original data is all but forgotten and the smoother becomes the real thing.

So powerful is the spell of this smoothing miracle that you cannot convince an observer that the non-data is the non-data. The non-data, so pretty so pure and smooth and free of defect, *must* be real!

I haven’t yet found anything to counter this evil charm. There isn’t enough holy water to go around and every time I try to stuff garlic into somebody’s computer, the d***ed thing freezes up. Suggestions welcomed.

18 June 2014 at 8:07 am

I’m only half kidding here but imagine how it feels to be the 0.5th kid in a family and no one thinks you’re real.

18 June 2014 at 8:28 am

Well, you could make wagers with you taking the real data and them taking the magical data. They might learn with real money on the line. Or not, but least you would have their money. And you wouldn’t even have to bump them off in a blog play script to get it.

18 June 2014 at 12:44 pm

Apropos of not using the actual data, it’s Junk Science Week at Canada’s Financial Post. Today’s topic ( http://business.financialpost.com/2014/06/17/grab-that-after-dinner-mint-while-you-can-canada-is-in-full-blown-sugar-panic/ ) is sugary drinks and how the WHO (World Health Organization) wants us to cut our consumption in half. The new battle cry is sugary drinks cause heart disease. This followed from a study in the Journal of the American Medical Association that claimed such a link. JAMA came to such a conclusion despite the fact that the data in its own study showed that there were 468 deaths from heart disease in a group that consumed less than 1 can of soda per week while 183 deaths occurred in a group that consumed more than 1 can of soda per week. JAMA’s conclusion was that those who got 25% of their daily calories from added sugar tripled their risk of dying of heart disease versus those who got 5% of their daily calories from added sugar.

I realize that this is not necessarily a direct comparison, but it’s difficult to imagine how so many could die of heart disease while consuming such a low quantity of sugary drinks. Albeit they could have other sources of significant added sugar in their diets that increases the percentage intake of daily sugar. But if that were the case then this jihad against soft drinks becomes absurd.

18 June 2014 at 1:51 pm

Over at The Reference Frame, Lubos Motl argues that a universe in which space was discretized would violate Special Relativity.

18 June 2014 at 2:31 pm

My favorite example of reification is when epidemiologists calculate a risk ratio and then thru the magic of attributable risk turn that into a body count which they claim is real.

18 June 2014 at 2:51 pm

I think Paul Simon summed it up well – a man hears what he wants to hear but disregards the rest.

Take “economic growth.” I once argued with someone that economic growth, in and of itself, was not necessarily a good thing. He was flabbergasted. Of course, the devil is in the details (or in this case, in the averages). Germany had incredible economic growth during the rise of the Third Reich. China, with an economy still 90% controlled by the government, has had incredible growth for a generation now. So, does that mean violent fascism or totalitarian communism is the way to go? People see what they want to see.

JMJ

18 June 2014 at 2:59 pm

I have not heard a quantum theory of probability.

The physicists say that there is a minimum units of distance, time and electrical charge. But, I don’t see why there cannot be an infinite dimension of probability.

One model of looking at is the “many worlds” model. If I flip a coin in some set of universes it comes up heads and in another sent of universes it comes up tails. These universes may be completely theoretical, but there can be infinitely many of them.

And it is my understanding that the quantum machinists use this infinite nature of probability to explain physical phenomena. The photon travels through all paths… and the cat exists in a superposition of states…

18 June 2014 at 3:11 pm

Know how you feel. Try starting with uncertainty in the measurement of known things and then working toward estimates made about things we don’t know.

For example, ask your class members how old they are and then ask them to compute the average age of the class. You’ll get answers in years but somebody will give you an average to four decimal places. If you now ask for ages in years and months..weeks.. days .. you’ve got the basis for an easily understood rant about Excel – no, I mean, about averaging producing bad data by adding imaginary precision. Since all smoothers are averagers…

18 June 2014 at 4:19 pm

Paul, when you say a “quantum theory of probability” do you mean probability using discrete variables rather than continuous? As far as quantum mechanics being limited to discrete variables, that isn’t so. Physicists talk about discrete smallest units–Planck length, etc.–that are defined essentially by an uncertainty principle, but these are convenient mathematical entities, not necessarily “real” entities.

18 June 2014 at 5:43 pm

Impossibility implies zero probability, but not vice versa.

Think graphically! One can see the probability of an interval as the area under the probability

densityfunction (pdf) curve bounded by the interval. The area corresponding to a point under the Pdf curve becomes the area of a line, which is zero.A line is a nonempty set, but it has an area of 0. So, zero probability does not imply impossibility.

It is a often asked question, and the above is my usual answer. (No need to mention measure theory!)

You can teach how to properly use smoothing for forecasting, which is the main purpose of time series analysis. A simple explanation of what it means exactly to say “temperature has a normal distribution” will do. If a normal distribution is not appropriate, offer solutions. You are the teacher!

Why do you seem gloomy and doomy?

18 June 2014 at 7:30 pm

Briggs,

“This has an analytic solution, which requires calculus to understand, and which is fine”. Not just long division?

JMJ,

“I once argued with someone that economic growth, in and of itself, was not necessarily a good thing. He was flabbergasted. ” Maybe it was all the qualifiers packed into such a short sentence that confused him.

18 June 2014 at 10:16 pm

Practically speaking if you place the middle of your foot on the next halfway point then eventually the toes will reach the destination…

19 June 2014 at 12:22 am

Sounds like counting polar bears, and then counting drowned polar bears.

I like the wiki entry “Reification (computer science), the creation of a data model”

19 June 2014 at 1:54 am

Economic growth is usually a good thing, but there are always wide variations in such complex systems. Smoothing can make these variations seem subtle to an economist, it can make class lines seem to blur for the sociologist. But they’re not subtle to real live human beings, and those lines are quite stark and real too. So yes, the qualifiers do confuse some people. Without them, though, you can’t break through the bi-chromatic ring of political debate. Maybe that why you don’t see a lot of math philosophers in higher office. ;)

JMJ

19 June 2014 at 4:21 am

Bob Sykes,

Space itself wouldn’t have to be discretized for the the discrete proof to work. All we need is that the steps we can take must be a minimum distance. And that seems to be true.

19 June 2014 at 9:36 am

JMJ,

You may be missing my point, as from Briggs’ last post the statement is a tautology. The term “economic growth” can be replaced by any other, e.g. “drinking water”, and the statement still has as much or as little meaning as before. You may also be thinking of bubbles and not economic growth.

20 June 2014 at 9:08 am

Curious that you should say that no measurement can have infinite precision, while claiming reality is discrete, when a discrete variable *is* infinitely precise.

The Normal distribution is more usually defined by specifying the probability of an interval. You never specify the probability of the outcome being x, you always specify the probability of the outcome being between x1 and x2.

Measurements, likewise, are never infinitely precise. To say that the outcome observed was 3.61435, what you actually mean is that it was between 3.614345 and 3.614355. The probability of such an event, because it is of an interval, is non-zero. Add more decimal places, and the interval narrows, but still does not become zero. The probabilities predicted by the Normal distribution have *never* been zero. Not because the Normal distribution doesn’t exist, but because the infinitely precise *observation* that you are applying it to does not exist – *cannot* exist, as it would require you to write down an infinite number of decimal places when you stated it. There’s not enough paper in all the world for that.

Asserting that an observation can be a single, infinitely precise number is the approximation here. Starting from such a bizarre premise, is there any wonder that one can get some strange answers?

Incidentally, quantum mechanics doesn’t claim space-time to be discrete. The issue is that space-time and energy-momentum bear the same relationship to one another as do frequency and duration. To have a well-defined frequency, you need to allow enough space for the oscillations to occur. The more you constrain it, the more uncertain the frequency becomes. Likewise, the more you constrain events in space-time, the larger and more uncertain does the energy-momentum become. And as general relativity teaches, energy-momentum curves spaces-time, meaning that to describe events on length scales smaller than the Planck length involves energies high enough to twist space and time into knots. The large-scale concept of ‘distance’ no longer applies, or follows the rules we know.

It’s like driving through a globe-spanning mega-city. On the large scale, driving from one district to another, distances behave roughly as one expects. It takes you twice as long to get to a place twice as far away. But on the small scale, you have to drive round twisting streets and down back-alleys, dodge the traffic, pause at signals, and it is no longer true that places twice as far away are twice as far to get to. You might have to go there by a more roundabout route.

When you get down to the Planck scale, our familiar concepts of space and time are gradually replaced by something else – something a lot stranger and more complicated. Nobody is entirely sure what, yet, because we cannot as yet make any observations that would tell us. But whatever it is, it’s unlikely to make answers to Zeno’s paradoxes any more obvious, or comfortable.