William M. Briggs

Statistician to the Stars!

Page 152 of 410

Why Truth Isn’t Absolute: A Defense Of Relativism — Guest Post by Luis Dias

RelativismRegular readers will recognize frequent commentor and foil Luis Dias, who today offers us his defense of relativism. As the Staford Encyclopedia of Philosophy notes, “Although relativistic lines of thought often lead to very implausible conclusions, there is something seductive about them…”

I know dear reader that just by reading the title you will raise your eyebrows in irritated frustration. How could one possibly defend a position like this? How could one defend nihilism, moral relativism and other most vile depravities that mankind’s ever produced? I can already smell your growing ennui over the stubborn usual liberal idiocies…Well I’ll be glad to try, not to dissuade your absolutist beliefs, but at least to give you the tools to properly judge Relativism and its merits apart from the usual caricatures. So please indulge me, I’ll try to be as brief and as clear as I can be.

I propose to go directly to the juice here and try not to derail too much. Relativism is so false, you will shout, due to the following obvious undeniable truths:

  • A philosophy that states that the Truth is there is no Truth is nonsensical, and self-contradictory;
  • A philosophy that states morality is not objective is equivalent to have morality as a “fad”, neither right or wrong, but just whatever people in a given time feel “right”, and in such a situation “rape” could even be considered a “good thing”;

These criticisms may sound robust enough to end the conversation, until you actually ponder for a moment and try to judge the alternative with the same scalp, the same rigor. Once you start doing this, you see the cracks opening, an entire edifice shattering before your eyes. The alternative I am talking about is the Absolute or Objective truth, which is a Truth that is “independent” of the human mind, that is absolutely true irrespectively of any other thing. Intuitively you’d guess this is a much more robust philosophy. Things are either Right or Wrong with a capital, and it’s merely our fault for not getting it, after all Errare Humanum Est and all that.

That’s where the problem lies, that’s where the cracks open. We have to ask ourselves this simple question: is there any Truth about the World that we can be 100% sure of? Aren’t all truths either conjured by ourselves or plain hearsay? Isn’t everything we can utter just a conjecture hinged upon other conjectures? Is an absolutist philosopher capable of producing the Absolute Truth about anything at all (other than tautologies, that is)? . You may think it rather easy to come up with at least a proposition of the sort (you may even try the Descartian one, for the sake of Tradition you may be so enamorated with), but even in those we can also easily inject the poison of doubt and ambiguity. Cogito Ergo Sum is filled with assumptions about how the world works which are accepted without questioning. What if we question them? What is left of this absolute truth but ashes?

My position is that this Truth hasn’t been established at all. Mr. Briggs will tell you that there are some Objective Truths that we “just know” intuitively with our “gut,” and that the “null hypothesis” is that these truths exists; those who are sceptical of these are welcome to try to prove they do not. To me this is not only a terrible cop out, but it brings huge problems. To a Socratian inquisition “how do you know this to be true then?” such people will just irritatedly answer “I just KNOW ok? Get out of my lawn!”. Sorry, not a good enough. What if my gut tells me a different story than your gut? What then: will you simply deny my gut’s “authority” over yours? You can see the deluge of silliness that comes from assuming we have such a direct connection with the Truth stemming from these simple questions. Why is this important, you’ll probably ask. Men and women may not know when they actually stumble upon absolute truths, but they exist nevertheless, don’t they?

But if you have no tools to assess when you know when we stumble upon those truths, how do you know they even do? More importantly, if you cannot know an absolute truth, what makes you any way different from a relativist? From the omniscient point of view, you are just as cluelessly wandering around silly pseudo-truths as the perverted are. The only difference is that the latter aren’t blinded by some righteous posturing on the issue.

But that’s not…”What about morality!,” you will cry. If Relativism were “true”, wouldn’t we be rapists and criminals? Wouldn’t it be possible to create a moral rule where you could just do whatever you wanted to anyone else? Well dear reader, if that is an empirical test to an hypothesis, then clearly Relativism wins. Even an absolutist people like the Hebrews raped, tortured, killed, genocided lots of others and not against, but in the name of their god as the most just thing to do. IOW, what you consider the Relativism’s worst nightmare were it to be “true” already happened lots of times in History. Absolutism didn’t stop it from happening, it actually condoned it all. Slavery was deemed ok. Beating children was deemed moral. Eating beef is still deemed awesome and juicy instead of barbaric like it will be in a hundred years (my prediction!).

Absolutism is, ironically, more relativist than Relativism itself, for it does not even recognize its temporality, and like in an Orwellian nightmare, is always insisting that Eurasia was always and will always be a good thing to bomb (we just didn’t know it before, and will probably forget it in the future, my bad).

Why now, you are just being beyond silly

No, I am not. Ponder, what is worse. A philosophy that accepts the fragility and limited point of view of its wisdom and knowledge, or a philosophy that really thinks there is an ultimate point of view of absolute wisdom worthy of being in possession of? A philosophy that is humble enough to understand the temporality of its judgements, or a philosophy that arrogantly judges everything around it with a scent of the intemporal forever, always forgetting how in the past other absolutists judged with the same arrogance but with wildly different moralities?

Last but not least, the so-called inconsistency. It’s a myth friends. No Relativist will claim to absolutely know there are no absolutes. Please give us more credit than that. It’s very easy to understand: we are claiming we cannot see how one can possibly assert anything absolutely. We are not saying that no one could ever do such a thing. Such a statement is a strawman. Bury it, leave it alone. We do not absolutely know there are no absolutes. We, like Poincaré, simply do not care about that hypothesis. We simply require none of it, we can live without all of those absolutist requirements. We live in the Sea of Limitations, the Valley of Finitude, the Mountains of Humility. We have the gall to say we ought to be humbler. And so can you.

I’ll be ready to further these thoughts and more in the comments below. Fire away, friends.

Interest In On-Line Statistics Courses?

You might have heard what happened to when Stanford professor Andrew Ng put his machine learning (a practical kind of statistical modeling) on line. He expected mild interest. One hundred thousand students signed up.

This was enough to excite even the New York Times, which dispatched Tom Friedman to investigate. He called Ng’s success a “breakthrough”, which it certainly is, in its way.

Friedman also puts us on to a new company called Coursera, which offers courses by professors from well known universities, in much the same spirit as Ng offered his. “The universities produce and own the content, and [Coursera is] the platform that hosts and streams it.” Too, many universities already have on-line courses housed on campus.

Several things. Stanford pays Ng’s salary. This money is only partly for teaching, and more so for Ng to publish papers—which contain the material which will eventually be taught. If Stanford didn’t pay him, he would have little time to think of what new to say. Also, Stanford rightly owns the content to Ng’s lectures. Pay for work, etc. Same thing at Coursera, which hires out the professors for a cut of the pie. All well and good.

Some courses are ideal for the web. The closer the course content is to cookbook recipes, the apter. I mean no disrespect. A courses which shows you how to install and run a certain program is nothing but a series of recipes, a well marked path with milestones and a known destination. Basic machine learning fits this scheme. As do the courses offered at Coursera: Algorithms, Calculus, Introduction to Logic, Vaccines, and Securing Digital Democracy (electronic voting schemes). Recipe does not mean easy.

But maybe not statistics. Unless you want a course in classical frequentist thinking, which is cookbook all the way. Coursera has one just like this. Taught, like it is in many places, by a psychologist (who I’m sure is a nice guy).

Don’t see any courses in history, poetry, literature, high-level philosophy, and the like; all classes which are not amendable to multiple choice testing. These are courses which don’t necessarily have an end, or have different possible ends depending on the mixture of students, or which require students do a lot of writing and talking.

On-line, it’s just as easy for a professor to grade one multiple choice or only-one-correct-answer test as it is to grade 100,000 such tests. But only if the class is recipe-based. One professor could not read through 100,000 essays, or listen to as many presentations. In fact, for these kinds of “free form” courses, he may not be able to handle as many students on-line as he could in person, since the material delivered remotely introduces some level of ambiguity and slows down interaction.

Student interaction for recipe courses, which are (and should be) more popular, is greater, because if the recipe says that “at time T, add X cups,” then it’s likely many students will have this information at hand when they are queried at some forum. This is not as likely in the free form class, where the answers are rarely as firm.

The way I run my introductory class is free form. It is also not a regular course in statistics, but Bayesian from the get go, and in the predictive sense (“all parameters are a nuisance”) regular readers will understand. This marks it an oddity. I get away with it because of a certain rare confluence of events. But it would never fly at most universities, where professors at their professions are more conservative than Rick Santorum (“What! Teach Bayesian probability before frequentist? Never!”). Just ask any professor how easy it is to introduce a new course into the system.

I lecture, but not in contiguous blocks of time. I ask the students lots of questions and frequently. The answers they give provide direction for the course. Students come to the board and work out various matters with themselves, guided by me. I have students collect data (which fits a given paradigm) on any subject which interests them. This a wonderful way to maintain interest, but it limits the use of canned examples, the use of which would free up time.

I also have to spend a lot of time walking people through basic computer tasks, especially R. As a final exam, each student presents a talk on their subject as to an audience presumably unaware of statistics (many use data from their workplace, which they later show their bosses, reportedly to good effect). They must describe their interest, the data, the pertinent questions, show pictures, explain how they quantified their uncertainty, and finally detail how they would check all when new data arises.

Could this work on line? I’m skeptical, but intrigued. Places like Phoenix University push thousands of students through their pipes, and not all the classes are recipe-like, so maybe it can be done. All ideas welcomed.

The last difficulty is “credit.” Some courses earn credit as normal classes. But the free courses universities and Coursera offer come with nothing except a pat on the back or perhaps a letter stating that the student made it through. Of course, I love this trend away from formal “certification” and towards actual love of learning. Seems to work best for recipe courses, though, whose students actually want to bake a cake.

The “free” part doesn’t hurt when attracting students: a fine plan for behemoth institutions; wouldn’t work well for little guys like me.

Ignore below

Test of latex. Should be pretty: sin(x) \ne \int e^x dx, \Pr(x|e) = 0.5[\latex]

Richard Dawkins Turns Gideon?

Today a story which falls under Bets You Would Have Lost. Angry atheist (sounds like a board game) Richard Dawkins said he would have “contributed financially to the scheme” of passing out free King James bibles to U.K. students had somebody contacted him about it. Not only that, but he said he was “a little shocked” that the tots did not already have the bibles.

And not only that, but he, in a rare non-sarcastic voice, hopes that the kiddies actually read the bibles!

All this from a man who likened religion to child abuse. But don’t fret about that judgment or worry about hypocrisy because Dawkins conjoins his religion-equals-child-abuse meme with the belief that rape, child abuse, and other sins against the body are morally “arbitrary.”

By which he means that the raw emotions which tell us these acts are “wrong” were placed in our brains by selfish genes operating under the pressures of evolution, and as such these emotions are able to be suppressed (as he would have religious impulses suppressed) with the application of Reason. Women—let her that readeth understand—should not ride in elevators with Dawkins in case he forgets to switch on his Reason Center.

His stance does make it sound odd that he would have England’s children read the bible to learn about morality, however. He said he would have the bible read because the bible “is not a moral book and young people need to learn that important fact because they are frequently told the opposite.”

According to the Guardian, Dawkins says, “I have heard the cynically misanthropic opinion that without the Bible as a moral compass people would show no restraint against murder, theft and mayhem. The surest way to disabuse yourself of this pernicious falsehood is to read the Bible itself.”

Two things may be gleaned. The first, that Dawkins believes the bible is filled with advice on how to murder, steal, and run amok, or that, while it contains stories on these matters, it does not condemn them. The second is that these actions are all, to him, morally arbitrary anyway, belonging to a non-Enlightened, and possible earlier evolutionary, past.

But never mind! As it is he supports fully the plan created by Education Secretary Michael Gove to distribute the books and to have the children learn from them. Not all his co-anti-religionists agree with Dawkins. The National Secular Society is—are you ready?—outraged (isn’t everybody?) and says the plan favors Christianity. They have (selectively? purposely? in pure ignorance?) forgotten their culture also historically favored that religion.

An old story has a lady saying to a prominent philosopher that she did not see why anybody should read the bible or Shakespeare since they were both filled with cliches. Like this lady, Dawkins is aware that much of our language was shaped by King James’s effort, just as he knows how important that translation is to Western history. He would have the bible read for these reasons.

Dawkins was spot on when he said, “Ecclesiastes…is one of the glories of English literature,” though the New Testament is (mostly) a closed book for him. It’s selective (snarky) quotations from the Old Testament all the way. He even managed yesterday to work in an allusion that ancient Jews were Nazis. Don’t believe this? He said, “It was perfectly fine — indeed strongly encouraged throughout the Pentateuch — to kill Canaanites, Midianites, Jebusites, Hivites etc, especially if they had the misfortune to live in the Promised Lebensraum.”

Ah, Richard. Your selfish memes have taken over what you falsely believed to be your consciousness. Don’t worry. It’s not your fault. There’s no “you” there to carry fault anyway. “You” are just a mass of unthinking proteins, blood, and no guts (Dawkins still hides from a man called William Lane Craig). Besides, given the state of our world, nobody will notice “your” faux pas.

But never mind! We should celebrate that an enemy to religion like Dawkins is now friendly to the idea of kids reading the bible. Some of these kiddies may adopt Dawkins’s imaginatively lurid theology, it’s true, but more (I predict) will take away something better. Some will even come across this verse, which has eluded Dawkins’s glance: “This is my commandment, That ye love one another, as I have loved you.”

Update For Briggs. Remember we are all ladies and gentlemen in the comments. No name calling is allowed.

Scientific Truths Are Not Better Truths Than Just-Plain Truths

One of the key fallacies of scientism, in the sense of being the most destructive to common sense and personal wellbeing, is to suppose that any theory put forth in the name of science is therefore true, or certain enough to believe as true. The posited theory is, after all, scientific and, so scientism says, there is no better recommendation to truth than this.

This fallacy is field dependent, cropping up in some areas much more frequently than in others. It is rare, though still frequent enough in a mild sense, to find the speculations of chemists being refuted each new generation. But it is common as daylight to find the hypotheses put forth by sociologists, economists, and psychologists refuted not a generation after they are published, but often in the next issue of the same journal.

For example, the Weekly Standard’s Andrew Ferguson’s reviews the bad science and scientism behind the recent spate of experimentation “proving” conservatives are dumber/more inflexible/less compassionate/etc. than liberals, theories which are collected in Chris Mooney’s hagiography to scientism, The Republican Brain. (We began a collection of these studies here; please contribute. Also see Mike Flynn’s take on this.)

The studies rely on the principle that has informed the social sciences for more than a generation: If a researcher with a Ph.D. can corral enough undergraduates into a campus classroom and, by giving them a little bit of money or a class credit, get them to do something—fill out a questionnaire, let’s say, or pretend they’re in a specific real-world situation that the researcher has thought up—the young scholars will (unconsciously!) yield general truths about the human animal; scientific truths.

Although he didn’t intend it, Ferguson shows us another fallacy, which is that, in virtue of their being generated by scientists, that “scientific” truths are better than other kinds of truths, say metaphysical or logical truths. Stating it so plainly makes it obvious that if a truth is a truth, then it is a truth, and a truth is not more “truthy” because it comes from a scientist than a truth which comes from (say) a theologian.

The main problem with this summary is that often scientists use the word truth to mean “a belief which is probably but not 100% certainly, no-matter-what true.” That later creature is not a truth at all; it is a conjecture and nothing more. A conjecture which is “almost” true, or “for all practical purposes” true, is still a conjecture and not a truth. A truth is only true when it always is, when it can be deduced, when it arises as the end result of a valid argument. That is, conjectures when they are conjectures are not truths, but conjectures might become truth as new evidence arises.

Physicists make the mistake of confusing truth and conjecture just as often as sociologists and psychologists, only the physicists’ conjectures more often turn out to be truths as that new evidence arrives, so their error is of less consequence. Note that it is an error (a fallacy) to say, given evidence less than deductive, that a conjecture is a truth. The error will turn out to be more or less harmful depending on to what use the conjecture is put. If one is betting that a protein will fold a hypothesized way, one has turned a conjecture into a forecast, which is a term that acknowledges a conjecture is less than a truth.

If the conjecture turns out true, because of new evidence from an experiment, then the conjecture turns into a truth and gains are made. If the conjecture turns out false, we again know this based on new evidence, and loses are suffered. The loses are of particular interest: these are less the closer the conjecture is to the truth. Which is why the loses are greater in the soft sciences: their conjectures are much more often farther from truth.

One reason for the difference is that physicists more often than sociologists test their conjectures against reality. Another reason is that the evidence for a conjecture for the hard sciences is not just statistical, as it often is for soft-science conjectures. And any conjecture which relies primarily on statistics—given, that is, how statistics is practiced today—should not be trusted.

I’ll insert my usual plea that soft scientists act more like their hard (knock) brothers. Do not just assemble a one-time shot of data and compute some statistical model and tell us how well that model fits your data, and then assume because this fit is “good” that therefore your conjectures are true. This is formally a fallacy and is the weakest kind of evidence there is, but (almost universally) the only kind which is offered.

Instead do two things: (1) find very effort to discover evidence which refutes your conjecture (and then tell us abou it). And (2) as hard scientists (often) do, make predictions of data you have never before seen. If you do both these things, then you can ask us to believe your conjectures. Otherwise, keep quiet.

————————————————————————————

Thanks to the many readers who pointed me to Ferguson’s article, including Mike Flynn and Doug Magowan.

What Organic Boors, Swedish Pronouns, And The Exorcist Have In Common

The ExorcistNot much, except to demonstrate that the natural state of modernity is something closely resembling mass lunacy. To explain.

Swedish Pronouns

Via Sam Schulman ‏(Twitter: @Sam_Schulman) we infer that 1984 has not been translated in Swedish. This is an inference, mind, and not a direct claim. But you’ll agree it is a likely one after you learn that in Sweden, “it is now considered a distinct discrimination if one is addressed as a man or woman.” So reports Kopp Online.

Sweden is angling to de-genderify their pronouns so that use of he or she is officially discouraged, to be replaced by something resembling it (hen). Not only does this move strip useful information from its language, the Swedes have made an important step in subtracting from a person’s humanity, since to be called an “it” is to be equated to a chair or a bug. Now that, dear reader, is true equality. And it is under the banner of Equality that these changes are being made.

Ho hum you say? Then consider that “Nyamko Sabuni, currently Minister for Integration and Gender Equality, is now trying politically to free children from the constraints of gender roles.” Sabuni is leading an effort to de-genderify names, so that if he succeeds Sweden will be no longer have the equivalent of Bills and Janes, but will be a nation of only Pats and Chrises.

There is in Sweden a kindergarten with the telling name of Egalia where “gender-free” children are taught the joys of homosexuality and to play house, imagining, for instance, that there are “two or three mothers.” In a separate article, the wardens of this institution justify their experimentation by claiming that “gender” is not something which you are born with, but is something which can be “changed at any time.” This being so, the little tots should learn early how to do this morphing.

Organic Boors

Today reports of new “research” which confirms what everybody who has shopped in an urban farmer’s market already knew (“research” is needed because everybody did not have a p-value to accompany their belief). The news is that organic food boors are often bullies, that they are often self-satisfied “snotty and arrogant” moralists.

“There’s a line of research showing that when people can pat themselves on the back for their moral behavior, they can become self-righteous,” says author Kendall Eskine, assistant professor of the department of psychological sciences at Loyola University in New Orleans. “I’ve noticed a lot of organic foods are marketed with moral terminology, like Honest Tea, and wondered if you exposed people to organic food, if it would make them pat themselves on the back for their moral and environmental choices. I wondered if they would be more altruistic or not.”

Eskine found that “organic people judged much harder” than ordinary humans, that when “it came to helping out a needy stranger, the organic people also proved to be more selfish.” The money quote:

“There’s something about being exposed to organic food that made them feel better about themselves,” says Eskine. “And that made them kind of jerks a little bit, I guess.”

The only surprise is that Eskine found his results surprising: “You’d think eating organic would make you feel elevated and want to pay it forward,” he said.

He (I assume Eskine is a “he” because his first name is Kendall) should have gone to any Whole Foods…

HT Hot Air.

The Exorcist

Stay with me for a long story made short: Health and Human Secretary and self-labeled Catholic Kathleen Sebelius was invited to speak at one of Georgetown University’s commencement ceremonies, which was, or is, kind of, affiliated with the Catholic Church. This was controversial because Sebelius instituted a “mandate” which said that Catholic employers must provide (via “health” “insurance”) their employees abortion-inducing drugs and contraception. Recall poor Sandra Fluke and her plea for somebody—anybody but herself—to fund her birth control.

Abortion and birth control, whether you are for them or not, are against Catholic doctrine, meaning that any Catholic institution had no business honoring a woman like Sebelius who knowingly “mandated” a removal of religious freedom. Sebelius, in a weak attempt to justify her curious behavior and stick in the eye of her critics, in her speech said,

[President John] Kennedy talked about his vision of religion and the public square, and said he believed in an America, and I quote, “where no religious body seeks to impose its will directly or indirectly upon the general populace or the public acts of its officials — and where religious liberty is so indivisible that an act against one church is treated as an act against us all.”

Like many other things in the Obama government, Sebellius got it exactly backwards. There was no fear the Catholic church would foist its views on an innocent and unwilling public, but instead the reality of an omnipotent government forcing the Church to abandon its core beliefs. Shame on her for misusing Kennedy’s words.

Enter William Peter Blatty, author of The Exorcist and alumnus of Georgetown, who is organizing an official protest against Georgetown, which Blatty believes (if I may) is possessed by a spirit of idiocy.

Even if you are not Catholic and earnestly desire abortion for free, Blatty’s fight is yours. Kennedy was right: “an act against one church” is “as an act against us all.” If you let the government encroach upon our freedom because you lust for free contraception, you must know that this will not be the end and the government will soon come for what you treasure.

« Older posts Newer posts »

© 2014 William M. Briggs

Theme by Anders NorenUp ↑