“I Can See The Future!”

Futurologists in action!The future begins tomorrow. This being so, it doesn’t seem especially difficult to say what will happen in that uncertain land. Weathermen, and their noble sisters the weatherwomen, daily dispatch dependable forecasts for tomorrow.

Weatherpersons are not especially gifted soothsayers. Tomorrow can be seen by many—but only just. Peering twenty-four hours into the future is like a near-sighted man who has forgotten his glasses trying to identify a friend in front of a building at fifty yards. The face is blurred, but the building takes shape.

Big things are easier to see than small, but even the large grows small with distance, where the view grows murkier. Yet these difficulties don’t dissuade some from telling us of their visions. This only becomes a nuisance when the person telling the tale swears he can see where others cannot. I’m thinking of cocksure climatologists who claim to be able to call the temperature to within a tenth of a degree a decade hence.

And then there are the “futurologists.” Robert Nisbet calls futurology, “one of the more pretentious of the pseudo-sciences of the twentieth century and is fully deserving of the neologism by which it is known, comparable to labeling the study of the past ‘pastology.’”

It’s easy to set up shop as a futurologist: get somebody to pay you to make predictions of what will be in ten years. As long as you are confident, bold, and pay heed to fears and desires, your success is guaranteed. Your mistakes will rarely be held against you.

There are a hefty share of self-publicity hounds among futurologists: the Faith Popcorns who beguile businessmen into forgetting the pain of paying for poor predictions. But there are also, according to the Boston Globe, “serious futurologists” like “Nick Bostrom, an Oxford University philosophy professor who heads the school’s Future of Humanity Institute.”

One debate within this community is when the “singularity” will hit and what should one be wearing when it does. The singularity can be roughly defined as the point at which life emulates the plot of the movie Terminator. Technology becomes so advanced that humans are superfluous, mere cogs in the world machine.

Actually, there is no precise or universally agreed upon definition of what this mystical point is. Some say it will mean the rise of “Ems“, a.k.a. “future whole brain emulation robots”, machines into which we frail, death-plagued people will download our “selves.”

With an ample supply of spare parts, we would live forever! This idea thrills proponents, but I see it as dystopian. Issac Asimov predicted that if life can last millenia future people will spend most of their time avoiding all risks trying not to wear out and die. Who would volunteer to take a risk, what rewards could there be?

Machine-people aren’t likely, either. First, despite all the progress, modern computer chips are like tinker toys bought at a garage sale that lasted one day too long compared to the wiring of our nervous system. Calculating the digits of π at blazing speed is trivial next to computing how many shirts to pack on your next business trip. The difference in complexity is orders upon orders of magnitude.

Second, we are not our brains, but our entire bodies. We aren’t just the synapses in our skulls, but nerves, sinew, bone, and muscle. All these things are one interacting system. We can’t just invent a database for our brains, we must create an entire life-like machine, one whose intricacy would rival our own bodies.

Third, we very probably aren’t smart enough to do it. It has only been a bare 100 years that humans—who have existed for 250,000-400,000 years—that we have been able to invent toys which are sufficiently entertaining to distract us from our real work. Who says progress will continue? The only reason we got this far is because we figured out how to make food cheap and plentiful.

By “we”, I mean the species; thus, I speak nonsense. Only actual people know how to farm. When these people go to meet their grandfathers, some will leave their wisdom behind in books. But those that come after them will have read and understand those books.

Time must be spent by each new generation to learn what came before them. Once knowledge reaches a certain level, our progeny will spend most of their lives assimilating what came before. They won’t have time to create.

We are not infinitely intelligent, therefore we can’t think ourselves into every corner of the universe. Some things will always remain a mystery. Our great-etc. children might be able to see farther than we, but even their vision won’t penetrate into the future.

Do Americans Still Dislike Atheists?

Theists are stupid, say atheistsThe answer is: Americans hate atheists just as much as they hate theists. Residents of these grand United States had always disliked prigs of any stripe and are not shy about saying so.

What riles is not a man’s theology or the lack of it, but the way that man non-humorously and forcefully pushes his belief on another. When a zealot, in the manner of an insurance salesman behind on his quota, says you are a fool or a dupe for not thinking as he does, it’s only natural to want to see that man’s tongue stung by fire ants.

Take the academic philosopher Daniel Dennett, a leading proselytizer of atheism—a fellow not entirely unrepresentative of that breed. This tenured gentleman says that a mother teaching her daughter to believe in God is “child abuse.” He doesn’t mean Bible reading is akin to abuse, but it is abuse actually. Now I ask you, if a pollster queried you about your fondness for this man, what would you say?

Gregory Pauland and Phil Zuckerman, both academic sociologists, fret that you might not like Dennett. Even worse, they feel your dislike would violate Dennett’s “rights” in some vague sense. In the Washington Post the pair wonder if American’s “knee-jerk dislike of atheists [is] warranted”.

After all, there’s plenty to love about atheists:

A growing body of social science research reveals that atheists, and non-religious people in general, are far from the unsavory beings many assume them to be. On basic questions of morality and human decency — issues such as governmental use of torture, the death penalty, punitive hitting of children, racism, sexism, homophobia, anti-Semitism, environmental degradation or human rights — the irreligious tend to be more ethical than their religious peers…

An unfortunate injury to which many professors are prone is selective deafness: academics can hear what others are saying but the words which issue forth from their own mouths and pens fail to be registered. This sad malady is the natural consequence of lecturing too many snoozing students and writing too many unread papers. If academics actually listened to themselves, they would go crazy.

This disease has struck our pair with force. Consider their statement that “research reveals” atheists are “more ethical” when it comes to considering the death penalty, gay marriage, “environmental degradation,” and so forth.

If they would have been able to read what they wrote, they would have realized that their statement implies that there is only one indisputably correct answer to each of these questions. They assumed that the (politically) correct positions are so well known to readers that the positions need not be given. Do all atheists concur on the subject of “environmental degradation”?

Atheists are better people:

As individuals, atheists tend to score high on measures of intelligence, especially verbal ability and scientific literacy. They tend to raise their children to solve problems rationally, to make up their own minds when it comes to existential questions and to obey the golden rule. They are more likely to practice safe sex than the strongly religious are, and are less likely to be nationalistic or ethnocentric. They value freedom of thought.

Gosh, what cads theists must be, what rustic rubes! Why would anybody want to be one of these things when they could be an atheist? Atheists are brainy, rational, scientific. The kind of folk that know how to pronounce “existential”, people who are careful while rutting to not spread disease nor their distrustful genes; the kind of intelligent beings that look upon patriotism with the disdain that that outmoded concept deserves.

Most importantly, atheists value freedom of thought. Dennett’s grip of freedom is so strong that he preaches prison for parents who drag their offspring to church.

Why, given their puissant philosophical purity and obvious ethical superiority, aren’t atheists loved and acknowledged as leaders in thought and deed? Pauland and Zuckerman think that it’s because, “Psalm 14 claimed that atheists were foolish and corrupt, incapable of doing any good”, and that this notion stuck.

Further, surveys “find that most Americans refuse or are reluctant to marry or vote for nontheists” and that Americans “intrinsically suspect” atheists. “Negative stereotypes of atheists are alive and well. Yet like all stereotypes, they aren’t true”.

Here, as Bertie Wooster might have said, is where our pair have made their bloomer. For stereotypes are almost always true. It is the explanations about stereotypical behavior that are often false. As Steven Goldberg1 has shown, “Stereotypes reflect a population’s nearly always correct observation that certain groups exhibit certain temperamental or behavioral tendencies that set them apart from the rest of the population.”

Atheists aren’t disliked because of their philosophy. They are disliked because the most vocal of them (like the unknown artist responsible for today’s graphic) aren’t likable.

——————————————————————————————–

1Steven Goldberg: When Wish Replaces Thought: Why So Much of What You Believe is False, 1991. Prometheus Books. Buffalo, New York. p. 151.

College Graduates Are Academically Adrift

Repost I meant for this to run two days, but events overtook. Therefore, I’m restoring it to the top for the remainder of the day.

Academically AdriftAcademically Adrift: Limited Learning on College Campuses by Richard Arum and Josipa Roksa

In his justly famous Introduction to Logic, Irving Copi asks us as homework to identify the fallacy in this argument:

Our nation is a democracy and dedicated to the proposition that all people are created equal. We believe in equality of opportunity for everyone, so our colleges and universities should admit every applicant, regardless of his or her economic or educational background [7th ed., pp 127-128].

I will not be surprised if you do not see it; indeed, this argument is universally taken as sound. But the fallacy, hidden by layers of unconsciously assimilated politics, shines if you replace “colleges and universities” with (say) “major league sports franchises” and “educational background” with “athletic ability.”

One reason the fallacy passes for truth is because people (and professors) assume they know what college means. They often do not. College provides three functions: research, job training (majors like “business” and “nursing”), and imparting wisdom (what is thought of as a classical education). These functions, though they have little in common, are usually housed in the same facilities so the confusion about which is meant when using the word “college” is natural.

Arum and Roksa define by implication: college is a mixture of job training and (what I will call) traditional college. This is important because the majority of kids entering college do so for job training and arrive finding a strange and unfortunate mixture of training and rigor which characterizes undergraduate education in America.

A “growing proportion of high school graduates are entering higher education” because of the increase in “access.” But the problem is that too many kids are coming. Whether it is because of inability or inadequate preparation, many can’t hack the work. These facts explain the authors’ thesis: “We find disturbing evidence that many contemporary college academic programs are not particularly rigorous or demanding.” This is so because if the traditional rigor were not slackened at the same rate as the increase in enrollment, too many kids would be flunked out (taking with them valuable diversity and money).

The authors never make this connection, but they do that say students “expect”—this is the key word—”to enroll in college and complete bachelor’s degrees, even when they are poorly prepared to do so.” Since enrollments are still increasing, it is a safe prediction that the performance of the average graduate will decrease (this can be so even if the number of top performers remains fixed).

The authors offer no theories nor recommendations: the closest they come is to suggest tying school-level grants to student outcomes. Their goal is to instead provide statistics about who is doing well and who poorly. They excel at this; but the prose is so dry that the reader is advised to extinguish all flames while reading: this is a book by specialists aimed at the same; a third of the bulk is taken by footnoted charts.

Most statistics come from a pre- and post-test of the Collegiate Learning Assessment (CLA) test given upon matriculation and after two years, a period during which it is presumed students learn something. The CLA works by providing a jumble of materials surrounding a theme, which students must plow through to write a memo. The memo is graded for its analytical reasoning, writing effectiveness and mechanics, and problem solving.

Whatever weaknesses the CLA has—there are critics and all tests are imperfect—it surely mimics a routine task of cubicle dwellers, i.e. those ex-students hired because they have “degrees.” Surely kids should become better at these tasks after paying two full years of tuition. However, kids only average about a “seven percent gain.” And many students’ scores decrease: “With a large sample of over 2,300 students, we observe no statistically significant gains in critical thinking, complex reasoning, and writing skills for at least 45 percent of the students in our study.”

The real kicker is this: “There is some evidence that college students improved their critical thinking skills more in the past than they do today.” I would only quibble with the word some; it should be plenty. Anyway, the results are in line with the increasing-enrollment-decreasing-rigor theory.

There are no surprises in the results, not even a mild one: students with better educated parents did better, students who were better prepared did better, whites and Asians did better, students with better SAT scores did better, students who had more contact with their professors outside the classroom did better, students who studied alone did better.

In every measure, Business students did worst, followed closely by Education or Social Work majors. Science and Math students did best (and quite well), followed by Humanities majors. Health, Communications, Engineering, and Computer Science students were in the middle. Blacks studied less than whites, and were more likely to enroll in easier courses. Students with educated parents studied more.

My experience teaching adds to the evidence that Business students are the poorest and least motivated: for many (not all!) of these kids, college is the place to receive their “degree” without which they cannot progress. The idea that they should be made to learn anything that isn’t immediately useful mystifies most of them.

But credit hours must be earned! So students avoid difficult courses and instead take ones which are “soooo easy” (a direct quote). “Students often embraced a ‘credentialist-collegiate orientation’ that focused on earning a degree with as little effort as possible…12 percent of coursework was devoted to other subjects that included courses on their transcripts in areas as diverse as golf, tennis, and ‘ultimate Frisbee.’” More “courses on student transcripts had the terms race, gender, or sex in their titles…[others] had the words cinema, film, or movies in their course titles…and [more] had the term sexuality in their titles.”

They quote studies from Mary Grigsby, who found “that 70 percent of students reported that social learning was more important than academics…which they referred to as ‘work’ in contrast to social learning, which was regarded as ‘fun.’” Only an academic could prefer the euphemism “social learning” over “partying,” but let that pass: these books and papers must get past peer review.

Some trivia: a survey of time spent showed 24% sleeping (estimated), 9% playing working, 51% socializing, and only 9% in class and 7% studying. Yes, only 16% working. “Consistent with other studies, we find that students are not spending a great deal of time outside of the classroom on their coursework….37 percent of students reported spending less than five hours per week preparing for their courses.” That’s self report, incidentally, so it would not shock to learn the real number was even less. In banal language, they announce that the “limited number of hours students spend studying is consistent with the emergence of a college student culture focused on social life and strategic management of work requirements.”

Why are so many kids going to college instead of finding a job or searching for a trade school? The authors cite James Rosenberg’s research of high school guidance counselors: “[Kids] receive ‘motivational platitudes’ that emphasize a ‘warm, fuzzy approach” focused on ‘personal growth.’ Students are told: ‘to believe in themselves,’ ‘put forth more effort,’ or ‘establish themselves a little more as a person.”

Seeing college as a business and students in the role of “consumers” does “not necessarily yield improved outcomes” and there is no reason to expect that “consumers will “prioritize learning…Growing numbers of students are sent to college at increasingly higher costs, but for a large proportion of them the gains in critical thinking, complex reasoning and written communication are either exceedingly small or empirically nonexistent.” Students “might graduate, but they are failing to develop the higher-order cognitive skills that it is widely assumed college students should master.”

There is not one lever to pull to fix the situation, but many. First, research and graduate apprenticeship training should be separated from undergraduate education, which itself should be split in two: traditional classical education, reserved for the demonstrably best and brightest, and technical training, where the great mass of students would flow. The latter would train all those whose main purpose in attending college is to get a job.

This training, depending on the difficulty of the subject, should be for one to three full years (counting summers): business students would be out earning money in a year, nurses and engineers who have to accumulate more information would take three. We have to stop asking students to do what they cannot do and have no interest in doing.

Osama Bin Laden Dead: Time To Quit Afghanistan

Update below

The man whose mind was a slave to evil and who planned the attacks of 9/11, the man who spent the last decade of his pathetic life cowering in fear, the man who championed pain and grief and misery is dead. May he find no peace.

His body, complete with the bullet that killed him, was unceremoniously and fittingly dumped at sea. His corpse will feed the sharks, his bones will mingle with the effluvia of cruise ships. The odd burial was done to discourage pilgrimages of evil, and was an action in accordance with at least one Hadith, which says, “Graves should not be marked or built” for apostates.

Further, the “Noble Qur’an – At-Tauba 9:84″ demands that we do not attend the funeral of a hypocrite or disbeliever. No believer would have coolly murdered Muslims, Christians, Jews and others merely to scar some buildings. It was bloodlust, not faith, which drove Bin Laden.

We celebrate his death. We feel pride in our men who tracked him down and risked their lives and shot him in the head. None of our soldiers were injured. This is a good day.

And we are not alone. In an apparent reversal of its stance on the death penalty, the New York Times wrote approvingly that “the mood on the street was jubilant” after the announcement in Times Square. Its editorial even allowed that Bin Laden was a “failure,” a sentiment to which we can say amen.

The Daily Kos, in a fit of eloquence, said that putting down Bin Laden was a “BFD.” Although one unhappy writer on that unhappy site inadvertently confessed a bizarre sin, “[Y]eah, on a certain level, I get the mocking of Republicans because Bush didn’t get Bin Laden and ‘we’ did.” We?

Mr Obama said that “justice was done”, which must mean that killing Bin Laden was legitimate, moral, lawful, the very stuff of justice. Amen again.

Pondering the president’s words, Steve Clemons of the Huffington Post wrote that, “The President of the United States has checked off the box in bringing Osama bin Laden and al Qaeda to justice — and probably assured his reelection in 2012.”

This might be so, but Clemons and the writers at Daily Kos should remember that this was not Democrat nor Republican justice. Nor was it American justice. It was justice, plain and simple. It was the right thing for Mr Obama, it would it have been the right thing for Mr Bush had Bin Laden met his demise earlier. This was not the right thing for Americans only. It was the right thing for all.

Bin Laden was found in Pakistan: it was said he was there for some time. Mr Obama, making full use of his “I”s and “my”s, the victor’s traditional prerogative, said that the Pakistani’s claimed that Bin Laden “declared war on Pakistan itself”, and this is why Pakistani officials led our troops to Bin Laden’s hideout. Further, we are assured that we “received clearance to strike from Pakistan.”

The message is that we should not hold the Pakistani government culpable for harboring a murder for almost a decade. At least, not to the extent that we held Afghanistan culpable.

Even though Bin Laden was within shopping distance of Islamabad, it is doubtful we will hear calls to “Hold Pakistan accountable!” There may be investigations in certain sub-committees, held on Friday afternoons in July, but that will be it.

People are weary of war and don’t want to start another one with Pakistan—a country which has nukes and has said it would use them. Besides, there are other ways to “punish” Pakistan. Arming India, for one.

Killing Bin Laden is the action that could lead us out of Afghanistan. Our military could organize one final, grand push, claim victory, then pack up and leave. We won’t have been the first country to fail to tame this rugged land. Besides, Bin Laden was on holiday in Pakistan, not Afghanistan.

Calls for actions like this are already being made. Marwan Bishara of Al Jezeera said killing Bin Laden kills the “alibi”, that is, “Washington has less reason or justification to wage a war in Afghanistan now that bin Laden is no more.”

We should embrace this reasoning, particularly since (as Bishara continues) “for the Muslim world, bin Laden has already been made irrelevant by the Arab Spring that underlined the meaning of peoples [sic] power through peaceful means.”

Whether or not this is true, Bin Laden’s killing provides us with a casus pace, of which we should make full use and quickly.

Update

Some are suggesting that we remain in Afghanistan for three reasons: Pakistan, Iran, and money. Since it is obvious that some in Pakistan, the very definition of an unstable country, were complicit in the care and feeding of that dog OBL, there is sure to be more trouble to come from that quarter.

Bases in Afghanistan would have the Pakistani Army/Government swiveling their skulls looking east and west to India. The same is true with Iran: they would be boxed in by Iraq on one side, Afghanistan on the other.

But Afghanistan is only useful as a base to project air power. The land is too rugged to stage heavy artillery and troop launches into either Iran or Pakistan. Afghanistan must be supplied by us from the air, flying over either Iran or Pakistan. While it’s true that in any long-term engagement American air power would keep the skies clear, it is probably more trouble than it is worth to supply air bases so far from the sea and quick exits.

It is much easier to maintain forces in southern Iraq, which is accessible by the sea, through the Gulf of Oman. Naval air cover through ships stationed in the Arabian Sea can watch over Pakistan.

We should embrace more closely India as an ally. Let’s not forget that both Pakistan and India have nukes, which nobody wants to see loosed. The stalemate must be encouraged. And we don’t have to maintain an actual presence in India, which is cheaper. Finally, a stronger India is a better buffer for a growing China.

The Non-Candidacy Of Donald Trump

Donald Trump is not, has not been, and will not run for president of the United States. Nor will he campaign for the Republican nomination. He is a cynic and egomaniac whose only purpose in hinting at a run is to garner adulation and attention to feed his thin soul.

Consider: his platform thus far has been nothing more substantial than to insist that there was something nefarious in President Obama ignoring asinine requests to release his “long-form” birth certificate. “You wouldn’t believe what my investigators have found,” he said, implying a deep conspiracy between…whom? He never said, because he could not flesh out his childish claim. And reporters, often unable to hold the thread of a argument for any length of time, have not pressed him.

Only two things are possible here: either Trump believed what he was saying, or he was engaged in a farcical, pettifogging ploy to draw attention to himself. Either disqualifies him from becoming president. Donald Trump

The worst is if he believed himself. Assuming that only I know the real truth, that the populace has been duped by dark cabal, is a specialty of the conspiracy nut. He lives in a land where denials are proof of complicity, where the most mundane occurrence is deeply laden with occult meaning that only he can see. Does Trump believe himself one of the chosen few who Know The Real Truth?

And if Trump didn’t believe what he was saying, if he was playing to the crowd, then he was not intelligent or savvy enough to know that the “birther controversy” taints whoever touches it. Of course Mr Obama is a citizen; the evidence has long been available.

Two things: even having to write “the evidence has long been available” makes one sound foolish, as if there is something worth discussing. Trump forced sober people to waste their time on nonsense, and in so doing he has made it appear that the Republican party houses a substantial minority of scatterbrained lunatics.

And second, he should have realized that it wouldn’t have mattered even if our president was born on a UFO and deposited in a Hawaiian pineapple field in an alien plot to make Mr Obama our leader. Forget having a birth certificate: if Barack Obama came running naked and screaming out of an orphanage wielding a bloody machete, our starry-eyed press would say, “He must have had his reasons.”

Did Trump or any of his followers actually believe Mr Obama could be made to resign (only to have Joe “Hair Plugs” Biden ascend to the throne)? The mind boggles.

Charles Krauthammer called Trump the “Al Sharpton” of the Republican party, and this is close enough. Although I cannot recall Mr Sharpton haranguing audiences that Chinese men were serial, keep-it-in-the-family fornicators. If Trump had any sense of Chinese history or culture, he would understand how horrific his slur was, how dangerous it is for relations between our countries.

Any president or sitting politician who used the words artfully chosen by Trump would be considered infamous. Never mind that his protectionist scheme is economically unsound and more aligned with Democrat than Republican theory—he would fix a 25% tariff on Chinese goods—his behavior is boorish and un-statesmanlike. It is not—and cannot be seen to be—presidential.

Once again, we have a dichotomy: either Trump’s vocabulary, manners, and common sense are stunted, or he used the words he did in cold calculation, merely so that he could thrill to the crowd crying, “Oh, Donald!” Either way, he is disqualified from office.

His “reality” television show is said to suffer from poor and falling ratings. It will end its run soon. All TV shows end eventually, so this is not like the times Trump has declared bankruptcy, foisting the burdens of his imprudent decisions onto the backs of others. However, the show’s end is important because Trump claims he will make a grand statement at that time.

He said, “I think you will be surprised at a number of things, but I think you will be surprised at what my announcement is.” It will be no surprise when he says, “I am not running”, for it is clear that he never was.

He will instead claim that he wanted to “inform the debate”, or that he wanted to “force the true candidates to talk about the serious issues I brought up”, or something similar. Pure rot. The only things Trump brought up are the lunches of serious citizens forced to behold the horrific spectacle of his unbridled vanity.

Hayek Versus Keynes

The video “Fight of the Century: Keynes vs. Hayek Round Two” (via Econstories.tv) allows us to discern one reason why reporters side with the forces of large government.

Hayek’s corner man is von Mises, while Keynes strangely has Say (Galbraith must have been off skiing). Hayek clearly wins the round on points, scoring blow after unanswered-blow. The only point which Keynes bests Hayek is at the lip. Keynes’s ‘stache is bold, serious, starched, Bismarckian. It makes John Bolton’s walrus-like appendage appear anemic. Hayek’s fuzz is half-hearted, cutoff too soon, a foreign interloper. As a soup-strainer it is a failure. Hand that man a glass of milk!

But Hayek doesn’t even break a swear in the battle over the public ledger. Even though Hayek wins the fight proper, it is Keynes’s gloved hand the referee lifts at the end, not Hayek’s. Why?

It’s obvious that the fight was organized by some Don King-like entity, the outcome decided in advance by the connected, made men who depend on bailouts, payouts, and regulations foist on the other guy; the connected who move in and out of the government that feeds them. But why do reporters tout the loser?

Ego. The reporters throng around the man whose theory gives their lives meaning. Oh, sure, there are plenty of other reasons, like indoctrination in college and ignorance of economics. But no less important is self-esteem. I mean, the larger is government, the more important the reporter’s job, thus the more magnificent his opinion of himself.

Caution: the video is set to poor music. The lyric is clever, but the beat grates: it is, however, tolerable. The noise which accompanies the credits must have been a mistake uncorrected in production. Be ready to hit mute.

The fight isn’t over. There are more rounds to come, as evinced by this morning’s editorial in the Wall Street Journal, which reminds us of Mr Obama’s Keynesian thinking:

First came $168 billion in one-time tax rebates in February 2008 under George W. Bush, then $814 billion in spending spread over 2009-2010, cash for clunkers, the $8,000 home buyer tax credit, Hamp to prevent home foreclosures, the Detroit auto bailouts, billions for green jobs, a payroll tax cut for 2011, and of course near-zero interest rates for 28 months buttressed by quantitative easing I and II.

Scratchy growth, loose money, a tanking dollar, boasts—boasts– of “Leading from behind”, and inflation coming. Are we ready for five more years or this, or just one?

Government outlays: The Canadian Experience — Guest Post by David Ipperciel

In the post “Taxing the Rich Always Fails Eventually,” Briggs states that, “It is rational to believe that these years [of decreasing deficits] will continue to be infrequent.” I believe this to be the case only if one’s world is restricted to the US. However, when looking at the experience from other countries like Sweden or Canada, a different conclusion arises.

The following graph shows the Inflation-adjusted Outlay Per Capita up until 2007 for the Canadian federal government (constant 2007 dollars).

Canadian output per capita

The same situation can be illustrated with the debt to GDP graph, wich peaked around 75% in 1995, and moved below 30% in the following years:

Canadian output per capita

The situation is similar in Sweden, where debt to GDP peaked in 1995 around the same levels, followed by a dramatic turnaround (the graph starts in 1880 and shows the GDP debt ratio):

Sweden GDP per capita

Briggs’s graph starts in 1900, whereas my graph starts in 1961. Prior to 1961, there is no reason to expect a situation much difference from the US: government started with modest goals and modest tax grabs, with peaks during the two World Wars followed by peacetime corrections, in a general uptrend as the state become more omnipresent. The interesting story is what happens after the 1991 recession. While the US outlays continue to push higher, Canadian outlays not only stabilize, but even move lower. What happened?

Outside public outcry and tax hikes to pay for the increasing debt, the first real blow come from Moody’s downgrade of Canada’s foreign debt in 1994, which lost its AAA rating. Then an article from the Wall Street Journal shook Canadians’ perceptions of themselves: the article had called Canada an honorary Third World country because of their out-of-control debt. Something had to be done, and the government reacted swiftly: government spending was cut by 20% and 40,000 public jobs were eliminated.

The political landscape also changed: the conservative Mike Harris was elected in Ontario (the largest province with 38% of the country‚Äôs population) in 1995 on his “Common Sense Revolution” platform, which called for large tax and spending cuts, and the elimination of the province’s huge debt. One provincial government after the other voted for balanced budget laws.

In order to implement that initiative, contingent reserves were set up as buffers to make sure spending did not surpass income. Establishing conservative estimates of revenues was seen as sound fiscal planning. Year after year, the surplus came out larger than the government forecast. This situation lasted until the last financial crisis, but there is no reason not to believe that Canada will not regain fiscal balance in the near future.

The Canadian situation shows that governments can change the tide when pushed to the wall. Often, an excuse is needed to enact these changes. The Standard and Poor’s rating agency put the US debt on negative outlook this month, with a potential downgrade in the next two years. Will that bring about the needed changes? Will the unavoidable tax hike push the population to the edge and lead to change? The future is always hard to predict, but one thing is certain: the possibility for change exists, and the experience from Canada (and Sweden and probably other countries we know little about) is there to show it.