William M. Briggs

Statistician to the Stars!

Uncertainty Is An Impossible Sell

Yessir, this sweet baby of an algorithm can predict anything from NFL games to the Spanish GDP.

Yessir, this sweet baby of an algorithm can predict anything from NFL games to the Spanish GDP.

I’ve never seen the show, but I’ve heard that the protagonist on the X Files used to have a desk sign which read, “I want to believe.” That sentiment characterizes most buyers and users of predictive or explanatory algorithms, i.e. statistics.

Really, there is no scientific promise too large that it will not be at least hoped for. It doesn’t matter how many failures or unrealized dreams are met, the newest thing always excites. We saw something of this Tuesday with a new algorithm that claimed to be able to forecast the stock market two years in advance.

Experience suggests that this purported marvel, or any new gee-whiz algorithm, will liquefy the rationality centers of those people in charge of securing predictions (financial services, governments, marketers, academics, etc.). The authors of that paper now have a window of opportunity to cash in on their method.

Meanwhile, grumpy naysayers warning against enthusiasm will meet with a lesser fate.

A tale. I was dealing with a potential client who believed they discovered a means to increase their predictive accuracy to astonishing heights. R-squareds (a poor measure; don’t use them) northward of 95% were seen in tests! I was to automate the process.

Turns out they were smoothing two sets of time series which originally had no relationship to one another, and then correlating the smoothed series—which suddenly showed a remarkable correlation! Regular readers will know this is a huge no-no. Smoothing artificially boosts correlation and predictive accuracy. I tried showing the client how this works, proving it with several examples using made-up data that looked like theirs.

Funny thing is that I convinced a business type I was right, but he was overruled by their algorithm person. This algorithm person was basing his excitement on a sterling certified peer-reviewed paper by an academic from an institution of world renown. Regular readers know all about peer-reviewed papers from credentialed academics.

In what is now the theme of my career, I didn’t get the job.

Another anecdote, necessarily vague because I cannot betray any confidences. This didn’t happen to me, but to somebody I know. Major company wanted to understand how “social media”, specifically Twitter data, predicted their income. My colleague correctly noted how noisy this data is.

My colleague thus warned that, while something might be learned, whatever it would be wouldn’t be earthshaking. Certainly not much money should be spent on the idea. This advice was rejected and the major company sought bids from algorithm firms which could take on the job. One was found. I cannot tell you how much money was asked for or given, but if I did you would faint dead away. I can only say that whoever runs this algorithm company could easily find a position in government.

I offered to do the job for an order of magnitude less. My bid was rejected, but then I, like my colleague, cautioned that not much would come of the analysis.

The sequel? You already know what happened so there’s no point going into it.

If you want to set up business as a data scientist (the newfangled term by which statisticians are beginning to call themselves), the lesson is this: promise the moon and charge like you’re actually going there. Failure is rarely punished and never remembered.

Uncertainty is the same tough sell in science. The way to do statistics (or machine learning, or AI, or whatever) properly, like I’m always saying, is to use whatever model you have to predict new, never-before-seen data. If your model works, you’ll make good predictions. If not, not. Problem is, this method is necessarily less certain than the old ways of doing things.

Doing it the right way makes it look like you know a hell of a lot less. Fireworks are rare. Everybody hates this. Science is supposed to make us more, not less, certain!

It also does no good proving that if you get uncertainty right that, even though you will be less sure of yourself, you will and must make better decisions, and that better decisions mean greater rewards. The allure of certainty is too strong. People want easy answers and can’t abide the fogginess which attends uncertainty.

I have only given you two anecdotes, but they can be multiplied indefinitely (especially in academia). It’s thus rational to believe that nothing will ever change and that people will continue to be over-certain.

Update Breaking news! I have just developed a zero-point energy super computerized big data artificial intelligent learning prediculator. Investors should use my contact page and send me money.

23 Comments

  1. It was a poster: http://ep.yimg.com/ca/I/yhst-128366933912873_2270_100225894

    My two favorite X-file quotes are:
    “Trust no one.”
    “The truth is out there.”

  2. In some ways, and I preach to the choir here, statistics is the new alchemy. We wish to transmogrify uncertainty into certainty. “Data scientists” are the miller’s daughter, trying to spin anything into gold to avoid punishment.

    There are certainly occasions where this is possible. When the aggregate behaviors of the world behave linearly or quadratically within our tolerances, many methods work quite well, but the context and scope are always more limited than we want to admit. In my own line of work, statistics and regression provide significant benefits, but I was taught to always test models against unseen data.

    Maybe in engineering statistics works better, because many of us already don’t believe our causal, physical models to more than a couple of significant figures. The statistics and regressions on such things do not magically improve our trust, and so reification is avoided (I hope, anyway). There are still times when I am asked about the accuracy of a regression and I must remind my colleagues that we are arguing about decimal places that no one would trust or should trust. Is the weight of the vehicle 15,123 lbs or 15,124 lbs?

    I wonder if the growth in data science is partially due to an apprehension of (for marketers, say) finding out that your ideas won’t work, or people don’t like them. Uncertainty can provide lots of mental wiggle room to justify failure as success and make yourself feel better. “Did my marketing plan work?” we ask, and the answer comes back “we failed to reject that it worked!”, and everyone feels warm and fuzzy. Even better if a mixed-effects chaotic random-walk logistic analysis that a half dozen PhD’s says so!

  3. Briggs: You’ve never seen the X-files! How sad…….Oh, and I wish you luck with your prediculator. It won’t compete with my psychic business, of course. 🙂

    Actually, predictive algorithms and psychics are a lot alike. Both sell people what they want to hear and rely on people having the memory capacity of gerbils (no, wait, gerbils might be too smart for some of the predictions……). It rarely occurs to people that since the dawn of time humans have wanted to know the future and more importantly, to know the future will be what they want it to be. In many thousands of years, such predictability has never happened. An algorithm won’t change that. Too many variables, many unknown. Timing is one of the hardest variables. One may predict the overall direction, but rarely can you predict when something will happen. Reminds me of Heisenberg on a macro scale.

  4. I need to fire my copy editor. That last line should be “Even better if a mixed-effects chaotic random-walk logistic analysis that a half dozen PhD’s peer-reviewed says so!”.

  5. hey, I have a buzz-word that hasn’t been used here: neural network learning.

  6. See it all the time from the side of the guy writing the systems that generated the data for the “data scientists”. Marketers are insane for “attribution” – what caused someone to buy my product? 100 reasons, each as likely in combination as individually. Did your advertisement push them over the edge? Did it get them to come into your store instead of your competitors? Maybe they got a bonus from work and finally bought the widget they’ve been eyeing for some time?

    The crazier thing is that this data that they’re using to do the analysis? It’s incredibly dirty (in other words, real). Sales data is fraught with mistakes, cashier overrides, unrelated discounts, sharing of “loyalty” cards, etc etc. The data’s aggregated, a trend is found based on some self-reported or vendor provided customer attributes, and then predictions are made. In theory, the models are then tested and tweaked. This rarely happens, since by now its a year later and we have a new set of models and a new strategic direction.

    Also, is it me, or is “data science” a stupid term? It’s not the study of data – it’s the study of something else – sales, marketing, operations, etc.

  7. Bob,

    Let’s join as many buzzwords together as possible! In the words of Ian Malcom, “before you even knew what you had, you patented it, and packaged it, and slapped it on a plastic lunchbox, and now you’re selling it, you wanna sell it.”

  8. Also, this is great, from Peter Shankman.

    http://shankman.com/i-will-never-hire-a-social-media-expert-and-neither-should-you/

    Statistics don’t sell products – stories do.

  9. “I was to automatic the process. ”

    automatic -> automate.

    🙂 Your enemies continue there efforts to infiltrate and corrupt your blog.

  10. Briggs

    December 5, 2014 at 12:07 pm

    MattS,

    Curses! Foiled again!

  11. @Matts: “Your enemies continue there efforts to infiltrate and corrupt your blog.” ->Your enemies continue their efforts to infiltrate and corrupt your blog.”

    @Sheri: “It won’t compete with my psychic business, of course.” How do you know? 😉

  12. @Rob: Because I’m a psychic!!

  13. ” Failure is rarely punished and never remembered.”
    Of course not. If you were the manager that hired the algorithm guy and later discovered that what you bought did not work as advertised, would you call attention to your mistake? Would you want to be known as the dumb ass that was conned and wasted millions of dollars? You just keep quiet and pretend it never happened.

  14. I, like my colleague, cautioned that not much would come of the analysis.
    The sequel? You already know what happened so there’s no point going into it.

    Unsurprising. You already gave them the answer. Free milk and cows, etc. You should have charged double then gave them the answer. OTOH, nobody wants to have their fortune teller say “Who knows? The future is vague.”

  15. That new model of the market fails because the results of the model influence the future. How can the model predict future prices when those prices are influenced by individual actions that are based on the model? Ex. The model says prices will fall in a year, leading investors to short the market, which drives prices lower today, an outcome not predicts by the model.

  16. The best way to predict the success of a market model is to first analyze the success of all previous market models.
    Using statistics as a means to predict systems you don’t understand the actual inner workings of is rarely successful.
    Using statistics to measure the noise or uncertainty in systems you do fundamentally understand the inner workings of is more useful.
    It’s a good thing to know the difference. This magical thinking is common in areas like neural networks where the belief is dumping big data into a black box will derive the correct inner workings.

  17. Rob Ryan,

    They are even targeting the comments now. 🙂

  18. Dav,

    Yoda

    Difficult to see. Always in motion the future is.

    /Yoda

  19. Is future prediction different from finding data anomalies as they happen?

  20. I’m sometimes accused of being a “data scientist”, though I’m not. No Ph.D. in statistics, or even computer science. I read Briggs’s blog to learn. Primarily what I learn is that overconfidence is ubiquitous and pernicious, and is to be guarded against and shunned.

    I manage a group of people engaged in (engaged to?) Data Science’s formerly fashionable sister, “Predictive Analytics”.

    Predictive Analytics isn’t young any more. While Data Science courts Fortune 50 executives with the ecstatic promise of new techniques like Deep Learning, Predictive Analytics has learned what Mother Statistics tried to tell it long ago. “Reality is complicated”, she said, “make the best decisions you can, and hope for a bit of luck.”

    For those of us engaged in this work, that means spending weeks or months to show where we might be able to shave a percent off costs in this corner of operations, to improve results by a fraction over there, to increase revenues by a staggering two percent in this arena. That’s where the hard part begins – the part where we try to convince someone that it’s (and we’re) worth it.

    The thing is, while she’s older and less likely to inspire dreams, I like the older girl better (and can I admit a fondness for her mother?) She’s learned some lessons about the way the world really works. Seen some modest success, some shattering disappointments. Her goals today are more down-to-earth, her desires more modest, her vision more practical.

    But she’s not sexy enough to pimp out anymore.

    Any advice you can offer?

  21. I can sympathize with the surreal experience of sitting in a board room with a bunch of company directors and the ‘hot new kids on the block’ selling their new whiz bang technology, and I guess I’m the old grumpy guy. The old grumpy guy is there because I may need to connect up my stuff (which works) to the hot new bangs and whistles. Except I explain to everyone that it’s more bang than whistle and there is zero probability of any of it working and the people sitting across from me are completely clueless or incompetent or worse, so my attendance at said meeting is something of a mote point. Of course I get dismissed as that sad guy living in the past who couldn’t keep up with the times, and I remember walking out of that room thinking, ‘wow that is going to be one hell of a train wreck, but I did what I could do.’ I didn’t get invited back and about a year later I bumped into the managing director who volunteered the information (not that I asked since there was no point) that the project turned into a Grade A catastrophe and that they had lost their money and damaged their business.

    Regarding the X-Files, I have one up on Mr Briggs. I did watch one episode. As far as I could work out a demon possessed chicken pecked some unfortunate person to death and it was up to the X Team to work out the whys and wherefores. I don’t recall getting around to watching subsequent episodes.

  22. Will: I don’t remember any possessed chicken in the Xfiles. Must have missed that one. Usually, it was UFO’s, genetic manipulation, radiation and that sort of thing. My favorite was the chupacabra (Mexican goal sucker). It actually showed up all over the place in television. Mostly I think I just liked the word!

    However, since we are on the subject of certainty, one could always count on Scully and Mulder to lose their evidence in every single episode. That was 100% certain.

  23. The genius of the X-files was to build the suspense/mystery and not go over the top with gore. That and the interweaving of the various conspiracy threads with the quirky characters in the overall story arc. The show also incorporated several miniscule references that I found amusing such as the name of the seaside community where Mulder’s sister was taken by the aliens is quite familiar to me. Why the writers picked it is just another mystery.

Leave a Reply

Your email address will not be published.

*

© 2016 William M. Briggs

Theme by Anders NorenUp ↑