Pessimism regarding upcoming artificially intelligent personal trainers (part 7)

Posted on April 14, 2017


(Last Updated On: )

Artificial Intelligence Can’t Tell You Why

Say we have our AI best program picker compared to our human. You, the user of the program, looks at it and goes “Ok, but

  • why are we doing this many sets?”
  • why does my shoulder need this type of work?”
  • why is my knee bothering me when I do lunges?”

The AI cannot explain this to you. The tradeoff we make with machine learning which gives us insights humans didn’t have is we get insights that can’t be explained. We can guess or estimate why an AI is recommending a course of action, but we don’t truly know how it got there.

Train an AI on a bunch of image recognition photos. It may discern when a dog is in a photo. We can say “Hey! The AI learned dogs have x, y, z features.” But it’s possible the image set we trained the algorithm on happens to overwhelmingly have grass in the same photo. We think the AI is recognizing dogs, but really it’s been recognizing grass, in a scenario where grass happens to = dog.

This was eloquently captured in this article about machine learning algorithms taking dermatology jobs,

A.I. VERSUS M.D. What happens when diagnosis is automated?

“The most powerful element in these clinical encounters, I realized, was not knowing that or knowing how—not mastering the facts of the case, or perceiving the patterns they formed. It lay in yet a third realm of knowledge: knowing why.”

 

Imagine you’ve given birth to your first child. It’s a few hours after delivery. Dad is feeling solid. He didn’t faint and have to be taken to the ER for head trauma. (This happens!) Mom is feeling good. Tired, but healthy.

Baby is doing well too. Breast feeding is coming along. APGAR score is apparently good.

Then eight hours after birth you see the nurse is looking at your kid much more inquisitively.

Then twelve hours after birth you’re told your baby needs to go into observation.

Then sixteen hours after birth you’re told baby is fine.

Then eighteen hours after birth you’re told the baby needs to go to the NICU now, where in your mind she’s ripped out of the room in a manner making you question if you’re a psychopath due to how violent you suddenly feel. You can see her in four hours.

When you see her, half of one hand is covered with a tube sticking out of it; four electrodes are on her stomach, and incessant beeping is going on.

 

What goes through your mind as a parent? 

  1. “Hey, guess the algorithm picked something out. Where’s the cafeteria?”
  2. “WHAT THE FUCK IS GOING ON I WANT INFORMATION RIGHT NOW BEFORE I FLIP THIS HOSPITAL UPSIDE DOWN WITH THE HAMMER OF THOR!!!!”

The second, naturally.

The above happened to me when my kid was born. At every interval most are going to want to know why something is being done, and “algorithms tell us to do it” is not going to fly. If that were said to me at the hospital, my gut reaction may have been “So you have no idea what you’re doing?” What do we regularly consider the hallmark of knowledge? Being able to explain that knowledge.

Furthermore, while unfortunate, most do not want to hear anything about math at any point in their lives after thirteen years old. Using that as the sole rationale is going to cause havoc. Machine learning algorithms more than others, as they are math an average person will have no chance of understanding.

Instead you want to hear,

  • “we are monitoring the breathing rate of your baby. When it’s elevated like her’s is, we get concerned about infection.”
  • “after a while we don’t trust our eyes and ears enough. We want to hook them up to a monitor to truly see how heavy is their breathing.”
  • “we go to the NICU to administer antibiotics. We can’t know for sure if she has an infection for 24-48 hours. We have to grow the bacteria and see if anything comes from it. In that time we run antibiotics in case something is brewing.”

The ability to communicate is something most patients don’t think about until they’re in the middle of trying to do it. It’s something most doctors don’t think about until their online reviews come in. And not coincidentally, it’s something computer / math oriented people are notoriously bad at.

Yes, there is a segment of the population who you can say “Odds are 51% in this scenario, but 49% in that scenario. So we go with the first one” and they respond “Makes sense. Thanks.”

Yes, there are some where they don’t care about the why of something. They simply want to know what to do.

But most of us, for some time, are going to want some semblance of rationalization thrown in. One a person regardless of background can grasp. After a while we may trust a source enough to go with it, but we tend to not blindly run with recommendations about our health. How many Amazon reviews did you go through before your last consumer product??

This may be most true in America where our healthcare system is still tightly bonded with fee-for-service. Perform action = make money from action.

Imaginative scenario-

“Why do we give vaccines at this time?”

  • England doctor and patient-
    • “We’re currently studying 20,000 people as they go through the vaccine process, and this is what we’ve found to work best so far. You can follow the study through the NHS if you wish.”
    • “That makes sense. Thank you.”
  • American doctor and patient-
    • “Through a bunch of different studi”
    • “Like the ones funded by big Pharma?”
    • “Well…”
    • “Don’t you just get paid more the more shots you give??”
    • “I mean,”
    • “How much will this cost?”
    • “That depends.”
    • “Jenny McCarthy says this. What say you?”
    • [doctor burnout]

(When it comes to America’s DIStrust with doctors? Near the top baby. Nearly 50% of people don’t think doctors can be trusted!)

This happened while giving birth to my daughter. We had the nicest nurse in the world, but when it came to the epidural, it was impossible to not have in the back of your mind “How much more money do they get with an epidural?” If that nurse couldn’t have given a sound rationale –one that will need to be better than if it wasn’t fee for service- if she was at all pushy or impatient about it, what was in the back of our minds would have rushed to the forefront. After all, a top predictor of whether a woman has a c-section is the hospital. Not the woman or baby’s situation.

In America, with AI and healthcare, we’re putting healthcare workers in a position of needing to communicate something that’s harder, if not impossible, to understand, in a system where the patients are starting with some of the least trust a population can have in those communicators.

“Yeah, they claim the research algorithm says I need this treatment, but I think they just want more money.”

At least one can go read the research a doctor references. But reading the algorithm?

 

Nine part series-

 

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.