Nuance, AI, Cancer, and Leadership

I read an article on AI and leadership this morning, where the concern is that if the workers are all AIs, we won’t learn how to lead, as humans.  It is an interesting consideration, and not one I’d have come to without it being raised by someone else. But I don’t want to talk about that, I am more interested in the selection of sources, and the realistic abilities of an AI who replaces a human in cancer diagnosis and treatment being a successful option.

The article, without noting dates etc. uses old sources, a video from 2014 which makes some interesting assumptions (“Replacing human muscle with mechanical muscle frees people to specialize and that leaves everyone better off. This is how economies grow and standards of living rise.”) to bolster the argument that the thinking machines will take our thinking jobs.  The focus is on the robots, so statements such as the above one is just supposed to slide past as being obviously true. I don’t agree carte blanche with that statement, or many others in the video.

The author also cites the Watson PR piece about Watson diagnosing a rare leukemia and recommending a treatment that consequently saves the life of a woman in Japan. This, too, is over a year old, which is not noted in the article. I am not suggesting this invalidates the OOOH factor, but let’s just note that this is a single instance that has been all over the press as a sign of the future of Watson in medicine, and Watson being better than humans at diagnosing cancer.  And this is what I actually want to comment on.

Watson is fed data, millions of records of cancers, tests, treatments, outcomes, any genetic information on the cancer and the patient. Watson is also trained over years to get the right answers, so that Watson can continue to refine. Given that massive amount of data, it is not surprising that Watson can not only pick up a rare instance, but also recommend an unusual treatment which saves a life. I do agree that there are instances like this, where data and processing power will win.  What we never hear of, of course, are the instances where patients die, where we thought Watson could have done better.

And it may be that there will be a greater variety of treatments offered to patients because of the machine’s broad view of all treatments for that cancer to date, in particular populations, and their success rates. BUT the full set of recommended treatments may also be small, it will be the set of treatments which have already worked, because that is what an AI will choose from. It won’t come up with something new and novel, it will apply what already works.  This is fundamentally limiting.

The NCI has already done this, created a set of protocols with fixed recommendations for treatments. I am told it takes years to change these. But at the moment, you can discuss these recommendations with your doctor, specifict to your case. I’d hate to imagine a world where there is no dialogue, but perhaps that is me. I have seen studies which show that the machine is better than humans at diagnosing lung cancer. (This is a different role, not the oncologist who decides treatment path, but the pathologist who diagnoses your cancer, and determines its stage, which of course, feeds heavily in to treatment options.)

I’d like to point out, explicitly, that the end goal of treatments in these instance, in particular in the American medical system, is the continuation of life, and there is _nothin_ about the quality of life, just the extension.  If you look at how hospitals and doctors are rated, this is also obvious. Quality of life post-treatment isn’t on the list.

I am reminded of the dogs who can smell cancer, and other means of diagnosis that are novel but perhaps should be done in concert with humans.

I’d argue that cancer diagnosis and treatment is as much art as science, and if the machines are the science, we shouldn’t drop the art. Or the humanity, which, for the moment, the humans still corner the market on.

Leave a Reply

Your email address will not be published. Required fields are marked *