Unspoken communication: thoughts on music, Watson, and medicine

Hooray, a new blog post less than three weeks after the last one! Guess I'm slowly making progress. My days have been filled with lots of holiday cheer lately. I went home for Thanksgiving and joined the Stanford Band as an Old Fart for the Notre Dame game that weekend. This past weekend was also the 85th annual Galens Tag Days, where medical students stand out in the Michigan cold with red ponchos and buckets to collect money for Mott Children's Hospital and other organizations benefiting the children of Washtenaw County. This year Auscultations tagged as a group during Friday's Midnight Madness event on Main Street, singing Christmas carols and selections from our repertoire. It's not too late to help, so please visit the Galens website to make a donation: http://www.umich.edu/~galens/tagdays.shtml

[Singing John Mayer's "Heartbreak Warfare"]

As part of the Christmas spirit, I recently attended a performance of Handel's "Messiah" at Hill Auditorium with Clay, Joanna, and Steph. For anyone who has never seen "Messiah" live, hearing four soloists accompanied by an orchestra and full choir is absolutely astounding, and singing the Hallelujah chorus is a staple of the Christmas season.

[What I hope to be singing when my Air Force match results are released on December 14.]

During the concert, I started noticing aspects of the music that went beyond the notes being played. There was an incredible amount of unspoken beauty that could only be appreciated during a live performance. After all, what really drives us to go to concerts in the first place? As audio technology continues to improve, electronically synthesized MIDI files could conceivably mimic the pitches, tones, and dynamic changes that we hear. Even an artist's CDs and music videos aren't enough for his fans who pay good money to see him on stage. Watching "Messiah," I realized that I enjoyed watching the musicians subtly interact with each other. The violin player's entrance wasn't just determined by his sheet music; he was also watching the vocalist finish her solo before drawing his bow across his strings. Members of the choir bobbed their heads from side-to-side while they ran up and down strings of notes. There were even times when I wished I didn't understand the English lyrics to the piece; I found that I got distracted sometimes when the words of a verse were repeated multiple times, albeit to different notes. Although the Bible verses were indeed inspiring when put to music, the significance of the piece was not exclusively in the strict meaning of the words, but also in the way a singer's body prepared itself to carry through an impressive arpeggio.

I see parallels between this type of nonverbal communication in music and the type in medicine. It's a type of communication that can't be picked up very easily by a machine, despite what IBM might have you think.

[The 30-second commercial.]

[The 2-minute commercial.]

I can appreciate the statement in the 2-minute commercial (about 22 second in) about how Watson will never replace a trained doctor or nurse. If Watson were to be used in healthcare, its role would still be based on human-human communication. Doctors would need to know what data to input into Watson, data that would have to be generated by taking patient histories. Theoretically, the process of taking patient histories could also be computerized. I was once asked the following question on a med school interview: "With all the advances in artificial intelligence, with websites that allow users to type in symptoms and receive a list of possible diagnoses alongside the corresponding treatment, what will be the role of physicians in the future?" At the time I probably bumbled through the question. I was expecting the interviewer to ask me to elaborate on my CV, not delve into an existential discussion.

Past drama aside, I now know that taking a patient history is more than just going down an algorithmic decision tree. Patients don't present cases; they tell stories. As in all stories, certain sections can be emphasized, and some can be downplayed. For a machine whose basis of interaction is rooted in digital black-or-white choices of yes-or-no/1-or-0, Watson's interaction with the analog world of unspoken communication might be too limited to generate its own diagnoses. I find it hard to imagine Watson being able to differentiate between a patient confidently denying a symptom and a patient whose initial denial is actually a sign to continue further down that line of questioning. Just like that violin player taking his cue from the soloist, I plan to take cues from my patients to help determine how best to pursue a diagnosis and arrive at a plan.