I have no doubt that in the future computers will be able to answer the subtle - and serious - questions posed by real people with health issues and concerns about their medication. But as our guest blogger Amy Tudor discovered, that time is not quite here, and Google is having none of it.
Amy is an author and teacher from Louisville, Kentucky. She is best known for her poetry, with her first collection, A Book of Birds, winning the Liam Rector First Book Prize for Poetry in 2008.
A while ago I was visiting family and was barely out of the car before the matriarch, 75-year-old Nano, had a question for me about her new osteoporosis medication. “I can’t bend over or lie down or anything right after I take it,” she said, unnerved. “Why is that?”
I’m not a doctor but, as the saying goes, I do play one on the Internet. In 2006, I was hired to write about prescription medications for a new website, with 12 medical conditions that I’d cover for the site. I had worked as a science writer before, but in marine science and aeronautics, so I had a lot of work to do.
It took me over a year before the conditions and their meds became second nature, getting to the point where I could tell Zocor from Zyprexa and hypercholesterolemia from hypoglycemia. Reading clinical trials, talking to doctors, going to medical conferences, and reading hundreds of news stories every day goes a long way.
Around 2010, after our company had been bought and sold twice in the fallout of the U.S. recession, our new owners decided to try what many companies were experimenting with at the time: auto-generated content pages. Users would input information in the form of questions about symptoms or drugs and our content management system would produce pages that addressed this concern to best of its algorithms’ ability. I helped them choose relevant keywords, even knowing that it was likely going to cost me my job. The CMS worked a lot more cheaply than I did after all, and it didn’t require pesky things like health insurance or vacation days.
The layoffs came just before Christmas in 2011 and half of the writers were gone. I stayed on for a while, but it was clear that the ease of the auto-generated content was preferable in this Brave New World of Google algorithms and information on the Internet. I went back to teaching college.
“I used to go to that website you worked for,” Nano said to me, walking toward the house from the car. “But I just can’t get hold of anything that makes sense from it any more.”
The company’s CEOs also noticed - as did Google, who decided to label these auto-generated pages as “inferior content.” They also decided not to count the page views on them for sites all around the web in a SEO bloodbath.
Shortly afterwards my phone rang, asking me to come back as a contractor to help with the fix. “There’s just not a replacement for ‘intelligence-driven content,’” the exasperated Editor-in-Chief said. He knew that now. To be fair, he’d known it then. Now we go back and replace the auto-generated content with, well… content.
Algorithms are wonderful tools for searching, but Google’s recent move to downgrade these click-chasing pages created by them is a big win for expertise and good old-fashioned common sense. An algorithm can tell you that cherries fight cancer and link to clinical trial that few will read and even fewer will understand. A human being can tell you that the clinical trial showing “strong evidence” that cherries fight cancer was done on only 10 people and was funded by the Cherry Growers’ Council of one U.S. state.
And though Nano could have looked up the patient information on the drug she was taking, it wouldn’t have made much sense to her. That’s why she was out on the gravel drive to find me. She wanted an answer that she could trust and that made sense to her, and sometimes - most of the time, in fact - an algorithm just won’t do the trick.