Learning to distrust technology

Share this content:
Emily Mongan
Emily Mongan

We live in a society that relies on technology for just about everything. This isn't my opinion as someone who was raised in the advent of the internet and smartphones. This is a fact.

Technology shapes the way we live and work, often in ways that we don't realize or take for granted. That's why it's almost startling when these systems we rely on so heavily don't work the way we expect them to.

Take, for example, the simple online flight tracker. By knowing the flight number of whoever I'm supposed to pick up from the airport I can follow their journey online, see if their flight has been delayed and save myself countless loops driving around the airport.

But it has its flaws, which I found out last week when my flight to visit a friend in another state showed up as leaving exactly on time, despite the fact that we were grounded for an extra 45 minutes (not for a tech issues, as so often happens — we were delayed because Air Force One gets first dibs on departing the airport over a commercial flight to North Carolina).

So my friend, who had been diligently tracking my flight online, arrived at the airport earlier than I did. Inconvenient? Sure. But not fatal.

In the healthcare world, we've put a large emphasis on moving toward electronic systems for entering, managing and sharing health data. We like to assume that these systems will help improve patient care with as few hiccups as possible. That's why findings like a recent report from the Leapfrog Group can be so disconcerting.

The group found that computerized physician order entry systems aren't always perfect in detecting potentially harmful or fatal medication errors. The systems didn't flag 39% of potentially harmful drug orders and missed 13% of potentially fatal orders, according to the study. Among the most commonly missed errors were patients prescribed the wrong drug, incorrect dosage or incidents where follow-up reminders never appeared.

Leapfrog's findings primarily centered around hospitals, but there's a lesson here for all providers. There are bound to be errors in technology every once in awhile, like a cell phone alarm clock failing to ring or (like we're currently experiencing here at McKnight's) a coffee machine that just decides to stop working.

But in the healthcare realm, these errors can cause more than just a little frustration.

As one McKnight's reader aptly commented, electronic systems may help point out errors, but they don't prevent them all. Due diligence is needed.

As technology becomes more prevalent in your facilities, it's crucial to not see these systems as an infallible, magic solution to everyone's data woes. Instead, view these systems like you would a co-worker or team member. It's important to trust them, and more often than not they'll do their job correctly. But just like humans, there will be occasions where they mess up, and it's important to have a backup plan in place for when they do.

Emily Mongan is Staff Writer at McKnight's. Follow her @emmongan.


Next Article in Daily Editors' Notes

Daily Editors' Notes

McKnight's Daily Editors' Notes features commentary on the latest in long-term care news and issues. Entries are written by Editorial Director John O'Connor, Editor James M. Berklan, Senior Editor Elizabeth Newman and Staff Writer Marty Stempniak.