Tuesday, March 28, 2017

Disobey, Please

The M.I.T. Media Lab is taking nominations for its Disobedience Award, which was first announced last year.  As the award's site proudly quotes Joi Ito, the Director of the Lab and who came up with the idea: "You don't change the world by doing what you are told."

I love it.

The site, and the award's proponents, make clear that they are not talking about disobedience for the sake of disobedience.  It's not about breaking laws.  They're promoting "responsible disobedience," rule-breaking that is for the sake of the greater good.  The site specifies:
This award will go to a person or group engaged in what we believe is an extraordinary example of disobedience for the benefit of society."   
In Mr. Ito's original announcement, he elaborated:
The disobedience that we would like to call out is the kind that seeks to change society in a positive way, and is consistent with a key set of principles. The principles include non-violence, creativity, courage, and taking responsibility for one's actions." 
Given all that, $250,000 hardly seems enough, and it's a shame there can be only one winner.

The are two ways to participate: you can either nominate the eventual winner, or you can recruit someone else to nominate the winner.  Either way, you get to be flown to the awards ceremony on July 21, 2017.  Nominations close May 1 (and, no, it doesn't appear that being late to apply is the kind of disobedience they'll reward).


The award is funded by Reid Hoffman, the founder of Linkedin, who recently wrote about it on that platform.  His article is titled "Recognizing Rebels With A Cause," which is probably a good way to think of it.  Although one often thinks of disobedience in a Thoreau-type civil disobedience, Mr. Hoffman ties it closely to innovation:
In the realm of entrepreneurship, almost every great triumph has its roots in disobedience or contrarianism of one kind or another. And ultimately this impulse doesn't just create new products and companies, but also new industries, new institutions, and ultimately new cultural norms and expectations.
The Media Lab, he points out, serves as an example of how this can work: "researchers with widely varying areas of expertise are encouraged to collaborate and improvise in ways that become not just multi-disciplinary but antidisciplinary – disobedient."

Ethan Zuckerman, the Director of M.I.T.'s Center for Civic Media, told The New York Times: "In a lot of large institutions there’s really two ways you make progress. You make progress when people follow the rules and work their way through the processes, and then sometimes you make very radical progress by someone who essentially says, ‘Look, these processes don’t work anymore, and I need to have a radical shift in what I’m doing.’”

It just takes someone to stand up.

The creators of the award are probably not thinking much about health care -- despite disavowing it is about civil disobedience, many examples they've given revolve around people resisting what they think are improper government actions -- but they should be.

If there's a field where lots of stupid, or even bad, things happen to people , through design, indifference, or inaction, health care has to be it.

Every day, in every type of health care setting, things happen that aren't in the best interests of the people getting care.  People realize that they are happening, and, in many cases, they're happening because the rules say that is what is supposed to happen.

The list of disobedient acts in health care that would serve society is longer than my imagination can produce, but here are some examples:
  • The nurse who says, no, I'm not going to wake up our patients in the middle of the night for readings no one is going to look at (or blood samples that can wait until morning).
  • The doctor (or nurse) who knows a doctor that they believe is incompetent and decides, I'm going to speak up about it.  I'll make sure patients know.
  • The billing expert who decides, no, I'm not going to keep up the charge master, with this set of charges that aren't based on actual costs and which almost never actually get used (except by those unfortunate people without insurance).  Instead, we'll have a set of real prices, and, if we give anyone any discounts, they will be based on ability to pay, not on type of insurance.
  • The EHR developer who realizes that, it's silly that this institution's EHR can't communicate with that institution's EHR, even though they use the same platform and/or use the same data fields.  Data should go with the patient.
  • The insurance executive who vows, I'm tired of selling products that are full of jargon, loopholes, and legalese, so that no one understands them or knows what is or isn't covered.  We're going to sell a product that can be clearly described on one page using simple language.
  • The practice administrator who understands that patients' time is valuable too, and orders that the practice will limit overbooking and will not charge patients if they have to wait longer than 15 minutes. 
  • The medical specialty that commits to being for patients, not its physician members, by developing measures, specific to patient outcomes, in order to validate ongoing competence.   The results of measures would be made public, reported to licensing boards for action, and used to de-certify their specialty designation for physicians not meeting required performance.
Going back to the award's principles of non-violence, creativity, courage, and taking responsibility for one's actions -- well, the above would all seem to fit.  They're all achievable.  It only takes someone to stand up and decide to do them.

I just wouldn't hold my breath waiting to see any of them happen.


Break the rules.  Do the right thing.  Change the world, even if it is "just" the health care world.

And, who knows, maybe even win $250,000 for doing so.

Wednesday, March 22, 2017

On the Road to Ubiquity

Are you reading this on your PC?  How very 1980's of you.  Or are you looking at it on your tablet or smartphone?  Better, but still so ten years ago.  Are you an early adopter, viewing it in virtual reality (VR), perhaps set on your favorite beach as the text scrolls through the sky like the opening expository of Star Wars?  Now we're getting somewhere, but, even so, you still probably have to wear a clunky headset that's attached to a computer.

If you're aware of your device, that's the past.  Welcome to ubiquitous computing.
You may know it as "pervasive" computing, "invisible" technology, or as represented by the wearables craze -- not crude wearables like fitness trackers or Google Glass, but the ones that are already on the horizon that allow computing anywhere and everywhere.  Tech guru Walt Mossberg describes the goal as "information appliances" -- "dead-simple to use, without training or the need for a manual."  Or a even anything that resembles a device.

Some examples:
  • DuoSkin, developed by the MIT Media Lab, uses gold metal leaf devices that attach directly to the skin.  It can sense touch input, display output, and support wireless communication.
  • SkinMarks, developed by Saarland University (Germany), are "electronic tattoos" that are as flexible as skin.  They allow for touch input and display.
  • Smart clothing, such as the smart jacket developed by Levi and Google as part of Project Jacquard.  The jacket -- which will go on sale this fall (for $350) -- allows users to control selected features on their smartphone with gestures done at the jacket's cuff.  
  • Smart jewelry, such as the LEDA gemstone gems developed by MetaGem.  It can display various colors based on the kind of notification received, and MetaGem claims it can also do fitness tracking, SOS mode, remote selfie control, even be used for gesture-controlled games.

IDC estimates that the "wearables and hearables" market will grow from 2016's $102 million market to $237 million by 2021, with smart clothing accounting for almost 10% of that market (smart watches/bands still dominate in their estimates).  IDC warns that: "Tech companies will be forced to step up their game and offer a wider selection of sizes, materials, and designs in order to appeal to a broader audience."

Similarly, Tractica estimates that smart clothing shipments will grow from 140,000 in 2013 to 10.2 million in 2020, and Gartner projects that smart garments could reach 26 million by 2020. 

There's more.  With all these embedded devices, you'll still want something you can easily look at, and you probably won't want to be carrying around something with a screen.  No problem.  Sony, for example, has been working on projected screens that still have touchscreen capabilities, sensing hand motions well enough to, say, type or play the piano.  It can even morph into augmented reality.

You probably don't want to be lugging around a projector any more than you do a PC.  Sony's projectors are fairly small, and Serafim's iKeyBo has a keyboard projector that can "fit in your pocket."  It's only a matter of time before projectors get small enough to also become embedded into everyday items, like your new smart clothing.

Of course, input is only part of what we want screens to do; we also want them to display.  The future may be in holograms, which, as SingularityHub recently proclaimed, "aren't the stuff of science fiction anymore."  Various firms, such as Transparency Market Research and IndustryARC,  expect huge increases in the holographic display market, with the former company specifically citing demand for medical imaging as a major driver of that growth.  
  
Why would you want a print-out or a screen if you could look at a hologram, especially when it comes to the workings of our bodies?

This is the world we'll soon be in. Anything can be the input device, anything can do the processing and communication, and anything can be the display.  Devices become "invisible."  As tech columnist Greg Gascon describes
When using a piece of technology that has become invisible, the user thinks of using it in terms of end goals, rather than getting bogged down in the technology itself. The user doesn’t have to worry how it is going to work, they just make it happen.
Our current devices will look as old-fashioned and clunky as rotary dial landlines look to today's teenagers (that is, if they know what the latter are).

Especially in health care.  

Go to the doctor's office and they're listening to your chest with stethoscopes, taking your blood pressure with a cuff, measuring your temperature with a thermometer.  Sure, some of those may be digital now, but they're still all based on technology that is decades or even centuries old.  Go to the hospital and it's even worse: all the wires make it hard to move and the beeping of all the associated monitors make it hard to sleep.   

It doesn't have to be this way.  

Instead of all those monitors with all those wires, slap an e-tattoo on.  It could act as the sensor and the display, while updating your records wirelessly.  Instead of the intermittent, crisis-driven contact we now have with our physicians, our invisible monitors could keep track of us 24/7.  They'll alert us and our providers when something is off.

Instead of splitting attention between you and an EHR screen, you and the physician could view a holographic image of you that serves as your electronic record.  It can be updated with hand gestures and voice, help both you and your physician understand the issue(s) and your history, and help you understand what is happening with your health.

Of course, there will also be the nanobots working inside us.  Talk about ubiquitous, talk about invisible!

We're going to have to get past our fascination with the latest and greatest devices -- a new iPhone! a 4D television! -- and let their technology fade into the background.  As it should.

It's going to be very different, very exciting -- and sooner than many of us will be ready for. 

Tuesday, March 14, 2017

Health Care in a Post-Privacy World

Someone knows you are reading this.

They know what device you are using.  They know if you make it all the way to the end (which I hope you do!).  They may be watching you read it, and listening to you.  They know exactly where you are right now, and where you've been.

As FBI Director James Comey recently proclaimed, "there is no thing as absolute privacy in America."
Director Comey was speaking about legal snooping, authorized by the courts and carried out by law enforcement agencies, but, in many ways, that may be the least of our privacy concerns.

Your phone knows where you are, all the time.  Go outside and chances are you'll show up on surveillance cameras at some point.  Facial recognition software can now easily identify you (e.g., Facezam), as can supposedly de-identified data.

Think about what Google knows about you.  Think about what Facebook knows about you.  Think about what Amazon knows about you, including anything you may have told Alexa.  Think about what your mobile phone carrier or your cable/internet providers know about you.

It's pretty staggering.

We all know, in theory, that all these organizations are collecting information on us, and even that they're using it, ostensibly to "help" serve us better.  Again, in theory, we've given permission for them to collect and use our information -- in some cases, to even sell or share it with other organizations, with whom we may have no other relationship.

And these are all from the "good guys" -- law enforcement agencies or well known, usually publicly traded companies we're electing to get services from.   There's a whole world of hackers and cybercriminals who are after our data, for fun or for-profit, and they're pretty damn good at getting it.

I'd be remiss if I didn't note the recent WikiLeaks disclosure about how pervasive the CIA's surveillance capabilities are.  Whether they only use them per their mission, whether they can actually absorb and analyze all the information they collect, whether this is the whole iceberg or just the tip -- I don't know, but I'm pretty sure the C.I.A. is not the only one with these kinds of capabilities.

And if we think things are bad now, wait until the vaunted Internet of Things (IoT) really takes hold, when virtually everything may be subject to attack.

The Pew Research Center has been following the digital privacy issue for several years, and concludes that:

  • 91% think they've lost control over their personal information;
  • Few have confidence that any organization will protect their personal information;
  • At most only about half think they understand what happens with their information;
  • Most claim to have taken actions to protect their personal information, but most also admit they'd like to do more.
  • Perhaps most telling, our attitude about privacy is "it depends" -- e.g., it is OK to use their information if used to combat terrorism (or perhaps to make shopping easier).
Interestingly, younger respondents paid more attention to digital privacy -- but also were more likely to have shared personal information online.

What does all this have to do with health care?  After all, we have HIPAA to protect our data, right?  

Not so much, as it turns out.  Health care data breaches were up some 40% since 2015.  Accenture says 26% of Americans have had their health data breached -- and half of those were victims of medical identify theft, costing them, on average,  some $2,500 in out-of-pocket costs.  
Despite that, Accenture found that consumers still trusted health care providers and payors with that data much more than they did health technology companies or the government.  That confidence may be badly misplaced, according to IBM's Paul Roemer, who asserts that the average hospital has 100,000 unsecured (data) entry points, and large hospital systems 1,000,000.  

Indeed, Avi Rubin, the head of Johns Hopkins University Health and Medical Security Lab, told NPR that the health care sector was the "absolute worst" in its cybersecurity problems, because: "Their data security practices were so far below every other industry."

When all of our records were on paper, when none of medical devices and equipment were connected, security was not very good either, but at least the exposure risk was limited by proximity.  In an almost fully digital, connected world, though, we should all feel very exposed.

Yes, certainly people -- the biggest weakness for data breaches -- could be more vigilant.  Yes, certainly, all organizations should to beef up their privacy policies and their efforts to protect our data.  Perhaps blockchain or other alternative approaches to security can mitigate the risks of our data being exposed.  

But the genie is not going to go back in the bottle.

We leave digital footprints.  Lots of them. We've implicitly or explicitly decided that the advantages of being digital outweigh the disadvantages.  It may be time to revisit our attitudes and approaches to privacy, in health care and elsewhere.

It is supposed to be "our" health data, but if, as they say, possession is nine-tenths of the law, you'd have to say that the institutions that house it own it.  They are the ones who are failing to protect it, who are already sharing it -- for research and for commercial purposes -- without us even knowing it (or profiting from it), and they are the ones who sometimes charge us to get copies of it (usually delivered in paper form!).  And yet they seem to have a hard time sharing it when we show up in an ER or at a new doctor. 

The new era of Big Data won't happen without all our little data, yet we haven't figure our how our "little" should relate to the "Big."
 
HIPAA was literally passed in the previous century, when the Internet was still feeling its way and few of us relied on it.  Now, though, as Evan Schumn writes in Computerworld, "true online privacy is not viable."  We urgently need to revisit ownership of our data, what sharing of it means, to whom, and what privacy is realistic to expect in the 21st century.  

Like it or not, there is no absolute privacy, not even for our health information.  

Tuesday, March 7, 2017

Your Smartphone or Your Life

Rep. Chaffetz's recent remarks suggesting that some Americans should invest in their health instead of in a new iPhone reminded me of nothing so much of the old Jack Benny bit, where Benny is accosted by a robber who threatens "your money or your life."  When Benny doesn't immediately respond, the robber prompts him, and the supposedly miserly Benny snaps back, "I'm thinking it over."


I suspect that, like Mr. Benny, many of us would have a tough choice between our smartphones (and our other devices) and our health.  It may be not so that we're miserly as it is that we're addicted.

To be fair, Rep. Chaffetz subsequently walked back -- to some degree -- his comments, after taking merciless criticism on social media.  However, his point is not entirely wrong: we do have to make choices, and make our spending and lifestyle choices carefully.  Based on our health habits, most of us are not making choices that optimize our health.  And you can blame our various devices at least in part for that.

We watch a billion hours of YouTube alone each day, every day.  We spent another billion hours a month playing mobile games alone, and that was for 2015.  In the U.S., we spend some 5 hours per day on our mobile devices, eclipsing time spent watching TV.

Lest you protest that this is all just harmless entertainment at worst and important connectivity at best, keep in mind that the CDC reports that (in 2013!) 1 person dies and another 1,161 are injured every day due to "distracted driving," which is most commonly attributed to use of mobile devices.

Social psychologist Adam Alder, in a new book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, asserts that our love of our devices is not just passion but addiction -- literally.  It's a behavioral addiction, but one with a biological basis in many cases, such as when the brain's reward mechanisms that are triggered by, say, online games or online shopping.

As Dr. Alder told The New York Times: "The people who create video games wouldn’t say they are looking to create addicts. They just want you to spend as much time as possible with their products."  The same applies to Facebook or Snapchat or Amazon; they want your eyeballs and they want to hold them for as long as possible, as many times a day as possible.

Indeed, Wall Street likes to measure them by how many users they have, how often those users engage, and for how long they stay engaged.  The tech companies tweak their offerings in the way that a candy bar maker might adjust the sugar content -- or a cigarette maker might change the nicotine levels.

Dr. Alder notes that our increasing use is not a matter of lack of will power, quoting Tristan Harris, "there are a thousand people on the other side of the screen whose job it is to break down the self-regulation you have."

This is an issue that Mr. Harris has been outspoken about for sometime.  He spent several years as a "design ethicist" at Google, and bills himself as a design thinker, philosopher, and entrepreneur.  He wrote a long, thoughtful post on Medium last year, detailing how technology "hijacks" our minds.

Mr. Harris compares what technology does to what magicians do: "They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose."  They do this in part, by controlling the menu, so that we rarely think about the choices we're not being offered.

He also compares our reliance on our devices to the allure of slot machines: every time we check it, there is a chance for us to "win," whether that is a new email, Facebook post, or other notification.  Our brains light up and crave another try.  For some of us, that is checking our email, for others, updating our Facebook status, and still others posting on Instagram or tweeting.  

All in all, Mr. Harris details 10 "hijacks" that technology companies use to exploit our psychological vulnerabilities.  As he says, "Once you know how to push people’s buttons, you can play them like a piano."

And all this is before augmented reality (AR) and virtual reality (VR) have become widespread.  Imagine how they will enhance our addiction.

Some are fighting back.  The Wall Street Journal's Joanna Sterns detailed efforts to "lock up your smartphones, such as at concerts, dinner parties, or schools.  Yondr, for example, makes pouches for smartphones to create phone-free spaces; their slogan is "Be Here Now."

Wired, of all publications, just published an interview with author Florence Williams, author of The Nature Fix, the thrust of which was that we only spend 5% of our time outdoors, and that's not nearly enough.  And, no, being outdoors in VR or simply being on your smartphone while outdoors don't count.

We've been talking a lot about the opioid epidemic in the U.S., which is, indeed, a very real, very serious health care crisis.  But people like Dr. Alder and Mr. Harris would assert that our addiction to technology may impact even more of us and pose an equally grave threat to our long-term health.

Mr. Harris has founded Time Well Spent, "a movement to align technology with our humanity."  We're not going to get rid of our smartphones or other technology that we've come to depend upon, but we can do better about making it serve our purposes, not vice versa.

And one of those purpose should be our health.

It's not that smartphones are a "luxury item" that should be forgone to make better health choices, as Rep. Chaffetz was trying to assert, as it is that we're not using our technology very effectively to improve our health.  Exercise more, eat better, pay more attention to our health, especially any existing conditions.  Those are the kinds of things we should all be doing -- and that technology can help us with.

Or we can let them dictate our behavior, going for more time, more clicks, more immersion.

What we end up actually doing will determine if we're using technology, or are simply addicted to it.

Tuesday, February 28, 2017

Health Care Should Get "Smart"

Admit it: you're worried about your online privacy.  Admit it: your personal health information is one of the things you worry most about getting hacked.  Admit it: you don't understand why your health care providers seem to have a hard time sharing key information about you.  And admit it, you're not quite sure what health insurers really do, except for always saying no and for getting between you and your health care providers.

This is why blockchain is the new hope -- or hype -- for health care.  What intrigues me most about it, though, are its "smart contracts."

The GAO recently cited health as a key area of cybersecurity weakness, and TrendMicro profiled why cybercrime is a particular threat for health care.  The 2017 Xerox eHealth Survey found that 44% of Americans were worried about their personal health information being stolen, and one has to wonder if the other 56% are asleep or just don't care.

So it is no wonder that blockchain, with its touted higher level of security, is the new darling for health care pundits.

A previous post attempted to convey the blockchain basics and the hype (some of which interested readers were quick to try to deflate).  A recent article in NEJM Catalyst attempted to "decode" the hype, analyzing both the potential applications of blockchain in health care and the some of the challenges it would face.  The authors warned that:
the challenges and realities of health care and health care data may be insurmountable — even as some argue that blockchain could revolutionize how we share health care data.
The financial service industry, for one, is paying attention to blockchain.  A consortium of big banks and tech companies just announced a consortium to expand blockchain technology.  Improved security would be nice, but what may be catching their eyes most is the prospect for saving money.

Accenture estimated that blockchain could reduce investment bank's infrastructure costs by 30%, some $8b - $10b annually for the 10 largest such banks.  It could slash the cost for many back office functions, such as reporting, compliance, and clearance/settlement.

Smart contracts are one of the features blockchain offers to help achieve these savings.  Smart contracts are, essentially, automated programs that self-execute and self-enforce, based on satisfaction of the underlying terms. They can work between two parties, or for complex multi-party agreements, and do so without any middlemen -- no lawyers or other third parties.

Smart contracts have been compared (for better or worse) to a vending machine: put your money in, get your desired choice out.  Nick Szabo, who may or may not have invented Bitcoin, uses the vending machine analogy, and elaborates:
Think of blockchain as an army of robots checking up on each others' work.  Where traditionally you have accountants and lawyers, there are now a wide variety of things we can do with the vending machine-like mechanism to replace the job of traditional contracts plus added cryptographic mechanisms for integrity."
Electronic health records and exchange of patient data more generally are often cited as obvious potential uses for blockchain in health care, as the data is not stored in silos and can be shared by trusted partners.  Those may prove valid uses, but blockchain's killer app may be smart contracts -- such as to cut out health plans.

You get services from your health care provider, but your health plan decides if your contract covers the care, and how much, if any, it will pay towards the care.  Your provider may or may not be in your health plan's network, they may or may not have to contact your health plan for "permission" for some services, may or may not submit a claim on your behalf, and may or may not receive payment directly from them (as well as from your for your portion).

So, yes, they are in the middle.

Your health plan is really a contract between you and your health insurer.  If you pay the required amount, they are obligated to provide payment for a specified set of health care services -- under a specified set of conditions: application of deductibles/coinsurance/copayments, use of provider networks, limits on certain services, medically necessity, excluded services, etc.  Like many insurance contracts, to understand it you probably need a law degree.

Imagine a smart contract between you and your doctor.  She promises to, say, fix your broken leg and, if she does, you promise to pay her $X.  The two of you would agree how and when to decide if the leg is, in fact, fixed.  All that goes in the smart contract.

No ICD-10 or CPT codes, no unknown charges, no payment for care that doesn't work; just mutual agreement about what each party will do.   You fund a virtual currency account, she provides the services, the smart contract monitors when the conditions are met, and issues payment once they are.

None of that looks like the current health care system.

The broken leg is, admittedly, a relatively simple example, and even it would require some work to define the contract.  How badly is the leg broken?  How do you know if the price is reasonable?  How do you evaluate if the leg is fixed to your satisfaction?  And all this is supposed to happen while you are in pain in the ER or doctor's office?

However, none of that is insurmountable, nor alien to advocates of reference pricing or value-based care.  Defining expectations about care, price, and outcomes prior to receipt of care would be a real change, but one that would go a long way to changing the way our health care system works.

It is easiest to see smart contracts working for conditions with clear expected outcomes, costs that are relatively affordable, and between two parties.  However, they don't have to be so limited, and could be designed for use with multiple parties, be they multiple providers or multiple funding sources (e.g., using a Kickstarter campaign or even a catastrophic health plan).

Blockchain may prove to not fit the structure we've evolved in health care.  Smart contracts may not be made smart enough to understand what happens in our health care system.  But maybe, just maybe, blockchain could be the next big thing in health care, and smart contracts may be its killer app.

Tuesday, February 21, 2017

The Good, the Bad, and the Ugly in Health Care

I hate being a patient.

I have to admit that, although I write about health care, I am typically what can be described as a care-avoider.  My exposure to the health care system has mostly been through my professional life or through the experiences of friends and family.  The last few days, though, I unexpectedly had an up-close-and-personal experience as a hospital inpatient.

I want to share some thoughts from that experience.

Now, granted, any perceptions I gained are those of one person, in one hospital, in one medium-sized mid-western city.  Nonetheless, I offer what I consider the Good, the Bad, and the Ugly of the experience.

The Good:  The People

The various people involved in my care, from the most highly trained physician to the person who delivered meals, were great.

I loved my nurses; they fit all those great stereotypes people have about the profession.  Attentive, caring, cheerful, knowledgeable, hard-working -- the list goes on and on (full disclosure: I'm married to a nurse, so none of this came as a surprise).

I liked my doctors a lot.  Each of them spent literally hours with me -- answering my (many!) questions, discussing what they thought was going on with me, describing the various tests or procedures, developing care plans to fit me.  They were super-smart and a pleasure to talk with.

The aides, the lab techs, the imaging tech, the transportation specialists -- all of them doing jobs that I wouldn't be able to do -- were each friendly and helpful, taking pride in what they did and how it helped my care.

Whatever you might say about our health care system, you cannot say that it is not filled with people who don't care about the patients in it.

The Bad: The Processes

On the other hand, on the lists of criticisms about our health care system, many of its rules and processes truly do deserve a place.  They're like part of an arcane game no one really understands.
I'll offer three examples:

  • Check-in: I was literally on a table in a procedure lab -- still wondering how the hell I'd ended up there and not quite sure what was about to happen to me -- when I was asked to electronically sign several forms (Privacy Policy, Consent to Treat, Consent to Bill) that I could neither see nor was able to question.  No court of law could call that informed consent, but that's what the process required before I could actually receive care.
  • NPO:  At one point it was thought that, on the following day, I might have a procedure, so I had to be NPO (no food or water allowed) for at least 4 hours -- but starting at midnight.  I pointed out that it was highly unlikely that they'd be doing anything at 4 am, and even mid-morning was unlikely since nothing was yet scheduled, but that was not persuasive.  As it turned out, I'd gone something like 16 hours NPO when they finally listened to my concerns: by putting me on a saline solution IV.  I think they understood the physical problem but not the human one.  (It ended up I didn't have the procedure anyway.)
  • Discharge:  On my final day, the doctor told me around 1 pm that I was being discharged.  Around 3 pm his nurse practitioner told me she'd personally written the discharge orders.  Around 5:30 pm my nurse gave me all my discharge papers, but told me I had to wait for Transport to escort me out in a wheelchair (even though I was perfectly capable of walking).  Finally, around 6:30 pm my wife simply commandeered a wheelchair and we made a break for it.    
The rules and processes are all undoubtedly in place for good reasons, but we need to un-handcuff all those great people when rules and processes get in the way of better patient care.  

The Ugly: The Technology

Oh, health care technology.  It is equally capable of delighting as it is of frustrating.  It is truly remarkable that the doctor could go up my arm to perform a procedure in my chest, just as the detail an MRI provides is simply astonishing.  

On the other hand, those gowns...

Let's start with the perennial whipping boy, EHRs.  All of the staff used them, seemed to accept them, and even (grudgingly) acknowledged their value.  But no one liked them.  Even the youngest users, to whom technology is a given in their personal lives, were frustrated by the interface.  And, on many occasions, EHRs did not mean that people did not still often have to drag in other electronic equipment or even paper in order for them to do their job.

EHRs could be better, should be better -- and better get better.

MRIs are a wonderful technology, but as I was laying in that claustrophobic tube getting imaged, I kept thinking: what the heck are all those clanging noises?  We can make stealth submarines, but we can't make an MRI that is quiet, so that anxious patients don't have more to worry about?

I was on various forms of monitoring devices, the smallest of which was the size of a 1980's cell phone and still required countless wires attached to numerous leads.  I kept looking at the set-up and wondering, hmm, have these people heard of Bluetooth?  Do they know about wearables?

My favorite example of ugly technology, though, came when I had to fill out a form (which looked like it dated from the 1970's), so that it could be faxed to the appropriate department.  That's right, faxed.  To a department in same institution, in the same building.  I couldn't fill it out online?  A paper form couldn't be scanned and sent securely to the other department?  

I'd love to be the boss of the guy who has to request a new fax machine, just so I could look at him with my best "you've got to -be-kidding-me?" expression. 


No health care system is perfect.  Every system has its own version of the Good, the Bad, and the Ugly.  No one wants to have health problems, and no one wants to need to be in health care settings.  When we do, and when we have to be, though -- well, our system can do better.  Let's give all those great people working in health care a better chance to help us.

If any of the above strikes home for you, perhaps you'll Like/Recommend, tweet, or otherwise share with your circle.

Monday, February 13, 2017

Ask Better Questions

I've been thinking about questions.

A few things I read helped spur this.  The first was a blog post entitled "Asking the Wrong Questions" by Benedict Evans, of VC firm Andreessen Horowitz,  Mr. Evans looked at a couple of long range technology forecasts (from as long ago as 1964 and as recently as 1990), and pointed out how they both managed to miss several key developments.  He attributed this to "this tendency to ask the wrong questions, or questions based on the wrong framework."

And we're still at it.  Mr. Evans, whose background is mobile technologies, said that people are now doing a lot of speculating about what comes "after mobile," such as AR and VR.  There are lots of good questions being asked, he noted, "But every time I think about these, I try to think what questions I'm not asking."

That, my friends, sounds like some pretty good advice, especially if you fancy yourself an innovator.

Then there was an interview with Warren Berger in Singularity Hub.  Mr. Berger labels himself a "questionologist" -- a line of work I wish I'd thought of! -- and wrote a 2014 book A More Beautiful Question.

You have to admire his ability to turn a phrase; I love the notion of a "beautiful" question.

Mr. Berger defined a beautiful question as "an ambitious yet actionable question that can shift the way we think about something and may serve as a catalyst for change."  As he further explained:

  • “Ambitious” because we have to ask bold questions to innovate
  • “Actionable” because big questions we can’t do anything about don’t lead to change.
  • Critically, the question has to cause a mental shift—it makes you step back and say, “Hmmm, that’s interesting. I hadn’t really thought about this question before, and I want to explore it further.”

He sees these kind of questions as important not just for technological innovation, but even basic questions like "what business am I in" (something, for example, the folks at Snap have recently been asking, with some surprising answers).  He further suggests organizations should turn mission statements into mission questions, to remind people to keep questioning.  And, of course, he urges that leaders foster a culture of inquiry, without demanding immediate answers to every question.

One of Mr. Berger's favorite examples is how the Polaroid instant camera came about because founder/CEO Edwin Land's three-year-old daughter asked why they had to wait to see the picture he'd just taken.  As Land later recounted, “I thought, ‘Why not? Why not design a picture that can be developed right away?’”

One could argue that Polaroid's downfall came because it stopped asking "beautiful" questions.

In 2015, Tom Pohlmann and Neethi Mary Thomas, of decision science firm Mu Sigma, wrote in Harvard Business Review that we need to "relearn the art of asking questions.   They claim that "proper questioning" has become a "lost art."  They lament that, while most of small children's conversations consist of questions, the opposite is true for adults, and blame this on educational and workplace environments that reward answers, not questions.

They categorize questions as follows:

I suspect that too many questions in most organizations would be considered "clarifying," and not very many at all would be classified in any of the other three quadrants.  The authors agree with Mr. Berger that leaders need to encourage people to ask more questions, because: "In order to make the right decisions, people need to start asking the questions that really matter."

Let's turn this to health care.

Patient engagement is one of the hot topics in health care.  Improve patient engagement, the theory is, and all sorts of good things will happen.  Patient compliance with instructions would improve, we'd do a better job managing chronic conditions, and patients would have better ongoing attention to their health.  The questions being asked are often revolve around how can we use technology to improve patient engagement.

Certainly technology could improve patient engagement, but let's start from a different point.  I would be willing to bet that all or almost all providers have mechanisms to track payments, have statistics on late payments to which they pay close attention, and have procedures in place to reach out to patients whose payments are considered late.

On the other hand, I'd similarly bet that very few providers have mechanisms to track patient status post-visit/procedure/prescription, other than perhaps a simple follow-up phone call.  If the patient doesn't contact them to complain, all is considered good.  As a result, no tracking mechanism, no statistics, no procedures to escalate anything if a patient's status is not going as expected.  

So we have a form of patient engagement already, but it is built around money, not patient well-being.  Putting in a patient portal or an app doesn't change that underlying focus.  It's addressing the wrong question.

Or let's look at the big question that confounds everyone -- why does the U.S. spend so much more than every other country on health care (and yet only has mediocre health results)?  Maybe we're simply counting the wrong things.

Most of us have, by now, probably seen a version of this chart, detailing the drivers of our health care spending.

Basically, medical care gets the lion's share of the money, but is in itself not a major driver of health.

We're somewhat unique in this.  We invest in medical care, while most other developed countries spend more of their money on "social care" -- better housing, education, support income, etc.

The AHPA charts it as follows:

When you combine OCED's health spending figures and their social expenditures, the gap in "health" spending between the U.S. and other developed countries narrows dramatically (we're still higher, most likely due to our insanely high prices).

We're not asking the right questions -- or even looking at the right problems.

I'll close with two applicable quotes: