Tuesday, September 19, 2017

Not Just Better Tech, Nicer Tech

We are surrounded by our technology.  We're glued to our smartphones, and when we're not on them we're looking at our tablets, computers, televisions, or gaming systems.  We're turning our cars into mobile technology platforms and our houses into "smart" homes, complete with Internet-of-Things (IoT) connectivity and always-on-call virtual assistants like Amazon's Alexa.  Most of our jobs are increasingly infused with technology, even ones historically considered low-tech.

We're addicted to technology, but we're not all that happy with it, and nowhere is this more evident than in health care.


For example, a recent post-mortem of HITECH, by John Halamka and Micky Tripathi looked at the "miraculous" success of the program in moving health care providers to electronic health records (EHRs).  Still, the authors admitted: "Along the way, however, we lost the hearts and minds of clinicians.".

In their great analogy, "we gave clinicians suboptimal cars, didn't build roads, and then blamed them for not driving."

Indeed, EHR tasks are said to consume half of primary care physician's time, and nearly two-thirds of health care professionals in another survey said the ROI on EHRs has been terrible or poor; only 10% rated it positive.

As Jody Medich warns in Singularity Hub, our interfaces are killing us.  According to Ms. Medich, The "human-machine-interface" (HMI) we've been relying on is all-too-often based on a time when we sat at a desk, looked at a terminal, and did things like math.  It wasn't intended for now, when our computing devices are with us everywhere and expected to be always-on-call, for a variety of everyday tasks and in ways that we can immediately process.

Our cars as mobile technology platforms are a good example: driving at 70 mph, is not the best time to have to look at verbal information on a small screen or at confusing icons.

Ms. Medich believes we are about to go into an era of cognitive or perceptual computing, which "recognizes what is happening around it (and you) and acts accordingly...This means technology will be everywhere, and so will interface."

David Webster, a partner at design firm Ideo, frames the coming technology revolution differently.  He writes: "The key is to design experiences around emotional value rather than rational value."

That may be the problem; our technology has always been written by hyper-rational coders, aiming at "rational" tasks, while much of what we do every day is driven by more emotional reasons.  .

Mr. Webster gives the health-related example of a "smart" scale that chided a woman for gaining a few pounds -- not realizing she was four months pregnant.  Getting such alerts can help motivate people, but they need to be appropriate and in context in order to be effective.

Just ask any health care professional about "alert fatigue."

Mr. Webster goes on to say:
The fundamental role of designers is to use creativity to bridge the gap between rational and emotional—to make new technology engaging and appealing by having it meet humans on their terms. We’ve found the best way to get people to integrate new products or behaviors into their lives is to connect with them emotionally, which encourages adoption.
If there is any sector that needs to think about the emotional, it is health care.

People turn into patients when they enter health care settings.  They turn into patients once they're diagnosed with a health issue.  As patients, they're forced to rely on health care professionals, they're bombarded with unfamiliar jargon, they're often asked uncomfortable questions or put through unpleasant treatments and/or procedures.  They may be scared, worried, angry, uncertain, or even delighted (a new baby!).

Talk about emotional.

Much effort has been put into giving patients access to their heath records, yet Ambra Health reports 31% of consumers can't easily access them, and other research suggests that well below 30% of patients with such access actually access them.  And how many understand them?

Meanwhile, we're also collecting data from other sources, such as wearables.  We're able to track our steps, monitor our blood pressure and heart rate, measure our blood glucose levels.  We can see all those resulting numbers, and get alerts about them.  But more numbers are not what we need.

We're already floundering in data we don't easily understand and we're making it worse.

If health care was strictly rational, placebos wouldn't work and we'd be eager to replace our human doctors with artificial intelligence (AI) ones.  But they do and we aren't.

EHRs shouldn't be data collection vehicles for clinicians, and they shouldn't be primarily data reporting mechanisms for them, or for us.  We are not data and our health can't be reduced to it.

Similarly, it's very clever to create "dashboards" for our various health information from our many devices, but we care less about what the numbers are than what they mean for us.

A previous post argued for the importance of data visualization, pointing out: "Let's face it: most of us are not good with numbers.  Most of us don't think in numbers.  Most of us think in pictures."  Rasu Shrestha, MD, MBA, the UPMC chief innovation officer, gently disagreed, saying that most of us think in stories.

Pictures or stories -- either way, if we want tech to be effective, it has to engage us emotionally, not just rationally.

In a Wall Street Journal opinion piece, Mark P. Mills -- a senior fellow at the Manhattan Institute and engineering professor at Northwestern -- says the cyber age has hardly begun, as we have yet to truly integrate software into hardware "so that it becomes invisible and reliable."

Further, "the U.S. now stands at the equivalent of 1920 for ubiquitous cyberphysical systems," he believes, and "the dominant players of the cyberphysical age have yet to emerge."

Our 1960's/1970's approaches to technology have been very successful, but it is now the 21st century and it is past time for the next era of technology.  Whether that is cognitive computing, emotional design, or cyberphysical -- or a combination of all three -- our technology needs to and is going to act very different.  It needs to "know" us and react to us appropriately.

We are building technology with ever-higher IQs, when what we really need is technology with EQ.

Where better to start than in health care?

Tuesday, September 12, 2017

The World Is Not (Going To Be) Flat

Too many of us have come to believe that the world is flat.  No, I'm not talking about the Flat Earth Society or the random celebrities who purport to believe it is (although both are troubling).  I'm talking about the rest of us, who increasingly see the world through the prism of our various screens, be they smartphones, computer screens, or TVs.  Americans admit to almost 11 hours of screen time daily, and one has to suspect that is understated.

That's going to change.  Soon.  And digital cinema camera maker RED may be showing us how with its new Hydrogen One smartphone.

In a partnership with Leia Inc., an HP spin-off, RED announced the Hydrogen One, which it claims is "the world's first lightfield "holographic" smartphone."  It is expected to be on the market in the first half of 2018, and will retail for $1,200 ($1,600 for the titanium version).  

Their press release says: "The Hydrogen program will feature stunning holographic content and 3D sound for movie viewing, interactive gaming, social messaging and mixed reality."  That's all very nice, but it is the holographic display that has people's attention (well, that and the price).

RED's background has been in cameras, aiming to "build the world's best cameras," which began with 2007's RED ONE, a breakthrough 4D digital camera (they now claim products with "8k resolution").  Leia has similarly grand ambitions, stating that:
Our proprietary Diffractive Lightfield Backlighting (DLB™) solution adds nanostructures to a conventional display and gives them almost magical properties while preserving their standard imaging capabilities.
Got that?  A 2015 Leia video helps illustrate their display:
OK, so it's not quite like the holograms we expect from science fiction television shows and movies, and content is going to be an issue for some time, but it is at least a break from the flat screens we're used to, even with 3D renderings in 2D.  And they're not alone.

Apple, for example, has filed a patent application for "interactive three-dimensional display system," according to CB Insights.  Holus has a Kickstarter campaign for its "interactive tabletop holographic display." while Holoxica claims "several generations of holographic technologies, which span from static images to motion video displays with interaction."

There are published papers in Nature and Optics Express that promise, respectively, "Holographic displays generate realistic 3D images that can be viewed without the need for any visual aids" and a "360-degree tabletop electronic holographic display."

Writing for NPR last month, Glenn McDonald asserted that for true holographic displays, we're not quite there yet, but "we're getting awfully close, though."  He cites a laser display from the University of Rochester and a laser-plasma approach from Aerial Burton as examples.  Other versions are still, he believes, more like optical illusions of "genuine" holograms.

If we only think we're seeing a hologram, does it really matter, as long as we do see them?

The point is, holographic displays are not only feasible with existing technology, but are starting to be commercialized.  RED may have gotten a jump on the market, and may be early in what the experience can yet deliver, but its state-of-the-art will not remain the state-of-the-art very long, nor will they be the only ones.

There will be some fast followers, and they will, indeed, follow fast.

It won't just be about smartphones.  Anything that uses a screen could be augmented, or replaced, by a holographic image.  I've written before about the coming world of "ubiquitous computing," where your device could be just about anything and your display show up anywhere you desire.

You may not care about a holographic display of text, for example, but you might about images, especially if they are interactive.

Entertainment and gaming, of course, are two industries where holographic displays should find early uptake.  Health care, on the other hand, is rarely a fast follower of new technology, but the industry needs to be thinking about the possibilities.

Health care is about people, but it is full of words and data.  Your medical chart is full of words you don't know, drugs you can't even pronounce, numbers that have no obvious meaning.  Health educators do their best to come up with illustrations, simplified explanations, videos, and visual aids, but most of us have health literacy levels well below our general literacy.

Even health professionals struggle to take in all the information, and that problem will grow exponentially as that data does, such as through sensors in wearables and elsewhere.  We need more pictures, and some of those pictures should be holograms.

Picture a hospital room, with the poor patient hooked up to various monitors (all beeping away constantly).  A doctor or nurse coming into the room has to look at the screens and try to make sense of what they are saying about the patient's health.  They may be very good at it, from years of practice, but perhaps it doesn't have to be so hard.

A holographic display -- perhaps of the patient, or trend lines -- could help more easily illustrate problem areas or indicators that are trending in the wrong direction.  It doesn't have to be holographic, of course; it's just that we are visual beings.  A holographic image might make the situation more real and the comprehension faster.

I long for the EHR that is based on holographic displays, allowing the clinician to not only visualize a patient's history and current status but also interactively annotate them.  The display could also be much easier for the patient to view and understand, as well as allowing for on-the-spot visualization of any diagnoses or proposed treatments.

If, as the saying goes, a picture is worth a thousand words, then a holographic display might be worth ten thousand words of medical jargon.

Augmented reality (AR) and virtual reality (VR) are equally exciting, but, to the extent that they require us to view them through devices, may have different uses.  At some point, though, the three technologies may, for all practical purposes, merge into a unified "hand-free" interactive experience.

So, kudos to RED and Leia for showing us the way forward with smartphones.  The future of holographic displays can't come fast enough.

Tuesday, September 5, 2017

Health Care's Juicero Problem

Bad news: if you were still hoping to get one of the $400 juicers from Juicero, you may be out of luck.  Juicero announced that they were suspending sales while they seek an acquirer.  They'd already dropped the juicer's price from its initial $700 earlier this year and had hoped to find ways to drop it further, but ran out of time.

I keep thinking: if they'd been a health care company, they not only might still be in business but also would probably be looking to raise their prices.
Juicero once was the darling of investors.  It raised $120 million from a variety of respected funding sources, including Kleiner Perkins, Alphabet and Campbell Soup.  They weren't a juice company, or even an appliance company.  They were a technology company!  They had an Internet-of-Things product!  They had an ongoing base of customers!

Juicero's founder, Doug Evans, saw himself as a visionary, telling Recode: "I'm going to do what Steve Jobs did.  I'm going to take the mainframe computer and create a personal computer.  I'm going to take a mainframe juice press and create a personal juice press."

The market seemed promising.  All those people willing to pay $5 for a cup of Starbucks or $200 for their own Keurig would certainly see the value in their own juicer, especially with Juicero's own, IoT-connected Produce Packs.  Indeed, Juicero claimed to have sold over a million of the Produce Packs alone.

The ridicule started almost as soon as the hype.  $700 -- even $400 -- for a juicer?  Even for Silicon Valley, that was a bit much.  Mr. Evans was replaced as CEO last fall, but their woes continued.  The negative publicity probably reached its nadir in April, when Bloomberg reported people could produce almost as much juice almost as fast just by squeezing the Produce Packs directly.
Moral of the story: if you want to introduce products that have minimal incremental value but at substantially higher prices, you're better off sticking to health care.

Take everyone's favorite target, prescription drugs.  The pharmaceutical industry has learned how to play the system for higher prices, and profits.  They can take existing drugs and tweak them to justify higher prices, or even buy rights to existing drugs and jack up the price, as we saw with Daraprim ($13.50 per tablet to $750) and EpiPens ($57 to over $500).  Former Turing Pharmaceuticals AG CEO Martin Shkreli, who raised the Daraprim price, may be hated for his actions but he's not alone.

Consider this: studies suggest that only about 10% of new drugs are actually clinically superior to existing treatment options.

As Donald W. Light charged in Health Affairs, "Flooding the market with hundreds of minor variations on existing drugs and technically innovative but clinically inconsequential new drugs, appears to be the de facto hidden business model of drug companies."

As with prescription drugs, we regulate medical devices looking for effectiveness but not cost effectiveness -- and we don't even do a very good job evaluating effectiveness in many cases, according to a recent JAMA study.

And, as Elisabeth Rosenthal pointed out in her remarkable An American Sickness, medical device manufacturers have figured out how to game FDA regulation by claiming products were "substantially equivalent" to existing devices; "The surprising result is that today there is generally far less careful scrutiny of new devices than new drugs."

Take robotic surgery, hailed as a technological breakthrough that was the future of surgery.  A robotic surgical system, such as da Vinci, can cost as much as $2 million, but, so far, evidence that they produce better outcomes is woefully scarce.

As Dr. John Santa, medical director at Consumer Reports Health said, "This is a technology that is costing the heathcare system hundreds of millions of dollars and has been marketed as a miracle -- and it's not."

It is, of course, much more expensive.  

Proton beam therapy?  It's one of the latest things in cancer treatment, an alternative to more traditional forms of radiation therapy, and is predicted to be a $3b market within ten years.  The units can easily cost over $100 million to buy and install, cost patients significantly much more than other alternatives, yet -- guess what? -- not produce measurably better results.

The number of proton beam centers is growing rapidly, of course -- especially in the U.S.

But, in our health care system, with its crazy-quilt system of financing and delivering care, we don't need new drugs or fancy new devices to cost us more.  We pay more for pretty much everything than pretty much everyone else.

Last year Vox used 11 charts to illustrate how much more we pay for drugs, imaging, hospital days, child birth, and surgeries than other countries.  Their conclusion, which echoes conclusions reached by numerous other analyses: "Americans spend more for health care largely because of the prices."

We not only don't get a nifty new juicer from all of our health care spending, we don't even get better health outcomes from it   

Health care's "best" Juicero example, though, may be electronic health records (EHRs).  Most agree on their theoretical value to improve care, increase efficiency, and even reduce costs.  But after tens of billions of federal spending and probably at least an equal amount of private spending, we have products that, for the most part, frustrate users, add time to documentation, and don't "talk" to each other or easily lend themselves to the hoped-for Big Data analyses.

Many physicians might, on a bad day, be willing to trade their EHR for a Juicero.

Jonathon S. Skinner, a professor of economics at Dartmouth, pointed out the problem several years ago: "In every industry but one, technology makes things better and cheaper.  Why is it that innovation increases the cost of health care?"

Essentially, he believes, our health care system pays for too many things that are of too little value, because, "unlike many countries, the U.S. pays for nearly any technology (and at nearly any price) without regard to economic value."

So we can make fun of Juicero all we want, but when it comes to overpriced, under-performing services and devices: health care system, heal thyself first.

Tuesday, August 29, 2017

We Must All Be Healthy Together...

Is there anything our microbiome can't do?

We're starting to get a better picture of how our microbiome impacts our health, and I'll get to that shortly, but at the recent Biohack the Planet conference some clever folks at Biota Beats figured out how to turn the microbiome into music:    

Click here to check out what it actually sounds like (especially the full symphony).  

OK, so it's not Beethoven or even Jay-Z (although David Sun Kong, an MIT biologist who presented the work, says DJ Jazzy Jeff is going to put a track from his microbiome on his next record), but it is pretty cool.  

Imagine that one day we might diagnose our health by checking the sounds of our microbiome.

For anyone who hasn't been paying attention, the microbiome are all the microbial organisms living in, on, and around us.  They are literally everywhere, and their genes outnumber our genes by 100-to-1, perhaps more.  They even have more cells than we do.  

That probably sounds terrible to many people.  We're a nation that demands antibiotics at the slightest sniffle, that puts antibiotics in its food chain, that uses antibiotic soaps.   Ever since we discovered penicillin, we've decided that if we can kill off "foreign" invaders to our bodies, we should.

And, certainly, much good has come from that.  We don't usually die of infections any more.  The trick, though, is understanding what is "foreign," and what is "invading" us, rather than simply at home in us.   

We start acquiring our microbiome in the birth canal (and, in fact, if you came out via a Cesarean-section, it can adversely impact your microbiome), and its composition is constantly changing from then on.  Children eating dirt, for example, is generally frowned upon by modern parents, but it is actually is a great way for them to boost their microbiome.  There is a "hygiene hypothesis" that links increasing incidence of autoimmune and allergic diseases to our efforts to avoid such "germs."

The Human Microbiome Project has been studying our microbiomes since 2008, and, as with our own genome, it seems that the more we learn, the more we realize how much more there is to learn.  Links between our microbiome and our health seem to be everywhere we look.  

The microbiome is deservedly becoming a big research focus.  Indeed, the Cleveland Clinic listed microbiome as the top medical innovation of 2017:


IBM, Harvard, MIT, Mass Gen, UCSD and the Flatiron Institute are teaming up to map 3 million genes found in the gut microbome (you can donate spare computing time in the effort).   This is only one of several efforts in the field, TechCrunch reports.   

It has been well documented that our gut microbiome not only helps us digest foods, but also how we store fat, gain weight, regulate blood glucose, even what food we crave (e.g., chocolate!).  The following chart illustrates the complex interactions:
Boulange, et. al.
Researchers Jasenka Zubcevic and Christopher Martyniuk assert that: "There’s growing evidence of a link between the brain and our microbiota as well."  Their research found that the brain communicates with the gut microbiome via the bone marrow immune cells, which suggests connections to our immune responses and immune diseases.

In addition, the gut microbiome has been linked to stress, anxiety, and other mental health issues.  Which, as Professors Zubcevic and Martyniuk put it, gives "a whole new meaning to the term 'gut feeling.'"

And we're still being surprised.  A new study of the Hadza, an African hunter-gatherer tribe, found that their microbiome varied seasonally, possibly based on changing diet throughout the year.  The degree of the changes were unexpected, with some of the microbiota dying off entirely, then reemerging.  

Their microbiome was also dissimilar to those found in more industrialized societies.  As to why, or what it meant for our health, the researchers could only say: "That’s a huge question — it’s the elephant in the room." 

Justin Sonnenburg, the lead author of the Hadza study, admitted to The New York Times, "We don’t have a good grasp of what these seasonally varying microbes even do."
Two 2016 studies tried to correlate a host of factors with people's microbiome, and could only explain 8-16% of the variation.  As one of the researchers said, "It's very humbling."

Two researchers -- Rebecca Vega Thurber and Jesse Zaneveld -- have proposed what they call the Grand Unification Theory of Unhealthy Microbiomes,  It theorizes that when microbiomes become unhealthy, they do so in unpredictable ways.  They become more varied, but in every direction.  

That's why Dr. Zanesveld likes to call it the "Anna Karenina" hypothesis, as in, Tolstoy's famous opening: "All happy families are alike; each unhappy family is unhappy in its own way."

The theory is still in the early stages, but it is drawing attention.  The fact that our microbiomes vary both from each other's and over time even when healthy complicates our understanding of when it is not.  As The Atlantic's Ed Yong points out,  "if the microbiome is ruled by randomness, then it might be hard to determine whether a particular community is unhealthy, and to develop standardized, effective ways of steering it back on course."

We're proud of 21st century approaches to medicine like breakthroughs in immunotherapy or gene therapy, but, in some ways, we're still like when we started understanding the role of bacteria and viruses.  Dr. Sonnenburg concluded: "We have to think of ourselves as these composite organisms, with microbial and human parts."  

Professors Zubcevic and Martyniuk came to a similar conclusion, especially as we increasingly rely on pharmacological interventions that may impact the microbiome:  
Much like the chicken-or-the-egg scenario, however, this complex interplay warrants further investigation to fully understand the consequences (or benefits) of perturbing one single component of the gut microbiota.
The fundamental change to thinking about our health that we have to make is that, when it comes to our microbiome, it isn't "us" versus "them."

It's all us.

When he was urging his fellow citizens towards independence, Benjamin Franklin famously said, "we must, indeed, hang together or, most assuredly, we shall all hang separately."  So it is with our health and the health of our microbiome.  We're in this together.

Meanwhile, I'm looking forward to hearing my microbiome!

Tuesday, August 22, 2017

We Need More Dumb Ideas

Over the years I've listened to many new-to-health-care entrepreneurs pitch their great new idea.  They're so excited: health care is so inefficient!  People are so frustrated by the system!  It will be so easy to improve it!

I usually end up thinking, "Oh, you poor people.  You really don't know much about health care, do you?"  They don't fully grasp the strange way it is bought and sold, the convoluted financing, or the layers of regulation.  So I wish them well and wait to hear about their eventual failure.

But now I'm thinking, maybe it is experts like me who are part of health care's problem.

In Harvard Business Review, Ayse Birsel suggests that companies need to do more "reverse thinking," deliberately thinking up wrong ideas.  As she says:
Wrong thinking is when you intentionally think of the worst idea possible — the exact opposite of the accepted or logical solution, ideas that can get you laughed at or even fired — and work back from those to find new ways of solving old problems.
"Wrong" ideas force us to think differently, and to identify exactly what about them is actually wrong.  Doing so can open up new ideas or new avenues to investigate.  One of her examples of this is biochemist Fred Sanger, who was trying to sequence DNA back in the 1970's.  He went at this by trying to do the opposite, building DNA instead of breaking it down.  His resulting insights garnered him his second Nobel Prize.

Ms. Birsel has several suggestions for how companies can spur such wrong thinking, but the most powerful one may be "Be the Beginner."  She cites the famous Zen quote:

In other words, sometimes we can know too much, and that knowledge can cloud our thinking about what the possibilities really are.  Sometimes we need all those possibilities.

Be the Beginner, indeed.

We've all been in meetings where everyone is trying to show off how much they know and what they think should be done.  People often talk past each other, not really listening and certainly not being open to any dumb ideas that might end up not being so dumb after all.  It's more about showing how smart they are rather than solving the problem.

Thom Crowley, in Fast Company, thinks that we'd more more effective if we'd all just stop trying to be the smartest person in the room.  Knowing when to say "I don't know," he believes, "might make you the most useful person in the room, which is way more valuable than being the most knowledgeable."

Don't always try to be the "expert."

Dumb ideas are sometimes what we need.  Panos Mourdoukoutas, a Professor of Economics at LIU, asks: "What do Amazon, Dell Computer, Home Depot, Airbnb, and Uber have in common? They all started with a “dumb” idea."

Smart ideas, he says, are easier, in a sense.  They address conventional markets, with well-defined ideas about the products.  But these markets usually have lots of competitors and slow growth.

On the other hand, so-called dumb ideas serve non-conventional markets, where neither products nor competitors are well-established.  That opens up all sorts of possibilities.

Admit it: the first time you heard the business models for Uber or Airbnb you probably thought they sounded like pretty dumb ideas, but the venture capitalists who let that stop them from an early investment are probably kicking themselves now.  As Professor Mourdoukoutas concludes: "The bottom line:  “Dumb” ideas can bring a fortune, provided that they are executed right."

Source: Anuja Shukla, Ideo
Leif Huff, a partner at Ideo, recently outlined "5 Ways to Think Like a Designer," which were: collaborate, get into a question-framing mindset, get lost in dreams, be curious, and -- my favorite -- challenge assumptions.  Those are tips we should all take to heart.

Health care is full of constraints.  Physicians guard clinical matters closely; no one can do anything that resembles the "practice of medicine" without joining their guild, with all its attendant training, rules, and ways of doing things, even ones that seem counter-productive (e.g., 100 hours a week for residents).

Health insurers are similarly wary about other entities being in "the business of insurance," although they can't stop self-insured plans from competing with them.  Both  now operate under federal rules about what their products can even look like.

As is well known, drug companies charge much more for drugs in the U.S. than anywhere else in the world, and are opposed to importing not just drugs approved in other countries but ones already approved for sale here.  They use that, patent protection, and FDA approval as barriers to competition, ostensibly to protect patients but, many believe, to maximize profits.


Take any aspect of our health care system, and you'll find professionals who want to preserve (and, if possible, expand) their role, and regulators who help provide barriers to entry to anyone who might want to usurp some or all of that role.  It is not a system where new ideas, especially "dumb" ideas, are welcomed.

Look, I get it.  Health care is very complicated.  It is about people's health, one of the things we (supposedly) value most.  Make a mistake, pick the wrong idea, and people could die, or, at the very least, not get better.

But health care comes up with enough of its own dumb ideas, and its own ways of harming people.  It shouldn't be immune to having to listen to other dumb ideas.

Sometimes we need new thinking, even in health care.  Sometimes we need to think of a bunch of wrong ideas to understand what might make them "right."  Sometimes we need to take a chance on a "dumb" idea that maybe isn't really so dumb.  Sometimes we need ideas that are, for lack of a better word, dangerous, if only to the status quo.

Some dumb ideas are, in fact, dumb.  Many will fail; others would make things worse.  But you can't let that keep you from coming up with ideas that are outside people's comfort zones.  Oscar Wilde had it right:

So next time someone pitches me an idea that sounds naive at best ad dumb at worst, I promise I'll try to be more patient, listen a little harder, and try to get past my own "expertise" in order to be open to its possibilities.

Dare to think of dumb ideas.  Don't be afraid of dangerous ideas.  Be the beginner.

Tuesday, August 15, 2017

11 Things About Health Care I'm Dying to Redesign

The folks at Ideo recently published 19 Things We're Dying to Redesign, covering a wide range of products, services, and systems, both big and small.  It's very thought-provoking, but only one of them addressed a health care topic (oddly enough, incontinence).  If there is an area of our lives that badly needs redesign, it would be health care.

And not redesigning it sometimes literally results in us dying.

Let's start with a clean slate.  I'm not as ambitious as Ideo, in terms of the breadth or number of topics, but here are 11 things about heath care that I'm dying to redesign:

  1. Assure affordability: We don't expect that everyone can buy a Mercedes, or even a car, but we do have federal programs that try to ensure poor people can get food (SNAP) and housing (Section 8 and other programs).  When it comes to health care, though, we're wildly uneven, both in terms of who gets help and what that help looks like.  That is not the mark of a civilized country.
  2. Share high costs broadly:  It is well known that a small percentage of the population accounts for a majority of health care spending.  Unfortunately, while some of those people fall under broad social programs like Medicare or the VA, most do not.  They may be covered by an employer plan or a small health plan, and their cost can be catastrophic to that plan and the other people covered by it.  The highest cost patients should be a broadly spread social burden.
  3. Count: Sadly, we don't know the actual effectiveness of many things we do in health care.  Even for the ones we do, we know many don't work for many, if not most, of people they're done to, although we don't know which people or why.  We know there is more unnecessary care and more medical mistakes than there should be, although, again, not which care, done to which people by which providers.  Even counting mistakes is frowned upon, due to malpractice fears.  Health care is not voodoo, and $3 trillion is too much to pay for art.  It should be more of a science, and that requires not just better data but better use of it.
  4. Health, not medical: We don't have a health care system.  We have a medical care system, and it shows.  We need to treat health habits and social determinants of health at least as importantly, if not more, as we do medical care.
  5. Recognize who "we" are: We talk about "our" health, but it is becoming increasingly aware that our health is heavily dependent on the health of our microbiome.  We don't fully understand how it impacts us, but we know there are more of "them" than "us," and treatments that impact our microbiome impacts us.  We need it to be healthy for us to be healthy.
  6. Reinvent health insurance: Insurance is supposed to protect us against unexpected and catastrophic expenses.  Somehow it now also is used to encourage preventive care, pay for budgetable expenses, subsidize lower income members, negotiate payment rates with providers, dictate our choice of providers, and try to manage our health.  Plus, we've decided to treat dental, vision, and most long term care separately, not to mention health portions of auto insurance or workers compensation.  None of this makes sense; time to start over.  
  7. Get rid of the mystique: Medical care is complicated.  It has lots of codes, lots of jargon, and uses highly trained professionals to dispense it.  This is not a system designed for us to understand, and so we put our health in the hands of the people who have helped it be so complicated.  We defer to the degree and the white coat.  We each should be the best expert about our own health, and others in the health care system should help us achieve that.  
  8. Encourage responsibility: Few of us are maximizing our heath.  We don't eat right, we don't exercise enough, we're under too much stress, we weigh too much, we drive too distracted.  Then we delegate taking care of the consequences of our behavior to our various health care professionals.  We're not responsible for everything that happens to our health, but we should take responsibility for much more of it than we do now.
  9. Move health back home:  Too much of our care happens outside the home -- in a doctor's office or outpatient facility, or, if we're more unlucky, in a hospital or long term care facility.  Indeed, hospitals now own increasing portions of the delivery system.  This is backwards.  We want to be at home, with our families and living our lives.  We should recognize that as the locus of our health, deliver more care there, and see a stay in a facility not only as last resort but as a failure. 
  10. Remember whose health it is: The health care system is not oriented around us.  It is designed around health care providers, and oriented around their views of and interactions with us.  As a result, our data is siloed, incomplete, and often incomprehensible to us.  No one who has had to wait hours for an appointment or a procedure can think it is about them.  It should be about us.  
  11. Better, not just more: Health care is like the defense industry, where technological advances get progressively more expensive without necessarily having the corresponding effectiveness.  New drugs add minor health improvements but cost tens of thousands more.  Technology should drive costs down and productivity up, and make our experience better.  Where is the iPhone of health care -- delighting consumers at a price they are more than willing to pay? 
These are not little asks.  These would not be small changes.  They are, indeed, suggesting that we basically rethink everything about health care.  They may not be possible.  

And yet.

We're spending almost 20% of GDP, for mediocre results, and with neither patients nor health care professionals happy about the system we've built, or, at least, allowed to develop.  It's only going to get worse, unless we drastically change the course we are on.  

This is not a time for tweaks.  

Maybe your dying-to-redesign list for health care would include things like better hospital gowns or slicker apps, but I'd prefer to think bigger.  


Tuesday, August 8, 2017

Make My Genes Better...Than Theirs

The age of gene editing is upon us.  Specifically, the use of CRISPR.   Amazing things are happening, proving again how clever humans are.

Whether we're smart remains to be seen.  


For those who are unfamiliar with it, CRISPR -- more accurately, CRISPR-Cas 9 -- is a new technique for gene editing.  It has allowed faster, more precise, and less expensive gene editing.  It can already do more than you may realize.

CRISPR has been much in the news lately, due to a new study published in Nature.  Researchers successfully corrected a DNA mutation that causes a common heart disease that is sometimes fatal, especially for young athletes.  In what is believed to be a first, the researchers repaired viable embryos.  Moreover, they repaired most (72%) of the embryos, which is much better than previous efforts.

The mutation would have been fixed not only for the individual but also for their descendants.

The results produced some surprises.  For example, of the 42 embryos where the mutation was corrected, 41 of them corrected it using the mother's (correct) version of the gene, rather than the inserted template DNA.  That was not expected, and points to the unknowns involved in the process.

Development of embryos after co-injection of a gene-correcting enzyme 
Photo: Oregon Health & Science University 
There are several important caveats.  These were embryos in a lab, not a womb.  They were not then implanted to verify that the embryos were viable.  There were only a few dozen embryos, which did not have the trillions of cells that even an infant would have.  And, perhaps most importantly, the researchers focused on a single mutation.

As Dr. Izpisua Belmonte, one of the researchers, told The Washington Post, "I don’t want to be negative with our own discoveries, but it is important to inform the public of what this means.  In my opinion the percentage of people that would benefit from this at the current way the world is rather small."

Most diseases, physical characteristics, talents and traits are genetically complex, involving numerous genes (height, for example, may involve 93,000!), and so are not easily amenable to this approach.  Still, some 10,000 medical conditions are believed to be caused by specific mutations, including Huntington's disease cystic fibrosis, and sickle cell anemia.

Robin Lovell-Badge, a professor of genetics at London's Francis Crick Institute, told The New York Times, "You could certainly help families who have been blighted by a horrible genetic disease.  You could quite imagine that in the future the demand would increase. Maybe it will still be small, but for those individuals it will be very important.”

The research team has already said that the BRCA gene mutation, associated with breast cancer, may be one of their next targets.

Despite this, there were articles about this being the prelude to "designer babies," as well as more balanced articles about why that was not true -- yet.  Alta Charo, a professor of law and bioethics at the University of Wisconsin, told The Atlantic:  "This has been widely reported as the dawn of the era of the designer baby, making it probably the fifth or sixth time people have reported that dawn.  And it's not."

The researchers and many others are keenly aware of the ethical issues involved.  The Executive Director of the Center for Genetics and Society warned about people being willing to pay for genetic upgrades:
Once those commercial dynamics kick in, we could all too easily find ourselves in a world where some people’s children are considered biologically superior to the rest of us. We need to ask ourselves whether we want that new kind of excuse for extreme social disparities we already tolerate.
In many countries, there are restrictions on this kind of research.  15 out of 22 Western European counties, as well as Canada, ban attempts to change the human germline.  In the U.S., federal funding is not allowed, nor are clinical trials, but other research is permitted, under some voluntary guidelines.

Shoukhrat Mitalipov, one of the lead researchers in the recent study, hopes there will soon be some consensus on how to regulate the field, as otherwise, "this technology will be shifted to unregulated areas, which shouldn’t be happening."  There is an international summit on gene editing scheduled for early 2018 in China.

China is already a leader in the field.

The technique has made tremendous progress in just a few years, and it is going to make even more in the future.  However, we're going to find that directed changes to our genome are even harder than we suspect.

But they are going to happen.  We can use technology for good or for bad, but history has taught us that, once developed, we will use it, one way or another.  CRISPR and whatever comes next in gene manipulation will be no exception.

There are several ways this could go:

  1. We walk before we run: We could first focus on single mutation diseases, figuring out how to prevent them from occurring, possibly even reversing them.  Only then carefully move to more complex conditions.
  2. We widen inequality: We already know that more money means better health and a longer life.  Genetic engineering could drastically further widen all of these inequalities -- permanently. Remember Brave New World?  
  3. We weaponize it:  We know China is working on gene editing.  Who knows what North Korea or Iran is doing.  If a country could create smarter, stronger, more creative citizens, they'd have an economic or even military advantage.  If used on other populations, it could make chemical or biological weapons seem trivial.  It has already been called a potential weapon of mass destruction.
  4. We goof: We are just beginning to really understand the complex interplay of genes, their environment, and other factors, not to mention the impact that our microbiome -- whose genes vastly outnumber our own -- has on us.  We're going to have more surprises, and they may not all be good  As Kevin Esvelt, a prominent researcher in the field, has admitted: "My greatest fear is that something terrible will happen before something wonderful happens."
The Pandora's Box of gene editing is now open.  We have high hopes for it, but we should remember that, in the original parable, hope was the last thing in the box, not the first  

Let's hope we're smart about it.