But, if you're like most of us, you don't really know how or why they work.
Maybe that's OK. Most of us don't know how our cars work either, couldn't explain how heavier-than-air flight is possible, have no idea what the periodic table means to our daily lives, and would be in trouble if our lives depending on us making, say, bricks or glass.
Still, though, as Captain Kirk once said (in a very different context), you have to know why things work.
Welcome to computational thinking.
The concept was introduced by Jeannette Wing in a seminal paper in 2006. She suggested that it was a fundamental skill that should be considered akin to reading, writing, and arithmetic -- learning how to solve problems by "reformulating a seemingly difficult problem into one we know how to solve, perhaps by reduction, embedding, transformation, or simulation."
She further clarified that it has the following characteristics:
- Conceptualizing, not programming
- Fundamental, not rote skill
- A way that humans, not computers, think
- Complements and combines mathematical and engineering thinking
- Ideas, not artifacts
- For everyone, everywhere.
Dr. Wing believes we've come a long way since her manifesto, and she may be right. For example, Microsoft sponsors Carnegie Mellon's Center for Computational Thinking, and Google offers Exploring Computational Thinking, "a curated collection of lesson plans, videos, and other resources on computational thinking (CT)." It includes an online course for educators.
A new initiative, Ignite My Future, wants to train 20,000 teachers to help make computational thinking a fundamental skill, hoping to engage a million students over the next five years. One of the last initiatives President Obama announced was the Computer Science for All Initiative, providing $4b to improve K-12 computer science education (how it survives the new Administration remains to be seen).
A recent New York Times article, notes that, while the number of computer science majors has doubled since 2011, there is growing appeal to learn more about computer science by non-CS majors: "Between 2005 and 2015, enrollment of non-majors in introductory, mid- and
upper-level computer science courses grew by 177 percent, 251 percent
and 143 percent, respectively."
There is now an Advanced Placement course Computer Science Principles that "introduces students to the foundational concepts of computer science
and challenges them to explore how computing and technology can impact
the world."
The Times also profiled a number of ways that "non-techies" can learn elements of computational thinking, because "Code, it seems, is the lingua franca of the modern economy." The options include CS+X initiatives in college, a number of intensive "boot camps," and an increasing number of online courses, such as through Coursera, edX, and Udacity.
Sebastian Thrun, co-founder and chairman of Udacity, argues that this kind of thinking is important for everyone because: "It’s a people skill, getting your brain inside the
computer, to think like it does and understand that it’s just a little
device that can only do what it’s told."
Still, computational thinking is not a panacea; as Shriram Krishnamurthi, a computer science professor at Brown, warned The Times, in our current culture, "we are just overly intoxicated with computer science.”
One of my favorite approaches to demystifying programming, Raspberry Pi, has sold some 12.5 million of its ultra-cheap, ultra-simple computers, making the 3rd best selling "general purpose computer" ever.
Do a search for Raspberry Pi and you'll find thousands of examples of things people are doing with them, from simple tasks that children can do to sophisticated hacks. Heck, someone has made a LEGO Macintosh Classic using a Raspberry Pi.
There's a new industry in toys for children that help teach coding, as The New York Times also reported. including Cubetto (which, at $225, is a lot more expansive than Raspberry Pis).
All of which is to say, there's getting to be fewer and fewer reasons why people don't learn computational thinking.
And health care sure could use some more of it.
Health care likes to think of itself as a science, and it has many trappings of science, but even in the 21st century it remains much more of an art. After all, this is the industry in which it was just reported that 20% of patients with serious conditions have been misdiagnosed -- in fact, most people are likely to experience at least one diagnostic error in their lifetime -- and in which we have an "epidemic" of unnecessary care.
It is an industry in which the technology often frustrates both the providers and the patients (e.g., EHRs and mammograms, respectively), where design is confusing at best and harmful to patients at worst. It is an industry in which the coding has gone beyond arcane to incomprehensible,
And it is an industry where there is surprisingly little data on efficacy, even less agreement about how to measure quality or value, and little training to help clinicians interpret or explain the data that does exist. It is an industry that is bracing for its era of Big Data, and may not be at all ready.
So, yes, some computational thinking in health care certainly seems like it would be in order.
No comments:
Post a Comment