Inside the Bubble – Insecurity
If you reveal your secrets to the wind, you should not blame the wind for revealing them to the trees.
-- Kahlil Gibran
The topic at this year’s Patient Engagement Advisory Committee (PEAC) meeting was cybersecurity. This has become my favorite domestic regulatory meeting, and I was especially looking forward to this one, because I thought the topic might bring out more experts and fewer wide-eyeds. I found that, when it comes to cybersecurity, these are not mutually exclusive groups, but the cybersecurity wide-eyeds tend to run a bit deeper (geekier?) than some others.
A CDRH presentation described patients as an “asset.” Merriam Webster defines an asset as “an item of value owned.” The presenter defined assets as “things that we care about” and “things that are worth protecting,” apparently “things” being the operative word here. The other things cited as assets were medical devices and patient data. I wasn’t especially surprised by this perspective, but I was a little surprised to hear it said out loud.
The same presenter advised the Committee that “Providing probabilistic estimates of a threat exploiting a vulnerability is unknowable.” This struck me as more of a position firmly taken than a conclusion thoughtfully arrived at. It was a departure from the material in the slides, and the syntax made no sense. At first I thought he meant to say (or that someone else meant for him to say) that the probability that a threat will exploit a vulnerability is unknowable. But that didn’t seem to mesh with whatever he was trying to say about providing probabilistic estimates, which was…I think…that, given the unknowable, you couldn’t provide them. But isn’t this what estimates are for? If you don’t know the probability that something will happen, you estimate it. If you do know the probability, then you don’t need to estimate it, right?
I subscribe to the long-standing and variously attributed wisdom that “it’s not what you don’t know that gets you into trouble, but what you think you know that you don’t know.” My impression was that CDRH thinks it knows something about the probabilities associated with other types of risks, and, further, that it thinks this knowledge can’t be appropriately applied to cybersecurity risks, because of that whole unknowable thing. I wondered if the problem might lie more with what CDRH thinks it knows about other types of risks, than with what it thinks it doesn’t know about cybersecurity risks. Or maybe CDRH is painfully aware that it doesn’t know much about other types of risks, either, but would find it rather awkward to say so after all these years. Maybe, with cybersecurity risks, it’s trying to curb expectations from the get-go.
One of the Committee members pressed CDRH on patient access to their data, noting that one of CDRH’s slides showed a lot of players circled around the patient, communicating and coordinating, but the patient had been left out of the loop. The CDRH response agreed that patients should have access, but this access seemed limited to data as it pertains to cybersecurity breaches, not to all the data their device might have to offer, which is what the patients seemed to want.
There was only one presentation by a medical device company. It was a nice presentation about the company’s response to reports of cybersecurity vulnerabilities involving its devices. It seemed reminiscent of the CDRH slide, with a lot of collaborating, but no reference to patients being engaged in that process.
AdvaMed offered a list of cybersecurity principles it had adopted. I thought they were okay principles, but somehow I had been expecting something more substantive from the industry’s largest trade association. One of the Committee members asked what light these principles might shed on how AdvaMed would recommend that companies include patients in decision making and think about its communications to patients. The answer was it would recommend following FDA’s guidance. So I guess its members really got their dues’ worth there.
Most of the public speakers were either cyber MDs or cyber patients, which is to say, they knew far more about cybersecurity than most MDs and patients. Several had hacked their own devices. They seemed to be pretty uniform in their priorities, which were more transparency, more communication, and more control. In one patient’s words, “my device, my body, my life, my choice.”
A cyber MD currently associated with a university medical center gave a nicely thought-out presentation on informed consent and cybersecurity. I paid close attention, and saw no indication that he viewed this as an academic exercise. He really seemed to think that informed consent was part of medical practice. This puzzled me until he made a reference to doctors with “an informed consent document that they're going to do two dozen times that day in the OR.” Ah. One of the Committee members picked up on it too and asked if this wasn’t a little late to be seeking a patient’s consent. I wondered if the Committee member was thinking what I was thinking, which was that, call it what you will, a form that a patient signs in the OR is not a consent form. It’s a release form.
Based on the public presentations and the roundtable discussions, the non-cyber patients struck me as more pragmatic than the experts. Perhaps that’s because for patients this is only a matter of life and death, while for other players it’s a matter of financial, legal, political, and professional liability. These patients wanted their doctors to be their first point of contact in the event of any issue with their device, but they did not expect their doctors to be cybersecurity experts, nor did most of them want to become cybersecurity experts themselves. They simply wanted timely notification about any cybersecurity threat, practical information about what options were open to them in the event of a threat, and to be advised of sources for further information. Most were not amenable to the notion that anyone should sit on the information until everything got reviewed, analyzed, figured out, and maybe also resolved. (In other words, they weren’t interested in being kept in the dark until everyone else involved had had ample opportunity to C their A’s.)
Most thought manufacturers should be the best sources of information about their devices, and the actions patients might take in the event that a device they manufactured was breached. They thought doctors should be the best sources of information regarding the clinical ramifications of these options for individual patients. They did not think that either manufacturers or doctors were where they needed to be, but generally seem to accept that this was still a brave new world for everybody.
Where there was dissatisfaction, it seemed to be directed primarily at healthcare providers, and secondly at manufacturers, which I thought was appropriate. The dissatisfaction with healthcare providers seemed a bit more intense, probably because patients expect their healthcare providers to act in their patient’s best interests, but they had no such expectations of manufacturers. From manufacturers, they seemed to want competence. Don’t we all. No one seemed to be unhappy with CDRH, which I found refreshing. Perhaps this was because, unlike almost everyone else involved, the patients themselves didn’t seem to have a lot expectations of CDRH, which I thought was also appropriate.
A number of the presenters were affiliated with one organized group of hackers or another. This took me back to my grassroots days, when I learned how easily such groups are taken over and manipulated to address agendas other than the ones they think they are pursuing. Who might want to bend a hacker group to their own agendas, and what those agendas might be…quite the intriguing question, no?
There was an undertone of concern emanating from CDRH and other players related to cyberattacks, national security, and nation-states, which not many patients seemed to share. My takeaway on this topic was that, inside the bubble, some “assets” are more worthy of protection than others. Perhaps rightly so, but maybe not nearly as much as they might like to think.