What a 'backdoor' to iPhones could mean for health care

Joe Infantino, Senior Staff Writer

The FBI has found a way to access a heavily encrypted iPhone, representing a short-term win for the bureau.

But in the longer-term, experts says its access method and proposed policy of establishing backdoors in encrypted devices may create new vulnerabilities that hackers could exploit to access the health data stored on millions of users' devices.


For months, the FBI tried every technique in its arsenal to access a single iPhone 5c: the phone used by Syed Rizwan Farook, who, along with his wife, Tashfeen Malik, killed 14 people at the Department of Public Health in San Bernardino, California.

Eventually, the FBI contacted Apple, seeking help in bypassing encryption on Rizwan Farook's phone. Apple refused.

In doing so, Apple CEO Tim Cook said the FBI had asked for "something we consider too dangerous to create." Cook said the existence of a unique operating system to bypass the phone's security features "in the wrong hands" could threaten the privacy and security of its customers.

"This software—which does not exist today—would have the potential to unlock any iPhone in someone's physical possession," Cook said, adding that it would be similar to "a master key, capable of opening hundreds of millions of locks."

The controversy appeared on track to be resolved in court. But the FBI last week said it gained access to the iPhone without Apple's help, prompting the bureau to drop the Apple case.

The FBI so far has declined to publicly disclose how it unlocked the iPhone, and it seems poised to test the unknown method on other iPhone models. This "ability to now unlock an iPhone through an alternative method raises new uncertainties, including questions about the strength of security in Apple devices," Katie Benner and Eric Lichtblau write in the New York Times.

Threats to mobile health data

These new uncertainties surrounding the privacy and security of iPhones have caught the attention of health data experts, who note that iPhones have become troves for health data in recent years.

In addition to Apple's flagship Health app, its ResearchKit framework, and its new CareKit platform, there are third-party apps, such as Fitbit and MyFitnessPal, that also record information about users' health.

Is Apple already changing health care? Thousands sign up for trials through ResearchKit

"There are lots of things that we store on our phone that are personal and that we expect to be private and secure," David Harlow, a health care attorney and author of the HealthBlawg, told me.

Data stored on smartphones may already be vulnerable due to software errors and other vulnerabilities. A study published in March found that the majority of 211 diabetes apps leak data, including more than just condition-specific information. And "this is hardly limited to diabetes apps," Eric Boodman writes for STAT News.

But a technique that could unlock any iPhone, such as the one apparently used by the FBI in the San Bernardino case, would represent a new vulnerability that hackers could potentially exploit.

And some lawmakers are proposing requiring all technology vendors—not just Apple—to deliberately create such "backdoors" in all encrypted devices.

The backdoor debate

The concept of requiring backdoors in encrypted devices has been championed by some lawmakers for nearly a decade, and has gained renewed interest in light of the FBI vs. Apple case. 

Members of the House Homeland Security Committee in late February proposed legislation that would establish a commission to recommend a new encryption policy for the government. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.), the top-ranking Republican and Democrat on the Senate Intelligence Committee, also have signaled an interest in requiring backdoors.

Supporters of backdoors, including FBI Director James Comey, say that law enforcement needs a way to bypass encryption in case of a national security threat.

However, technology and data experts have raised concerns. "Backdoors make it possible to virtually track your every movement and to know your every thought," Patrick G. Eddington, a homeland security and civil liberties policy analyst at the CATO Institute, told me. By creating a backdoor, "You're making things that much easier for any malicious actor" who wishes to hack into a device, Eddington said.

How hospitals are using Apple Watch

In the health care space, a backdoor would create another potential route for hackers to access exploitable health information.

"It's possible ... to use this information to impersonate someone who has really good health insurance in order to get some big-ticket medical procedure," Harlow said. "And that's disastrous for the person who's been hacked because we want those records to be accurate."

Once a phone is hacked, "You can't put the genie back in the bottle; the information is out," Harlow said. "That's why the first line of defense"—in this case, preventing the use of a backdoor—"is so important here."

Implications for health privacy laws

Another unanswered question relates to liability. If a technology company builds a backdoor into their devices to help law enforcement, and if hackers later figure out how to use that backdoor illicitly, who bears liability for the resulting data loss?

If—and that's a big if, according to Eddington—Congress were to mandate a backdoor to all encrypted devices, "There's no question that every manufacturer of software would be concerned about" liability under HIPAA.

Harlow said the privacy law probably would not kick in every time an encrypted device is lost or stolen because "the FBI could argue that the backdoor is"—at least theoretically—"only available to law enforcement, and therefore loss of an encrypted device would not be a breach."

But if hackers used a backdoor for unauthorized access to a health care app, and if the app developer noticed, then HIPAA could come into play.

And Harlow said, "If you're talking about an app developer or promoter that is a covered entity or business associate under HIPAA, then there are very prescribed protocols that you go through."

That often means:

  • Breach notification requirements;
  • Exposure to fines; and
  • The obligation to fix the gap in security.

However, both Harlow and Eddington agreed a legislative mandate for a backdoor into encrypted device is unlikely any time soon.

Neither Burr nor Feinstein have introduced specific legislation yet. And the Homeland Security Committee's bill appears to be viewed "more as a kind of a cooling mechanism" to temporarily abate stakeholder's concerns, according to Eddington.

But even if lawmakers never require manufacturers to create deliberate backdoors into their devices, hackers will likely keep finding new ways to access iPhones and their apps—so health care data will never be fully safe.

iPhones and more: Your guide to health care mobile device usage policies

As the capabilities, complexity, and available number of mobile devices increase, so does usage of these devices by health care stakeholders—and so does the need for mobile device management. To manage and secure these devices and associated mobile environments, providers must create and expand policies backed up by technology.

This report discusses the necessity of health care mobility policies, includes recommendations on what should be included in those policies, provides best/appropriate practices, and offers advice for dealing with numerous challenges providers encounter, such as bring your own device policies.


Next in the Daily Briefing

Yelp reviews: A missing piece of the patient-experience puzzle?

Read now