As part of our National Cyber Security Awareness Month (NCSAM) activities, bTrade’s MFT Nation blog will post a series of what we call “case studies of what not-to-do” in the world of data security. We offer these data security case studies in hopes they can serve as a preventative, of sorts. In our experience, there are few things more fundamental to successful data security practices than learning from what others did poorly, and then taking steps to ensure that you don’t go down the same path. Through documents filed in public cases/proceedings, we have the unique ability to examine how others drew scrutiny from regulators, and hopefully learn from these mistakes.
So let’s take a look at our first case study involving a company named LabMD. The material recited below comes from documents filed in court cases and related administrative law proceedings.
LabMD conducts clinical laboratory tests on urological specimen samples from consumers and reports test results to physicians. In the course of its business, LabMD collected and retained the most personal of all personally identifiable information (“PII”) for over 750,000 consumers, including Social Security numbers, bank account and credit/debit card account numbers, and “sensitive health information, such as health testing codes that can reveal the consumer was tested for sexually transmitted diseases.”
Under circumstances that remain hotly disputed by the parties, the Federal Trade Commission (FTC) learned about the possible breach involving patient information and began an investigation into LabMD’s data security practices. The investigation persisted for three years before the FTC filed an administrative complaint against the company. The administrative proceeding is nearing completion because the parties have filed post-trial briefs and related documents.
Healthcare Clients Beware: You May Well be Obligated To Ensure Your Data Security Practices Comply with Both HIPAA and the FTC Act
LabMD asked that the proceeding be dismissed contending that HIPAA preempts the FTC Act. The Commission rejected this argument because HIPAA and the FTC Act do not conflict and are, in fact, “largely consistent.” Those MFT Nation readers in the healthcare industry should pay special attention to this, especially those involved in compliance, because the Commission found that:
nothing in the FTC Act compels LabMD to engage in practices forbidden by HIPAA, or vice versa. It is not unusual for a party’s conduct to be governed by more than one statute at the same time, as “‘we live in an age of overlapping and concurrent regulatory jurisdiction.’” LabMD and other companies may well be obligated to ensure their data security practices comply with both HIPAA and the FTC Act. But so long as the requirements of those statutes do not conflict with one another, a party cannot plausibly assert that, because it complies with one of these laws, it is free to violate the other.
FTC Alleges the Existence of “Multiple and Systemic” Data Security Failures
In a post-trial brief exceeding 100 pages, the FTC claims that it introduced at trial a “comprehensive mountain of evidence” that proves “multiple and systemic” data security failures for a business “holding a vast amount of sensitive data.” We will spare you from a discourse on all the details contained in the 100+ pages. Instead, we want to share the following key areas where the FTC finds fault with LabMD’s data security practices:
- LabMD did not patch and update operating systems and other programs. For example, in 2006 some LabMD servers were running Windows NT 4.0, an out-of-date, unsupported version of the operating system.
- LabMD gave many, if not most, employees administrative rights over their computers, so they had the ability to change security settings on the computers and download programs and files to the computers, including an unauthorized file sharing application that was installed on the computer used by LabMD’s billing manager, and through which certain PII was compromised. Further, LabMD maintained files containing highly sensitive PII on employee desktop computers, and employee desktops were connected to the internal network where PII was stored, which makes such PII vulnerable because an employee may inadvertently expose sensitive information to malicious software, unauthorized software, unauthorized individuals, unauthorized changes, and other threats. In addition, some employees could remotely access their computers on the network, including PII on the network, using their home computer and LabMD had no security requirements for the home computers used to access its network.
- LabMD did not provide its employees with tools or training to encrypt sensitive information in emails. As a result, between 2004 and at least October 2006, an IT employee transmitted sensitive consumer information from LabMD’s network to the private AOL email account of LabMD’s owner and CEO without encrypting the information.
- LabMD did not require employees to use unique, hard-to-guess passwords. For example, one employee used the password “labmd” from 2006 to 2013; passwords used by physician-clients’ offices often included the users’ initials, the username and password could be the same, and many users shared passwords; worst of all, LabMD’s employees set the FTP program up so that anyone could log in anonymously, that is, without using any password at all.
- LabMD had no policy addressing encryption of sensitive information it received and generated, and it did not implement file integrity monitoring. LabMD compounded the risk by mandating policies directing a manager to perform daily back-ups of sensitive billing information about thousands of consumers onto a workstation computer with unfettered internet access and other employees to store business documents on their computers.
- LabMD kept far more information than it needed, including the PII of more than 100,000 consumers for whom it never performed testing. LabMD failed to regularly purge the PII even though it is a regular practice of IT practitioners.
- LabMD did not properly configure its firewall to block IP addresses and unnecessary ports.
- LabMD did not deactivate the login access of former clients, and one employee’s credentials remained valid a year after she left LabMD.
- LabMD IT employees used low-quality products without full functionality, that LabMD had no established IT budget, and that LabMD IT employees had no discretion to purchase IT equipment, applications, or training.
- As part of its investigation into one of its internal documents on the file sharing site, LabMD removed the hard drive from the billing computer and allowed it to be destroyed by an outside security firm, which according to the FTC, “contravenes the first rule of forensic research – work with a copy of the drive and keep the original safe.” I’m sure this didn’t sit well with FTC regulators, as it smacks of cover-up rather than cooperation.
Except for destroying the hard drive from the billing computer, LabMD’s data security flaws as alleged by the FTC are all-too-common. But the good news for LabMD is that all can be easily avoided/prevented. MFT Nation has written extensively about the inherent dangers associated with running out-of-date software, and our message to LabMD would be the same as we’ve delivered in the past–don’t be penny wise and pound foolish. LabMD would also benefit from investing in a managed file transfer software solution. For example, bTrade’s TDXchange contains easy-to-use features that would allow LabMD to authenticate users, restrict user access to information based on need, purge and archive data, assess risk associated with the movement of its data, encrypt information while stored and in transit, log access to information and system components, ensure system and information integrity, and protect network gateways.
Contact bTrade for Assistance
To learn more about the aforementioned case study, as well as the steps you can take to avoid going down the same path, send a confidential email to me or our other data security experts at email@example.com.
bTrade, the industry leading compression and managed file transfer (MFT) provider, announced today that it once again scored well in connection with an ISO 9001:2008 supplier evaluation, this time from its customer, Telered, S.A.
Telered is a Panamanian company that provides the electronic payments network for Panama. The shareholders of Telered are all the major Panamanian banks who demand infrastructure supported by state-of-the-art technology, including bTrade solutions. Telered maintains connections with many entities, both private (local and international) and public, through a communications infrastructure that transmits the information between the entities and the affiliated financial institutions in a secure and reliable manner.
Telered conducted the ISO 9001:2008 supplier audit for bTrade in June 2015. In this audit, bTrade was rated on key metrics such as product quality, expertise and experience of employees, timeliness and effectiveness of problem resolution, return on investment, and overall performance in the competitive market space. According to Yolany Leon, a Process and Quality Analyst at Telered, bTrade continues to score highly as a Telered supplier. Telered has been a bTrade customer for more than a decade.
“It is extremely gratifying to receive this recognition from our customer, Telered, S.A.,” said Steve Zapata, President and CEO of bTrade. “bTrade has always focused on continued quality, and this supplier rating reconfirms our mission to provide our customers with the highest quality products and services possible,” added Zapata.
For more information on bTrade’s solutions and services, please visit bTrade.com.
If So, Take Some Encryption and Call the Doctor in the Morning
Earlier this year, a business acquaintance received a data breach notification letter from Anthem, an insurer in the healthcare industry. The letter begins with the standard assurances that Anthem “works hard to protect your personal information.” But the mood quickly changes in the next sentence as Anthem advised: “Unfortunately, we were the victim of a cyber attack and your personal information may have been accessed.”
The same person had the misfortune of thereafter receiving a similar type of letter from another company in the healthcare industry, UCLA Health. The letter from UCLA Health conveyed the same type of canned message: assurances about “working hard” to protect personal information; followed by a claim of being a “victim” of hackers; and then the “unfortunate” result that personal information may have been compromised.
This person was concerned, and she asked me a lot of questions. How could so many healthcare companies become “victims” of hackers? Don’t these “victim” companies have data security protections in place to prevent this sort of thing? Have they ever heard of encryption? She asked me for answers, so I did some research into the state of data security in the healthcare field, and I want to share some of it with readers of bTrade’s MFT Nation.
Health Insurers Aren’t Required to Encrypt Data???
One of the first items I came across is this article from Wall Street Journal online entitled “Health Insurer Anthem Didn’t Encrypt Data in Theft.” The article discusses data breaches involving Anthem and another health insurer, Humana, and suggests that the hackers were successful because the companies weren’t encrypting data. According to the authors, health insurers aren’t required by law to encrypt data:
Health insurers don’t always encrypt members’ data, and aren’t required by the federal Health Insurance Portability and Accountability Act to encrypt data.
Under HIPAA, doctors, hospitals, health plans and others must “address” encryption in their operations, but don’t have to scramble data if they determine doing so would impose an unreasonable burden, the likelihood of disclosure is low and they have implemented alternative security measures.
The authors are technically correct, but in practice, health insurers should consider encryption to be a required part of their data security practices.
The Healthcare Industry is Subject to Strict Data Security Laws
If you want to know the data security standards for the healthcare industry, you need to understand the Health Insurance Portability and Accountability Act, or “HIPAA” for short. HIPAA is a comprehensive federal law, and when I say “comprehensive,” I mean it’s got a lot of data security standards that, if followed, will help protect the confidentiality of personal information when it is stored, maintained or transmitted by healthcare entities such as Anthem and UCLA Health.
As the authors of the WSJ article stated, HIPAA does not explicitly require encryption. In fact, it doesn’t explicitly require the implementation any specific security technology. However, this doesn’t mean what people sometimes think it means, and this misunderstanding can cause healthcare folks some real heartburn.
Encryption is Reasonable and Appropriate, Right?
In a section of HIPAA captioned “Technical Safeguards,” there is an “Implementation Specification” dealing with encryption. But the Implementation Specification for encryption is categorized as “addressable” rather than “required.” Why are some things “addressable” rather than required? Because there is not always a “one-size-fits-all” solution for data security issues. For example, a healthcare entity with only a local network and no electronic connectivity to any person or entity outside of the organization may not need to encrypt. Also, encryption is just one method of rendering electronic data unreadable to unauthorized persons.
It should be noted, though, that “addressable” does not mean “optional.” A healthcare entity must still determine whether an addressable Implementation Specification is a “reasonable and appropriate” data security measure to apply within its particular environment. The key phrase here is “reasonable and appropriate.” In other words, encryption is required if it’s reasonable and appropriate to do so.
Whether a particular security measure is “reasonable and appropriate” depends on a balancing of factors such as an entity’s size, complexity, capabilities, infrastructure, hardware and software, as well as the costs of the security measures and probability and criticalness of potential risks. Basically, a company doesn’t “have to” encrypt, but if it chooses not to, it better be prepared to demonstrate clearly in the event of an audit by HHS’s Office for Civil Rights that its analysis is accurate.
In summary, health insurers are required to encrypt personal information whenever it is “reasonable and appropriate” to do so. In what situations would it be considered reasonable and appropriate to not use encryption? Except for the local network example mentioned above, I can conceive of very few, if any. Even if you think that such a situation exists in your organization, you better be prepared to convince HHS’s Office of Civil Rights that you are right, and remember that this office generally considers encryption to be both necessary and appropriate. For these reasons, I believe that encryption is required by most, if not all health insurers.
Let us know us know whether you agree or disagree by posting a comment below, or by sending a confidential email to firstname.lastname@example.org.
Federal Appeals Court Says FTC has Authority to Regulate Data Security Practices
MFT Nation has written before about the FTC vs. Wyndham Worldwide Corp. case. The FTC sued Wyndham after three separate hacking incidents led to more than $10 million in fraud losses for customers. Wyndham allegedly did not use encryption, firewalls, and other commercially reasonable methods for protecting consumer data.
In our previous posts we did not address a corollary issue raised by the FTC vs. Wyndham case–whether companies that fail to provide customers with reasonable data security protections can be punished by federal regulators. Earlier this week, a federal appeals court confirmed that the FTC can punish companies for unreasonable data security practices. The appeals court ruling is interesting on several levels, but for purposes of this post, I want to share some of the humorous and biting ways the court dealt with some of Wyndham’s arguments.
Reductio Ad Absurdum Invites a Tart Retort
For example, Wyndham asserted that a business shouldn’t be held liable if “the business itself is victimized by criminals.” Basically, Wyndham tried to deflect attention from its alleged wrongdoing by pointing the finger at another wrongdoer, the hackers. Wyndham suggested that if it can be held liable in such circumstances, then the FTC could conceivably sue supermarkets that are “sloppy about sweeping up banana peels.” I kid you not; Wyndham actually proffered this argument.
The appeals court had fun with it, though, describing the argument as reductio ad absurdum. For non-lawyers and those who don’t speak Latin, the court is telling Wyndham that its argument is absurd. And the court used the banana peel analogy to explain the absurdity of Wyndham’s argument: “It invites the tart retort that, were Wyndham a supermarket, leaving so many banana peels all over the place that 619,000 customers fall hardly suggests it should be immune from liability.” Generally speaking, you’re not doing well when the court feels your argument “invites a tart retort.”
In Some Instances, the Third Time Isn’t a Charm
Wyndham also argued that it was not given adequate notice of “what specific cybersecurity practices are necessary to avoid liability.” To this argument, the court quickly offered another tart retort: “We have little trouble rejecting this claim.” The court explained:
As the FTC points out in its brief, the complaint does not allege that Wyndham used weak firewalls, IP address restrictions, encryption software, and passwords. Rather, it alleges that Wyndham failed to use any firewall at critical network points, did not restrict specific IP addresses at all, did not use any encryption for certain customer files, and did not require some users to change their default or factory-setting passwords at all.
In other words, Wyndham’s practices would fail under any cybersecurity standard imposed by the FTC. And the court went on to say that Wyndham’s case was made “even weaker given it was hacked not one or two, but three times. At least after the second attack, it should have been painfully clear to Wyndham that a court could find its conduct failed.”
Beaten by Not Broken
What did Wyndham say about the ruling? It remained defiant, saying that ultimately “the facts will show the FTC’s allegations are unfounded.” But in a more conciliatory tone, Wyndham did acknowledge that “safeguarding personal information remains a top priority for our company, and with the dramatic increase in the number and severity of cyberattacks on both public and private institutions, we believe consumers will be best served by the government and businesses working together collaboratively rather than as adversaries.”
Stay tuned to MFT Nation for further developments in the FTC vs. Wyndham case, as well as others affecting the world of data security.
You may know that some people, referred to euphemistically as “dumpster divers,” search through garbage for items they find useful/valuable. But did you know that some dumpster divers are much more sophisticated and focused on harvesting data stored on discarded hard drives, full workstations, old mainframe tapes, and now cell phones? With everyone seemingly worried about hackers on the wild internet, we forget about this other security risk. And it is real. These sophisticated dumpster divers can recover data from such discarded items. (more)
In a previous post about the long running Oracle vs. Google case, we noted that “Oracle has a federal appeals court precedent on its side, which it can use as a sword against all other Java API users who Oracle believes may be violating its copyrights. So, unless and until another court decides otherwise, the IT community should be aware that Oracle may be on the hunt for violators.” (more)
I’m pleased to report that bTrade has been acknowledged by Info-Tech Research Group as an “experienced and recognized leader in the MFT industry.” The acknowledgement follows a period of evaluation by Info-Tech of bTrade and its MFT products, and is memorialized in a vendor landscape report appropriately titled, “Select and Implement a Managed File Transfer Solution.” (more)
The United States Supreme Court (aka, the “Supremes”) left today for summer vacation, and on the way out the door they posted a list of cases they will hear in the fall term. We thought MFT Nation readers would want to know that the Oracle v. Google case was not on the list. So what does this development mean? (more)