My NacuboWhy Join: Benefits of Membership

E-mail:   Password:   

 Remember Me? | Forgot password? | Need an online account?

Business Officer Magazine

Foiling Data Thieves

When technology managers at MIT reviewed data storage risks, they found sensitive personal information in unexpected places. A new process omits unnecessary details from records, thwarting identity theft.

By Allison F. Dolan and Deborah Fisher

*Developments on the legislative front from 2007 through 2008, combined  with high-profile data-loss incidents, made data protection a front-burner  concern for institutions in Massachusetts, including the Massachusetts Institute of Technology (MIT), Cambridge. A breach of security at TJX, a major retailer based in the Northeast, compromised millions of customer records. That incident prompted Massachusetts to pass its version of a data-breach law. Although the 39th state to do so, it was only the 5th state to include paper documents as well as electronic within the scope of the law. In 2008 Massachusetts took the matter a significant step forward by promulgating regulations, generally recognized as the nation's most proscriptive, that require anyone holding specific personal information about state residents to implement a written information security program.

In parallel with these legislative efforts, members of Massachusetts Institute of Technology's audit committee began asking business management about MIT's risk profile in the area of handling Social Security numbers (SSN), which were known to be used extensively, at least historically, within the enterprise. These questions coincided with corroborating information from past audit division reviews, as well as a few relatively minor data incidents.

Know Your Data Protection Regulations

FERPA—Family Educational Rights and Privacy Act (federal law, enacted in 1974). Provides students with right of access to “education records and right of privacy to those records.”  New regulations released in December 2008 included “recommendations for safeguarding education records” (73 Fed. Reg. 74806, 74843-44, Dec. 9, 2008).

Federal Trade Commission “Red Flags Rule” (regulations issued as part of the Fair and Accurate Credit Transactions Act of 2003, and in force for banks and financial institutions since 2008). Intended to cover any “creditor” (i.e., an entity that allows for deferred payment of services). Most higher education institutions are covered because of student loans. Regulations include data protection.

HIPAA—Health Insurance Portability and Accountability Act (federal law, enacted in 1996). Defines covered entities, which include health-care providers, health plans, and health-care clearinghouses. Protected health information includes any information held by a covered entity that concerns an individual's health status, provision of health care, or payment for health care.

HITECH Act—Health Information Technology for Economic & Clinical Health Act (federal act signed into law in 2009 as Title XIII of the American Recovery & Reinvestment Act). Makes many changes to HIPAA Privacy and Security rules; extends HIPAA's obligations to “business associates,” requires the U.S. Department of Health and Human Services to promulgate data protection standards (effective February 2010), increases financial penalties, adds Federal Trade Commission oversight of non-HIPAA entities (e.g., Google Health Records service offering), allows state attorneys general to bring civil actions. Also introduces first federal breach notification requirement.

Massachusetts Data Breach Law (M.G.L., c.93H & 93I) (enacted in 2007). Requires consumer notification if certain personal information (e.g., SSN, credit card number) was lost, stolen, or at risk of unauthorized access. Unlike most other states, Massachusetts law includes paper as well as electronic files. The law also required promulgation of data-protection regulations.

Massachusetts Data Protection Regulations (201 CMR 17) (effective March 1, 2010). Require any person with defined personal information to develop, implement, maintain, and monitor a comprehensive written information program, which defines administrative, physical, and technical safeguards. Such safeguards include requirements for contractual language for third parties with access to personal information, encryption of personal information on laptops, and requirements for managing access control.

PCI-DSS—Payment Card Industry Data Security Standard. Worldwide industry standard for technical and operational requirements to help prevent fraud and misuse of credit card information. Compliance with rules required to be a “merchant” and accept credit cards.

In December 2007, MIT's institute auditor and the vice president for information services and technology (IS&T) agreed to sponsor a project to determine MIT's risks associated with handling SSN. The program was initially staffed with a manager, who had more than 10 years of MIT experience in human resources, administration, and information technology, and an internal auditor with an IT background, who provided data analysis and built the database used to record the findings.

We had three significant advantages as we began this work. First, although MIT did not have a chief privacy officer or chief security officer, we did have a culture oriented toward privacy and security of information, as evidenced by MIT Policies and Procedures, which state that “the privacy of individuals must be protected, regardless of the form or the location in which the information about them is stored, including computer media.”

Second, and most significantly from the perspective of scoping the problem, MIT had not been using SSN as a personal identifier for more than a decade. In the mid-1990s, farsighted members of the IT department had introduced the concept of an MIT identification number to be used in lieu of SSN for internal processing. This early work meant that our primary systems, such as human resources, were not using SSN as a key field, and data about students or employees could be shared between systems or departments without any mention of SSN. 

Third, we had already centralized the systems for finance (including procurement), human resources, student services, and reporting. This allowed us to put together a profile of where SSN were stored, how they were used, and who had access to the data.

At the same time, MIT departments that accepted credit cards were working toward compliance with the Payment Card Industry Data Security Standard. In addition to these data-related issues, we needed to respond to the Federal Trade Commission requirements, known as the Red Flags Rule, for documenting an identity theft prevention program. Finally, all of this new activity was occurring in the context of the existing FERPA regulations related to student information and HIPAA regulations related to medical information (see sidebar, “Know Your Data Protection Regulations”). A new era for data management had arrived.

Capitalizing on Our Culture of Collaboration

Consistent with current best practices for data-related projects, we approached this initiative primarily as a business issue, not an IT problem. We engaged with administrators to understand where and how SSN was currently being used, and from that, we determined the data protection issues and made recommendations for changes. Although the focus was on SSN, we didn't ignore the other data elements that Massachusetts defined as protected personal information—driver's license or state-issued ID, as well as financial account numbers, including credit card numbers. In cases where someone identified one of the other elements, we would note that in our inventory.

As the project developed and the requirements of the various laws and regulations became clearer, we established regular meetings that included the project team, a representative of the office of general counsel, and the manager of IT security services. Other individuals were invited to the meetings as their expertise was needed.

We recognized early in this work the importance of leveraging the grassroots culture of MIT to communicate and campaign for protection of data. We held numerous meetings with departmental administrative, financial, and human resources staff. The meetings included presentation of the regulatory requirements, as well as sharing stories of how data loss occurs and steps to take to protect such information. We described the challenges in terms of a risk framework based on the data life cycle: collection, use, storage, and destruction (see Figure 1, “Risk Management Framework”). We then invited them to let us know where they saw SSN and other sensitive information. The formal presentation was designed to run about 20 minutes, but the sessions typically lasted an hour as colleagues compared notes on when and where they saw the different kinds of information.

The high-touch approach has proven to be vital to the success of the initiative. Although attendees had been exposed to the information as a result of e-mail and hard-copy mailings, those conventional approaches had minimal impact. However, in the meetings, there was typically lively discussion that most could relate to on a personal level. After all, who has not heard about, if not experienced, identity theft?

And MIT, like many peer institutions, has distributed data flows. As a consequence, individual departments often are the collection point for personal information, but once submitted to a central unit, the data are no longer needed locally. Many presentations led to departmental managers making decisions to change local data handling or storage practices.

We found that the message to protect this information was not a hard sell. In fact, many people were already concerned and were looking for direction on how to protect sensitive data. For instance, staff in undergraduate student services (admissions, financial services, and registrar) had already mapped out a number of steps to reduce the exposure of student information, including eliminating SSN from the admissions review package, keeping paper files under lock and key, and implementing a new, shorter retention period for the records of nonadmitted applicants.

There were also a number of situations where SSN had been part of a process in the past, but was no longer required. We were generally able to define the time period of risk and encourage people to review their paper and electronic files prior to that date. In some cases, our presentations were the first time individuals recalled hearing about the change in procedure, and they were able to incorporate the new information into their local practices.

In addition to these departmental meetings, we periodically offered sessions open to the community. Staff, faculty, and students all took advantage of the opportunity to learn more about this topic. These sessions were well received, and on several occasions, resulted in an invitation to speak to a department. In fact, one participant believed the information was so important that she recommended that we make attendance mandatory for all staff.

Learning the Landscape

Looking for Sensitive Information

Knowing where information is located is crucial to understanding your options for secure storage and destruction. Regardless of the process, look for data in or on:

  • Paper, electronic, fiche, and other media files.
  • Local areas and central systems.
  • Current operational files and archived files, including e-mail.
  • Physically secure and publicly accessible spaces.
  • Locally owned spaces and third-party locations.
  • Computers, printers, copiers, and other equipment queued up for redeployment.
  • Portable devices and media, such as smart phones, PDAs, CDs, and USB drives.

As a result of these meetings and subsequent conversations, we were able to profile the SSN landscape in different ways. We knew, of course, that SSN was required for payroll as well as a number of other employee-related processes, such as benefits administration. We also knew that SSN comes into play in student-oriented processes. For example, SSN is requested, albeit not required, on a student's application form, to help ensure data from different sources (e.g., ACT scores) are correctly linked with the right applicant. It is also required for government reporting of admitted students, as well as in the student loan process, along with other protected personal information, such as parents' financial information.

What caught us somewhat by surprise was the number of situations where SSN was showing up in the accounts payable process. The first example we observed was independent contractors' SSN on file because of the need to report their compensation to the IRS. That led to the realization that we needed SSN for other miscellaneous payments, such as honoraria. Even in cases where SSN was not required, we noticed that sensitive information, such as personal credit card numbers, would creep into the process as part of the documentation for a transaction. Many individuals were oblivious to the fact that submitting their bank or credit card statements as backup for a reimbursement meant their personal account numbers could be scanned into our financial system and potentially be viewable to anyone with system access.

We encountered one department that tracked visitors on a local database—the date of the visit, the topic, and the guest's SSN. When we mentioned that they would need to notify those honored guests if there was a breach, they were happy to remove that element from their records.

Sharing concrete examples such as these was very helpful in heightening the community's sensitivity to how personal information can be inadvertently exposed.

We also uncovered SSN in some unexpected places, such as checkout cards in library books that predated the MIT ID number implementation. In such cases, we noted the scenarios in our inventory and discussed possible risk mitigation steps that the department could consider (see sidebar, “Looking for Sensitive Information,” and Figure 2, “The Sensitive Data Iceberg”).

Throughout the discovery phase, we were particularly mindful of the quantity of records. The risk profile increases dramatically for files, usually electronic, that contain tens of thousands of SSN (or other protected data). While we did not ignore systems or processes that could collect and maintain this data, we prioritized risk mitigation based upon large volumes of data (see Figure 3, “Risk Landscape”).

Winning Ways

Once we completed the discovery phase, we communicated with specific departments about how they could help us reduce risk. Early in this process, we had a number of wins.

Reducing and monitoring access to SSN. Most notably, the IS&T staff were able to make a system change, transparent to most users, which sharply reduced access to SSN in our data warehouse from several hundred people to about a dozen. This was an immediate and dramatic reduction of risk, as the data warehouse is the primary reporting tool for the enterprise. We also began to log those who accessed SSN, so we could monitor for patterns of use as well as misuse.

Modifying forms to eliminate SSN. As we identified forms that included SSN as a required field, we talked with the form owner about the requirements. As a result, a number of forms were modified, including the paper version of the MIT job application. Several of our third-party benefits providers had forms with SSN; our human resources department worked with those vendors to allow use of the MIT ID instead. Our internal reimbursement form was another document that included SSN; although SSN was required for certain payments, it wasn't needed for many others. Working with a staff person in accounts payable, we were able to identify the different scenarios and make improvements in the instructions for using the form.

Increasing departmental awareness of secure paper destruction. Many areas had no idea that strip-cut shredders were not considered sufficiently secure. We evaluated shredders, as well as shredding services, using the National Association for Information Destruction's guidance on what to look for in a vendor. Although we didn't require departments to use a single vendor, we did find it helpful to establish agreements with two vendors, including one with a significant educational and institutional cooperative discount. When we reviewed reports from our procurement system, we saw a significant increase in the number of shredders purchased as well as the number of shredding service engagements.

Coping With Challenges

Any project of this scope will encounter challenges. In our case, we searched to find the right balance between risks of inadvertent (or malicious) exposure of data and the degree of protection we wanted to require. For instance, the Massachusetts regulations require encryption of certain devices such as laptops if they contain the protected data elements. The question for us was, to what extent should encryption of laptops be carried out? Our program addresses this need with a risk-based principle in mind. For example, staff members in central units with roles that regularly involve accessing sensitive information are expected to have encryption; on the other hand, individuals with less process-intensive responsibilities are less of a priority.

Another challenge, not unique to this initiative in this period of history, was engagement of senior staff throughout the enterprise during a time of resource constraints. Individuals were extremely willing and cooperative and could implement many improvements without significant cost, but any task requiring time or financial resources had to be carefully weighed against other demands. In particular, many of the tools that could be applied in the IT realm not only had acquisition costs, but ongoing support and maintenance requirements that needed to be prioritized relative to other items on the already-oversubscribed IT project queue.

We have asked all individuals to be aware anytime personal information requiring notification (PIRN) crosses their desks.

Perhaps our biggest challenge was that a lack of precision in our data definitions caused some confusion. When we used terms such as “high-risk data,” “sensitive data,” or even “personally identifiable information,” people thought of date of birth, home address, mother's maiden name, or even salary information. (Many people were surprised that the latter is not protected by law.) Although it was certainly desirable to have people thinking broadly about sensitive data, it muddied the waters when it came to actions that were necessary if there was loss or unauthorized access of regulated data. To reduce the confusion, as well as address the clumsiness associated with reiterating the specific set of Massachusetts protected data elements, we coined the term “personal information requiring notification” (PIRN). At this point, our definition of PIRN is defined based on Massachusetts law.

This term has also simplified our written information security program. We can now talk about the responsibility of business process owners, such as the payroll director, to determine when PIRN is required, who is authorized to see or handle the information, and what the record retention should be. Department heads are then responsible for knowing when processes include PIRN and ensuring that the staff dealing with those processes have access to, and have taken advantage of, the relevant training, including specific IT requirements. And finally, we have asked all individuals to be aware anytime PIRN crosses their desks. If they don't understand why they are seeing PIRN, they are encouraged to ask questions. The mantra we have repeated ad nauseum is, “You can't lose what you don't have.”

Finally, we were challenged to maintain momentum. This work began in late 2007 and continues today. We are grateful to all our colleagues who have remained steadfast in working toward our goal of security and compliance.

Taking the Next Step at Your Institution

Although the Massachusetts data protection regulations provided an impetus that not every institution faces, there are many attributes of our journey that would be applicable to business officers at other institutions—large or small—as well as steps you can take to make sure that this important issue is fully addressed on your campus:

  • Recognize that data protection starts as a business issue, rather than solely an IT issue. It is the business process owner who is responsible for articulating why certain data is needed, who needs it, and how long it needs to be retained. Whether this data analysis becomes codified into institutional policies or is primarily reflected in procedures and practices depends on the prevailing culture.
  • Work with general counsel to understand how federal data regulations apply and whether there are relevant state regulations. Based on this, it would be appropriate to incorporate language in third-party contracts so that there is agreement about roles and responsibilities in the event of a breach. 
  • Engage the IT department in providing tools for access control, data discovery, computer security, and encryption of data while in transmission or sitting on a computer. Even if there are no regulatory requirements, it is a good practice to document the minimum IT security standards for protecting sensitive information.
  • Establish a defined data incident-response process. Virtually every state has some of form of a data-breach notification law, and other data-breach requirements may also apply (such as HITECH). We identified a crossfunctional team, which is facilitated by the manager of IT security services and includes representatives from the offices of general counsel, campus police, corporate relations, and internal audit. We have also researched options for forensics and breach notification services, so we have some resources to turn to if needed.

As a result of these activities, we know what we will do if, despite all our risk mitigation, the unexpected occurs.

ALLISON F. DOLAN is program director for pro-tecting personally identifiable information, and DEBORAH FISHER is institute auditor at the Massachusetts Institute of Technology, Cambridge.