One Company’s Path to 21 CFR 11 Validation

This case study describes how ISI Group selected and validated a document management system for compliance with the United States Food and Drug Administration (US FDA) electronic record and electronic signature (21 CFR part 11) regulations. It was written by a employee of the ISI group for FileHold Systems.


ISI Group, LLC is a small virtual organization of consultants and contractors. They develop and sell medical image storage, display, and management software for use in hospitals and other healthcare facilities.

In the United States, most varieties of medical image-handling software are classified as medical devices by the US FDA, and companies that sell the medical device software must comply with US FDA regulations. In broad terms, compliance is achieved by registering with the US FDA, establishing a quality system, and providing evidence (records) that demonstrate that US FDA regulatory requirements have been addressed.

ISI Group elected to use a document management system as the primary storage system for the records that support US FDA regulatory compliance, and to use that system to automate document approval and distribution. These choices mean that the document management system is also subject to US FDA regulations.

The regulations involved are summarized below:


Why relevant

21 CFR 820.70(i)

Covers the overall requirement that a medical device company “shall validate computer software for its intended use according to an established protocol” when such software is used to automate a process.

21 CFR 820.40

Regulations for document control including document approval, distribution, and document change management. These are the processes being “automated” by the document management system.

21 CFR 11 (all)

Details the characteristics that a document management system needs to demonstrate if is it used at the primary recordkeeping system and/or for electronic signatures in lieu of paper-based records.

Selecting a System

The document management system needed to be:

  • Compliant with the regulations covered in the previous section
  • Affordable
  • Simple enough for reluctant users to adopt (high usability)

Selection Criteria


Of the three criteria, the most challenging one to assess was 21 CFR Part 11 compliance. The vendors of some systems (usually the more costly ones) indicated that 21 CFR part 11 compliance was “built in” (but would still need local validation). Often these systems were part of a larger suite of software covering quality system automation in general. Other vendors indicated that compliance could be achieved by using an add-on (plus local validation). And other vendors did not assert compliance or an awareness of 21 CFR Part 11 requirements.

A few document management system vendors offered “validation toolboxes” for a fee, but that fee typically was close to or higher than their budget for the system as a whole.

Since any document management system would have to be validated by the buyer for its “intended use according to an established protocol”, and since most of the specific regulatory requirements for validation are based on common sense, they decided to consider systems that appeared broadly compliant even if the vendors had no stance relative to 21 CFR Part 11 requirements.


Cost of the systems that were evaluated varied widely. Some open source systems were effectively $0 for the software itself (though most of those charged for training or support). Other systems cost upwards of $50,000.

While ISI Group had no firm budget, as a small business they wanted to keep the cost as low as possible. As a rule of thumb, they decided to semi-arbitrarily exclude systems with an effective cost of over $20,000 (both out of pocket and in terms of their own time to implement/support).


Usability was assessed primary based on vendor-provided demos. Vendor documentation and user reviews were also considered if they were available.

Narrowing the Field

After an informal survey of dozens of document management systems, they chose to look at 15 systems more closely. Of the 15 systems, 13 were rejected for the following reasons:

  • Too expensive (despite meeting preliminary usability and compliance assessment) – 2 systems
  • Overall architecture/usability did not “feel” right – 2 systems
  • Inquiries not returned – 3 systems
  • Electronic signatures limited to PDF format and/or required outside add-on – 6 systems (see below)

Of the 15 candidate systems, the remaining two were flagged for a detailed investigation as described in section 2.4.

Refining their E-signature Needs

As they looked at the 15 systems noted above, they realized that they should focus on some aspects of electronic signatures and less on others. Because their primary goal is to use electronic-signatures to indicate approval of internal quality system records they decided that:

  1. The signature status (signed or not) should be indicated in the system without having to open the document itself.
  2. The ability to sign a document should not require a format change.
  3. The signature did not have to be transferrable outside the document management system. This meant that they did not have to get a signature system that was platform independent, in the short term, so long as the document system they chose could accommodate “outside” signed documents should their needs move in that direction.

Choosing the Finalists

The two systems they decided that could meet their needs were FileHold, a general purpose document management system, and NextDocs, a platform that incorporates document management as a part of a modular “suite of integrated quality management software solutions”.


NextDocs presented as a quality management system first and as a document management system second. It had modules for various quality system areas and seemed very “US FDA-oriented”.

FileHold presented as a document managed system with that could be used as part of a quality system solution but that was not purpose-built for that task. The document management and signature capabilities seemed adequate for their specific needs but there was no advertised support/awareness of the US FDA 21 CFR 11 regulation. (At the time.)

Advantage: NextDocs


Overall NextDocs was significantly more costly than FileHold. While the extra cost did translate into extra capabilities, and while those capabilities were desirable, they were not certain if they were truly essential for an organization of their size. FileHold appeared to be able to meet their “must have” needs at a lower initial cost plus a lower maintenance cost. Plus FileHold could be set up into regulated and “non-regulated” areas and could be use used for general document storage.

Advantage: FileHold


In general, FileHold seemed more intuitive and approachable than NextDocs. The on-premises version of NextDocs is SharePoint-based, which was considered a significant negative in their organization. The on-premises version of FileHold includes both a general web client and a locally installed thick client.

The implementation model for FileHold was also much more straightforward than that of NextDocs, and the FileHold sales and support group were more responsive during the demo periods for each product.

Advantage: FileHold

The Decision

FileHold was chosen after an informal (but careful) check to make sure that its signature capabilities were a defensible choice for their specific intended use, and based on the fact that FileHold could be used for general document management for areas beyond their regulatory needs, and because the overall opinion of stakeholders that usability trumped a purpose-built system that was less approachable.

Validating the System

First Steps

US FDA regulations state that for software used to automate processes or parts of the quality system, companies “shall validate computer software for its intended use according to an established protocol” (21 CFR 820.70(i)) [emphasis added].

As this was the first of several systems they would need to validate, they spent a substantial amount of time trying to learn what would need to be done to satisfy this regulation in an efficient way. The best resources they found for general validation information are listed below:

The US Medical Devices forum at Elsmar.com: http://elsmar.com/Forums/forumdisplay.php?f=180

The US FDA’s General Principles of Software Validation guidance document: http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/ucm085281.htm

The Validation of software for regulated processes Technical Information Report (AAMI TIR 36:2007): http://my.aami.org/store/detail.aspx?id=TIR36

Medical Device Software Verification, Validation, and Compliance, especially Part III:

Establishing a Protocol

As stated in the Validation of software for regulated processes Technical Information Report (AAMI TIR 36:2007), “Although the word protocol is used with different meaning throughout the medical industry, here the word implies a plan. The plan for validation must be a formal, approved document or documents.”

Based on that, they wrote up a short document that summarized the scope, goals, and risks of adopting the document management system, and the described what they would do to ensure the goals were met and that the risks were understood and managed. In broad terms, the plan (protocol) helps with the validation effort because:

  • Understanding the scope helps determine the intended uses and whether or not validation under 21 CFR 820.70 is necessary.
  • Understanding the goals helps determine the criteria used to judge validation success or failure.
  • Defining the risks helps determine the amount of effort and rigor that should be applied to the overall validation effort.

Defining Intended Use

An often cited US FDA definition of “intended use” for a medical device is included in the US FDA Labeling Regulation (21 CFR 801.4). For non-medical device software, a high-level intended use statement should at a minimum cover the functions being automated, who will be using the software, when the software is used, and so on. Defining the intended use is critical because it provides the standard by which validation success or failure is judged.

The Validation of software for regulated processes Technical Information Report (AAMI TIR 36:2007) has a good explanation of defining intended use for non-device software.

They found that as they worked through the validation process, they had to revisit and refine their intended use statement as their understanding of what they had to do to validate the document management software evolved. These lessons were captured in either the Validation Plan (see previous section) or in the Validation Requirements document (see section 3.4.2).

Validation Activities

AAMI TIR 36:2007 provides a long list of activities that can be used to support validation. The report also provides insight as to how to choose particular activities based on the state (internally developed vs purchased), risk, and use of the software being validated.

Because they elected to purchase finished software, certain validation activities (code reviews, unit testing, etc.) had no value. The activities they elected to use are summarized below.

Note that there was a fair amount of overlap in the execution of these activities. They are listed in the order that they were started, but in most cases they happened at the same time and had a tendency to bleed into each other.

Informal Vendor Audit

The process that they used to eliminate most of the known document management systems also functioned as an informal vendor audit. Some systems were just too expensive or didn’t have a feature set that worked for them. Other systems that might have had potential were eliminated if vendors were not responsive to information requests, or if their demonstrations, documentation or marketing materials seemed low quality. This process also helped them refine their own understanding of what they needed their chosen document management system to do.

Validation Requirements

They drafted an intended use statement and used it as their highest level validation requirement. This statement helped them define the scope of what they would validate, and just as impotent, what they would NOT validate.

They also defined more detailed requirements based on specific needs they had (for example, the ability to display signature status without displaying a document) and the US FDA’s applicable regulatory requirements for the US FDA’s Electronic Records; Electronic Signatures regulation (21 CFR 11).

They also documented the rationale for not electing to address specific 21 CFR 11 regulations (for example, they did not address requirements for biometric signatures since their chosen document management system did not use biometric signatures).

All of this was combined into a formally reviewed and approved document. If they could demonstrate that the system fulfilled the requirements, they would deem the system validated.

Vendor-Supplied Documentation

They used vendor supplied documentation to as a way to demonstrate that some of the more mechanical requirements in the 21 CFR 11 were met. For example, for the US FDA requirement that states that the system implement means for “limiting system access to authorized individuals” they simply referenced the user documentation that described how user accounts, passwords, and document access was managed.

Defining Specific Processes to Support Validation

Certain US FDA requirements for a document management/electronic signature system are both tied to a document management system’s capabilities and tied to how the system is to be used. One 21 CFR Part 11 requirement states that:

“Protection of records to enable their accurate and ready retrieval throughout the records retention period.”

To fulfil this requirement, in addition to referencing the product documentation that describes how older document versions are retained by the system, they set up external processes to ensure that records were not deleted prematurely, and to ensure that the data stored by the document management system was backed up.

Combining User Training with Validation

They chose to treat certain 21 CFR 11 requirements as training items. For example, to demonstrate they fulfilled for the US FDA requirement for:

 “establishment of, and adherence to, written policies that hold individuals accountable and responsible for actions initiated under their electronic signatures…”

…they created a training presentation that included an explanation of signature accountability, and had users sign a “read and understood” memo as a part of the training.

Another US FDA requirement specifically called out user training. They used vendor-provided training for those who would be administering the document management system, and then their internal administrators created training packages specific to how the document management system is used within their company. The evidence that training was performed was referenced in their validation records.

Traceability Matrix

ISI Group used traceability matrix as a verification step. The matrix referenced the US FDA’s regulatory requirements, their specific requirements, and also the risks they identified in their validation plan. For each of these items, they cited the record(s) that showed how each item had been addressed. Some items were supported by only one record, others by multiple records.

Executing the Validation

Executing the validation itself was the shortest step of the overall process. It involved:

  • Having stakeholders sign off on the Validation Plan and Validation Requirements documents
  • Preparing, executing, and documenting the training of end users
  • Verifying that all records referenced in the trace matrix were finalized, and then finalizing the trace matrix itself.
  • Summarizing the overall validation effort for use in future cycles (such as when they want to install a new version of the document management software),

Lessons Learned

In retrospect, the most difficult part of this process was sorting through the vast number potential systems, and coming to terms with the terminology surrounding .

For the first part, ISI Group had to work several times through a list of what they wanted vs what the US FDA required, and come up with a subset that they thought would work. This took much longer than they wanted. But ultimately they were able to use some of this work into their validation process as part of the informal vendor audit.

For the second part (terminology) they had to come to terms to what “validation” actually meant in their context, and make sure they were reasonably in alignment with what the US FDA regulations indicated what was needed. An excessively literal approach would mean a lot of work for no tangible result. But not taking enough time to drill down and extract what the US FDA’s expectations would be given their intended use and level of risk also would result in wasted effort.

In retrospect, they probably over engineered what they needed to do. But even that was useful as it they used what they learned to validate other systems. They plan on reassessing what they have done and streamlining it when it is time to revalidate a newer version of their document management software.