Eighty-one percent. That's the share of all GLP 483 observations in FY2023 that traced back to training failures. Twenty-nine out of thirty-six observations. In a year when FDA issued 10 Form 483s across 28 inspections, the single dominant finding — not one of many, the dominant one — was that personnel weren't adequately trained for the work they were doing.
FY2022 told the same story at a different ratio: 41 of 78 observations, 53%, tied to training. FY2024 pulled it down to third place behind SOPs and QAU deviations, but it was still there. Training isn't an occasional inspection theme. It's the structural weakness that shows up year after year, facility after facility.
Here's the part that confuses people, though: training failures almost never generate warning letters. In our analysis of 17 GLP warning letters from 2019-2026, only 18% cited training as a factor. Compare that to Study Director accountability (88%) or data integrity (53%). FDA's triage logic is clear — training gaps are fixable. You retrain people, update SOPs, document the correction, QAU signs off. Problem addressed. The data from the studies isn't compromised by the fact that someone's training record was incomplete.
So training won't get your studies rejected. But it will get you a 483. And 483s, while not catastrophic, create a compliance narrative around your facility. A CRO with repeated training-related 483 findings is a CRO whose corrective actions aren't sticking. Sponsors doing due diligence will notice that pattern. It's worth getting right, and it's one of the easier things to get right if you understand what FDA actually expects.
What the regulation says
21 CFR 58.29 is short. Two paragraphs. Here's what they require:
(a) Each person engaged in the conduct of or responsible for the supervision of a nonclinical laboratory study shall have education, training, and experience, or combination thereof, to enable that individual to perform the assigned functions.
(b) Each testing facility shall maintain a current summary of training and experience and job description for each individual engaged in or supervising the conduct of a nonclinical laboratory study.
That's it. No specified curriculum. No required hours. No certification program. FDA doesn't tell you how to train people — they tell you to demonstrate that training happened and that it was adequate for the work performed.
The ambiguity is intentional. A histopathology technician at a tox CRO needs different training than a formulation chemist at a bioanalytical lab. A regulatory framework that prescribed specific courses would either be too narrow to apply or too broad to enforce. Instead, FDA leaves "adequate" to the facility's judgment and then checks whether that judgment holds up during an inspection.
The operational question isn't "what course should we buy?" It's "can we prove that every person who touched this study was qualified to do what they did?"
What inspectors actually check
I've talked to enough people who've been through GLP inspections to know the pattern. Training isn't a standalone module in the inspector's playbook — it runs through everything else. When an inspector finds an SOP deviation, they look at whether the person was trained on that SOP. When they find a data recording error, they check whether the person was trained on the data system. Training is the root cause investigation tool that inspectors apply to every other finding.
Specifically, here's what they pull:
Personnel training files. The inspector asks for the training file for a specific individual — usually someone whose name appears on a study record. The file should contain: current CV or resume, job description, records of all training completed (dates, topics, trainer, method of assessment), and documentation of ongoing training.
Training-to-task match. Did the person performing the necropsy have documented training in necropsy procedures? Did the person operating the HPLC have training on that specific instrument? Not "HPLC operation generally" — training on the Agilent 1260 in Lab 3 that generated the data for this study. The specificity matters. I've seen 483 findings where the training record showed "chromatography training" but didn't specify the instrument or method. That's not adequate.
SOP training currency. When an SOP is revised, everyone who works under that SOP needs to be retrained. The inspector will check the SOP revision date against the training record dates. If the SOP was revised in March and the study ran in June, training records should show the affected personnel were trained on the new version before — not after — they performed work under it. This sounds obvious. It's the single most common training gap I've encountered. SOPs get revised. The training matrix doesn't get updated. Nobody notices until the inspector does.
Training assessment. FDA doesn't require exams, but they want evidence that training was assessed — not just delivered. "John Smith attended GLP training on 3/15/2026" is attendance, not training. "John Smith completed GLP training on 3/15/2026, demonstrated competency in data recording procedures via practical assessment, assessed by Jane Doe, Training Coordinator" is training. The difference matters to inspectors because attendance without assessment doesn't demonstrate that the person can actually do the work.
The training nobody does well
Every GLP lab trains on SOPs. Most train on GLP awareness. The gap I see consistently is training on why things matter, not just how to do them.
A technician trained on "how to record body weights in the LIMS" knows the procedure. A technician trained on "why contemporaneous data recording matters, what happens when it doesn't happen, and what a 483 finding for non-contemporaneous recording looks like" understands the stakes. The second technician is less likely to batch-enter data at end of shift because they understand what that creates. The first technician follows a procedure until it's inconvenient.
I'm not saying every lab needs a regulatory philosophy seminar. But the facilities I've seen with the cleanest inspection histories are the ones where technicians can explain why the procedures exist, not just recite them. That understanding doesn't come from a PowerPoint deck delivered once during onboarding. It comes from study directors and QA staff who explain the regulatory context when they train people on specific tasks.
By the way, this connects directly to data integrity. The ALCOA+ principles only work in practice if the people generating data understand what attributable, contemporaneous, and complete actually mean in their daily work. Training on ALCOA+ at the bench level — not as abstract concepts but as "here's what it looks like when you record this body weight" — is the bridge between having a data integrity policy and having data integrity.
Building training records that survive an audit
The training record is the deliverable. Your training might be excellent — engaging instructors, practical exercises, real competency assessment. If the record doesn't capture it, it didn't happen. Not from FDA's perspective.
Here's what a defensible training record contains:
For each training event:
- Date of training
- Topic (specific enough to link to an SOP, instrument, or procedure)
- Method (classroom, on-the-job, self-study, practical demonstration)
- Trainer name and qualification
- Assessment method (written test, practical demonstration, verbal Q&A, observed performance)
- Assessment result (pass/fail, or competency verified)
- Trainee signature acknowledging completion
- Trainer signature confirming assessment
For each person's file:
- Current CV or resume (updated when qualifications change)
- Job description listing assigned functions
- Training matrix showing all required training and completion status
- Documentation of continuing education or refresher training
The training matrix is the single most useful document in your training system. It's a grid: personnel on one axis, required training topics on the other, completion dates in the cells. When an inspector asks "was this person trained on this SOP?", the training matrix gives the answer in seconds. Without it, someone has to dig through individual files, cross-reference dates, and piece together a narrative. Under inspection pressure, that's not where you want to be.
Keep the matrix current. When a new SOP is issued, add it to the matrix. When a new person joins, add them. When an SOP is revised, update the required training date and flag anyone who hasn't been retrained. This is the kind of administrative work that nobody gets excited about. It's also the kind of administrative work that prevents 81% of 483 findings.
Training for electronic systems
This is the gap that's growing. Labs are moving to LIMS, ELN, and electronic data capture, but training programs haven't kept pace. A technician who was trained on paper data recording five years ago is now expected to use a validated LIMS with audit trails, electronic signatures, and role-based access controls. The system is more complex. The training often isn't.
What FDA expects for computerized system training:
- How to log in and log out properly (not shared accounts — see our data integrity checklist for why this matters)
- How to enter data correctly the first time
- How to make corrections (the proper procedure, not just "edit the field")
- How the audit trail works and why they shouldn't try to circumvent it
- What to do when the system is unavailable (downtime procedures)
- How electronic signatures work and what signing means legally
The Jiangsu Kerbio warning letter I mentioned in the data integrity article is relevant here. Inspectors observed staff completing 26 days of study records at once. That's not just a data integrity failure — it's a training failure. Either nobody trained those staff on contemporaneous recording requirements, or the training didn't take. Either way, the training system failed before the data system did.
CRO training: what sponsors should ask
If you're outsourcing repeat-dose toxicity or safety pharmacology studies to a CRO, the CRO's training program is your concern. Not because you're responsible for running it — the CRO manages their own training — but because training adequacy at the CRO directly affects the quality of data in your IND.
During CRO qualification, ask specifically:
"Can you show me a training matrix for the staff assigned to my study?" Not the facility-wide training SOP. The matrix for the people who will actually handle your compound, run your assays, record your data. If they can't produce this before the study starts, that's information.
"How do you handle SOP revisions mid-study?" If an SOP is revised while your study is running, how do they retrain affected staff? Do they pause study activities during retraining? How is the gap between SOP revision and retraining documented? This is exactly the scenario that generates 483 findings.
"What was cited on your most recent FDA inspection, and what was the CAPA?" If training was cited (and statistically, there's a good chance it was), ask what they changed. Did they add assessment components? Increase refresher frequency? Update their training matrix system? The answer tells you whether they treated the 483 as a paperwork exercise or a real improvement opportunity.
"How many concurrent studies does a typical technician work on?" This isn't strictly a training question, but it's related. A technician working on 6 concurrent studies across different protocols is more likely to make procedure errors than one working on 2. Overloaded staff is a leading indicator of training-related findings because the training that worked at 2 studies doesn't scale to 6.
The frequency question
How often should training happen? FDA doesn't specify. "Current" is the word in the regulation — training must be current. But current relative to what?
My practical rule: training expires when something changes. An SOP revision triggers retraining. A new instrument triggers training. A new study type that the person hasn't performed before triggers training. A 483 finding against the facility triggers refresher training in the cited area. Annual GLP refresher training is standard practice and probably the right floor, but it's not sufficient on its own.
The labs that get this right build training into their change control system. SOP revised? Change control record includes a line item: "identify affected personnel and schedule retraining before effective date." New equipment installed? Equipment qualification includes a training deliverable. It's not a separate training program — it's training embedded in every process that could affect study quality.
The labs that get it wrong treat training as an annual event. Once-a-year GLP awareness training, sign the attendance sheet, done. Twelve months later, do it again. Meanwhile, 8 SOPs were revised, 2 instruments were replaced, and 3 new people joined the team. The annual training covered none of that.
Checklist
Personnel qualification
- Every person performing or supervising GLP work has a current CV on file
- Job descriptions exist for all GLP roles and list specific functions
- Education, training, and experience are documented and adequate for assigned functions
- New personnel are trained before performing GLP work (not during, not after)
Training delivery
- Training covers specific SOPs, instruments, and procedures (not just generic GLP)
- Training includes the "why" — regulatory context, not just procedural steps
- Training on computerized systems covers login, data entry, corrections, and audit trails
- ALCOA+ principles are covered at the bench level with practical examples
- Study-specific training is provided for unique protocols or procedures
Training assessment
- Competency is assessed, not just attendance recorded
- Assessment method is documented (test, practical demo, observed performance)
- Assessment results are documented with pass/fail or competency confirmation
- Trainers are qualified to train and assess on the topics they cover
Training records
- Individual training files are maintained for all GLP personnel
- Training matrix links personnel to required training with completion dates
- Training records include: date, topic, method, trainer, assessment, signatures
- Records are retained for the required period under 21 CFR 58.195
Training currency
- SOP revisions trigger retraining for affected personnel before the effective date
- New equipment triggers training before the equipment is used in GLP studies
- New personnel are trained before assignment to GLP work
- Refresher training occurs at defined intervals (annual minimum for GLP awareness)
- Training matrix is updated when SOPs, equipment, or personnel change
- Change control records include training as a deliverable
CRO oversight (for sponsors)
- CRO provides training matrix for study-assigned personnel before study start
- CRO's process for mid-study SOP revision retraining is documented
- CRO's most recent inspection history is reviewed, with training findings noted
- CRO's concurrent study load per technician is within reasonable limits
Related reading:
- GLP Compliance Checklist for Preclinical Studies — the full 21 CFR Part 58 walkthrough
- GLP Data Integrity: ALCOA+ Checklist — electronic systems, audit trails, and what warning letters reveal
- FDA GLP Inspections: What 483 Data Reveals — enforcement patterns and the training-to-warning-letter gap
RegFo checks your nonclinical study package against FDA/ICH requirements — including genotoxicity battery, repeat-dose toxicity, and safety pharmacology — before you submit.