The proposed NIMS Training Plan: What's going on?NIMS Training Plan sets DHS up for a reprise of its ironic role as an obstacle to preparedness
By Jeff Rubin, PhD, CEM
In January, the Department of Homeland Security released the draft NIMS Training Plan for comment. The version of the plan that's adopted will supersede the Five-Year NIMS Training Plan of 2008, making this review process important for all levels of government and all response sectors.
Unlike many NIMS documents over the past several years, this one offered a reasonable amount of time for review (just over a month), and there is an appropriate reference to training by function rather than title, but it's tough to find much upside beyond that.
For the impact it would have on a huge spectrum of organizations, for the sudden and unexplained philosophical shift it represents, and for the apparent lack of critical thought that went into it, I'd have to call the draft both disappointing and alarming. Not only does it represent a giant step backward for NIMS implementation, it sets DHS up for a reprise of its ironic role as an obstacle to preparedness.
The document is not well written; there is no evidence of careful consideration or review. It contains internal contradictions, contradictions with NIMS itself and superficial definitions, much like a term paper rushed to completion just before a deadline. Given the potential impact of any set of NIMS requirements, the end-users deserve better.
Training is not enough
For one thing, the document needs to distinguish between training, compliance and capability. The notion that training equals capability is misdirected. Training can generate capability: it is necessary, but not sufficient.
This is particularly true when training is built around a series of online classes. The online content is actually pretty good for what it is (IS-702.A and IS-703.A in particular), but in itself is unlikely to generate more than a little orientation at best.
The assumption that the NIMS curriculum "adequately trains emergency response personnel to all concepts and principles of each NIMS component" misses badly.
And what exactly does taking two baseline online courses (ICS-100 and IS-700.A) actually provide beyond measurable compliance? If someone completing ICS-100 can correctly match titles with ICS levels, and distinguish between "base" and "camp," does that generate a capability (or a core competency)?
What capability gaps does IS-700.A address, and why is it so important for every responder (field or EOC) to complete it? No credible emergency manager, incident commander or educator could believe that a couple of one-time, introductory, online classes generate an actual capability, so it has to be easy measurement, where success equates solely to numbers.
Despite the bang-on statement that "personnel qualification as a whole is not only a function of training, but a combination of training, operational experience … job shadowing, and administrative requirements," there is little in the plan beyond ICS training, which makes it appear to be little more than a five-year ICS training plan (limited largely to Command and General Staff focus at that).
Compliance is based on measurable criteria such as course completion, rather than actual proficiency, which is more important but more difficult to measure. NIMS is supposed to be more than that.
Three more concerns
Speaking of ICS, the determination that it resides solely in the field may satisfy purists, but it leaves the critical EOC roles hanging, with little more than 100/700 training as recommendations.
Don't like the idea of Command and Operations in an EOC? Let's put that aside and consider Planning, Logistics, and Finance: critical support functions that demand coordination between an incident command post and EOC if the latter is to effectively support the former. Despite the NIMS document's recognition of EOCs' critical role, there has long been a dearth of effective standardized EOC training; that seems rather unlikely to change.
If you're a field IC, do you consider someone supporting you in your EOC with a couple of online classes to be qualified? Do you want your EOC staff to understand the system they're supporting? ICS-200 is the meat and potatoes of ICS; any EOC or scene practitioner in a supervisory role should not only take this class but actually understand it, and that's not a likely outcome from a single online exposure.
In fact, the draft refers to the need to incorporate adult-learning concepts, which is entirely appropriate, yet the emphasis on online classes seems to belie this approach. Between online classes and mass 300/400 training, there has been little to address the needs of adult learners. Where's the evidence of adult learning models in the vast overproduction of minimally qualified ICS-300 and 400 instructors over the past several years?
Acceptable though the online classes may be, it's a stretch to call them truly interactive and incorporating adult learning techniques. They're still standardized classes for large audiences, focused on assessing short-term retention of objective information that is "testable" — but not necessarily important.
Perhaps the most bizarre component of the draft training plan is the concept of tying training requirements to projected incident complexity. Scenario-based planning has substantial limitations, but planning based on potential incident complexity (an artificial concept in itself) just doesn't make sense. The philosophy behind this deviates not only from best practices, but also from intuition.
The apparent implication is that small/rural jurisdictions are OK with two online classes, one of which provides information on a document and no actual capability. Are core ICS content (ICS-200) and incident escalation and the planning process (ICS-300) not important here? More to the point, is the implied plan for those jurisdictions to wait for help to arrive? Is there no intention to provide anything for local EOCs, or Type 4 IMTs?
An unclear goal
This leads to one of the key questions about the draft plan: Is the primary goal of NIMS compliance, or at least this training plan, to enhance mutual aid capabilities? If so, that's an odd priority, not to mention a questionable plan for achieving it.
The draft strongly suggests that the goal of NIMS (or at least this plan) is to qualify practitioners for extrajurisdictional response rather than internal proficiency. The All Hazard Incident Management Team (AHIMT) program has many positive aspects, but it was (I hope) never intended to replace local capability.
A key AHIMT/deployment tool, task books, serve a specific purpose: They describe competencies and behaviors and document qualifications for a specific position; the standardized functions and tasks provide a component for a universal credentialing system, which is particularly important for deployment or other participation out of a responder's primary area.
FEMA has put a great deal of emphasis on the AHIMT program; this isn't a problem in itself, but AHIMTs are neither a panacea nor a universal template.
There's no question that those who wish to deploy must meet clear, rigorous and recognized standards, but that's not a core need within an organization. The draft training plan refers to competency development, which is indeed a good way to go, but ideally competencies are the source for learning objectives in training.
In addition, defining core competencies is a major challenge in itself, not to mention whether the objective is functionality or ability to deploy. Core competencies should focus on ability to function within one's own organization, which means knowing how things are supposed to work within it. Task-book completion does not necessarily accomplish that.
The overall feel is that there are a bunch of existing classes (online, ICS and AHIMT), and the goal is to fit them into a compliance program. Compliance seems to be the primary objective, as course completion is measurable, and "metrics" seems to be all the rage — regardless of whether what's being measured is actually meaningful.
Unfortunately, some of the most useful aspects of compliance are more subjective: efficacy of the planning cycle, development, follow-up and results of CAPs, assessing recurrent issues in exercises and real incidents, etc. Tougher to measure, but far more meaningful than how many people take IS-700.