The Complicated Process of Simplification: The Evolution of the 2012 Standards
by DeAnne Maxwell, RCAP Chair 2017-2019
The Old Way
My first Self-Study submission experience occurred in the fall of 1990. I had been a clinical instructor for about a year and was new to the accreditation process. Our program had a file box with hanging file folders. Within each folder, a label was taped to the inside cover: An Essential and its corresponding compliance requirements (the Essentials are what NAACLS used to call the Standards). Twenty-two Essentials with twenty-two folders, and over the course of a year, my colleagues and I packed each folder with printed papers for documentation.
When submitting the Self-Study, we assembled all the paperwork, hand-numbered each page, made three copies of the fully assembled document, hole-punched everything, and inserted each copy into a 3-inch binder. There were no jump drives or electronic submissions, just three three-ring binders and an uncountable amount of paper cuts. The postal service delivered the box of binders to the NAACLS office. NAACLS sent one copy to the self-study reviewer and one copy to the site visit team leader and kept one for their records. From what I understand, the NAACLS office had a room dedicated exclusively to storing self-studies.
Before the 2012 Standards, NAACLS focused on inputs, as many accreditors were. Specifically, they ensured program and course objectives were consistently used throughout the program. I remember printing each page of objectives for every lecture (leveled, of course) taught, along with every unit and final exam given to students. Each exam question had to be leveled and matched to the accompanying objective for each course. Examples of objectives and exams, lists of the labs I had on the course schedule (with all the needed materials for each one), and the topics for the case studies and homework materials all went into the self-study binder. At the time, to be compliant with the Standards, NAACLS required each instructor to provide a binder with objectives, quizzes, and exams for each course they taught. There was so much information – I remember feeling so sorry for the site visitors. The day of the site visit arrived in the Spring of 1991, and even though we had carted all the binders to their hotel room the night before, the visitors still had to spend the next day looking at information in binders in the classroom because of the overwhelming amount of documentation provided.
I became Program Director in 1993 and did three more self-studies like that: 1998, 2005, and 2012. For each one, all the paper (if not more), all the lists, and all the big binders. In 2005, I participated in my first NAACLS site visit and realized what being on the other side was like. We had an early afternoon check-in, and when we arrived, the program provided a big box of more documents ready for us. Before I became a site visitor, I assumed the work began on the first full day of the site visit. This was not the case. We grabbed a quick bite to eat and then hung out in someone’s hotel room to go through and check objectives and exam questions, look for progression of content throughout the courses, and review labs, case studies, and homework to confirm educational activities–a binder (or two) for every course taught. I always tried to do everything the night before to spend better time on-site with program officials and students. However, the site visit report form itself was full of checkmarks for “submitted information.” It was impossible to break free from that structure when I really wanted to talk to the program faculty and students to see their perspective of the program.
I honestly began to think that just because the program submitted everything appropriately, it didn’t mean that the program graduates could meet entry-level competencies. As I went on more visits (and the required lists of information continued to increase), I realized I was not the only PD or site visitor who felt this way.
The Revision
NAACLS is required by its policies and procedures to fully review its standards every ten years. In 2008, the Board of Directors (BOD) started revising the Standards but mandated a shift in focus to outcomes orientation. I was on the NAACLS Review Committee for Accredited Programs (RCAP) from 2010 to 2019 as an MLS Educator. I was very passionate about changing the Standards, so I was asked to work on the revision. Educators from all NAACLS-accredited disciplines with varying experiences joined to reorganize and evaluate the 22 Standards. This revision resulted in establishing six Core Standards that crossed all disciplines and two discipline-specific Unique Standards. This process was, as you can expect, lengthy. First, we looked at each Standard and determined its purpose. The second step was deciding what we wanted as documentation of what we were looking for.
We looked at all the material NAACLS required programs to provide and in doing so, we struggled to determine what was just information and what actually demonstrated program quality. We realized we were looking for documentation of analysis and evaluation of outcomes, but we weren’t seeing it in the current model. “Ongoing program analysis” and “program effectiveness” became our guiding forces, and suddenly, the revision became easier. We had calls every couple of weeks and exchanged countless emails discussing each Standard with document edits and comments. The task force dedicated a substantial amount of time to ensuring the core standards considered all disciplines and possibilities. Creating the initial draft of core Standards that could cover all program disciplines was rough — the unique ones were a bit easier. Discipline-specific volunteers created their content and standardized the language throughout the document. We finalized a draft for the BOD in April of 2010 and a revised draft that we shared with the review committees in July of 2011. The work was far from over.
Over the next year, we incorporated review committee feedback as we prepared a draft for public comment. During the summer of 2012, we entertained public comment and continued to revise the Standards. As expected, the shift to ongoing assessment and documenting evaluation was just as difficult for the other programs and program officials to embrace as it had been with our core group. I think programs performed ongoing assessment in many cases, but it was never documented or recorded. The NAACLS Board of Directors approved the Standards, with several revisions based on public comment, in September of 2012 – four years after their initial request for revision. It seems like a long time now, but at the time, we were amazed at how quickly that time passed.
The Compliance Guide
Standards are the basic requirements an accreditation agency sets for quality assurance. The review committees and the board soon learned that these requirements were just the first step and that additional discussions and decisions would quickly follow.
Work began on developing a Standards Compliance Guide. The previous set of standards included a sparsely detailed chart to help steer programs to standard compliance. However, Program Directors indicated they needed more guidance, so we set forth to create a living, changing, and clarifying document that walked Program Directors and faculty through each component of every Standard without over-prescription. The compliance guide would focus on what programs should include in the self-study narrative and what a program should incorporate as documentation in the self-study report. In addition to the self-study, the Standards Compliance Guide would include what programs should be provided on the day of the site visit to show compliance with the Standard.
An all-volunteer group from all disciplines worked on the contents of the Compliance Guide so that all programs could utilize the document. I worked on the unique Standards VII and VIII for MLS programs. This document was crucial because the board could make minor tweaks and clarifications without the formality of a Standards revision. Once all compliance guides were ready, the last step was to create a single self-study review and site visit report that could cross all disciplines. Gone were the cumbersome discipline-specific checklists, which made things much easier for other disciplines to assist in the review processes.
The Benchmarks
While the ongoing assessment and evaluation Standard revision resulted in considerable thought and debate, one of the most significant changes in the 2012 Standards was the emphasis on benchmarks.
Benchmarks were part of the new Standards included in Standard II. Of course, every program has always looked at its certification scores and pass rates. Seeing graduates pass the exam made every Program Director beam with pride. However, choosing which other outcomes NAACLS would require for programs to report and developing benchmarks associated with those outcomes was an enormous task. For a group of educators so concerned with our lists and inputs, flipping the page to look at how all those inputs went into measurable outcomes was challenging. Carefully selecting numbers that reflect the program’s effectiveness to educate and keep the profession’s integrity without it being detrimental to smaller or larger programs proportionally required long and, at times, heated discussions.
For years, The Standards had been so… well, “standard” that there was no room to establish variation and individual program nuances. Graduation rate/ attrition data would indicate the program’s ability to retain qualified students. However, the review committees and board wrestled with the issue that some programs select their students, and others accept all who wish to participate. Thus, the “final half of the program” solution was created, the point in every individual program whereby the faculty has deemed the “halfway point.” For me, it was about 2/3 of the way through the program – after students completed all didactic and student lab time prior to clinicals. For some university/college programs, it was completing a particular semester or set of courses. The “final half of the program” decision was one of the first times we could see the possibilities for programs to utilize individual choices but remain in compliance with the Standards.
The last benchmark was placement data for graduates. Binders and lists of labs do not tell anyone about the ability of a graduate from a particular program to get a job or advance their education. Data collected on graduates getting hired or accepted to further their education gave programs (and NAACLS) better insight into the quality of preparation of future practitioners. Standard II, including these benchmarks and the program’s evaluation plan of those outcomes and their own goals for assessment, continue to be significant components of the Standards.
Closing the Loop
The Board of Directors rolled everything out following their September 2012 meeting, but when would programs need to start with the new materials? NAACLS Staff notified those in the Fall of 2013 that their self-studies, due in the fall of 2014, with site visits scheduled for the Spring of 2015, would be reviewed utilizing the new 2012 Standards materials. That gave us at least a year to present the 2012 Standards “road shows.” These road shows were a series of workshops presented across the country.
Attendee questions mainly centered around the new Benchmarks, of course, but streamlining the affiliation agreement, introducing additional sponsorship models, the clinical liaison inclusion, and the teach-out plan also received significant attention. Feedback from both of these topics resulted in some minor tweaks to the Standards Compliance Guide – the first example of the Compliance Guide providing an efficient way to implement outside feedback. Many of these changes continue to be discussed by program directors and refined by review committees and the board to this day.
The process was long; the correlation between documents took many stabs and attempts. People came and left the committees, and even the task forces changed over the years. The defining realization during the 2012 Standards revision showed that the Standards are still standards; yet in today’s evolving healthcare education landscape, there must be flexibility to do things differently to achieve the same outcomes. This process works only with the effort and hard work of the NAACLS volunteers and staff.
As NAACLS celebrates fifty years of clinical education, it is only fitting that the task of revising standards begins again. It is a daunting task with many changes over the past ten years, but it is one of my most rewarding experiences.
It truly defined to me what collaboration meant.