Blogs

Dedication and Commitment, "The Guts to Do More"

By Douglas Cline posted 06-01-2018 10:06

  

As instructors, we have a duty to provide the highest quality of service and instruction. We must be our students' inspiration to strive for excellence. The question is just how dedicated and committed are you? Stop and take a look in the mirror, because the future of the fire service rests on today's instructors' shoulders.

Organizations, and instructors themselves, must determine how well training is delivered. There are a variety of methods to carry out this task. Head nods, smiles and reaction questionnaires can be given to the students to be completed; persons who are subject matter experts or senior trainers can audit the sessions; post test scores can be used; on the job performance evaluations; and instructors can do self-evaluation and peer assessments. 

The optimum time to evaluate the work of an instructor is while he or she is actually in the process of delivering instruction. Observation is recommended.

However, observation is only effective if it is driven by objective, comprehensive, reliable and accurate standards.

Follow these steps to create and use instruments to evaluate the delivery of training:

Step 1: Identify and define the objectives of the evaluation and determine the instrument type(s) to be utilized in gathering assessment data. Determine why the evaluation is being conducted. It may be to provide feedback on an instructor on a specific delivery problem, or to evaluate the overall competence of an instructor during the delivery of a program, or to define specific circumstances of a delivery that may require the redesign or modification of materials and logistics.

Step 2: Consider how the information will be summarized, and to whom it will be reported.  Evaluation data can serve many purposes and can be interpreted many different ways. It is important that clear decisions define why, when and from whom data is being collected. It is important to evaluate what information is collected and its relation to the original objectives for the evaluation.

Step 3: Identify and define the specific competencies and performance to be measured. First, you must determine which competencies will serve as the basis of the evaluation. Typically, a detailed evaluation involves no more than three competencies, where a more general evaluation may evaluate multiple competencies. Secondly, the objectives of the evaluation must be clearly specified. This is so that both the evaluator and the instructor being evaluated understand what is being measured.

Step 4. Determine the sources of data. You can obtain evaluation data from a number of sources. More common methods of data collection are evaluations by evaluators, co-instructor, peer, self-evaluations, learners, clients and training managers/responsible parties. It is important to remember that the varying skill levels of the evaluators can influence the data.

Step 5.  Write the questions. For quality control, questions must be linked to a specific, desired outcome for the evaluation. When questions are written, we can control the specificity or generality of the individual item. These controls are essential to keep the evaluation instrument practical, manageable, reliable and valid.

Step 6.  Design the format and layout of the instrument. Evaluation instruments must be written clearly and concisely for what is to be measured. The evaluation must contain unambiguous directions for use and feature ordered questions or items to be evaluated. Finally, instruments must be user friendly. This means they are easy to read and use, and have enough space for documentation and question answers.

Step 7. Pilot-test the instrument, and obtain feedback. Prior to using a document for program evaluation, allow it to be pilot-tested. This will allow others to provide feedback on the instruments'' adequacy and usefulness. This pilot-test helps the evaluators determine how well the instrument design and layout meet the desired objectives. It also allows for the evaluation of the instrument, to ensure its design provides the desired results. Since instrument development is time consuming and costly, it is imperative to evaluate the tool to ensure that it will provide the best information possible. 

Step 8. Create the final instrument, and implement the evaluation. The final instrument must provide the data needed to ensure that the training achieves its objectives, or job performance requirements. Instruments may be used to assess a variety of training aspects. The instrument may be used to asses the instructor's performance, logistics, usefulness of instructional methods, course materials and content, media, and design adequacy of courses. 

Effective fire service organizations must recognize their responsibilities to assist in the professional development of their instructors. Instructors also must realize they have areas that need development. As the leaders of the fire service, we as instructors have to have the guts to do more. We must set a precedent for the future. We begin that precedent with the instructor in the mirror. We have an obligation of dedication and commitment to educating the future of the fire service.

 

Article originally published in the Volume 29, Issue 6 (June 2000) of "The Voice".

0 comments
0 views