This article is one of a 7-part series on “Technology Auditing Strategies” co-authored by Roberto Calderon, Eric Stewart and Dr. Paul Eder.

Evaluations are a systematic assessment of the operation and/or outcomes of a program, intervention, or policy. They the collection of data, analysis, and usage of information and analysis results with the purpose of:

  • Making judgments about the program’s, intervention’s, and/or policy’s effectiveness and efficiency; and
  • Giving recommendations for areas of improvement.

Before starting an evaluation, the evaluating team should determine a methodology for conducting the evaluation. Planning should consider all circumstance around the program including budget, schedules, data availability, number of evaluating team members just to name a few.

Six Steps of Program EvaluationConducting an evaluation can be done in six main steps:

  • Establish stakeholders
  • Explain what the program is about
  • Select the design of the evaluation
  • Collect data
  • Generate conclusions based on data analysis
  • State findings and provide recommendations

In the federal government, besides establishing a methodology and following the six general steps for conducting program evaluations, evaluators shall also comply to policy and standards. This set of guidelines is called the Quality Standards for Inspection and Evaluation (QSIE). Since 1993, QSIE has been consistently used by Federal employees and Federal contractors. It is expected that evaluation-related work initiated by an Office of Inspector General (OIG) will refer to the standards. The newly updated version of the QSIE—which was released in January 2012—was published by the Council of the Inspectors General on Integrity and Efficiency (CIGIE) on its website. CIGIE was established as an independent entity within the executive branch to examine and look after economy, integrity, and effectiveness issues that transcend individual Government agencies. One of the main goals of the CIGIE is to present manuals, guidelines, training, and quality standards for Government inspections, evaluations, audits, investigations, and operations related to Federal OIGs.

The main objective for publishing the 2012 QSIE document was to ensure Federal agencies and Federal contractors conducted inspections and program evaluations under guidelines that show collection of factual information. It was a major priority to share best practices on the level of quality and professionalism required to asses fraud, waste, and mismanagement. The following key terms are defined in the QSIE:

  • Competency: Have knowledge of: different methods of evaluation; assumption, concepts, and processes of the evaluated program; oral and writing skills; and qualitative and quantitative analysis. Be able to maintain a working relationship with the organization which owns the program being evaluated. If necessary, use Subject Matter Experts (SMEs) if there is technical language or topics unknown to the evaluator. Per OIG requirements, evaluators should do annual training for at least 40 hours.
  • Independence: Evaluating organizations should not provide services outside of the evaluation that involve managerial functions, decision making, and any work in which work activities has related material to the evaluation. Evaluating organizations shall not evaluate their own work. Consider personal impairments, external impairments, and organizational impairments that would affect the nature of the evaluation. If work cannot be declined once this criterion is not met, the impairments should be stated in the reporting document.
  • Personal Judgement: Use professional skepticism by assessing validity and reliability of evidence collected throughout the evaluation. Evidence shall be sought using the most appropriate methodology for data gathering, depending on the program being evaluated. Make sure you follow organizational and ethical standards when conducting evaluation. Findings and recommendations are delivered in an unbiased manner and in coordination with OIG staff.
  • Quality Control: Have an internal quality control method that works besides the use of QSIE. Documentation of such method should be retained for a period of time to use with QSIE for review. Supervision shall ensure: evaluation has a plan and this plan is followed, objectives and goals are met, and findings and recommendations are supported by documented evidence.
  • Planning: Consider relevancy of evaluated topic as well as the significance and impact of potential outcomes. These points should be continuously considered through the inspection. Coordination, research, and work plan should all be considered to meet this standard. Take the proper actions when dealing with classified or sensitive information. Prepare a document that states the evaluation plan. This document should include the methodology to be used and state the objectives and goals of the evaluation, evaluation tasks and deliverables schedule, and potential outcomes.
  • Data Collection and Analysis: Sources of collection should be included in the supporting documentation in detail. Information collected should be reliable, valid, related to the evaluation, and responsive to the target objectives. Confidentiality and security shall be enforced in all information collected. Supportive qualitative and quantitative information should be appropriately presented and documented. Data analytics shows relevance to the relationship, is properly presented, and connects its relevance to the desired outcomes, findings, and recommendations.
  • Records Maintenance: Supporting documentation should provide a record of the nature and scope of evaluation work. Safety procedures and document tracking shall exist and follow administrative and legal requirements.
  • Timeliness: Evaluations should be conducted in a timely manner, showing flexibility when there is a change in priorities. There should be interim briefing reports to show significant matters to officials. There should also exists some flexibility when there’s a change due to external factors.
  • Fraud, Other Illegal Acts, and Abuse: The following conditions might indicate a risk of fraud: lack of internal controls, inadequate safeguarding and control of resources, unfamiliar transactions that lack a record, lack of proper documentation or insufficient evidence, past reviews of findings that involve suspicious or criminal activity, or misleading/biased presentation of findings/recommendations.
  • Reporting: There should be retrievable documentation. Evaluation report shall include: scope of evaluation, goals and objectives, methodology used, a statement of compliance with QSIE, impact/significance of the program being evaluated, findings/conclusions, and recommendations. Finding/conclusions and recommendations should be supported by evidence, and such evidence should be documented in the report. Language should be clear and concise, and proof written.
  • Follow Up: follow up procedures are guided by resolution policies of each OIGs. Follow up actions should be performed to ensure recommendations are being implemented. Agreement of that an action was agreed-on assure the “closing” of such action in the evaluation. Previous evaluations usage in future evaluations of the same topic is also considered a follow up activity.
  • Performance Measurement: should focus on outputs and the outcomes. Measurement should ensure to evaluate that objectives of evaluation were completed as well as indicate any complications. Performance indicators shall show whether the targets were successfully met, partially met, or not met at all, with an explanation.
  • Working Relationships and Communication: maintain communication at all levels, by establishing methods of communication, to provide and receive regular feedback. Recognize priorities. Communication should engage in an objective, professional, and fair manner.

Standards for evaluation are always provided with the caveat that professional judgment should be incorporated when finalizing any plans. While standards are important to consult, evaluators must still consider the unique context and needs of any organization. As a result, your evaluations will have a grounding in Government-wide standards, they will be implemented in a manner most consistent with helping the organization being evaluated to achieve its mission.



Roberto Calderon is an associate consultant at The Center for Organizational Excellence, Inc., a management consulting firm based out of Rockville, MD specializing in organizational effectiveness, human capital, data management and information technology solutions. He has authored numerous insights on topics such as data standards and technology audits.