Monday 9 January 2012

Assignment 1


     The following is my assessment of the Review of Programs/Services for Gifted Students prepared by the Ontario Quality Assurance Department for the Ottawa-Carleton School Board (OCDSB) in 2001. The review of the gifted programs was part of a larger Ontario Ministry of Education Project with the goal of “developing standards for each exceptionality in order to improve the understanding of what is the most effective way to provide special education programs across the province.”  According to Scriven, formative evaluation "is research-oriented vs. action-oriented," and since there are few calls to action in the recommendations, this assessment in primarily formative.
      Researchers used three methods for gathering their data: Qualitative. They interviewed various stakeholders; Quantitative. They had stakeholders complete surveys; and third, they compiled findings from previous research done by similar groups. I found it interesting that they invited individuals/groups to submit up to five areas of concern that they would like to see evaluated and, from these, they received 17 responses with a total of 119 suggestions...obviously exceeding the limit of five each.
      Although they did not use the exact terminology, based on the broad information headings, the researchers were using a CIPP (Context, Input, Process, Product) model. The following headings were selected by the researchers to categorize the 119 responses (I have added what CIPP category that each would most likely fall in – there would obviously be some overlap):
  1. Students (Context)
  2. Budget (Context, Input, Process)
  3. Administrative Responsibility (Context, Input, Process)
  4. Facilities (Context, Input, Process)
  5. Needs of Students (Input)
  6. Staffing (Input)
  7. Qualifications (Input)
  8. In-service (Input)
  9. Material Resources (Input, Process)
  10. Delivery Models (Process)
  11. Activities (Process)
  12. Goals of the Program (Product)

What I liked about the evaluation:
  1. It was requested and done. I think that too many programs are run without ongoing assessment and evaluation.
  2. The researchers acknowledged that the recommendations for improvement would be “within the confines of responsible fiscal management.” Often government programs make great promises which are not feasible financially – this recognized the limitations that school boards face.
  3. Some data was provided (and more can be requested in hard copy but was not included in online version of the report). This gives support for the researchers' judgements. Conversely, it also enables readers to assess and challenge the conclusions and recommendations of the researchers.
  4. Specific recommendations were made in each CIPP category.

What I did not like about the evaluation:
  1. Goals of program are vague or non-existent. Even though the researchers included a category of “Goals” when organizing responses, not one of the 119 suggestions fell into that category. I believe that assessing any program must start with alignment to goals – Is the program doing what it is supposed to do?
  2. Similar to #1, the goal of assessing all exceptionalities was to ensure that “all students have access to curriculum, teaching, and learning environments, that will enable them to reach [provincial] standards.” As stated, this would virtually eliminate all gifted programs.
  3. In their recommendations, researchers did not comment on the alarming fact that 100% of the responders wanted to see an issue in Context, Input, or Process addressed, but not one of the 119 were concerned about the Product – or outcome – of the program. This could mean that everyone is happy with the goals but I find this doubtful since nowhere in the evaluation are the specific goals written or even referenced.

      Considering the goals of Ministry when they requested the report, the Stufflebeam CIPP model seemed like a logical place to start. The Ministry seemed more concerned with documenting what was being done than they were with improving the program. Had they been more interested in improvement, the Prevus Discrepancy model would have been more practical. The other model that was touched on was Scriven as the researchers did analyze roles but they certainly did not address goals in any significant way. Overall, the report seemed to be a political document to testify to the government's concern for gifted children without actually having to anything more to support them.

The complete report can be found online at http://www.abcontario.ca/pdf/ocdsb_gifted_review/Gifted.pdf

Thanks,
Joel

1 comment:

  1. Joel I agree with your analysis of the program. The most startling aspect of the report is asking for input on what should be evaluated. I understand that this makes the method more of a participatory or action research model but it should not happen at the expense of program goals. That being said there are a number of well crafted recommendations for the future of the program. They also involved a wide selection of stakeholders. I am not sure if having the number of consultants and advisors on the team was a help or a hindrance.

    ReplyDelete