Saturday 14 January 2012

ECUR809.3 Assignment #1

After some struggles selecting a completed evaluation to review/analyze, I finally decided upon an evaluation that concerns a particular special education program.  The evaluation that I decided upon, “An Evaluation of the Structured Success Program: From the Students Perspective,” centers upon an evaluation of an alternate education program offered and implemented in Saskatoon.  Once I stumbled upon the article, it peaked my interest because the Structured Success Program is also offered within the school division I am currently working for and is run out of the building where I teach.  The program is an alternate program for those students who have serious behavioral and/or emotional disorders.  Students who are unable to operate within the regular classroom environment, and whose behaviors are a significant disruption to the learning environment for classmates, are placed in the Structured Success Program.  It is the goal of the program to teach and model appropriate behavior and social skills so that the students will eventually be integrated back into the regular classroom setting.
Although the evaluation does focus somewhat on the overall student perception of the Structured Success Program, the title is a little misleading.  A considerable amount of the evaluation is focused on the students’ perception of their classroom environment and its relationship with their self-concept.  The evaluation aimed to examine these two perceptions and then compare these findings with results from the regular classroom.  The aim of the comparison was to fill a gap in research that exists in studying the relationship between classroom environment and self-concept for students with B/ED.  This aim does still link with the overall purpose of the evaluation, but the connection between the two is not explicitly stated.  The comparison is made because, it seems, the goal is to get an overall understanding of students’ perceptions of their classroom environment, their self-concept, as well as students’ general impressions/perceptions concerning the program. 
The methods that were used in order to evaluate students’ perceptions and impressions of the Structured Success Program were all student completed surveys and questionnaires.  In order to determine the students’ perception of their classroom environment and self-concept, the CES (Classroom Environment Scale) and MSCS (Multidimensional Self-Concept Scale) were used.  More specifically, 17 students currently enrolled in the S2P from five schools in Saskatoon participated in the study.  The subjects were all males ranging from 11-14 years, with eleven being enrolled full time in the program and six only part time.  The length of time that these students were enrolled in the program also varied from 3-27 months.  The data collection for the evaluation was completed in two phases.  During phase one of the data collection participants completed the two self-report questionnaires (CES and MSCS).  The results from these two tests were then compared with results completed by students in regular classroom environments.  In the second phase of data collection students participated in semi-structured interviews with the author of the evaluation.  The questions from the interview included some limited response style questions as well as more open ended questions that required more detailed responses from the participants.  The aim of the second phase was to ascertain the general perceptions and sentiments of students in relation to their enrollment in and experiences within the program.
Perhaps the most evident strength of the methods employed is simply that they match with the intended purpose to provide information on students’ perceptions of their classroom environment, self-concept and the program in general.  The goal was to provide some information in terms of how regular classroom students’ perceptions of environment and self-concept compares with students with B/ED, and the administering of the CES and MSCS serves that purpose.  The evaluation also included an overview of the similarities and differences found as a result of administering and comparing the two sets of results.  The comparison is presented in a clear, concise manner and the reasons for the variances in responses are also explained.  The nature of the interview process also aligns with the goal to provide insight into the general perceptions of the program.  The open ended nature of the interview questions allows students to provide detailed, personal responses in relation to their experiences in the program.  In considering alternative formats for the open ended style responses, the evaluator also notes, rightly so, that a focus group format would have likely proved ineffective with this group of students.          
Although the overarching purpose of the evaluation seems to be achieved at first glance, there are some weakness and limitations of the evaluation.  The nature of the S2P student lends itself to general inaccuracies in attempting to establish empirical data to base conclusions upon.  The fact that these students usually harbor a general resentment for testing and mandatory work means that there is a possibility that students are not really thinking about the questions as they answer them, but simply trying to “get it over with.”  The evaluation itself provides evidence to suggest that these assumptions may very well be the case.  The interview phase of the data collection only included 5 out of the 17 participants.  The evaluator could not get the other students to participate in this section of the evaluation and also had to offer a $10 incentive just to get those 5 to complete the interview.  The conclusion here is an obvious one: how can you base conclusions about a program based on the responses of 5 out of your 17 participants.  For that matter, 17 participants isn’t exactly a large number to base conclusions upon either.  The small number of participants in the data collection process is an obvious weakness, one that the evaluator chooses to mention as well. 
The number of the participants is a concern, but in addition to that, the participants were also all males.  This seems to cause some concern as it certainly seems plausible to assume that there may be variances in the perceptions of females in the program compared with their male counterparts.  In addition to the problems with numbers in the interview portion of the evaluation, there is also little information/insight offered into the purpose of the interviews in relation to what the information will be used for.  The evaluator states that the interview process provided students with the opportunity to offer their opinions and experiences concerning the program, but to what end?   What will this information be used for?  These questions are not addressed.  The conclusion notes that understanding students’ perceptions within special education programs is needed because it may have implications for the future development and redevelopment of those programs.  It seems then that, despite some limitations, this evaluation still provides some research and conclusions in an attempt to shed light on an underrepresented group of students.  


Source:
Da Silva, Trudy A. "An Evaluation of the Structured Sucess Program: From the Students' Point of View." Canadian Journal of School Psychology. 2003 18: 129.  http://cjs.sagepub.com/content/18/1-2/129. 

1 comment:

  1. HI Gemma

    Interesting choice. Helpful for you to gain more understanding around the program in your building. I think the report you share is indeed full of holes. It does not include a comprehensive enough sample of those impacted by the program. If the main purpose is to hear from those in a program you truly need to hear from those in the program. I am curious as to why no parent data was collected to support the other findings. I am also like you suspicious of results where people are paid and response rates are dismally low. Are there any recommendations for modifying or improving the program? Not including any advice really defeats the purpose of calling the report a program evaluation.

    ReplyDelete