Assessment, evaluation and analysis for the first year of onsite school counseling services for Year One and Program Two
“As of 2018, the SCP (school counseling services program) as implemented by TalkifUwant expertise experienced similar outcomes in multiple years of evaluations (published here), between different districts, under different school boards, different school administration and teachers, and even with different onsite providers. SCP is likely one of the few, if not one of the only in the US, even in 2018, to be evaluated in different settings, different years, and with different providers. More unique is the use of the digital overlay services. I’ve been apart of the SCP program from the ground up and am excited to help others along the way!”
INTRODUCTION: The following information is a compilation of data from the counseling services program that operated out of a Steinhatchee County school during one contract service year (the academic calendar of the school). From year 2005 t0 2018, the program expanded to include as many as six different schools in the district. The program was designed, developed, and implemented by Kurt LaRose who also was the direct service provider for the duration of the intervention and evaluation period. In other evaluation years, similar outcomes are/were realized, under different school district adminstrations, different school adminstrators, across different demographically situated districts and with the use of counselors, other than LaRose.This data analysis and interpretation represents a multivariate compilation of information, obtained from multiple informants including youth interviews, administrative and school personnel surveys, and self-administered evaluations by the direct service provider. Data was also obtained from independent stand-alone sources such as academic records, attendance records, and report cards.
The author of this report created the survey instrumentation that was used for this analysis, while the exit interview questionnaires for the youth mirror those that were developed and designed by the Florida State University Multidisciplinary Center, an organization with which LaRose once worked with as a counselor and therapist in various school settings. Survey instruments designed by LaRose have not been evaluated to establish psychometric properties.
The structure and organization of this assessment is divided into five general categories. It begins first, with a discussion of independent data sources, such as demographics, program census information, and youth attendance and absence documentation from the school and from the counseling program records of attendance. Grade reports are compared at time one and time two. Second, the analysis reviews surveys of the school personnel with an extrapolation of the information that was provided by the respondents who participated with, and returned, surveys. Third the analysis discusses youth exit interviews, comparing and contrasting the strengths and weaknesses of counseling services from the client centered perspective. In the fourth segment of the analysis a brief scoring by the counselor who assisted the youth for the academic year is provided looking at issues of “clinical significance” comparing pre and post intervention variables based upon degrees of psycho-social functioning. And finally, part five of this analysis compares the actual cost of contract services with non-contract fees in private practice settings. Cost savings are noted, if realized.
Information in each of the five sections is generally explained using pie charts and graphs created after the raw data was transposed from original source documents and entered into spreadsheets. Each graph and pie chart includes a brief explanation that ends with transitional statements to lead the reader from one segment of the analysis to the next. Thus, the pie charts and graphs can be holistically viewed in the manner they are presented and organized in this report and/or each graph and chart can be independently viewed and interpreted by the reader as separate, stand-alone data sets. Either way, there is value in the ecological connectivity of one graph and chart to the next, both for evaluation purposes and for fluidity in reading the report, however there is value in viewing the report in random and non-linked ways as well – also for evaluation purposes.
The report ends with a summary section, addressing funding sources, evaluation and report limitations, professional and personal affiliations, collaborations, and expressions of thanks and appreciation. Contact information, website links, and other indirectly related information is found in the summary section as well.
As of 2017 the author of this report has facilitated and supervised the implementation of SCP in other districts, consulted program implementation in the Bahamas and in Africa, and has transitioned the program after a 14 year history to an inhouse services model. Currently, in 2018, LaRose provides consultation services to districts seeking mental health services onsite, with full functioning digital services platforms that are district branded for management of mental health communications about youth, in contrast to the FERPA aspects and in contrast to right to information aspects, depending on the setting. Professional development, evaluation support, and training for onsite counselors are provided as part of onsite services.
PART I: INDEPENDENT PROGRAM STATISTICS
The Youth Referred vs. Youth Served chart reflects how many youth were referred for counseling services (blue) during the course of the contract period, how many parents gave written consent with the child’s assent (maroon), the number of those who were referred and not served (light blue), and the total number of youth served in the counseling services program (yellow). The youth, who were referred but not served, were those youth who did not return written consent forms, even when the child gave assent. Referrals were made to the counseling program via the guidance department, the principal and assistant principals, as well as the school resource officer, teachers, and parents. Some of the youth who were referred (blue) for counseling services were not served (light blue) due to a lack of parental consent. Of those youth who were served in the counseling program, demographics provide helpful information as to the general population identity for those youth seen each week.
The Demographics of Youth Served graph indicates the number of youth grouped within certain categories of race, gender, and grade level. While the grade level of the youth can usually offer indications of chronological age, the ages cannot be assumed as consistent with grade level, particularly in a population that has been identified as in need of counseling services. The demographics of the youth served during the course of the counseling services program changed as the census fluctuated, and as such, the numbers listed in the graph are averages.
The Counseling Services Census graph highlights the shift in the number of youth who were served in the counseling services program at the program’s beginning, mid-point, and at the program’s end. The census numbers shifted during the course of the program due to the number of referrals made (highlighted earlier), but also the census shift can be attributed to terminations.
The Terminations: Voluntary & Involuntary chart shows the total number of terminations that occurred during the contract period for the counseling services program at the school. Some youth were “voluntarily terminated,” meaning the youth stopped coming to counseling for a number of reasons: moving to a new area, expulsion from the school, and/or treatment goals were achieved and counseling services were no longer needed. The other reasons a youth can be terminated from counseling services were for involuntary reasons. “Involuntarily terminated” means that some terminations occurred for clinical reasons: reasons other than external variables. Some terminations are positive, and some are negative. Program impact can also be assessed by reviewing how many participating youth DID NOT come to their weekly counseling sessions and why the no shows occurred.
The Reasons Youth Missed Sessions graph looks at the total number of times youth missed weekly counseling sessions, and for what reason the youth missed. No shows does include days when the counselor was absent, as these days automatically serve to excuse youth from missing sessions. The reasons youth missed sessions is helpful in determining if the counseling program is something the child is avoiding for reasons related to the sessions, and not some external influence. Likewise, the source for missing sessions is tracked in order to monitor the whereabouts of youth at all times. Regardless, if a youth misses a session too often the type of counseling session they have been placed in may need to be considered. Session types, for example individual sessions, group sessions or family sessions, are used in different ways for different youth, in order to facilitate social, emotional, behavioral, and academic success.
The Sessions Provided by Type chart includes the tally of various types of sessions that youth attended – individual sessions (blue), group sessions (maroon) and/or family sessions (yellow). Numbers in the chart above reflect youth who attended the same type of session, multiple times, over the course of the contract year. For example, a child might attend a group session 30 times per year (one group each week). If six children did the same thing, the total group type number for the year would be tallied by taking 6 (children) x 30 (weeks) for a total of 180 group sessions. Similarly, if a child attended one individual session per week for 30 weeks, and four other children did the same thing, the total group type number for the year would be tallied by taking 5 (children) x 30 (weeks) for a total of 150 individual session types. Family sessions are minimal in that such sessions were necessary according to the specific familial needs of some youth, thus this number for the entire year is very small. The number of times a youth attends certain group types, in addition to other support services that are given to the school, the teachers, to parents, and others, indicates what additional interventions were needed to assist youth with various services on a day-to-day basis.
The Daily Service Breakdown chart reflects the average number of youth served per day (over the course of the contract year) with a break down of the session type that the youth attended each day (group, individual or family). The three session types and the number of youth served per day are related to one another, but they will not be equal to each other. Because children were seen individually or in groups, and because family sessions occurred after school involving the same children who were also seen earlier in the same day, the average combined numbers of “individual sessions,” “group sessions,” and “family sessions” exceed the average number of “youth served per day.” The fourth column, “average support services per day,” is a daily average of a different service provision typology; this number is an average that includes meetings with principals, teachers, parents, school resource officers, guidance counselors, case managers, and it includes counselor attendance at IEP and study team meetings – based upon 15 minute time segments. A common question in school counseling programs is whether or not youth measurably improve over the course of counseling services. Measuring success in the school setting often, and logically, leads evaluators to obtain grade differentials.
The Grade Differential was established by obtaining grades from the school by accessing their computer database of recorded quarterly grades. Grades were obtained at the first nine weeks (9-1) of the school year, and then contrasted to the grades at the third nine weeks (9-3). Calculating the overall difference between time 1 and time 2 is the method by which all scores in all classes were compared. The grade differential is an averaged overall score, and it does not reflect a course-by-course improvement or decline; the grade differential does not weight core classes differently than elective classes. The selections of time 1 and time 2 is based upon three considerations: 1) 9-1 is a logical beginning assessment period, as the first nine weeks is the initial point when grades are available, 2) 9-3 is the next logical assessment period since final 9-4 grades are not entirely posted at the end of the counseling services program, and 3) 9-3 might better reflect an internal locus of control measure for youth who either improved or declined, as the students are less likely to be motivated by a semester pass/fail scenario, which would be the case if improvements/declines were assessed at the last nine weeks of the school year. Grades only serve as one variable to consider in looking at program success. Other considerations related to program efficacy include youth exit interviews, school personnel evaluations, and counselor assessments.
| Onsite School Counseling Services Programs For Your School |
| Drop Out Prevention Services and Group Presentations for Your School |
| Deescalating Out of Control Behavior In the Classroom |
| Gun Safety Protocol Development and Implementation For Your School |
| Onsite Intro seminar: How Teachers, Counselors, Students Work Together |
| Why Schools Need Onsite School Counseling Programs Now! |
PART II: PERSONNEL EVALUATIONS
DISTRIBUTION: Personnel evaluations were distributed to the school principal who organized the logistics of surveys distributed and collected. The school was able to copy as many evaluations as needed, and evaluation forms were sent electronically via email to the principal. Evaluation distribution/collection was encouraged to involve as many participants as possible, particularly those personnel who were directly involved with the youth attending weekly counseling services. The evaluations were distributed to the school principal one week prior to the ending day of the program.
COMPOSITION: The assessments consisted of seventeen quantitatively and qualitatively designed responses. Respondents answered fourteen 6-point Likert type questions (“strongly agree,” “agree,” “somewhat,” “disagree,” “strongly disagree,” and “unable to answer”), two open-ended questions, with a final question that simply stated “Other Comments.”
Response sets in the charts and graphs that follow are labeled differently than they appear on the questionnaire. “Strongly agree” and “agree” responses from the evaluation were lumped into one response set for the charts and graphs that follow, just as “strongly disagree” and “disagree” were lumped into a different response set. The “somewhat” response and the “unable to answer” response remained isolated response sets respectively because agreeability and disagreeability were not attainable in these two responses. One other response category in the charts and graphs, labeled as “declined to answer,” was not an option on the 6-point Likert type questionnaire – but it is included in the graphic analysis to indicate that a respondent chose not to (intentionally or inadvertently) answer a survey item. The survey distribution indicates the sample size to which the principal sought feedback on behalf of the counseling services program, however the survey return rate is indicative of respondents’ intention to provide feedback, and possibly can serve as a measure of program interest.
The Survey Distribution & Return Rate chart provides the number of evaluation forms that were distributed by the school principal, one week prior to the end of the counseling services program and it indicates the number of evaluation forms that were returned, one week later. The rate of return is useful in obtaining a percentage of return, in order to assess the ability of the evaluation to accurately reflect the overall opinion of the respondents. Another consideration of program success is not only the distribution and return rate, but also more specifically if the school personnel observed youth improvement during the course of counseling services.
The Youth Improvement chart highlights improvement of the youth in counseling, by asking school personnel to respond to the following statement: “The youth who were served by the program improved throughout the year.” Respondents either circled or checked their answers on a pre-printed form. The majority of those surveyed indicated that the youth improved. Another question to consider is whether or not any of the youth did worse, or regressed, during the course of the counseling program.
PROGRAM EVALUATION NOTE: When the youth improvement responses noted above, are compared to the converse of this same question in the “Youth Regression” chart below, inter-item reliability can be minimally assessed. When the youth improvement item and the youth regression item are both contrasted to the Grade Differentiation chart, validity can be minimally assessed. The validity of the overall program evaluation is further indicated, when staff reports of improvement and regression, along with independent grade differentials, is compared and contrasted to degrees of psychosocial functioning reported by the counselor who was the direct service provider.
The Youth Regression chart highlights reported regression of the youth who completed the counseling program, by asking school personnel to respond to the following statement: “The youth who were served by the program worsened throughout the year.” Most of those surveyed either strongly disagreed or disagreed that youth who were served regressed. However, for youth to improve related to school counseling services, another variable to factor into the outcomes is the students’ whereabouts during the days that services were provided.
PROGRAM EVALUATION NOTE: When the youth regression responses are compared to the converse of this same question in the “Youth Improvement” chart, inter-item reliability can be minimally assessed. When the youth regression item and the youth improvement item are both contrasted to the “Grade Differentiation” chart, validity can be minimally assessed. The validity of the overall program evaluation is further indicated, when staff reports of improvement and regression, along with independent grade differentials, is compared and contrasted to degrees of psychosocial functioning (part IV of this report) reported by the counselor who was the direct service provider.
The Youth Where Abouts Known chart assesses whether or not school personnel believed that the counseling program monitored the whereabouts of the served youth for their attendance. As noted earlier, students missed sessions for a number of reasons, but in order to track the reasons, youth whereabouts must be monitored and recorded each time a youth is called out of class to attend weekly sessions. Respondents were asked to assess whether or not “the counselor made sure to keep the whereabouts of the youth monitored each week.” School personnel believed that the counseling program effectively monitored and reported student whereabouts, by an 8:1 ratio. If the youth improved, did not regress, and were monitored effectively, did school personnel also believe that the program was needed this year?
PROGRAM EVALUATION NOTE: If youth whereabouts are known, as suggested by this survey item, data that was maintained on the youth who missed sessions, the reasons they missed sessions, and when/if they skipped sessions, should arguably be consistent with personnel beliefs that the counseling program effectively knew where its students were on the days the counseling services program was at the school.
The Counseling Program Needed This Year chart assesses whether or not school administration, teachers, and support staff believed that counseling services were needed in the first place. It’s possible for the students to benefit from services, and it’s possible for the services to be comprehensive, but it’s equally possible that they may not be indicated due to various other unknown considerations. To assess the other possible considerations, even in the absence of the content of such details, the school personnel reported that they believed “the counseling program is needed at this school [this] year” by an 8:1 ratio. It’s one thing to say that a program is needed this year, but conversely, is the program possibly NOT needed next year?
PROGRAM EVALUATION NOTE: When the counseling program needed chart from above, is compared to the question assessing whether or not the program is NOT needed next year (chart below), inter-item reliability can be minimally assessed. When reliability of these two items is considered in connection to program efficacy noted in prior staff responses, and in review of counselor interpretations of youth progress, the overall instrument validity can be minimally assessed.
The Counseling Program Not Needed Next Year chart assesses whether or not school administration, teachers, and support staff believe that counseling services are NOT needed next year. School personnel disagreed with the statement that said “the counseling program is not needed at this school next year” by an 8:1 ratio. This outcome is consistent with the earlier finding that suggested services are/were needed at the school. But what about whether or not the school desires to have a program in their school next year?
PROGRAM EVALUATION NOTE: When the counseling program NOT needed next year chart is compared to the item assessing whether or not the program is needed this year, inter-item reliability can be minimally assessed. When reliability of these two items is considered in connection to program efficacy noted in prior staff responses, and in review of counselor interpretations of youth progress, the overall instrument validity can be minimally assessed.
The Counseling Program Desired Next Year chart looks at the wishes and desires of school personnel based upon the recommendations of survey respondents. All of the respondents expressed a desire for the counseling program next year, nine to zero, by interpreting this survey statement: “I would recommend that this program continue in the future.” If the program is continued next year, the next consideration is whether or not it contains services that seem unnecessary and/or if it did not provide services that would have been more helpful.
PROGRAM EVALUATION NOTE: When the counseling program desired next year chart is compared to other survey items that assess program efficacy, program needs, youth psychosocial changes, and service provisions, inter-item reliability can be minimally assessed. When the reliability of these items is considered in relationship to other program evaluation questions, an argument for instrument and program evaluation validity can be asserted.
The Services Provided Were Not Necessary question was an open-ended, qualitative question that invited respondents to list program variables that they believed were not important or not needed. In general, when this question is not answered, it suggests that certain respondents did not believe there were program components that were unnecessarily provided. In cases where items are listed as not necessary – the comments were categorized into the “strongly disagree / disagree” response set. The item says: “I would recommend that the program discontinue (list aspects of the program that you think are not needed or that are unnecessary).” The majority of the respondents did not list program components to discontinue. The next question logically follows: are there any services that could be added in order to improve the program next year?
PROGRAM EVALUATION NOTE: This item can be correlated to two other items: “counseling program needed this year” and “counseling program NOT needed next year.” The higher the correlation between the three questions, a higher reliability co-efficient could be anticipated.
The Add Services Next Year chart assesses if school personnel believe that services could be expanded in some manner. The respondents indicated their preferences with write-in answers, in response to the statement, “I would recommend that the program add (list aspects that you think are needed for the program).” This question, like many of the open-ended questions, was usually left blank, however when a respondent listed a service they wanted to see added, regardless of its content, their responses were placed into the “strongly agree/agree” response set. The add services chart does not clearly provide ideas as to what is needed to improve the program specifically, thus a specific in-service question was needed, and already included in the survey.
The De-Escalation In-service Needed (Next Year) chart assesses if respondents believe they would benefit from an in-service that is geared to addressing the processes escalating and de-escalating behaviors in the classroom. Behavioral issues are a common factor that leads to counseling program referrals. Some behavioral issues may be resolvable without counseling services and/or administrative interventions, but to determine if such training is perceived as potentially beneficial the survey item stated, “I would like the counseling services program to add a one-day workshop addressing “The Issues of Escalating and De-escalating Behavior in the Classroom.” Respondents supported the statement by an 8:1 ratio. The next program variable to consider is whether or not the counseling services program was able to successfully present in-services to the staff with a degree of success, based upon the current years program.
| Deescalating Out of Control Behavior In the Classroom |
PROGRAM EVALUATION NOTE: If the current program offers future in-service trainings to the school, it would less likely be supported in the proposition, if the counseling program failed to successfully provide in-services in its current year of service. The comparison of the proposed de-escalation in-service item to the evaluation of already provided in-services partially addresses social desirability variables.
The Intro & Wrap Up Seminars Helpful chart assesses the value (or lack of value) of the two in-services that were provided to school personnel as a component of the counseling services program. These seminars are related to the logistics, legalities, and purposes of the counseling program at the school (in the Intro Seminar) and also they offer a forum whereby feedback and closure discussions with teachers and administrators, discussing program likes and dislikes (in the Wrap Up Seminar), occur. Of those who were able to answer the statement that said, “the ‘Intro to Counseling Services Seminar’ at the beginning of the year,” and “the ‘Counseling Services Wrap-up Seminar’ at the end of the year were helpful” most respondents believed the seminars were helpful at a 6:1 ratio. The next question in the survey addressed whether or not school personnel believed overall counseling services were beneficial to school personnel.
PROGRAM EVALUATION NOTE: If the current program offers future in-service trainings to the school, it would less likely be supported in the proposition, if the counseling program failed to successfully provide in-services in its current year of service. The comparison of the proposed de-escalation in-service item to the evaluation of already provided in-services partially addresses social desirability variables.
The Personnel Benefits chart assesses whether or not school administration, teachers, and support staff believed that the counseling services program was helpful for them – overall. It is presumptive to suggest that because children improved in social, psychological, academic, and behavioral areas that the staff inherently benefited. To assess personnel benefits based upon personnel responses more directly, respondents evaluated the statement that said: “the counseling program was helpful to school personnel.” Consistent with reports of student improvement and student regression, personnel benefits were noted by the majority of the respondents. Another program consideration is whether or not school personnel believed the on-site counselor was accessible to them, professional, and courteous.
The Counselor Professionalism & Accessibility chart is a compilation of four different questions on the personnel survey, thus the number of respondents in the chart reflects the total number of people who completed the survey, multiplied by the four questions that address issues of professionalism and accessibility. Professionalism with the staff was assessed with the use of the statement that said, “the counselor was professional, courteous and cooperative with school personnel,” while professionalism with the students is reflected in the statement that said, “the counselor was professional, courteous and cooperative with the students.” Professionalism in communication was measured via this statement: “the counselor was professional on the telephone, in the use of email, and in other forms of communication.” The availability and accessibility of the counselor for various people who interacted with the counselor was assessed in the statement that said, “the counselor was accessible each week to ask and answer questions (for teachers, administrators, and parents).” Professionalism and accessibility help in evaluating the interactional nature between the onsite counselor, the school personnel, parents and students, but these questions may/may not address how well the program functioned and operated in a logistical manner.
PROGRAM EVALUATION NOTE: In multiple school evaluations, these four questions, when lumped into one response set graph, correlate into the four categories 100% of the time. The reliability issue is evident in the 100% agreement rates between survey respondents in the same survey, while the reliability and validity property of these items is strengthened in other surveys, completed by other respondents, in different schools. Whether this 100% agreement rate continues or not will known with future evaluations using the same indicators.
The Program Operation & Logistics chart indicates how well the school administration, teachers, and support staff believed the program functioned for the contract year. A component of logistics and operation of the counseling services program is that, with the exception of referrals and follow-up, the program runs with as minimal of an interruption to the normal routine processes of the school day as possible – making program accommodation easier to facilitate between the school and the counseling services program. “The counseling program appeared to run smoothly” was the statement that was included in the school evaluation form to indicate “operation & logistics.”
| Onsite School Counseling Services Programs For Your School |
| Drop Out Prevention Services and Group Presentations for Your School |
| Deescalating Out of Control Behavior In the Classroom |
| Gun Safety Protocol Development and Implementation For Your School |
| Onsite Intro seminar: How Teachers, Counselors, Students Work Together |
| Why Schools Need Onsite School Counseling Programs Now! |
PART III: YOUTH EXIT INTERVIEWS
Independent demographics and stand-alone reports, in conjunction with administrative and personnel evaluations are helpful in the review of program success or failure. Another consideration for program efficacy can be based upon the interpretations of the youth who were the direct recipients of the services – eve as others may believe the program has been helpful “for” the youth. But what do the recipients of the services themselves believe, and is their interpretation consistent with the stand alone data, the school surveys, and the counselor assessments?
Seven interview questions were asked of the youth who were being served by the program one week prior to the last session (most questions are in the title of the graphs that follow, otherwise they will be listed in the accompanying summary). All qualitative responses were assigned to, and subsequently grouped by, specific and relative categories so that a tally of the interviews could be made. The youth exit interview section of this report begins with a count of those youth who completed the program at years end versus the number of youth who were available to answer the exit interview questions.
The number of Youth in Program vs. Exit Interviews Completed chart highlights the number of those who were attending counseling sessions at the end of the program compared to the number of youth who participated in the exit interview. Some youth did not participate at the time of the interviews (one week before the program ended) due to absences. Each assessed exit interview question follows:
Youth responses ranged from liking the topics and missing class, to liking the counselor and liking a place to go each week to express feelings. The youth responses were then compared and contrasted to what they disliked, looking for themes between the two interview questions.
PROGRAM EVALUATION NOTE: The comparing of the two items of “likes” and “dislikes” about the counseling program serves to, in part, address social desirability responses. Social desirability is further addressed when youth were asked to identify what they found hard about counseling.
It’s interesting to note that while the majority of the youth interviewed indicated that they enjoyed the topics discussed in weekly sessions (see exit interview chart 2), and while none of them stated that they disliked the topics (see exit interview chart 3), in this question most of the youth indicated that it was the topics that they found “hard.” The finding suggests that the topics were somehow difficult for the youth, but they benefited in the discussion of them, nevertheless.
PROGRAM EVALUATION NOTE: The comparing of the two items of “likes” and “dislikes” about the counseling program serves to, in part, address social desirability responses. Social desirability is further addressed when youth were asked to identify what they found hard about counseling.
The literal question posed to the youth was “what would you like to see changed next year in counseling?” Similar to all of the exit interview questions, the youth spontaneously provided their own responses. Three of the youth were “unsure” (orange color) and 2 others would change “nothing.” Five others would have different (or more) activities next year.
PROGRAM EVALUATION NOTE: The similarities between this question and the question of what the youth disliked about counseling continue to provide evaluation validity based upon the inherent address of social desirability when the responses are compared.
The literal question was “what would you like to see stay the same next year?” An interesting find here, contrasted with the findings in the previous question where 5 students would change “activities” next year, in this scenario 5 of the students reported that would keep “activities” the same next year.
PROGRAM EVALUATION NOTE: The similarities between this question and the question of what the youth would like to keep the same, in addition to the similarities already addressed when the youth answered the question of what they disliked about counseling, combine to provide further support for evaluation validity.
The open-ended and qualitative question that asked if students leaned anything about themselves included agreement from every student that they did learn something. Not all of the students reported awareness of something learned that they believed was positive (see the positive category in blue), because some students commented that what they learned was negative (“negative” responses were categorized as such by the author of this report based on the content of what the student stated) or they offered self-descriptions that were derogatory (noted in maroon). The positive and negative comments often addressed internal processes and as such were linked to a category that was identified as “self-esteem” related.
The literal question presented to each youth was read, “if you could say anything to the people who created or developed the counseling program, what would that be?” Not all of the students responded verbatim by saying it was a “good program” but the majority made the statement or one similar to it that indicated that the program was perceived as a good one. This question was the final exit interview item.
| Onsite School Counseling Services Programs For Your School |
| Drop Out Prevention Services and Group Presentations for Your School |
| Deescalating Out of Control Behavior In the Classroom |
| Gun Safety Protocol Development and Implementation For Your School |
| Onsite Intro seminar: How Teachers, Counselors, Students Work Together |
| Why Schools Need Onsite School Counseling Programs Now! |
PART IV: COUNSELOR PRE / POST EVALUATION
The counselor evaluation of pre-treatment and post-treatment psycho-social functioning. The counselor evaluation of the youth’s level of psycho-social functioning (pre and post intervention) is a self-administered assessment (completed by the counselor) that identifies to what degree social, occupational and academic function existed on the first day of counseling services, compared to the degree of functioning at the last day of counseling services. The youth assessed were those who were in the program at the end of the contract period. The counselor evaluation is based upon clinical interpretations at the program begin date and at the program end date, linked to individual case records, which includes historical clinical observations, assessments, and interventions for each case for the duration of the contract period. These clinical case reviews were indexed into one of six areas in the counselor assessment.
The “counselor rating index” (CRI) is comprised of a program specific 6-point Likert response set, developed in relationship to the Global Assessment of Functioning GAF Scores, commonly used by U.S. mental health professionals. GAF is outlined in the Diagnostic and Statistical Manual of Mental Disorders 2, published by the American Psychological Association (DSM-IV TR, 2003, p. 34). It is important to note that the GAF Scale was not directly used in this evaluation, but rather a trimmed down comparative “counselor rating index (CRI)” was designed and used. The author of this report acknowledges that GAF categories are broken down by 10-point segments, thus 20-point ranges (seen below in the left hand column) means that the author merged two categories of GAF for the sake of an equitable comparison with the CRI. The comparison of the CRI with the GAF Scale is highlighted below.
Psychosocial functioning addresses multiple areas of clinical concern in the provision of mental health services. Scores are not necessarily indicative of mental disorders, even if the scores (GAF or CRI) are low. Biological factors, substance use, as well as situational and environmental variables are useful in assessing for mental illness, but theses scores are not the only variables that are used to do so. For the purposes of this evaluation mental illness was not necessarily the assessment variable measured in the CRI, but it was not excluded either – rather the degrees of functioning were measured pre intervention and post intervention.
PROGRAM NOTE: In October 2015 the DSM5 (first published in 2013) discontinued its use of the GAF.
The pie chart above reveals the number of youth who began the counseling services program (pre-counseling) and the level of counselor assessed psychosocial functioning. It is important to note that a score of 3 or below would indicate the need for professional intervention. Youth who score at a zero were likely in need of more intense services than those services provided in the school setting. Pre-counseling numbers are useful, especially when they are compared to post-counseling data, to highlight psychosocial changes before and after treatment.
PROGRAM EVALUATION NOTE: Regardless of the equivalency limitation between CRI and GAF, when pre and post psychosocial functioning from the counselor perspective is compared and contrasted to the students’ grades at time one and time two, and when the CRI functioning scores are compared and contrasted to the administrations survey responses related to youth improvement and youth regression, in addition to a review of the students’ exit interview questions of likes and dislikes program evaluation and instrument reliability and validity is strengthened. Further, validity is strengthened when these findings are duplicated in another academic setting, involving different youth in a different community, with a different administration. Equivalency has not been assessed in review of the CRI and the GAF, in part because the CRI was developed for the sake of convenience and ease of use, as opposed to the use of the well-known GAF Scale. In October 2015 the DSM5 (first published in 2013) discontinued its use of the GAF.
The pie chart above indicates the changes, if any, in the counselor assessed levels of psychosocial functioning at the end of the counseling program (post-counseling). It is possible for some youth to regress, thus lower numbers from pre to post test are not necessarily surprising. However, overall improvement percentages might be strengthened when compared to the administrative reports of improvement (by percent) in “part III” of this report. \
PROGRAM EVALUATION NOTE: Equivalency has not been assessed in review of the CRI and the GAF, in part because the CRI was developed for the sake of convenience and ease of use, as opposed to the use of the well known GAF Scale. Regardless of the equivalency limitation between CRI and GAF, when pre and post psychosocial functioning from the counselor perspective is compared and contrasted to the students’ grades at time one and time two, and when the CRI functioning scores are compared and contrasted to the administrations survey responses related to youth improvement and youth regression, in addition to a review of the students’ exit interview questions of likes and dislikes program evaluation and instrument reliability and validity is strengthened. Further, validity is strengthened when these findings are duplicated in another academic setting, involving different youth in a different community, with a different administration. In October 2015 the DSM5 (first published in 2013) discontinued its use of the GAF.
| Onsite School Counseling Services Programs For Your School |
| Drop Out Prevention Services and Group Presentations for Your School |
| Deescalating Out of Control Behavior In the Classroom |
| Gun Safety Protocol Development and Implementation For Your School |
| Onsite Intro seminar: How Teachers, Counselors, Students Work Together |
| Why Schools Need Onsite School Counseling Programs Now! |
PART V: PROGRAM COST COMPARISON
Program efficacy is a good indication as to the need for services, however affordability is a consideration as well. The next two graphs indicate what mental health services cost in the community when provided by private practitioners using fees that the market allows, what the services actually cost under the counseling services program contract agreement, and what additional costs might be included if “support services” were provided at the full private practice billing rate. These “support services” costs were calculated, based upon a daily average of times such services were actually provided in the school during the contract period, including the private practice rate that would be charged for such support services in the private sector, and then to reach the daily fee these numbers were averaged out in relationship to how many days the services were actually provided in the life of the contract. In summary, the graph is a comparison actual contract costs to possible private sector costs. One of the dollar values (blue) is actual, whereas the others (maroon and yellow) are figurative based upon certain community based scenarios.
PROGRAM NOTE: As of 2018, the SCP (school counseling services program) as implemented by TalkifUwant expertise experienced similar outcomes in multiple years of evaluations (published here), between different districts, under different school boards, different school administration and teachers, and even with different onsite providers. SCP is likely one of the few, if not one of the only in the US, even in 2018, to be evaluated in different settings, different years, and with different providers. More unique is the use of the digital overlay services.
The Contract Rate vs. Private Practice Rate chart includes there columns: 1) the actual rate charged (blue) to the district for each day of service provided under the terms of the contract, 2) the average daily rate (maroon) that would normally be billed for similar services if they were provided in private practice (including an average daily amount for Support Services) and 3) a brief breakdown of the average daily amount for support services (yellow), based upon the average number of times that similar services were provided to the school during the existing contract period. The private rate was figured by calculating the average number of hours spent doing various tasks each day while at the school during the past year – multiplied by the hourly rate for individual, group and family sessions, with the hourly rate for support services also being factored into the private party average daily rate. Remember that “support services” was defined in Part I of this evaluation as “…meetings with principals, teachers, parents, school resource officers, guidance counselors, case managers, and it includes counselor attendance at IEP and study team meetings (see Part I, “Service Break Down…” section). In the final estimate, the daily cost for services that are being provided under the contract to the district is $350, whereas in the same case if such services were provided in the private practice sector, the same services would realistically jump to nearly $700 per day. The next variable to consider in a cost analysis of the counseling services program is the annual cost differences based upon the academic year that just ended.
PROGRAM NOTE: In 2018 these numbers would need to be adjusted to reflect the local market depending on the location of the school(s) and contract fee for services that are required at the time of a current program implementation.
How much was the overall contract cost for the academic year (maroon)? How much would these same services cost in the private non-contracted sector (blue)? And what is the cost savings between the actual contract amount and the comparable private party amount (yellow)? PROGRAM NOTE: In 2018 these numbers would need to be adjusted to reflect the local market depending on the location of the school(s) and contract fee for services that are required at the time of a current program implementation.
PROGRAM SUMMARY, CREDITS, AND LIMITATIONS, ETC.
Basis for Counseling Services Program. The counseling services program was provided based upon the contents of an extensive written proposal that was directed to the superintendent of schools in the county where the services were provided. The content of the proposal for counseling services is a detailed and lengthy description of the service provisions that are/were provided to the district, the school, the students, the parents, the teachers and the administration.
Proprietary Program Aspects. The program proposal document is a proprietary document in the sense that the program components are explicated and detailed by LaRose, and they are unique to the program that was designed, developed, and implemented by LaRose. Thus the proposal identifies the program and labels it in its entirety as the “counseling services program” to which LaRose is the program developer, designer, implantation administrator, and direct service provider. Treatment methods, assessment and diagnosis methods, and any of the theory on which such program aspects are/were based are not proprietary as these are academically and professionally known, published, researched and acquired.
Program Limitations & Strengths. While the development of this program has been duplicated in multiple school settings, in part modeled after those that have been in existence through FSU’s College of Social Work (at the FSU Multi-Center) for some time, the evaluation instruments used for this analysis are those that were created and designed by the author. Where issues of reliability and validity have been considered, in spite of the absence of true psychometric assessment, and where foundational aspects of psychometrics have been incorporated into the instruments used, specific program evaluation notes have been included throughout. Most of the limitations of the instrumentation used for this analysis are listed in the appropriate item by item “evaluation note[s].”
Because of the infancy of the counseling services program as evaluated in this report, and in consideration of other program duplications with similar outcome evaluation results, the strengths of the program are noted respectively herein and consistently elsewhere. Repetitive, similar, and cross-community evaluations and outcomes give credence to the inputs and outputs that facilitate the overall success rate of the counseling services program and its evaluation component.
Survey Disclosures & Additional Limitations. The inherent bias of the author of this report should be considered in the interpretation of the findings that are noted here, and such a bias is disclosed herein. True statistical analysis has not been performed, even as foundational structuring for such an assessment is evident in the program evaluation notes that sporadically appear throughout the report. Areas of limitation include: 1) Grading information, census information and demographics that are generally and often accepted as independent data references and as such they are usually believed to be without subjectivity of researcher(s); 2) The administrative surveys and school personnel surveys were distributed by the principal of the school at the request of the author of this report, and the principal used distribution and collection methods that were entirely autonomous, without input of the author. The author did not investigate how distribution and collection methods can be factored into the return rate for surveys; 3) The youth exit interviews were completed with face to face interviews between the program evaluator (who is also the counselor). The youth who participated and answered exit interview questions did so voluntarily. Complicating this evaluation component is the variable of social desirability, which may be heightened given the power differential that inherently exists in the counselor/patient dynamic. Lastly, the grouping of qualitative in the exit interview responses was necessary in order to tally youth reports, however the process of grouping is admittedly a subjective one. And even in relationships where rapport is fully developed and the interpretations of the interviewer are believed to be representative of youth responses (and they are/were), the evaluator grouping of responses may not accurately reflect the intent of the youth; 4) The pre and post self-administered CRI index is based upon the subjective opinion of the author of this report, and the counselor who provided the counseling services program to the assessed youth. The subjective nature of self-administered surveys, not to mention those that are also designed by the program administrator and evaluator, is an inherent limitation – however – the use of case notes could serve to mitigate bias because case notes were recorded at the close of each session, week after week, not at the time of the evaluation. The CRI comparison to the GAF is not indicative of statistical equivalency between instruments. The self-administered pre-post youth evaluation by the counselor, with biases noted, was needed to correlate other evaluation aspects into this report and to factor into the overall equation of program success or failure issues of value in evaluation: clinical significance, practice wisdom, and psycho-social-occupational functioning that is not limited to the observable and measurable constructs that can be operationalized in the purest forms of statistical evaluation.
Reliability & Validity with Program Limitations Discussion. Equally important to mention in addition to bias disclosure, is that this assessment was developed using the highest standards of program evaluation and outcome measures that could be reasonably and affordably developed to compile the data that has been explicated in this report. Issues related to psychometrics have been addressed in the limitations of this report, and in various program evaluation notes, but also in the outcome graphs and charts of this report, the highlights of the potential strengths in psychometric considerations are labeled: the potential for inter-item reliability, test/re-test reliability (between two different programs), and construct validity that is further strengthened by reliability indicators. The goal in bias and validity limitation disclosures is not to negate the findings of the evaluation or the efficacy of the counseling service program, but rather to address the potential limitations in reliability and validity to diffuse reservations about ongoing duplication of the counseling services program in other school settings. Similar and overall positive results have been realized in other school program evaluations that have been published on the author’s website (to find research and analysis information go to the “Site Map & Index” page for specific links). Hopefully, the limitations are addressed when the various forms of data gathering and reporting are compared and contrasted so that collectively the symbiotic outcomes reveal the true successes and failures of the counseling services program at this school and other schools like it.
Partnerships, Collaborations, and Affiliations. The efficacy of the program was assessed for multiple reasons: 1) it is an academic and professional standard in the field of clinical social work to evaluate whether or not a program is helping the people who depend on the profession for human services interventions, 2) it is necessary in order to improve, adjust and terminate various program components, 3) if program efficacy is measured and outcomes warrant ongoing support, the counseling services program can continue to obtain increased funding, and 4) similar services can hopefully be similarly duplicated in a more global degree, as evidenced based practice becomes clearer in the counseling services program evolution process. This evaluation and analysis will also be used to further the counseling services program in multiple school settings, for as long as the services can be provided in the interests of the school districts who will sponsor the services, in the interest of the school settings who serve the youth enrolled in counseling services, and in the interests of the students themselves who have the most to lose – and the most to gain – if/when they succeed. All of the original documentation for this analysis and interpretations report is on file at the office(s) of LaRose and queries related to such records can be directed to the author.
Clinical Acknowledgements. It is important to note and credit other people who have directly and/or indirectly contributed to the successful design, development, and implementation of the counseling services program. Much of the technique and methodology used in the counseling sessions was co-developed with the advice and guidance of a child services expert, Terry Abell, Licensed Mental Health Counselor, who works with the FSU Multidisciplinary Center and who trained the author of this report in direct clinical counseling practices designed for youth in the school setting. Likewise, a special debt of gratitude is extended to Alison Otter, a professional Art Therapist who has been instrumental in teaching LaRose (me) methods to reach children, with techniques that are not entirely linked to a preferred (and bias toward) the cognitive/behavioral perspective. LaRose is greatly indebted to these two professionals, continues to benefit from a collaboration that is ever expanding in relationship to Ms. Abell and Ms. Otter. Andrew Miller, Licensed Clinical Social Worker, provided clinical supervision for the counseling component of the program, for this and other school counseling programs, offered by LaRose.
“This program evaluation, first written in June 2006, is one of the first ones ever completed in the 14 year history of onsite school counseling services implementation. More importantly, it (and the other like it over the years) is one of the few onsite school counseling programs in the United States, even in 2018, that has program evaluation data. As schools in the US are increasingly, due to gun violence as one factor, mandated to provide onsite counseling services, this 14 year program model is turn key for schools seeking such services. With the digital overlay program component added, and in successful transitioning of the program to various districts for program continuity – SCP (School Counseling Program) can be implemented with new or existing personnel supported by the TalkifUwant model(s) and expertise. The digital overlay can be used by itself by districts – branded to their district – using new or existing forms in a totally paperless platform, including video/chat, and remote access sessions (homeward bound) along with the face to face onsite and in school sessions. Digital mental health in SCP may resolve the seeming overlapping and at times difficult to sort – right to information requests for public records, FERPA educational records laws and HIPAAs mental health privacy records laws. This is particularly helpful to sort out now, with experts who have gone through the growing pains of dealing with conflicting information in onsite mental health services programming – particularly in the school setting. When schools are mandated to treat youth in a mental health capacity allot has to be done – not only to meet one mandate – but to collaborate and meet others as well. I’ve been apart of the SCP program from the ground up and am excited to help others along the way!”
Program Credit & Funding Source. The school superintendent, and the county school board that provided the resources to fund the counseling services program are also to be commended. Its insight, wisdom, and goal driven motivations to ensure that the district provides to its students the services that will assist their youth in succeeding in the academic setting is innovative, far reaching, and curative. At a time when mental health funding is limited, constrained, and seasonably unpredictable, a venture that seeks to foster mental health program implementation where students directly and almost immediately benefit cannot be under estimated – and neither can it be sufficiently praised. The kind of innovation and efforts to reach the youth also would not have been feasible without the direct assistance and facilitation of the ESE Program Director and the accompanying support staff in this department, who have all pursued the counseling services program at the district level on behalf of youth who might otherwise go without services altogether. A special expression of gratitude to the district is extended.
Logistical Support. At the local school setting level, without the input, assistance, and support of the many people who enable the day-to-day operations of the school itself, the counseling services program would not exist. The support staff, teachers, guidance counselor(s), principal, assistant principal(s), coaching staff, and others, all contributed time and energy in various ways and degrees so that troubled youth could be assisted to better succeed. Often such professionals are perceived as oppositional parties to the very youth they are hopeful to serve – such a perception, in my experience, is erroneous. The communication that occurs when people begin to address problems and solutions with children they have in common, in partnership with various trained professionals and disciplines, is the foundational function of curative mental health services and holds true when professional team up to meet the needs of youth. Teachers, principals, guidance counselors and support staff are seldom paid enough to do the work they do, thus the work they do is not merely motivated by the pay check, but rather by the passion for seeing youth succeed. “Thank you” is likely an overused term, that understates the appreciation for what front line school personnel do every day in the classroom, in the office, and out on the playgrounds, gymnasiums, and football fields.
Report Preparation and Electronic Application. All graphs were developed using standard MS Office® software, such as Excel®, Word®, and Access®. The publication of this document on the web was made possible, in part, due to the web publishing features found in MS Publisher®, MSOffice, Powerpoint, Excel and other similar products. This technology is incorporated into the updated Website Platform for TalkifUwant.com, using WordPress technology as of 2018, where Office based and Google based products make such interchanges possible.
About the Author. Kurt LaRose is licensed in Washington DC and in Florida as an LCSW and LICSW. He holds credentials as a qualified clinical supervisor, has presented at the NASW-FL conference on digital private practice, is a former professor and an author. He provided direct services for the counseling services program in this, and other schools and counties and ultimately transitioned to program implementation services for districts needing them. A professional and personal bio can be found on the LaRose website, along with other private practice information.
The continuity of the counseling services program is determined via private contract negotiation that occurs each year between LaRose and various county school districts. Service availability is limited depending upon the number of schools being simultaneously served, and availability of the counseling service program is dependent upon availability of other counselor’s who will collaborate to meet demand. The counseling services program is also limited to the availability of funding.
Author Contact. Questions related to the raw and transposed data on which this analysis is based, the specific examples and/or copies of the survey instruments, and the school counseling program design, development and implementation, with data tracking, and intervention methodologies and supporting intervention research, as well as questions about this assessment can be directed to Kurt LaRose. Information regarding the counseling services program in the school setting, as well as assessment, diagnosis and treatment of youth, in addition to other program implementation, research and evaluation regarding various other areas of mental health and mental illness, with private practice methods and techniques used by LaRose can be found at the TalkifUwant.com website. Visitors to the website are advised to use the link title “Site Map & Index” for ease of use and accessibility for information that is relevant to specific interests.
Limited Permission to Duplicate & Copyright Information. Permission for the use of the content of this report, its text and graphics, by the school district where the contract services were provided, and/or by the school personnel who facilitated program processes, is expressly provided herein. This limited permission to duplicate, directed only to the contract authority who funded the counseling services program identified herein, must include the use and distribution of the report in its entirety and all other uses are strictly prohibited. US Copyright laws protect the content of this report, and this report may not be used in any manner without the expressed and written permission of the author.
<<<<< REPORT ENDS >>>>>
| Onsite School Counseling Services Programs For Your School |
| Drop Out Prevention Services and Group Presentations for Your School |
| Deescalating Out of Control Behavior In the Classroom |
| Gun Safety Protocol Development and Implementation For Your School |
| Onsite Intro seminar: How Teachers, Counselors, Students Work Together |