In This Issue

Evidencing efficacy in graduate-level supervisory practices

An approach that describes methods of data collection and assessments

By Richard T. Marsicano, MA

Abstract: This paper provides an example of and a rationale behind demonstrating effectiveness as a graduate-level supervisor. A method consisting of a five-pronged approach to data collection is discussed. An example showing a possible arrangement of supervisory data is also provided.

The National Association of School Psychologists (NASP; 2000) consider supervision a process that is vital to the growth of the beginning school psychologist. In fact, internship and practicum are the first extended experiences in which students can begin to comprehensively transfer knowledge to practice.

Practicing school psychologists collect data, analyze data, and use data to make decisions. Researchers in the field evidence their  contributions as scholars by presenting data(i.e., scholarly works). Teachers in the field present data in the forms of student evaluations, self-reflection, student work, and student grades. Evidencing supervision effectiveness is no different. Consider the following dialogue:

  1. A: What are your strengths and weaknesses? 
    B: My strength is that I am a hard worker. This can also be my weakness because sometimes I just end upworking too much! I am working on establishing better boundaries.

  2. A: How would your last coworkers/boss describe you? 
    B: I’m not a mind reader, but hopefully as hard working and easy to get along with.

  3. A: Have you ever supervised others? Tell me about your supervisory experience. 
    B: Well, I’m a people person…

Based on this exchange, it is likely safe to assume that person B is not getting the job. Few situations in life will require you to demonstrate your value with a smaller allotment of time. A job interview requires you to stand out. While each of the first two answers – or some variant – to the interview questions is firmly entrenched in our brains, the last question requires a bit more thought and preparation (okay, a lot more). As we transition from a student role to that of a professional, it is imperative to be able to answer these important questions. Whether your future holds a faculty position, practicing in a school, or an outside consult or clinical role, there is one commonality: the supervision of others. The ability to present valid, meaningful supervision outcomes is vital in demonstrating your skills as a supervisor. How do we do this?

Data, Data, and more Data!

An axiom I have adopted is that it is better to have too much data than too little. This is not meant to condone meaningless data collection; rather, having data from multiple sources can often increase the power of your efficacy conclusions. In accordance with best practices, conceptualizing assessment as an ongoing, wide-ranging system yields more accurate and reliable data than one-time measures (Prus & Waldron, 2008; Waldon & Prus 2006). In other words, having too much is better than having too little. But, how do we collect said data?

Use Pertinent and Sensitive Assessments When deciding on means of assessment consider three things. First, spend some time deciding what specific skills epitomize your model of assessment. Consult the research and your program’s internship assessment form(s) for guidance. Second, consider the scale of your assessment measures. A true-or-false analysis of your supervision style won’t cut it. Absurdity aside, using a method of assessment that is insensitive to change will not give you an accurate depiction of your supervisory skills. Third, decide on your sources of information. For instance, consider this five-pronged approach:

  • Supervisor self-evaluation 

  • Supervisee self-evaluation 

  • Supervisor evaluation of supervisee 

  • Supervisee evaluation of supervisor 

  • Student outcomes

Whereas examining each of these sources in isolation yields only tentative conclusions, combining them makes a more convincing argument. For example, presenting data that shows supervisee skill improvement is good. The addition of supervisee self-evaluation data showing the same improvement is great. The addition of improved student outcomes is the icing on the proverbial cake: It shows the value-added piece that demonstrates socially significant outcomes. Combine these sources of data (see figure 1) to form a data-driven, cogent evaluation of your supervisory skills.

Use Formative Assessments

Banta (1999) defined assessment as "the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student [i.e., supervisee] learning and development" (p.4). While this definition was intended to refer to assessment in general, one could make the argument that it just pertains to formative assessment. In contrast to summative assessment’s use in making a judgment of content mastery, formative assessment is given to improve performance. Gathering frequent assessment data from multiple sources will add a measure of reliability. Tables 1 and 2 represent supervisor and supervisee evaluation summaries that can be used to reliably show improvement over time.

Figure 1: Methods of Assessment

Figure-1: Methods of Assessment

Table 1: Supervisor Evaluation



Fall Winter Spring
My Supervisor/I Supervisee Self Supervisee Self Supervisee Self

Provided appropriate support during assessment

6 6 7 7 8 8

Provided appropriate support during code development

7 7 8 8 9 9

Provided appropriate support during intervention planning

5 5 7 6 8 9

Provided appropriate support during parent meetings

7 6 6 7 8 9

Provided appropriate support during teacher meetings

6 7 7 9 7 9

Provided appropriate support during technical adequacy writing

8 8 8 10 9 10

Responded to emails promptly

9 5 88 9 9 10

Provided relevant info or person who could answer questions

6 9 9 9 10 9

Is approachable

7 6 9 7 8 8

I’m satisfied with the support I received/ provided

6 7 7 6 9 8
Overall 6.7 6.6 7.6 7.8 8.5 8.9
Note. Ratings were given on a scale of 1-10 where 1 = strongly disagree and 10 = strongly agree.
Form a Plan to Address Weaknesses

Why collect data if you’re not going to use it? An honest self-evaluation prior to becoming a supervisor should provide an initial indication of possible areas needing improvement. Once you begin to analyze your data, develop a specific action plan targeting your weaknesses. Aside from the obvious benefit of improving your skills, this will allow you to demonstrate your commitment to improvement. Speaking of demonstrating, let’s try that again:

  1. A: What are your strengths and weaknesses?
    B: As indicated by my data, my biggest strengths are ____________. I was told some time ago that I needed to work on ___________. I feel I have addressed this issue by…

  2. A: How would your last coworkers/boss describe you?
    B: As a supervisor, I value how the people I supervise perceive my competence. According to past supervisees I am ___________. This compares to my self-analysis by ______________.

  3. A: Have you ever supervised others? Tell me about your supervisory experience.
    B: Yes. I was able to improve my skills, my supervisee’s skills, and make meaningful differences in the lives of students. According to my data…

Whom would you hire?

Table 2: Supervisee Evaluation



Winter Spring

Self-Assessment Statement

PES PES Self PES Self PES Self
Demonstrates knowledge of scientifically-based instruction/intervention approaches 1.1; 1.3; 1.4; 2.1; 2.7; 3.1; 3.6 3 4 5 4 7 6

Demonstrates knowledge of code development and selection of key targets

1.16 4 3 5 5 8 7

Demonstrates knowledge of scientifically-based approaches for PBS

1.2; 2.2; 3.2 3 5 5 6 7 8

Demonstrates knowledge of data-based decision-making practices

1.5; 2.3; 3.3 4 4 6 5 9 10

Applies concepts of technical adequacy in decision-making

1.6; 2.4; 3.8 3 3 5 4 10 10

Plans, engages in, and evaluates staff development activities

1.7 5 4 4 5 7 9

Demonstrates use of information technology for activities

1.13; 2.8; 3.10

3 5 5 7 6 7
Demonstrates strong writing skills 5.16 2 4 6 3 7 8

Demonstrates use of collaboration and involvement of key stakeholders

2.5; 3.4 4 3 5 5 5 7
Personal and professional behavior All section 5 statements 33 3 7 6 6 7
Overall n/a 3.4 3.8 5.3 5.1 7.2 7.9
Note. PES = Program Evaluative Statement; Self assessment statements were created by the author to correspond with programmatic evaluative statements; a scale of 1-10 was used for all areas where 1 = not competent, 5 = minimally sufficient competence, and 10 = mastery.


Banta, T. W. (Ed.) (2004). Hallmarks of effective outcomes assessment. San Francisco: John Wiley & Sons Inc.

National Association of School Psychologists (2000). School standards for training and field placement programs in school psychology. Bethesda, MD.

Prus, J., & Waldron, N. (2008). Best practices in assessing performance in school psychology graduate programs. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology V, pp. 1943-1956. Bethesda, MD: National Association of School Psychologists.

Waldon, N., & Prus, J. (2006) A guide for performance-based assessment, accountability, and program development in school psychology training programs (2nd ed.) Bethesda, MD: National Association of School Psychologists.

About the Author

Richard Marsicano is in his fourth year as a doctoral student in the University of Cincinnati school psychology program. All correspondence pertaining to this article may be emailed.