Introduction
Policy AC23 requires that the evaluation of teaching effectiveness for purposes of promotion and tenure be based on both peer review input and at least two types of student input. Policy AC40 states that both annual and extended performance evaluations are based on the same criteria as used in tenure and promotion decisions and therefore require both peer review input and at least two types of student input for the teaching effectiveness criterion. The Altoona College Guidelines for the Promotion of Full-time, Non-Tenure Track Faculty requires evidence of teaching excellence that should include peer review input and two types of student input.
Despite the emphasis of these documents on evaluating teaching effectiveness using multiple measures, some faculty may worry that too much emphasis is placed on just one of the required measures: the Student Rating of Teaching Effectiveness, or SRTE. These concerns should be addressed in light of the fact that student rating forms have limitations and, by themselves, are not sufficient for either personnel evaluations or for improvement of teaching. Moreover, the SRTE is not specifically designed to provide information about enhancing the effectiveness of instruction, and twenty years of scholarship have demonstrated that numerical student ratings are influenced in both positive and negative directions by a number of variables, making comparisons across broad groups of faculty difficult and potentially inaccurate and misleading.
The driving philosophy behind these guidelines, therefore, is to resituate the SRTE as one of many measures for evaluating teaching effectiveness. These guidelines outline a number of methods for obtaining information on teaching effectiveness, the procedures for gathering and reporting that information, and some suggestions for interpreting and weighing such information.
Peer Review of Teaching
Peer reviews of classroom sessions and curricular materials are especially important because student ratings alone do not provide sufficient evidence of the extent of student learning in a course. In addition, peer reviews are better suited than numerical student ratings to offering suggestions for the improvement of teaching. Therefore, peer reviews should focus on providing the type of feedback that cannot easily be obtained or reasonably interpreted from other measures. This feedback should consider such issues as the pedagogical soundness of teaching approaches, apparent instructor expertise in the subject area, the quality of student work, the quality of faculty/student interactions in the classroom, and areas for improvement.
Whenever possible, peer review should be conducted by faculty of the same discipline. To achieve this goal, divisions may seek peer review committee members from outside the Altoona College.
For faculty undergoing tenure review, policy AC23 requires that the evaluation of teaching effectiveness be based on both peer and student input. For provisional faculty (i.e. tenure track faculty who do not yet have tenure), peer reviews of teaching will be conducted for the second- and fourth-year reviews and for the tenure review, and during other years at the faculty member’s request. A peer review letter will be included in a candidate’s tenure and promotion dossier and could also be considered at a faculty member’s annual review. Each division has its own procedures and guidelines, developed by the faculty in the division, for handling this requirement. Divisional guidelines should be reviewed and updated periodically and should be posted in a location accessible to all divisional faculty. It is the Division Head’s responsibility to ensure that faculty serving on peer review of teaching committees are informed of the divisional guidelines and procedures for peer review of teaching.
For full-time, non-tenure-track faculty undergoing promotion review, Altoona College guidelines require an evaluation of all Faculty Annual Reports, including SRTE data, for the eight or more academic years prior to the review, as well as “a representative number of peer evaluations.” Faculty under review should plan in advance to obtain a peer evaluation of teaching the semester prior to the promotion review going forward, and during other years at the faculty member’s request. Each division has its own procedures and guidelines, developed by the faculty in the division, for handling this requirement. Divisional guidelines should be reviewed and updated periodically and should be posted in a location accessible to all divisional faculty. It is the Division Head’s responsibility to ensure that faculty serving on peer review of teaching committees are informed of the divisional guidelines and procedures for peer review of teaching.
For post-tenure faculty, the AC40 extended review process requires an evaluation of all SRTE data collected since the last review, but faculty should consider gathering and submitting additional materials relevant to a teaching evaluation. (See sections B and D for suggestions on additional materials.) Faculty who desire other forms of teaching assessment should work within the appropriate procedures for their divisions and plan in advance to obtain a peer evaluation of teaching the semester prior to the extended review going forward. If the faculty member under extended review requests peer involvement per the Altoona College AC40 guidelines, then all teaching materials should also be submitted to the divisional promotion and tenure committee.
“Second Form” of Student Evaluations of Teaching
Policy AC23 requires that the evaluation of teaching effectiveness for purposes of promotion and tenure be based on both peer input and at least two types of student input. The Altoona College Guidelines for the Promotion of Full-time, Non-Tenure Track Faculty requires evidence of teaching excellence that should include peer review input and two types of student input. Of those two types of student input, one specific measure—the SRTE—is mandated by the University for use across the entire University. The other type of student input—often called “the second form”—is left up to the discretion of individual units.
Each division should have its own procedures and guidelines, developed by the faculty in the division, for handling this “second form” requirement. Divisional guidelines should be reviewed and updated periodically and should be posted in a location accessible to all divisional faculty. It is the Division Head’s responsibility to ensure that faculty in the division are informed of the divisional guidelines and procedures for handling this “second form” requirement.
AC23 mentions three possible choices for “second form” evaluation that divisions can discuss as they develop divisional policies.
- Written evaluations administered with SRTEs. Faculty may use a combination of division-approved forms for written student comments to be handed out at the same time the SRTEs are administered (i.e., a lab-specific form for lab sections, a discussion/seminar-specific form for those course types, a lecture-specific form for that course type). The Schreyer Institute provides a number of possible forms for comment collection or as templates for adaptation; appendix 1 also offers an example of a written evaluation form. Whatever form(s) the divisions choose to approve, after administration, the forms with written student comments should be placed in an envelope separate from the SRTE forms and then delivered to the appropriate staff assistant’s office. Divisional policies should address appropriate methods of document storage and of comment sharing with faculty after course grades have been submitted.
- A major advantage of this “second form” method is that comment collection can occur routinely during any or all administrations of the SRTE, providing faculty with more feedback that can help improve teaching and create a more helpful context for the useful interpretation of SRTE results.
- This method generally has the highest rate of return for student input among the three methods under discussion here.
- This method also provides non-tenure-line faculty (who currently often receive minimal if any additional teaching assessment beyond SRTE results) with more teaching feedback, which can be useful for personal development and during annual review meetings.
- Letters/emails soliciting evaluations from former students, representing a range of courses taught during previous semesters. As per divisional policy or at the request of the faculty member under review, the Division Head will solicit letters from a random sample of students who have completed a course taught by the faculty member. (Students undertaking work with the faculty member during the semester the solicitations go out should not be contacted.) The Division Head will work with the faculty member to ensure that letters are solicited from a wide range of students. Clear divisional policies should describe who receives the letters/emails and how a summary/assessment of the letters/emails is generated.
- Exit interviews. As per divisional policy or at the request of the faculty member under review, the Division Head and the divisional peer review of teaching committee may arrange for formal interviews with students at the end of the semester. Clear divisional policies should describe who attends these interviews and how a summary/assessment of the interviews is generated.
The Student Rating of Teaching Effectiveness (SRTE)
The University mandates the use of the Student Rating of Teaching Effectiveness (SRTE), an in-class rating survey for student evaluation of teaching. This survey may be supplemented by other forms of student evaluation at the discretion of the faculty of the unit (see sections B and D).
The SRTE is a “cafeteria” system with a fixed pool of items from which divisions and disciplinary units select items most appropriate to their courses. It consists of three sets of questions—a University core, a departmental core, and individual faculty items rating the quality of the course and the quality of the instructor.
- The University core consists of two global questions that are included in all survey forms, asking students to give an overall rating of the course and an overall rating of the instructor.
- The departmental core consists of five to fifteen additional items from the pool, selected by the faculty of the division and/or disciplinary unit. These items should be selected to reflect the nature of the discipline, type of class, and other factors the divisional/disciplinary faculty deem to be appropriate. A single disciplinary unit can have up to three different forms for different course types, i.e., discussion, lecture, or lab courses.
- A Schreyer Institute report of recommendations for use of the SRTE states that only 32 questions out of a pool of 177 receive regular use. Divisions and disciplines are strongly encouraged to review the SRTE question pool in order to make the SRTE a more responsive and useful tool. Teaching methods have changed considerably over the last fifteen years, generally shifting away from lectures to include more collaborative learning practices and to include more technology, but most units have not updated the SRTE section B questions to reflect those changes.
- Individual faculty members may add up to five additional items from the pool to supplement the two global questions and the departmental core.
Appropriate controls for the confidentiality of information shall be implemented by all units in distributing, collecting, and maintaining the surveys.
Results of the SRTE surveys shall belong to the unit which administers them, not to the individual faculty member who was rated. Results shall be returned to the Division Head for inclusion in tenure and promotion dossiers and for use in faculty annual review discussions. The faculty member shall be furnished with a copy of all survey results.
Administration of the SRTE
Administration guidelines for the SRTE are published with the survey form. According to AC23, specific procedures for the administration and collection of surveys are to be developed by the faculty of the unit. These guidelines establish the specific procedures for the administration and collection of surveys for Altoona College.
- Responses to survey items must remain anonymous.
- The SRTE should be administered during the last two weeks of a course but not in the same class period in which a test is given. Faculty are cautioned against waiting for the last class meeting, due to the possibility of class cancellation (especially during the fall semester) and generally lower course attendance on the last day.
- Faculty should plan SRTE administration in order to maximize returns as much as possible. As a general goal, at least two-thirds of the registered students should be present for the evaluation to be administered in order to achieve valid results. According to Dr. Angela Linse, Executive Director of the Schreyer Institute for Learning Excellence, small classes (less than twenty students) should aim for an 80% return rate in order to maximize validity, while very large classes (more than one hundred students) can obtain valid results with a minimum 50% return rate.iii
- Students may need to learn how to give careful evaluations of teaching. Faculty who teach first-year courses, particularly First-year Seminar as a stand-alone course or as a course component, should consider developing class activities or tutorials with the goal of training students to become careful evaluators.
- Directions to students should be uniform across administrations of the survey. Faculty should consider the use of a scripted statement to convey the importance of ratings and thoughtful feedback during the administration of the SRTE. For an example of such a scripted statement, see appendix 2.
- When faculty members provide directions to students regarding the SRTE, they should reiterate that students are intended to complete the SRTE based on individual perceptions and experiences. Students should not use the SRTE administration session to norm experiences or to persuade classmates to rate an instructor in a particular way.
- The course instructor shall not be present in the classroom while students are completing the evaluation, but other individuals (such as another faculty member with no connection to the course) can be present to proctor the administration.
- The responses should be collected and returned to Instructional Services by the person administering the evaluation (student volunteer or faculty proctor).
- The course instructor shall not participate in the collection or compilation of the survey results.
Frequency of reviews
According to AC23, the specific procedures for determining the frequency of reviews for the faculty members within a unit shall be determined by the college. Such procedures must be developed in consultation with the faculty of the college. These guidelines endorse the following principles about the frequency of reviews:
- Whenever possible, evaluations for all faculty members should be conducted over a period of years and in a variety of courses.
- For provisional faculty (i.e. tenure track faculty who do not yet have tenure), all sections of all courses shall be evaluated by the SRTE every semester the faculty member teaches. The results from each of these evaluations must be included in the candidate’s tenure dossier.
- If there is some reason to explain the absence of results in a particular case, the appropriate Division Head shall make a note to that effect in the dossier. For example, in advance of a course being taught for the first time in an experimental way, a Division Head and a faculty member might agree not to include SRTE results in tenure or promotion dossiers. Note that the SRTE is still administered in this example; the results are simply withheld from the dossier. Such agreements should be in writing.
- Other circumstances that might merit faculty consultation with the Division Head, and potentially a Division Head’s note in the dossier, include the possible impact of academic integrity cases on a particular course’s SRTE results, the introduction of new pedagogical methods, or any unusual circumstances that a faculty member believes warrant discussion.
- For all other faculty, the University strongly recommends that all sections of all courses should be evaluated. Such evaluations have value in allowing faculty members and Division Heads to track individual performance over time, identify developing strengths and possible weaknesses, and reward exceptional teaching.
- The College understands that it might not be possible to administer the SRTE in all sections every semester due to unusual circumstances (illness, weather, etc.). Faculty members who occasionally do not administer the SRTE in a course should not be sanctioned.
- If there is some reason to explain the absence of results in a particular case, the appropriate Division Head shall make a note to that effect in the appropriate location, typically the annual review. For example, in advance of a course being taught for the first time in an experimental way, a Division Head and a faculty member might agree not to include SRTE results in the annual review or future promotion dossiers. Such agreements should be in writing.
- Other circumstances that might merit faculty consultation with the Division Head, and potentially a Division Head’s note in an official record, include the possible impact of academic integrity cases on a particular course’s SRTE results, the introduction of new pedagogical methods, or any unusual circumstances that a faculty member believes warrant discussion.
- At a minimum, all faculty (with the exception of tenure-track faculty in provisional status) must administer the SRTE in at least one course every semester that they teach.
- Faculty who choose not to administer the SRTE more frequently should recognize that faculty lacking significant SRTE evaluation records are at a disadvantage when undergoing consideration for promotion or University teaching awards, particularly if they fail to provide their Division Heads with other evidence of teaching effectiveness.
- Moreover, faculty who choose not to administer the SRTE more frequently may be at a disadvantage in their annual reviews if they fail to provide their Division Heads with other evidence of teaching effectiveness, such as those outlined in Sections B and D of this document.
- Faculty members who choose not to administer the SRTE more frequently but who DO provide other adequate evidence of teaching effectiveness to their Division Head should not be sanctioned. Sections B and D of this document discuss alternate forms of evidence that could be considered by Division Heads as valid substitutes for or supplements to the SRTE.
- Alternate measures that follow the requirements in the Administration of the SRTE section of this document regarding student anonymity and proper data management should be considered equivalent to the SRTE for purposes of evaluating teaching for annual reviews, AC23 reviews, and AC40 extended reviews, whereas measures that do not adhere to all the C.5.a requirements should be considered useful as supplements.
Appropriate Use of SRTE Data
University Use
As evaluation for promotion and tenure: The SRTE was developed for use in determinations about tenure and rank that require student input as evidence of teaching effectiveness. According to AC23, an individual faculty member’s scores for courses taught over a number of years are included as part of a dossier along with peer review and at least one other form of student input on teaching effectiveness.
College/Divisional Use
As evaluation for performance reviews: According to AC40, the SRTE scores are included in the evaluation of teaching effectiveness for both annual reviews and extended faculty reviews, in addition to instructional and advising activities, instructional improvements, teaching awards, and other evidence. These reviews are used not only for personnel decisions such as salary increases but also as opportunities to recognize good work and for faculty self-evaluation.
Faculty Use
- As evaluation: The global ratings used by departments and promotion and tenure committees can also give instructors an overall picture of how the students perceive their teaching. In addition, faculty can look at the rating data over time to look for changes and progress. Finally, they can use the divisional or disciplinary “B” items and the individual “C” items to help them gauge student perception of their teaching effectiveness in certain aspects of teaching, such as preparation for class.
- To drive improvement: Divisions/disciplines have the option to give out comment sheets along with the SRTE that ask for general comments such as “what was the best thing about this class?” These comments are the most useful for faculty interested in improvement because students often give concrete suggestions about what works well and what doesn’t.
Interpretation of SRTE data
Student rating forms have limitations. Because student evaluations commonly elicit numerical responses, it is easy to assign them a precision that they do not possess.iv
Several factors have been found to have some relation to student ratings: expected grade, student workload, students’ level of motivation (influenced by such factors as whether the course is in a student’s major, is being used to fulfill a requirement outside of a student’s major, or is a University-mandated requirement), type of course (lecture, discussion, lab), class size, and faculty enticements on the day of SRTE administration (bringing cookies to class, etc.).v
SRTE results appear most accurate and potentially useful when tracking an individual faculty member over at least five and preferably ten administrations of the SRTE.vi
Faculty members should be given an opportunity to respond to evaluation results. In conversations with their Division Heads, faculty should be invited to discuss the objectives of the course, how the teaching methods were used to meet that objective, and how circumstances might have affected evaluations. See appendix 1 for faculty strategies in providing context for the interpretation of SRTE results. Furthermore, other evaluation information gained from a given course (see sections B and D of this document) can aid with the interpretation of rating results.
There is a need for divisions/disciplines to develop sound methods to make intra-disciplinary comparisons if the division/discipline believes such comparisons to be potentially useful. According to a 2004 Schreyer Institute report, an early trial period of reporting norms ended with negative results.vii Variables such as expected grade, student workload, students’ level of motivation, course type, and class size could all influence student perceptions of teaching and SRTE results. But for a discipline using the same “B” questions on the SRTE, developing relative standards for particular courses should be possible. Relative standards should be discussed and decided upon by the faculty of the discipline.
According to a 2004 Schreyer Institute report, comparisons of course and instructor ratings between divisions/disciplines cannot be made when differing SRTE forms with different sets of questions are in use.viii The flipside of having a flexible division- or discipline-specific form for evaluating courses within a division/ discipline is a loss of comparability between divisions/disciplines. Because “B” and “C” questions can influence student thinking when assigning ratings to the “A” questions, even comparing data from the “global” questions across divisions and disciplines is problematic.
The use of SRTE results to create explicit rankings among divisions, disciplines, and individual faculty runs counter to the Schreyer Institute’s 2004 report on the limitations of using SRTE results for extra-disciplinary comparisons. Such misuse of SRTE data fosters an inappropriately competitive campus culture and undermines the collegiality of faculty relationships. In addition, the use of the SRTE to create explicit rankings could increase pressure on faculty to become less rigorous with regard to course workloads and to inflate course grades as methods to inflate SRTE results.
Other Measures of Teaching Effectiveness
In addition to formal student evaluation measures and peer evaluations, faculty are strongly encouraged to become knowledgeable about and to make use of other measures of teaching effectiveness. Such measures could include any or all the items in section B, as well as a number of other methods. Alternate measures that follow the requirements in section C.5.a regarding student anonymity and proper data management should be considered equivalent to the SRTE for purposes of evaluating teaching for annual reviews, AC23 reviews, and AC40 extended reviews, whereas measures that do not adhere to all the C.5.a requirements should be considered useful as supplements. The following list is not by any means exhaustive.
- The Student Evaluation of Educational Quality is available as a machine- scannable form or as a web-based application. While most instructors use it to collect mid-semester feedback, it can also be used at the end of the semester in addition to the SRTE.
- The Student Assessment of Learning Gains instrument is designed for instructors from all disciplines who wish to learn more about how students evaluate various course elements in terms of how much they have gained from them. Feedback from the instrument can guide instructors in modifying their courses to enhance student learning. It may be used at any point in the semester.x
- A customized feedback measure allows faculty to produce a completely individualized instrument the old-fashioned way by creating a paper survey (generally useful only for smaller courses).
- A Teaching/Learning Portfolio is a coherent set of materials that usually includes sample syllabi and assignments with connected reflective commentary about the work of teaching. A portfolio is an evidence-based description of one’s approach to teaching—a scholarly argument that presents a case for an instructor’s accomplishments and professional development. Built-in tools on ANGEL, such as the polling tool or the survey tool, allow faculty to create a custom feedback instrument in the ANGEL environment.
- Unsolicited student comments via correspondence (i.e., “Thank you for helping me get into grad school” emails, or “You are the greatest teacher ever” holiday cards), while not appropriate for tenure and promotion reviews, can be of use in annual reviews.
- Faculty-solicited student comments might be useful in annual reviews, such as a letter from a student in an independent study where the SRTE is not administered (the SRTE is not to be administered in courses where the enrollment is fewer than four students). Solicited students should be instructed to send their comments directly to the Division Head.
- Faculty who wish to explore and implement alternative student assessments of teaching for official use in AC23 and AC40 reviews should do so in consultation with their Division Heads and with an awareness of the requirements listed in section C.5.a of this document for valid administration, collection, and compilation of the data.
Revised:
Altoona College Faculty Senate, January 30, 2018
Previous Versions:
Altoona College Faculty Senate, 2008
Appendix 1: Guidelines for Reporting Student Ratings for Review
Angela R. Linse. Ph.D .
Executive Director Schreyer Institute for Teaching Excellence. Penn State
Annotate Your Student Ratings (~ ½ page)
Course Data
- Course Title & Number
- Instructor
- Term(s) and year(s)
- Enrollment Respondents (#, %)
Course Description
- brief description of course content goals, etc. (1 short paragraph)
- primary teaching methods (1-2 lines)
- class format (# sessions/week; duration of each session)
- brief description of students (e.g. %juniors/seniors, % non-majors, etc.)
Student Ratings
- Is the response rate representative?
- What were the primary issues raised by students?
- Identify themes from the summary data report and student comments. This is your opportunity to direct reviewers' attention to particular results or comments that are most useful or informative. Help reviewers read and interpret your results rather than leaving it up to them to identify significant themes and appropriate responses.
- strengths (2-3 themes)
- challenges (2-3 themes)
- What changes did/will you make to address student concerns?
Analyzing your Results
Identify areas that students see as needing improvement in your quantitative results. Compare these to themes that appear in students' written comments. A quick approach for identifying themes is to build a list of topics that repeatedly arise as you read students' comments. Keep a cumulative tally of the comments that could be assigned to each theme. Let the frequency of the comments under each theme guide your course revisions.
Another method is to create an electronic document with all of the students' answers to each question. Reading students' responses in electronic form rather than handwritten comments can help create the distance necessary to focus on the underlying content rather than personal criticisms. Sort student comments into groups based on similarity and label the group with a subject heading. Then rank the groups based on the frequency of comments in each. Some common themes include labs, homework, group work, lecture, instructor style, availability, textbook, and exams.
Student Ratings Annotation Example
ME 3000: Advanced Mechanical Engineering Analysis
Fall Semester 2004
- Enrollment: 60
- Respondents: 32 (53%)
Course Description
Mathematical modeling, analysis, and design of physical dynamic systems involving energy storage and transfer by lumped-parameter linear elements. Time-domain response by analytical methods and numeric simulation. Laboratory experiments. Prerequisites: Linear Algebra, Differential Equations, Probability & Statistics, Engineering Dynamics.
This is a 15-week advanced lecture and laboratory course that meets in three I-hour time blocks and one 2-hour lab (taught by TAs). The 1-hour sessions include lectures about the primary theoretical material of systems dynamics, with derivations of fundamental principles, followed by worked examples similar to assigned homework problems. The lab sessions include 7 lab assignments and 7 discussion sessions. The lab assignments require students to conduct hands-on experiments relating to problems discussed in the large class sessions. Students are also required to devote time outside of class to assigned readings, lab write-ups, and homework.
Students: The course is a required undergraduate course for mechanical engineering majors and is a prerequisite for many of the required capstone sequences. About 50% of the students were juniors, 45% seniors, and 5% new graduate students.
Student Ratings
Students appreciated that expectations were clear and grading processes were systematic and implemented fairly. They also took advantage of my frequently scheduled office hours those of my Teaching Assistants. Students' written comments provide similar information. For example, "Availability of Prof & TA is good" "Office hours & e-mail help a lot; lots of communication with students," "very approachable, very positive attitude."
Students wanted more opportunities to practice analysis and evaluation. In their written comments, students requested more time in class to practice solving problems similar to those in their homework assignments. For example: "More interaction, but not as intense/involved as lab" and "More interaction w/ lecture notes prior to class, so we can expect more out of lecture."
Changes
One change I plan to make in this course is to decrease the amount of time I spend lecturing and provide time at the end of each session for student questions. Rather than solving every derivation in class, I will leave a portion of it incomplete and revisit it during the next class when I will ask students to help complete the solution. A number of the topics covered in this course are particularly challenging for students, thus I will occasionally provide opportunities for students to work tough problems in class, when the TAs and I are there to provide guidance.
Midterm Feedback Form
Things that help me learn | Explanation/Example (i.e. why?) |
---|---|
Response | Response |
Response | Response |
Response | Response |
Name | Response |
Things that could be changed | How could this be done differently? |
---|---|
Response | Response |
Response | Response |
Response | Response |
Response | Response |
III. What changes could you make to improve your own or other students’ learning?
Student Comments
Note: the questions and themes below are only examples.
Different questions are asked by different departments/divisions.
Different themes will emerge for each course and/or instructor.
What helps you learn in this course? OR The strengths of the course/instructor are:
- Instructor Knowledge
- Class Discussion
- Teaching Methods
- Instructor Style/Instructor Enthusiasm
- Projects
- Course Content
- Readings
- Supporting Materials
- No Strengths
- Misc.
What were the greatest challenges of this course?
- Organization
- Workload & Assignments
- Grading
- Lectures
- Unclear Expectations
- No Weaknesses
- Misc.
What suggestions do you have for improving this course? OR What could be done differently to improve your learning?
- Clarify Expectations
- Provide Grading Criteria
- Organization
- Misc.
Appendix 2: Possible Scripted SRTE Administration Announcement
What information should be given to students regarding the evaluations? It would be best if the person who distributes the evaluations could make a short statement to students prior to handing out the evaluations. The final format is up to that individual, but the following is a suggested announcement:
"I just want to take a few moments before handing out the evaluation forms to let you know how valuable your input is as a means of gauging the instructor's and this course's effectiveness. The evaluations that you are about to fill out are not only important sources of information for the instructor's and this course's effectiveness. The evaluations that you are about to fill out are not only important sources of information for the future improvement of courses and teaching, but are also taken into account in instructor annual reviews and promotion reviews. Because the information provided on these forms is so important, I encourage you to take the time to fill out all three sections of the form, each of which covers significant areas of evaluation. You will have fifteen minutes in which to fill out these forms."
“Appendix XI Teaching Proposal.” 1998. The Manual of the Irvine Division of the [University of California] Academic Senate. Accessed March 1, 2008.
Notes
iii This issue was discussed at a Schreyer Institute videoconference on SRTEs that took place on April 10, 2008.
iv “Statement of Practices for the Evaluation of Teaching Effectiveness for Promotion and Tenure.” 2006. The Faculty Affairs Committee of the University Faculty Senate. Pennsylvania State University. Section D4. Accessed March 1, 2008. In addition to this specific reference, the overall organization of this “Statement of Practices” and large segments of its actual text were used as a basis for these present guidelines.
v For a range of materials that discuss various potential influences on student evaluation results see:
- Braskamp, Larry A. and John C. Ory. Assessing Faculty Work: Enhancing Individual and Institutional Performance. San Francisco: Jossey-Bass, 1994.
- Langbein, Laura I. “The Validity of Student Evaluations of Teaching.” PS: Political Science and Politics, Vol. 27, No. 3 (1994): 545-53. The other studies referenced in her Notes provide a fairly broad sample of scholarly work done on student evaluations prior to 1994.
- Theall, Michael et al. The Student Ratings Debate: Are They Valid? How Can We Best Use Them? San Francisco: Jossey-Bass, 2001.
- Youmans, Robert J. and Benjamin D. Jee. “Fudging the Numbers: Distributing Chocolate Influences Student Evaluations of an Undergraduate Course.” Teaching of Psychology, Vol. 34, No. 4 (2007): 245-7.
vi Cross et al., page 7. vii Cross et al., page 7. viii Cross et al., page 7.