Questions? We have answers.
- Where can instructors find USRI Instructor results?
- Where can I find USRI results?
- What does the USRI report consist of?
- USRI Reference Data
- What does the USRI instrument looks like?
- When will I receive results from USRIs?
- What provisions are made for student anonymity in the USRI system and process?
- How can I customize my USRI questionnaire?
- Does USRI support ratings if there are multiple instructors for a course, as in the case of team-teaching?
- How do USRIs work?
- Who should I contact if information on the USRI email announcement is incorrect?
- When do USRIs occur?
- What should I do if I have problems logging into the USRI system?
- How can instructors check response rates during the USRI rating period?
Collect feedback that counts
USRIs gather feedback from classes to help instructors, departments and faculties improve curriculum and instruction. The results also serve as one important factor in decisions affecting the career of your instructor.
The collection of students' ratings of instruction is regulated by the GFC policy. The current version of this policy may be viewed by linking to Section 111.3 of the GFC Policy Manual.
Where can instructors find USRI Instructor results?
To access numerical data:
If your class had at least ten students, a report of your student rating will be available through: https://tsqs.srv.ualberta.ca/cgi-bin/indusri/ind_usri.pl.
To access comments:
A summary of comments will be available through:
To access previous comments:
- After login, click on "Reports" (second icon top-left)
- Click on "Individual Reports"
- Set top drop-down boxes (Year, Period, Status, Survey) to "Show All"
- The system will display the your courses and you will be able to access the comments.
If you have comments or concerns about this process, please let us know at firstname.lastname@example.org.
Where can I find USRI results?
On October 12, 1993, the General Faculties Council (GFC) of the University of Alberta modified its policy concerning Teaching Evaluation and Student Evaluation of Instruction to include the requirement for collection of students' ratings of instruction on a University-wide basis using a basic set of mandated questions. The policy also made provision for releasing the associated results to the Students' Union and the Graduate Students' Association. Currently, results are not made "public" unless there have been at least 10 completed questionnaires for a class.
In 2011, online access to results was restricted to those who are registered as Students. This was followed by allowing instructors to see results for their own classes and, later, providing access to Deans, Chairs and Directors to view results for their employees. This latter access includes the Interim and Acting roles but not Associate or Assistant. Beginning in July 2012, Deans, Chairs and Directors may extend this access to individuals whom they designate by sending an e-mail to Test.Scoring@ualberta.ca which provides:
- their name and employee number
- the name and employee number of the person to whom they wish to grant designated access
The current version of this policy may be viewed by linking to Section 111 of the GFC Policy Manual. Included in this policy is the following cautionary note concerning the results obtained from students' ratings.
Student questionnaires form an important part of evaluating teaching effectiveness but cannot be taken alone as a complete assessment of an instructor or course. Factors other than an instructor's teaching ability may influence ratings. These factors include class size, class level, Faculty, time of class, required versus optional course, grade expectations, student GPA, gender, race, ethnicity, age of both students and instructors. Small differences in evaluation should not be considered as meaningful.
What does the USRI report consist of?
A one-page report is generated for each class from which students' ratings have been collected. The Instructor Report contains the text of each of the rating questions appearing on the questionnaire. The questions are reported in the sequence that they were printed on the questionnaire. Following the text of each question, the numbers of students responding to the rating scale Strongly Disagree (SD), Disagree (D), Neutral (N), Agree (A), and Strongly Agree (SA) are reported. These frequencies are followed by the median of the responses and reference data.
What does the USRI instrument looks like?
USRI Reference Data
The columns of reference data display statistics from Tukey's box-and-whisker plot analysis (John W. Tukey, Exploratory Data Analysis, Addison-Wesley Publishing Company, Inc. 1977). The values displayed are derived from all the classes in the indicated reference group. These statistics are chosen to achieve two main objectives:
- summarize skewed distributions of data, and
- identify outliers from the general population if they exist.
The median value (middle of a ranked set of numbers) is generally preferred over the mean to identify the centre of a skewed distribution of scores. This is the value below which 50 percent of the medians from other classes lie. Please note that data for the items in the current set of mandated questions are accumulated from Academic Year 2005/06 and beyond. If an item (question) has not been used at least 15 times by the indicated reference group since then, the reference data cells will be filled with the text: too few uses. It is theoretically possible for all median scores in a single year to be above, or below, the Reference Group median.
The 25th and 75th percentiles provide information about the spread of scores around the median. By definition, twenty-five percent of the scores are above the 75th percentile and twenty-five percent are below the 25th percentile. Since this occurs by definition, these values should not be used to determine whether a particular score is good or bad.
The lower Tukey Fence, which is the 25th percentile minus 1.5 times the distance from the 25th to the 75th percentile, defines a reasonable limit beyond which a score can be considered an outlier. Outliers are scores that appear to be outside the usual distribution of scores for the population being tabulated, i.e., for the indicated reference group. Given the nature of the USRI data, the upper Fence will usually be above 5.0 and, therefore, need not be reported.
Please note that some items can be expected to elicit higher ratings because they are closer to apple pie types of items, i.e., we would expect the item to be rated quite positively. This is illustrated by the campus-wide results accumulated in the years 2000-2004 for the two items shown below.
|The instructor treated students with respect.||3.4||4.3||4.6||4.8|
|Overall, the quality of the course content was excellent.||2.9||3.8||4.1||4.3|
This suggests that the median obtained for the first item in a particular class can be expected to be 0.5 of a rating above that for the second item simply because that has been found to be the case in results from thousands of classes surveyed at the University of Alberta. Note that the 25th percentile for the first item corresponds to the 75th percentile for the second item.
Also, the reference group used for a particular class consists of all classes in the indicated department or faculty. One of the most consistent findings of researchers studying students' ratings of instruction is that the ratings obtained for items such as those addressing general satisfaction with a course or instructor depend on the discipline in which the course is taught. Franklin and Theall (1995) reported that "Professors in fine arts, humanities, and health-related professions are more highly rated than their science, engineering and math-related colleagues." There appears to be a combination of reasons for these differences including differences in the characteristics of the students, in the nature of the subject matter, and in the course objectives that are emphasized in different disciplines. The sizes of the differences and the conclusion that they are not necessarily related to characteristics of the instructors in the different disciplines leads to the advice that "we must continue to be very cautious about--if not prohibited from--using the results of student evaluations to make comparisons across disciplines" (Marincovich, 1995 ).
For example, the item "Overall, this instructor was excellent." illustrates that results at the University of Alberta are consistent with the research studies. The reference data from some of the departments in which a large number of classes have been surveyed appear in the following table.
|Electrical & Computer Engineering||2.7||3.9||4.2||4.6|
|Mathematical & Statistical Sciences||2.8||3.9||4.2||4.6|
|Earth & Atmospheric Sciences||3.0||4.0||4.3||4.6|
|Modern Languages & Cultural Studies||2.9||4.0||4.4||4.8|
|History & Classics||3.4||4.2||4.5||4.7|
When will I receive results from USRIs?
Instructors will receive rating results – from the Universal Student Ratings of Instruction – within twenty working days after the course is completed and once the Chair, Director or Dean has signed the grade sheet.
What provisions are made for student anonymity in the USRI system and process?
GFC Policy 111.3 D states the importance of student anonymity in completing course and instructor survey questions. Free expression of views in the ratings is important, so long as the safety of the members of the University community is upheld. Additional details follow below.
1. The survey administrator cannot identify the student through the survey tools unless the student self-identifies.
2. The survey tools are truly anonymous.
3. Your ID/username does not get tagged on the survey results.
4. You must log in for verification that you have taken, partially taken, or not taken some or all of your surveys. Again, your answers to survey questions are completely separate from this verification.
5. Circumstances that warrant overriding anonymity are spelled out in GFC policy 111.3 D (see above). Threats to the safety or well-being of members of the University community will not be tolerated, and would result in actions in order to identify the author of the statements according to GFC policy.
6. These surveys or ratings are conducted so that "the results help instructors and departments or faculties to initiate constructive change in curriculum and instruction. In addition, the results are one important factor in decisions affecting the career of your instructor." (GFC policy 111.3 C).
How can I customize my USRI questionnaire?
The USRIs are generate using the Instructor Designed Questionnaire (IDQ). This system allows for the inclusion of items relevant to the individual instructor, university, the faculty and the department; and allows for normative as well as individualized feedback on the quality of instruction. Please visit our IDQ system page to obtain more information about the IDQ system.
Does USRI support ratings if there are multiple instructors for a course, as in the case of team-teaching?
Yes! In a case such as this, the student rating can be configured to provide numerical and open-ended questions that repeat for each instructor; The questionnaire is arranged such that the questions that apply to the overall course appear first. These are followed by questions that apply to the instructor. Instructor-related questions are repeated for each instructor involved in the course.
When results are compiled and ready for viewing, each instructor will receive the questions that apply to the overall and their individual results. This method eliminates the need of having separate questionnaires generated for each instructor in a team-taught class in order to achieve confidentiality among instructors.
How do USRIs work?
Step 1 - Prior to the start of the rating period, instructors will receive an email announcement with further information and important dates for the rating period.
Step 2 - Once the announcement email is received, instructors should notify students that they will be receiving an email with instructions and encourage participation.
Step 3 - When student ratings become available, students will receive an email with instructions and appropriate links to the rating.
Step 4 - During the rating period instructors will receive a helpful reminder to encourage student participation.
Step 5 - If students have not yet completed the rating they will receive an initial reminder via email.
Step 6 - A secondary reminder via email will be sent to any students who have not yet completed the rating.
Step 7 - The rating period is now complete! Instructors will be able to view results online through the USRI Personal Report website.
Who should I contact if information on the USRI email announcement is incorrect?
If any information appears incorrect on the email announcement such as instructor name, course information, or if your course has been canceled or is missing student ratings, please contact TSQS as soon as possible by phone 780-492-2741 or by email email@example.com.
When do USRIs occur?
Generally, USRIs are available for students to complete once the withdrawal deadline for classes has passed, and will be available until the last day of classes.
What should I do if I have problems logging into the USRI system?
If you have difficulty logging into the ratings system please try the following:
- Clear your browser cache. For assistance on clearing your brower cache see the following help article: How to Clear Browser Cache and Cookies.
- Check the status of your CCID to ensure it is valid and functioning properly by accessing https://myccid.ualberta.ca/
If you still have problems logging in, please contact IST at firstname.lastname@example.org.
How can instructors check response rates during the USRI rating period?
If you would like to check the response rate during the USRI course rating period, log into the USRI system and your homepage will now provide a status overview and the current response rates for your courses.