73° Fort Worth
All TCU. All the time.

TCU 360

TCU 360

All TCU. All the time.

TCU 360

A TCU student reaches for a Celsius from a vending machine- a refreshing boost amidst a hectic day of lectures and exams. (Kelsey Finley/Staff Writer)
The caffeine buzz is a college student's drug
By Kelsey Finley, Staff Writer
Published Apr 18, 2024
College students seem to have a reliance on caffeine to get them through lectures and late night study sessions, but there are healthier alternatives to power through the day.

    Evaluation software sent to Provost for approval

    A Faculty Senate-approved teaching evaluation software used to increase student confidentiality and bolster feedback for professors will be piloted this semester before being sent to the Provost for final approval, Cathy Coghlan, director of Institutional Research said.

    SmartEvals!, which would replace the current Student Perception of Teaching (SPOT) software Class Climate, was approved March 1 and will be used by four departments this semester. The Department of English, College of Education, Department of Biology and Department of Social Work all volunteered to use the the new software.

    Another part of the pilot is a new teaching evaluation rubric, which the Senate also passed March 1. Only the departments testing SmartEvals! will use the new document this semester.

    The current rubric is divided into three sections- course information, student information and instructor information. There is also general and specific comment spaces made available. Questions are answered on a four-point Likert scale, which gives the student the option of “strongly disagree”, “disagree”, “agree” or “strongly agree”.

    The rubric approved by the Senate is different in that it is divided into five sections- student information, assignments, faculty-student interaction, organization and learning- with two to five, five-point Likert scale questions a section. The new document also includes “Yes” or “No” questions and about two open-ended questions per section.

    The new rubric was written by a SPOTs review committee last semester, Judy Groulx, chair of the University Evaluation Committee, said.

    One of the main goals of the committee, which was composed of 16 professors representing each college on campus, was to generate discussion on what constitutes an effective SPOT, Groulx said. Plus, revising the document allowed the committee to structure the rubric in a way that maximized the potential of the evaluation system.

    “We realized we had a chance, as a committee, to try and expand the conversation around how do you evaluate teaching that’s good and how do you use that?” Groulx said. “It got to be a habit. You give them and you’re not really satisfied with them. Some people use it one way, some people use it another way.”

    Clarifying questions on the rubric so that professors and students are on the same page was another goal of the committee, which put in nearly a year’s worth of research before making changes to the current document.

    “They would always get information back they weren’t quire sure how to interpret because they didn’t think the students were all necessarily interpreting the question the same way,” Groulx said. “There were some items that we knew were getting regularly misinterpreted.”
    Both the SmartEvals! recommendation and the new evaluation rubric would mark the beginning of the university’s campus-wide move to a permanent, on-line evaluation system.

    Last fall, 40 percent of departments used the online format. All departments are using the system, known as eSPOTs, this semester, Coghlan said.

    The benefits of moving to a permanent online system, especially SmartEvals!, range from making student feedback more confidential and valuable to professors, Coghlan said.

    SmartEvals! would make evaluation results more accessible to professors, something that isn’t the case with Class Climate or the traditional in-class paper format.

    “Faculty will be able to log into the system,” Coghlan said. “They’ll have a longitudinal history of all their SPOTs.”

    Professors currently cannot log into Class Climate and before 2010 they had no electronic database where they could review evaluation results over time like SmartEvals would allow.

    The university decided to use Class Climate as the pilot program for the online system since it allows for paper use, too, and would help soothe the transition between the two formats, Coghlan said.

    And, as Coghlan pointed out, since the format would be completely online, the issue of a professor recognizing a student’s handwriting, like they might be able to do with the paper format, wouldn’t be a problem.

    But not all faculty members are completely confident in SmartEvals! – and it has little to do with the software itself.

    The proposals passed with a near unanimous decision in the Senate, but prior to the vote some professors had expressed concern over the response rate and available completion window of the online format in general.

    The system saw a 71 percent response rate in the departments that used the new format last fall, Coghlan said. Students were notified of the survey via email and had a three-week window near the end of the semester to complete it.

    That window, and the lower response rates compared to the to the paper format, has some faculty worried results could be negatively affected.

    Ranga Ramasesh, a professor in the Neeley School of Business, said he realizes the response rate of the online system was not bad but was still lower than the in-class format. Ramasesh said one of his courses has two sections with 80 students each. The response rate for the in-class format is around 90 percent, Ramasesh said.

    And while the online system still saw a 75 percent response rate in his class, Ramasesh said the number of non-responses add up, especially in large classes.

    “[A three-week window] is not good,” Ramasesh said after a town hall meeting to discuss the proposals last month. “From my side, I want them to wait two more weeks. Give me the chance to stick through the whole course, then make a judgement. That way, you get the big picture.”

    Coghlan said she understands some faculty’s concerns over the response rates. Still, steps have been taken to ensure those numbers will rise. Professors will be able to log into the system to check and see how many of their students have completed the survey, giving them a chance to encourage students fill the surveys out before the deadline passes.

    Also, while the 71 percent rate might be lower than what is typically turned in with the paper format, that number of responses to the online format from last year is a solid starting point, Coghlan said.

    “In the long run, the positives will outweigh any negatives in terms of response rates,” Coghlan said. “We have been very proactive in giving faculty ideas about how to improve their response rate. The fact that we had a 71 percent response rate with Class Climate last semester, says a lot about the potential [of SmartEvals!.]”