Skip to main content

's Laura Osterman Implements Just in Time Teaching

's German and Slavic Languages Associate Professor Laura Osterman implements Just in Time Teaching techniques in her teaching with the support of the Fall 2014 ASSETT Teaching with Technology Seminar.

Introduction

Through observation of myself, students, and fellow students /scholars in my field (Russian Studies – including literature, film studies, and folk culture studies), I have come to realize that we learn best through a combination of thinking and interacting.  That is, we need to sit quietly with a text in order to see its patterns and understand its deeper meaning, but we also need to talk with others about it, and/or to read stimulating material related to it. I have been seeking ways to work these two processes into my pedagogy. I seek a tool which will help students think about assigned material before coming to class, and I would like to use that tool as a springboard for group discussion of the material.

Shortly after I arrived at in 1996, I began to use technology to meet these goals: at first I had students in upper division courses participate in an email listserv to discuss the materials we were reading, and later, starting in 2001, I had them use the discussion forum feature in WebCT. I have struggled with the difficulty of reading and grading all these posts, and students also complained about the difficulty of keeping up with an online discussion requirement. To remedy these issues, I began to use online quizzes as an assessment tool in 2004. However, online quizzes were not giving students the degree of active thoughtful participation that I was seeking, so during various semesters since that time I have experimented with requiring weekly or every ­class­ period guided written work, submitted online in a dropbox, by email, or in person in class. I have continued to use online quizzes because they are “self-­grading” and therefore “easy,” but I have been very frustrated with my inability to write a quiz which will actually tell me anything about how students have processed the reading.

Intervention: Just in Time Teaching Technique

This past summer, I attended the COLTT conference and was particularly interested in the Just In Time Teaching (JITT) technique, which was receiving a lot of buzz. With this technique, the professor assigns specific questions for the students to answer the night before they will discuss the material in class, and then the professor uses the student responses to help inform and structure class the next day. The technique uses long- answer (or short answer) quizzes, called Warmups, which are graded for effort on a 2 point scale. Grading for effort only (either the student receives 2 points for a good effort or not) is a significant improvement which solves the problem of instructor burn­out that I had experienced with graded online discussions. The instructor can assess the class’s take on the subject in a matter of ten minutes by reading a sampling of the responses, and can grade the responses en masse: you have D2L automatically assign the highest grade to all, and only change the grades of those students who do sub­par work or don’t answer the question, etc.

I chose to test out Just in Time Teaching (JiTT) in order to address three pedagogical goals this semester: 1) to foster student thinking, 2) to facilitate student interaction with the class as an interpretive group, and 3) to improve professor­student communication.

For feedback from student to professor I also used anonymous surveys through D2L, and for feedback from professor to student I used screencasts for feedback on papers (using Screencast­o­matic).

I implemented these interventions in my upper­division, cross­listed class in which I had 15 students this semester, RUSS 4471/ WMST 4471/ GSLL 5471 Women in 20th­21st Century Russian Culture.  This class, while small, had a few issues: the course counts for the A & S core and thus had students registered who had little interest in the subject matter. Also, it combined students with quite varied areas of expertise: students who know Russian culture well and those who don’t, students who know feminist theory well and those with no experience in this. I noticed a distinct division between the engaged students and those who were less engaged in the course. This was especially palpable in class discussions on the days when I had not assigned a Warmup.

I assigned weekly Warmup quizzes (17 per semester, of which 15 “count”) and one survey (for no credit), and provided screencast feedback on one take­home midterm and the final exam.

How ­to for JiTT in D2L

There are some technical issues with the Warmup technique on D2L, which I addressed with the help of OIT consultants. For faculty who might be interested in implementing this technique in their classes, here is the workaround which OIT provided me:

Create a quiz. Create questions using the short­answer option. Write the question. For the answer blank, increase the number of rows to 6 (students will anyway be able to write as much as they want; this just increases the size of the blank). In the answer blank, place a single character: a period. Change the weight to 100%. Change the radio button next to the answer to “Regular Expression.” That is it for the assessment, but you must also create a new report for the quiz. Select the tab called “Reports.” Name your report, and under Report Type choose “Attempt details.” Now, after you get some responses, you may go into Manage Quizzes, click the down arrow, choose “Reports,” click on the new report you named, choose HTML version, and you will see the responses grouped by student. There is no way that I know of to group responses by question and still have D2L show the student names (if anyone finds one, would you please let me know!). For me the report opens in a very small window so I do Control­ A and save a copy of all the responses, then paste them into a Word document. These instructions are the result of a semester­long struggle with the technical issues involved in configuring D2L to make grading and viewing student responses easy. By sharing them, I hope to make use of the JITT technique possible for other faculty.

Besides the technical aspects, I had an additional issue with this technique: I had lingering feelings that students should be reading each others’ posts rather than (or in addition to) having me present and comment on selections from their responses in class. However, in an anonymous survey around midterm, I asked whether students liked my in­class, teacher­conducted use of their posts, and the response was yes. Later, I varied this technique and got better at using their responses creatively.

Assessment of the Techniques

I could see a distinct difference between the classes when a Warmup was due and those when no Warmup assignment was due; when there was a Warmup, students were more engaged in class discussion, more easy to draw out. During the semester, I fine­tuned the way I used Warmups in class. For most of the semester, in class I discussed a sample Warmup or two on a Powerpoint slide, saying what was good or interesting in the answer and what was thin or incorrect. What I didn’t like about this classroom technique was that it felt awkward; doing it without mentioning the student’s name prompted me to do more talking, whereas I wanted to bring student voices into the classroom. However, a discussion can be fostered with Warmups: a few times I summarized ideas contributed by students on Warmup (on Powerpoint slides or orally) and asked students to fill in the details of their ideas in class, which worked as a springboard for further discussion. Doing this means the professor can’t really critique the Warmup answers substantively, but one can ask for more clarification.

I also tried online discussions for one week in lieu of Warmups with this group (and have used them in the past). I found based upon my experience that online discussions still need to be brought into the classroom and “activated” just as Warmups do, because the majority of students don’t read the responses to their posts.  Perhaps using online discussions with a goal (such as a group or individual blog or presentation) would induce students to really work together and become more invested in the process of group interpretation.

Surveys measure student perception of how effective a pedagogical technique is – how well they like the technique – which might be different from how effective the technique is in motivating or engaging students. Nonetheless, surveys at least help gauge what students are noticing about their learning in class. In order to assess student perception of their learning, at the end of the course I used a survey with some open ended questions in order to cast a broad net, and also some more focused questions. I used the following questions. In particular, questions 3 and 4 were intended to compare two different techniques for using Warmups in class: the first more teacher ­driven and the second a springboard to student discussion:

  1. What is helping your learning in this class?
  2. Talk about how Warmups are being used in this class and how that influences your learning.
  3. On a scale of 1­10, how useful did you find it to have a sample Warmup discussed by the professor in class (professor says what was good or interesting in the answer and what was thin or incorrect)?
  4. On a scale of 1­10, how useful did you find it to have the professor summarize ideas contributed by students on Warmup or online discussion and ask students to fill in the details of their ideas in class as a springboard for further discussion?
  5. As best as you can, hypothetically, compare the effectiveness for you of doing a weekly Warmup vs. doing a weekly online discussion (in a class similar to this one).
  6. As best as you can, hypothetically, compare the effectiveness for you of doing a weekly Warmup vs. doing no weekly preparatory writing (in a class similar to this one).
  7. What suggestions do you have for improving the effectiveness of the Warmup technique?
  8. What suggestions do you have for improving the class/the teaching of the class?

Question 3, assessing professor­driven in-­class reports on Warmups, received an average score of 7.5; question 4, assessing Warmups as springboard to discussion, received an average score of 8.1. The results are roughly equivalent, and also suggest that while most students found this technique useful, it might profitably be used in a variety of ways in order to benefit a variety of different learners. For #6, comparing having Warmups to not having them, the most common answer was that students preferred a Warmup because it forces them to do the reading: if there is just a bimonthly quiz “you are able to fake ways through discussion.” One student noticed that Warmups got them to “really think about the readings instead of just reading it.” Again for #6, among the reasons for not preferring a Warmup was the time it took for the student, and the time it took in class.

I observed that Warmups fostered student thinking about the class material, but for some students the quality of the thinking waned toward the end of the semester.

Perhaps that is inevitable.  However, I have seen very high quality, higher­ order thinking in some students’ Warmups.

One idea for future is to use a few Warmup questions as exam questions. I would be interested in seeing if students do better on an exam if they have completed a Warmup previously on this same question (or a version of it). Another idea is to encourage students to use Warmups as a springboard for papers.

This semester I also tested screencasts for feedback on selected student papers (a total of 7). I used Screencast­o­matic which I found easy to use and which allowed a 15 minutes for comments (which I sometimes used every second of). I did this only for those papers for which I had a significant amount of feedback to give. Although in class I told students to look at their feedback and wrote the url on the back of their papers, initially only 3 out of 7 students looked at their videos and only one gave me feedback on this technique. When I reminded them individually by email (giving the link), all of the remaining 4 did look at their videos and gave me feedback. All the feedback was positive: they felt they could better understand my critiques and what they could do better next time. I also felt the screencasts “freed” me to give the kind of feedback I would like to give. Certainly it is a time investment (took me about twice as long as usual, often 20 minutes in addition to reading and thinking about the papers) and I guess it works best in upper division smaller classes, but I will definitely use it again. I will continue to assess whether this is worth the time investment.

My final assessment is that studying teaching ­with ­technology techniques this semester has helped me to enjoy teaching more and find it more rewarding, which I am sure has helped make me a better teacher. As Amanda and Caroline taught us this semester, technology is not a goal in an of itself, but is a solution to a problem ­­how to get students to think and share their thoughts. As it turns out, technology can provide beautiful and simple solutions toward these goals.