V. Krishna Kumar, PhD
In the early 1960s some university and technical training centers in the USA and former USSR employed electronic student response systems during lectures to measure student’s comprehension via multiple-choice items. Students responded by pressing response keys, usually five or fewer, and the percentages of students selecting various options were displayed either to the instructor or to the entire class depending on the system (Rogers, 1975). The contemporary electronic student response systems, fairly common in today’s classrooms, allow polling, quizzing, tracking attendance, commenting and responding anonymously as needed, providing feedback, and storing grades.
James Rogers and I (Kumar & Rogers, 1976, 1978) conducted an exploratory study in the 1970s to test an early student response system in my graduate statistics course at Case Western Reserve University (CWRU). The system involved student response units (keysets) with 12 response keys. When a response key was pressed, its associated meaning was displayed on a terminal on my desk in a matrix with keysets as columns, the time in seconds when the response was made, and the type of response made at the intersections. It was possible for me to know which student was pressing the response key, but the demands of the lecture did not permit this easily—only a glance at the matrix was possible that someone pressed a key. Additionally, this wired classroom allowed full videotaping of all goings on in the classroom, lectures as well as the student responses for analysis at a later time.
A student assistant observed the interactions between students and me in my previous semester’s statistics class that met twice a week to help determine the response labels assigned to the response units keys (Kumar & Rogers, 1978). The responses were of two types: student initiated and instructor initiated. The former included: “Don’t Erase, Question or Comment, I Disagree, Slow Down, Repeat, Please Summarize, Am I Following, and I Understand.” The latter included Yes and No as responses to my queries. To the student-initiated list, I added, “Go faster” and “I am getting bored.” The challenging materials in the statistics course did seem to require repeating explanations based on students queries, and I was curious if the repetitious presentations made the class boring to some students. My main intention in using these responses was to have students help me determine the pace at which I covered the materials.
“Repeat” was the most frequently used response and “bored” was never used, possibly because the class had only 13 students and they knew that their responses were not really anonymous. Alternatively, I knew the students well and they probably were kind to me by holding off on the “bored” response key!
The CWRU Newsletter Image (1977) reported the study in some detail. The article included the following comment from me: “Kumar said that instructors could also improve their lectures by reviewing the student comments. ‘Many professors don’t know if they are going too fast or if the class is bored’” (p. 5). A radio station’s reporter read the article and called to interview me.
As I recall, the reporter focused on the response key “I am bored” rather than the overall intent of the study. She expressed serious concern that I was giving students an opportunity to “harass” their schoolteachers by indicating on their keypad that they were getting bored during their classes. Her concern took me by surprise since the newsletter article did indicate that I had used my own college classroom for the study, not schoolteachers. I, of course, clarified to her that no schoolteachers were involved in the study and I had experimented with my own university class. Perhaps a reference to the Cleveland Board of Education in the CWRU Image article about using the wired classroom as a way of improving the quality of classroom instruction gave her the wrong impression that my response key labels were being used for them too.
I have sometimes wondered if my interview was ever aired. I have also wondered if my clarification that the study was only conducted in my class minimized her potential story that a professor was trying to create a classroom environment where students can “harass” their teachers by pressing the “I’m bored” response key during ongoing classes.
Kumar, V. K., & Rogers, J. L. (December, 1978). Student response behaviors in an instrumented feedback environment. Topics in Instructional Computing. Special Volume: Evaluation of the Use of Computers in Education, pp. 34-54.
Kumar, V. K., & Rogers, J. L. (1976). Instructional uses of the Olin experimental classroom. In R. Coleman & P. Lorton (Eds.), Computer science and education. SIGCUE Bulletin, Vol. 8(1), SIGSE Topics, Vol. 2, (pp. 189-191). NY: Association for Computer Machinery.
Rogers, J. L. (1975). A computerized classroom for instructor’s experimentation and training. In O. Lecairne & F. Lewis (Eds.), Computers in education (pp. 591-593). IFIP North Holland Publishing Company.
Students set pace in pushbutton classroom. (June, 1977). Images, The Monthly News Magazine of Case Western Reserve University, IV (6).