Dec. 18, 2009 Agenda

  1. optical scanning vendor updates (Marshall)
  2. LESC feedback on survey questions, attached below (Jozefowicz)
  3. review of survey questions, attached below (Smith, McElroy)
  4. new business

LESC Feedback

These are points raised by the LESC, courtesy of Stephanie

1) people would like to be able to indicate exam version numbers directly (on the current sheets, the only way to do this is to use the seat number for this purpose

2) people would like a better/clearer/more efficient way to attach different weights to questions

3) people would like better/more thorough exam question item analysis (this point actually comes from my dept. chair in conversation...he seems to remember that the previous exam scanning technology before the current versions gave more detailed question analysis that was useful for those who wanted it)

4) desire to be able to designate multiple answers as correct for the same question easily

5) desire to be able to have some subset of questions designated as bonus questions such that missing them wouldn't count against a student but getting them correct would increase the student's score (there is no good way to do this currently that I know of with the only reporting being the total # of questions correct)

6) discussion regarding it would be desirable and/or feasible to offer a slate of different size scantron sheets...a smaller size for quizzes (say 25 questions or fewer), a medium size (say 50-100 questions), and a large size (say up to 200 questions)...perhaps that would drive up costs because there was less bulk buying and/or make the technical issues more difficult on the IT side, but if it is irrelevant on the IT side and no real bulk buying savings, then if smaller sheets are priced at $0.03/sheet, medium sheets at $0.04/sheet, and large sheets at $0.05/sheet, departments may consider it worthwhile to buy a range of sizes as a way to cut expenditures.

7) discussion regarding balancing security issues...there should be a mechanism such as batch code that allows a faculty member or designated representative to pick up exams if the representative has the batch code...in my department, student workers may be dispatched to deliver and/or pick up exams, and in a world of scarce resources in which student workers can do very little for me in order to abide by all of the increasingly stringent interpretations of FERPA, I personally would not be happy if some decree went out that only faculty could pick up their exams

8) discussion of what you guys face on the IT side with exams never being picked up, which was surprising to the rest of us for regular semester exams...so perhaps a question on the survey that indicates this is a real storage capacity issue and happens frequently enough to be problematic and therefore solicit input as what is a reasonable length of time that testing services should be expected to hold onto the exams

9) an overall concern with the current survey questions in terms of interpreting results--too many questions are binary yes/no questions, which may not really reveal much information...options a) add an open box for each question for further explanatory comments (as you did with questions I-K) or b) rewrite the yes/no questions so that there are a more thorough range of choices to either measure intensity of use and/or preferences and/or to measure a range of reasonable answers to a "why" question as you did with question J)...so in other words, take a question like E--first, do you want to delineate between regular use during the class weeks of the semester vs. during final exam week? second, there could be a range of possible answer choices that give more information to interpret such as "always," "most of the time," "some of the time," "almost never," and "never"

10) do you want to try and capture additional demographic and/or use data...how long have the faculty been at IUP, how do there experiences here compare with other institutions, are the testing services being used primarily for principles level courses or for upper levels as well, how would a given faculty member rate his/her use of testing services today vs. 5 years ago (decreased, same, increased some, increased by a lot) and why...a question to minimally capture which college a faculty member is in as there may be different and/or consistent needs within and across disciplines

Stephanie: Personally, I think the survey instrument is a good start, but I think it has a ways to go to really generate useful results that can be interpreted in a meaningful way. I would think there needs to be a minimum of 2+ further iterations minimally among the action team before really launching it. Once the action team feels like it's closer to being ready, I'd be happy to send it out to LESC for further input and/or to use them for a beta-testing round given that would be an additional set of eyes and we have talked about this first draft already this week.

Possible Survey Questions and Flow

from Dave Smith & Nate McElroy


QUESTIONS
A. Do you currently use the optical scanning services at IUP to administer exams, quizzes, and/or surveys. If your only experience with optical scanning is with processing student evaluation sheets, please choose NO. [yes, no]

B. Per semester, what is the average number of scanning jobs (each job is considered the materials placed in a single white envelope)?
radio buttons: <2, 2-5, 6-10, >10

C. For what tasks do you submit scanning sheets?
check boxes: quiz/exam with 1-19 questions, quiz/exam with 20-49 questions, quiz/exam with 50-100 questions, quiz/exam with >100 questions, survey with 1-9 questions, survey with 10-19 questions, survey with >20 questions.

D. Do you use the open answer area on the back of the bubble sheet? [yes, no]

E. Are you satisfied with the turnaround time to process your scanning job? [yes, no]

F. Are you satisfied with the instructions on the white envelope in which you submit your scanning job? [yes, no]

G. Are you satisfied with the output(s) produced by the scanning job (emails, reports, etc.)? [yes, no]

H. Please itemize features that you envision would compel you to use, or use more often, the scanning service at IUP:
check boxes: weighting per individual question; text recognition to support short answer questions; submission of jobs using my computer's scanner or my department's copier; scoring to account for incorrect answers;
open boxes: other (three boxes)

I. Please indicate the problems, if any, you have encountered when submitting scanning jobs. If you have never used the scanning services, leave this blank [open box].

J. What reasons, if any, do you have for not using the scanning services at IUP.
check boxes: I tried it once but didn't like the experience; I don't give multiple choice quizzes/exams; the submission instructions and/or process is unclear/difficult/annoying; the current features for question weighting do not suit my needs; the current output/report options do not meet my needs;
open boxes: other

K. Please indicate improvements or features that you would like to see implemented in an updated scanning service [open box].

Possible flow:
1.question A: if YES', go to #2; if NO', go to #4
2.question B, question C, question D, question E, question F, question G; go to #4 (could break these into two pages)
3.question J; go to #4
4.question H; go to #5
5.question I; go to #6
6.question K; go to #7
7.thank you