Notes from Nov. 9, 2009 Optical Scanning Group Meeting

attending: McElroy, Marshall, Smith, Lundberg, Beck
not able to attend: Jozefowicz (econ), Gropelli (nursing)

Nate McElroy (chemistry) and Amanda Marshall (IT) met at 3pm in G64 Delaney to discuss the current optical scanning system. Dave Smith (comp sci) joined us at 3:30. We took a tour of the facilities and heard feedback on issues from Richard Beck (IT) and John Lundberg (IT).

The current optical scanning equipment and processes have some deficiencies that need to be addressed soon (see list below). Broadly, the issues can be categorized as: 1) back-end and software problems that create hassle and inefficiencies for the personnel actually using the equipment on a daily basis, and 2) up-front issues with scan sheet submissions using the current instructions/forms that confuse clients (faculty/staff) and also create problems for the personnel using the equipment.

On average, IT personnel handle 15-20 scoring submissions per day. A submission is one envelope with header page and a set of scanning sheets. During finals, that load can increase to 50 submissions per day. Heavier periods are also seen during faculty evaluation submissions. The time to run one set of scan sheets can take between 3-10 minutes, depending upon the amount of software edits and/or corrections to be made to the header information. The use of the optical scanning system is actually increasing, despite the increasing number of online courses. A couple of us thought this might reflect a trend in growing class size, but have nothing more than anecdotal evidence to make this assumption.

Issues to address include:

  • the software is 10+ years old, and has not been supported for 8+ years
  • the software is not very user-friendly; for example, if a faculty wishes to weight a question for anything more than 1 point per question, the IT user must manually change the weighting for each question, one at a time. Even if all questions are weighted as 2 points, each question weight must be changed. This leaves the possibility of user error in data entry, though very infrequently.
  • the reports generated are very static - there is no customization available to look at the raw date in more than two different report types
  • the instructions on filling out the header sheets and the submission folder need to be clarified/rewritten to avoid faculty/staff confusion and frustration and to improve efficiency for the IT users. (the scan sheet envelopes were created by and are printed by IUP)
  • the submission process and subsequent method of pick-up of exams is not very secure
  • can there be better integration with the scanning submission/reports with URSA and/or Moodle?
  • Scantron seems to be the sole vendor for optical scanning for large institutions - are there others and what are the other PASSHE schools using?
  • can we reduce the amount of paper used/generated?

After identifying these points, the following tasks were assigned:

  • Marshall: look into Scantron products and other possible vendors; find out what products other PASSHE schools use
  • McElroy: get feedback from ACPAC regarding the above issues and proceed with getting other faculty feedback on their needs (survey??); talk to Debbie Wardo about HR's needs for faculty evaluation sheets/data
  • The group will meet again after Thanksgiving break.