About Capital   •   Contact   •   Giving to Capital   •   Website Directory     
    Home | Faculty & Staff | Institutional Assessment Portfolio

Reporting Design
for data based on
Assessment of General Education Goals

Data Collection (Assessment Implementation Team)

Once each semester (mid-November & mid-April), the Assessment Implementation Team uses a rubric to assess samples of a cross-curricular project. In 2001-2002, the focus is on writing; in 2002-2003, the focus is on mathematical problem-solving. Each sample is scored both holistically and analytically based on a previously-established rubric.

Data Compilation (Clerical support staff)

The scoring sheets are submitted to assessment clerical support staff for compilation of results. The compilation includes

  • using Banner numbers to identify the academic history profiles of the samples in the cohort and setting up a database including the fields identified as significant for each assessment. Please see section below on Profiles for possible fields.
  • replacing Banner numbers with assessment key numbers (one-way irreversible translation) for each sample of the cohort so that assessment scores will be fully anonymous.
  • computation of results for each sample by averaging the assessment scores from the readers (two or three) for each sample. (Scores are rounded to the nearest .5 or .0)
  • combining assessment results for each sample (now known only by key number) with the academic history profiles.

See the section below on Student Learning Assessment Data Compilation for detail.

These descriptive reports are returned to the Implementation Team for study ( early December and early May) . The second report of the year will be cumulative, combining both fall and spring data for a cohort total of at least 100 students.

Data Interpretation (AIT & clerical staff in support of Departments)

The descriptive reports are studied for the generation of appropriate inferential reports. For use in developing program improvements.

  • From the descriptive reports, the Assessment Implementation Team selects areas for inter-item analysis in order to clarify results and suggest inferences that departments might find useful. The Institutional Researcher may assist with this selection, and clerical staff sorts the data for correlation summaries. (January & mid-May)

  • Both the descriptive and inferential reports are studied and clarified at the May Assessment Workshop and then sent to departments with suggestions for interpretation. (Late May)

  • Department chairs review the reports and share them with department members to identify further interpretive questions, returning to the AIT for data correlations and clarifications if necessary. Departments use the assessment results to design interventions for improving student learning. (Summer and Fall)

Archiving (AIT and clerical staff)

All scoring sheets and databases are catalogued by key number and stored in appropriate files for retrieval in future years for longitudinal studies and cohort comparisons. (June)

Profiles (Detail)

Fields to be included in profiles are limited to those that are relevant to each assessment's purposes. They include primarily those areas of academic history over which the school exercises some control. They exclude ethnicity, age, gender, and other background categories that might be of interest for purposes of general research but not for purposes of designing interventions for the improvement of student learning.

Fields included:

  • Accuplacer scores in reading & writing
  • Number of college credits completed
  • Program affiliation
  • Grades in courses related to the assessment goal
  • GPA
  • History of enrollment in a Learning Community or First Year Experience class
  • Other (use of learning support services, completion of tangentially related courses, etc.)

Student Learning Assessment Data Compilation (Detail)

Assessment team and clerical support will: BANNER support personnel will provide:

1. Pick the assessment samples 
 2. Profile information (by BANNER#) about each Sample
3. Assign a key number to each sample
4. Generate data through the assessment
 
 5. The averages computed for each sample
6. Connect profile information to assessment data
7. Destroy the keys
8. Maintain the anonymous, aggregate data
 

This division of functions ensures the anonymity of participants in the assessment activity.

Send comments to the Webmaster  
Capital Community College Homepage