Florida State University
Computer Science
Instructor Course Reporting Instructions

This page describes what program assessment information each faculty member is required to submit, each term, for each course taught. The following is a list of the kinds of information that may be required, depending on the specific course:

All of the information should be submitted through the website http://abet.cs.fsu.edu/.

Syllabus (for all courses)

Every term, at the start of the term, the instructor of every course is required to provide the students with a syllabus. The Department requires that the instructor provide a copy of this syllabus to the department, no later than when it is provided to the students.

The University imposes several requirements on this syllabus, which are subject to change and so may require the syllabus to be updated each term the course is taught. One of these requirements is that the syllabus include the "approved objectives" for the course. The approved objectives for each course in the Department of Computer Science are accessible via links from http://www.cs.fsu.edu/ugcc/.

Course Report (for all courses)

At the end of each term, in every undergraduate course, you are required to submit a brief report containing your comments on the levels of learning demonstrated by the students in your course(s). This end-of-term report is also advice to the next instructor of the course (possibly yourself) and to the Undergraduate Curriculum Committee. You are free to use your own format for the report and to keep it short if you do not have much to say.

Questions you will need to address in your report include the following:

  1. Which specific assignment(s) or a combination of assignments did you use to assess the SLO(s)? Please provide the title(s) as it/they appear/s in the course Canvas site.
  2. What was the numeric threshold that you used to measure learning success on this assignment?
  3. What particular aspects of the knowledge, skills, and/or values/attitudes that fall under the target SL O did students learn and what did they struggle with?
  4. Why do you think the students did well on the items you identified in your response to question 3 above?
  5. Why do you think the students struggled on the items you identified in your response to question 3 above?
  6. Which specific changes will you implement in the upcoming term(s) to better support student learning of the target knowledge, skills, and/or attitudes/values?

ABET/SMALC Outcome Assessments (for designated courses)

Additional information is required from the faculty members teaching the courses we have designated for assessment of program outcomes to satisfy the requirements of our accreditating organizations (SACS & ABET) and the State of Florida (State-mandated Academic Learning Compacts, or SMALCs).

These entities want to see evidence that we have a formal feedback process for continuous self-improvement. (The underlying thinking is similar to Total Quality Management (TQM), the Capability Maturity Model in Software Engineering (CMM), and ISO 9000.) As such they are not satisfied that we simply gather data, but that we (1) make use of the data to manage the quality of our program, and (2) we document this process in a way that they can review it.

These assessment requirements are based on the concept of measureable "outcomes" with respect to our students. Our department has defined several specific "student leaning outcomes" (referred to by capital letters "A" through "O") that we have promised to assess regularly. These are intended to cover some higher level "program outcomes" (referred to by lowercase letters "a" through "j") that are imposed by ABET' Computing Accreditation Commission (CAC). For internal purposes, we have mapped the ABET/CAC outcomes to FSU CS SMALC outcomes (referred to by letters "A" through "O" on http://abet.cs.fsu.edu/). A table showing how these two sets of outcomes relate to one another is provided at http://abet.cs.fsu.edu/.

For each FSU CS SMALC outcome associated with your course, you need to provide the following:

  1. A copy of the instrument used for the assessment
  2. A copy of the rubric used for the assessment
  3. A table of scores on the assessment
  4. A summary of the results of the assessment
  5. Your recommendations for improvements.

Each of these items is explained in more detail below.

Outcome

You need to read this, before the start of the term.

Program outcomes are statements that describe what students are expected to know and be able to do by the time of graduation. Student learning outcomes are more specific outcomes, that provide evidence of the program outcomes. For a listing and explanation of the student learning outcomes that our department as contracted to monitor/assess, how we have agreed to assess them, and their relationship to the ABET program outcomes, look under the specific year under the heading "Program Outcomes" at http://www.cs.fsu.edu/ugcc/.

1. Instrument

Submit a PDF file containing the instrument you used for the assessment. This will ordinarily be a copy of the assignment, or ethe xam questions, on which you based your assessment of how well the students achieved the given outcome. The choice of the type of instrument for each outcome is determined by the department. If you are not sure what the department has determined for a given assessment, please check with the Assessment Coordinator.

The kind of instrument you are required to use depends on the specfic outcome. See the 2009-2010 Outcomes for specifics.

2. Rubric

Submit a PDF file containing the rubric that you used to assess how well the students achieved the given outcome. This rubric should be "common", that is, the same among all sections of a given course, across all instructors and all campuses. The instructors of the course should agree on this. Once agreed upon, it should be passed on from one instructor to the next, until the all the instructors of the course agree to revise it.

A rubric is a scheme for assigning a grade or score to a student or a piece of student work. Its purpose is to break the evaluation process down into a series of smaller decisions, which can be made more objectively and consistently. For a very good example, see the second page of Daniel Chang's rubric for Oral Competency.

Most instructors use rubrics for grading student work. However, please keep in mind that when we are assessing our SMALC/ABET student learning outcomes, we have a different purpose. Therefore, the rubric you use for assessing the SMALC/ABET outcome should either be separate from the one you use for grading, or a subset of it, so that it only covers the SMALC/ABET outcome, and does not bring in extraneous matters.

3. Table of Scores

Submit, as a text file delimited by tabs, a table of the scores of the students on the assessment, one line per student. Please do not include student names or any other identifying information. Include just students who appeared on the final grade roster (no drops). It may be a text file, or an Excel spreadsheet. If you recorded detailed scores separately for each item on your rubric, each row of the table will contain the scores on the individual rubric items, followed by the overall score. The following are two of many possible formats. The first uses letters for the scores ("I" = ineffective, "E" = effective, "H" = highly effective), and the second uses numeric scores for the items, and then converts them to the I-E-H scale for the summary.

 Rubric Item 1Rubric Item 2...Summary
1EH...E
2EE...E
3IH...E
4II...I
...............
25HH...H
How many received a Grade of "C-" or better?24
Of those, how many achieved E or H on the assesment?23
 Rubric Item 1Rubric Item 2...Summary
17593...E
27080...E
35092...E
44959...I
...............
2590100...H
How many received a Grade of "C-" or better?24
Of those, how many achieved E or H on the assesment?23

4. Summary of Results of the Assessment

Submit a summary and analysis of the results of the assessment. Your summary should answer the question: "What progress did you make on the outcome, as evidenced by the assessment?" It should:

  1. Summarize the assessment results
  2. Present the most valuable finding from your analysis of results
  3. Assess relationship to standards established for this outcome by the department
  4. Explain any deviation from the established standards
  5. Identify responsibility for collection and evaluation of the results
  6. Report other insights that you have uncovered through the analysis
  7. Provide some evaluation of the method you used and its continuing status as a tool for assessment of outcomes

The summary should be supported by the data files that you uploaded. In preparing files for upload, remember that individual student records are covered by the university’ privacy policies. Social security numbers and student names should not be included in any results stored in the file bank.

The summary of results should contain enough detail to corroborate your assessment and analysis of the data. It may highlight areas of success in addition to areas needing improvement. Generally, your analysis should state its most valuable finding.

Example

The department exam was administered to all sections (3) of EXP 3000 at the end of the fall semester. Results indicated that a mean of 69% (62%, 66%, 78% per section) of the students scored 70% or better on the 17 item exam. An item analysis of the exam questions was conducted and it was found that one question had a significant flaw and that there were problems with several others. The exam was modified, i.e., 3 questions were replaced, 2 questions were reworded, and 3 new questions were added. The 20 question exam was then administered to two of three sections of EXP 3000 at the end of the spring semester. One section did not administer the exam because of a miscommunication with the instructor. Across both sections, 92.5% of the students (88% and 97% in the individual sections) scored 70% or better on the exam.

Results: A mean of 69% (62%, 66%, 78% per section) of the students scored 70% or better on the 17 item exam.

Relationship to Standard: 69% (as compared to the standard of 85%)

Analysis: An item analysis of the exam questions was conducted.

Deviation from Standard: One question had a significant flaw and that there were problems with several others.

Responsibility: Faculty teaching this course.

Insights Uncovered: The exam was modified, i.e., 3 questions were replaced, 2 questions were reworded, and 3 new questions were added.

Evaluation of assessment method: An item analysis and expert review.

5. Action

Submit your recommendations regarding actions that the department or next instructor should take in regard to this outcome. The purpose of this section is to show that you have reflectd on the results of the assessment learned something from them that can be applied to improving the quality of our program and/or the assessment process.

This section answers the questions: What are you going to do to improve learning, or the program? Who is responsible?

The results and analysis statement should spur some action to continue improvement. Such improvements might include:

These improvements can include actions that are indirectly related to improvements in the student learning outcome being assessed.

If your prospective improvements or action plan requires additional resources, their potential budget impact should be noted.

Please remember if you are already performing a high level, you should state how you expect to retain that level of performance. Your statements will provide a record over time of your efforts to improve the learning experiences of students and programs. They should be documented carefully and fully.

Example:

The goal was not met in the fall. This appears to be due to the content validity of several questions on the exam. We replaced these items in the spring, resulting in the goal being met. However, the exam was administered to only two of the three sections, so that we did not assess 100% of the population. The Undergraduate Studies Committee and faculty who teach this course will review the exam early in the Fall 2008 semester to evaluate the validity of the exam and whether it continues to assess the most important elements of the knowledge we wish students to attain. The exam will be revised as necessary and administered to all sections of this course (we will make sure there is no miscommunication with instructors and that they all understand the importance of administering the exam to their students).

Improvement envisioned: Review the exam to evaluate the validity of the exam and whether it continues to assess the most important elements of the knowledge we wish students to attain.

Time Frame: Early in the Fall 2008 semester

Responsibility: The Undergraduate Studies Committee and faculty who teach this course

Actions to retain results: None noted.

Resources: None noted.

Your comments may be read by:

  1. Inside the department: the Department Chair, Director of Undergraduate Studies, and Undergraduate Curriculum Committee.
  2. Inside the College: the Dean and associate deans.
  3. In the University: the Provost and his staff, and students.
  4. Outside the University: auditors and accreditation reviewers from ABET, SACS, and the Florida Board of Education, as well as the general public.

You should comment separately on each SMALC/ABET outcome that you are required to assess. There is no required format. The following are examples of the kinds of things that it would be appropriate to include:P

  1. Suggestions for improving student outcome, if any.
  2. Evaluation of the method you used to do the assessment (i.e., the asssignment/exam questions and the rubric). Is the assessment method working? Can it be improved? If so, what do you suggest?
  3. Evaluation of the outcome itself. Is it appropriate? Do you suggest any change in the statement of the outcome?
  4. Other insights you have gained from the assessment you did, if any.

If things are going well, you may not need to write much.

These comments/recommendations are likely to overlap in content with your basic course report. If so, you don't need to duplicate the details. Just write them down once here, and refer to this in your course report.

Computer Competency Assessments

This is a requirement imposed on us by the University and SACS, which applies to the following courses:

For background, see http://liberalstudies.fsu.edu/documents/OCCR_Criteria.pdf.

You must submit:

Note that the table here is not quite the same as that for the SMALC/ABET outcomes. Instead of the I/E/H scale, for this assessment you should report a letter grade on the assesssment, and the number of interest is how many achieved a grade of "C-" or better on the assessment. A student should not receive a grade of "C-" or better in the course unless she/he has achieved a grade of "C-" or better on the oral competency component.

Oral Communication Competency Assessments

This is a requirement imposed on us by the University and SACS, and applies only to CIS4250.

For background, see http://liberalstudies.fsu.edu/documents/CC_Criteria.pdf.

You must submit:

Note that the table here is not quite the same as that for the SMALC/ABET outcomes. Instead of the I/E/H scale, for this assessment you should report a letter grade on the assesssment, and the number of interest is how many achieved a grade of "C-" or better on the assessment. A student should not receive a grade of "C-" or better in the course unless she/he has achieved a grade of "C-" or better on the oral competency component.