Press Enter to skip to the main content

Assessment Manual

Chapter 1: Introduction

Mission, Vision, and Core Values

Mission Statement

Blinn College is building stronger communities by providing quality, comprehensive education, and empowering students to achieve excellence in their educational careers and personal goals.

Vision Statement

Shaping future academic, workforce, cultural, and economic leaders by providing excellent instruction, resources, services, and innovative partnerships, for students and the community.

Core Values

  • Access
  • Collaboration
  • Diversity
  • Excellence
  • Innovation
  • Respect
  • Service

Assessment of student learning is a critical component of all educational institutions, including the Blinn College District. With effective assessment processes in place, the institution can better evaluate student learning as well as the extent to which students are prepared for future educational, employment, and personal success. The information collected through this process provides significant insights allowing instructors, programs, and administrators to pursue continuous improvement in the accomplishment of our mission, vision, and core values.

The Blinn College District Assessment Manual is an ongoing project of the Blinn College Assessment Council, a permanent body comprised of faculty and staff from across the institution. The manual is designed as an introduction and guide to the complete assessment process at Blinn College and serves to give both the theoretical basis of assessment in higher education and the practical steps needed to ensure that assessment is ongoing and institution wide.

The Blinn College District views identification and measurement of student learning outcomes and improvement of student learning to be essential to its educational programs. The College uses several institution-wide assessment processes that are broad based, interrelated, and linked to its mission. The College uses these assessment measures to strengthen and improve student learning:

  • Institution-wide key performance indicators
  • Assessment of general education outcomes through the Texas Higher Education Coordinating Board
  • Program Review—a process which includes student learning outcomes by program area

The Blinn College District also uses Key Performance Indicators for Student Success such as retention rates, program completion, transfer rates, and graduation rates.

Categories of Assessment

Course-Level Student Learning Outcomes (SLOs)

Each course offered by the Blinn College District is required to include student learning outcomes EFA (LOCAL). These outcomes describe what a student should know, feel (attitudes and values), do, or be able to demonstrate because of instruction delivered in the course. High quality SLOs share a few common features:

  • They clearly relate to course concepts, assignments, and other evaluations.
  • They should be measurable.
  • They should include action verbs describing what students will be able to do, such as describe, list, distinguish, perform, etc.

SLOs allow the instructor of record and division leadership to evaluate the success of the curriculum, pedagogy, and assessment methodologies for each course. Analysis and evaluation of data collected are used to introduce changes in course curriculum and methodologies and are critical components of promoting continuous improvement in instruction.

Course SLOs are essential elements of any course and as such are the most familiar to faculty members. However, these course SLOs are part of a larger academic unit, or “program,” that must also be assessed. The formal definitions of “program” inform how the College District approaches the concept and how that impacts the assessment process. As it is the most specific, the definition provided by the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) will provide the basis for the definition of “program” for the purpose of conducting program assessment.

Program Definition, Texas Administrative Code

The Texas Administrative Code is the origin for regulations impacting College District operations. In rule §9.1 of Title 19, Chapter 9 of these regulations, the State of Texas provides a list of definitions informing educational activity. The list below reproduces the definitions in the document most relevant to defining a program, but the document does not offer a specific definition for “program.” Rather, Rule §9.1 defines specific types of programs differentiating between baccalaureate and associates awards as well as academic and workforce instructional emphasis.

  • Academic associate degree — An associate degree that will satisfy the lower-division requirements for a baccalaureate degree in a specific discipline.
  • Applied associate degree — An associate degree intended to lead directly to employment following graduation and may satisfy the lower-division requirements for a baccalaureate degree in a specific discipline.
  • Associate degree program — A grouping of courses designed to lead the individual directly to employment in a specific career or to transfer to an upper-level baccalaureate program. This specifically refers to the associate of arts (AA), associate of science (AS), associate of applied arts (AAA), associate of applied science (AAS), and the associate of occupational studies (AOS) degrees. The term “applied” in an associate degree name indicates a program designed to qualify students for immediate employment.
  • Career Technical/Workforce program — An applied associate degree program or a certificate program for which semester credit hours, quarter credit hours, or continuing education units are awarded, and which is intended to prepare students for immediate employment or a job upgrade in a specific occupation.
  • Certificate program — Workforce programs designed for entry-level employment or for upgrading skills and knowledge within an occupation. Certificate programs serve as building blocks and exit points for AAS degree programs.

Program Definition, Board of Trustees

The Blinn College District Board of Trustees also defines several types of programs EFBA (LEGAL) to guide the institution’s educational offerings and to ensure compliance with external regulators.

Program Definition, SACSCOC

In the interest of promoting a culture of continuous improvement in member organizations, SACSCOC requires that institutions provide evidence for the establishment and monitoring of student learning outcomes for each of its educational programs, Resource Manual for the 2018 Principles of Accreditation: Foundations for Quality Enhancement, Standard 8.2.

Notably, SACSCOC did not specifically define the term Educational Programs, leaving the interpretation of this requirement to member institutions. The resulting ambiguity was resolved in the publication of the updated Substantive Change Policy and Procedures, Revised December 2020:

Program: A coherent course of study leading to a for-credit credential including a degree, diploma, certificate, or other generally recognized credential.

This definition of program is inclusive of all awards or credentials offered by the institution as a result of a student completing courses for college credit and applies to general education even if that course of programs does not generally result in a credential.

All courses of study leading to an Associate degree, Certificate, or an Occupational Skills Award (OSA) are considered programs by SACSCOC and are subject to the processes described in the Assessment Process Handbook. Only not-for-credit courses of study are exempt from the requirements herein.

Stackable Awards Program Assessment

Many of the awards granted by the College District are closely related to each other and share administration, budgets, and personnel. This is particularly the case for disciplines with stackable credentials where students can earn awards at different levels within the same instructional discipline—these programs may be considered Stacked Programs. For instance, students interested in a career in the carpentry discipline could earn an Occupational Skills Award (OSA), a Certificate Level I, Certificate Level II, and/or an Associate of Applied Science (AAS). Each of these credentials are considered programs by SACSCOC, even if they enjoy considerable content and administrative overlap. Similarly, students interested in the field of accounting can earn an Accounting Assistant Specialist (Certificate Level I), Accounting Technology Certificate (Certificate Level II), Accounting and Financial Services Pathway (Certificate Level I specific to dual credit students) or an Associate of Applied Science. These awards also enjoy considerable administrative overlap, though the degree of overlap is reduced when compared to the carpentry example above.

Program-Level Student Learning Outcomes (PSLOs)

Each program is required to enact a Program-Level Student Learning Outcomes (PSLOs) assessment plan that measures student learning associated with successful program completion. Metrics for this type of assessment plan may include capstone projects, student portfolios, departmental examinations, external licensing examination or other measures that can be attributed to student learning because of program completion. PSLOs may also take the form of selected course-level assessments, so long as the program can articulate how these course-level assessments represent or contribute to program-level student learning.

Each program should develop and maintain 2 to 3 PSLOs. Programs may maintain more than 3 PSLOs, but it is not recommended. Each PSLO should be documented in the Strategic Planning Online (SPOL) platform, which is the cloud-based database system used by Blinn College District to aggregate planning, program assessment, and faculty credentialing data for program review, strategic planning, and accreditation.

Each program area has a Curriculum Resource Team (CRT), which is a committee that functions as a recommending body on academic matters to academic leadership. The CRT is composed of faculty members who, as subject matter experts in the discipline, will provide curriculum recommendations to academic deans, curriculum committee, the Vice Chancellor of Academic Affairs, and other relevant administration at the College. Each instructional unit, in collaboration with its CRT and leadership, must decide how often each PSLO is measured. In the case of staggered assessment of PSLOs, the instructional unit must arrange the assessment schedule so that at least 1 PSLO is assessed every year and the nature of the assessment schedule is described in the assessment report.

Program Level Outcomes (PLOs)

Program Level Outcomes (PLOs) are required for each program in the college district program inventory, as these assist program administrators to better understand the program’s goal accomplishments. While PLOs are focused on program-level outcomes rather than course-level outcomes, these are not directly tied to student learning. Rather, PLOs are focused on related goals such as increases in enrollment, graduation or transfer rate of students, student success in licensure examinations, and other indicants of program success.

Program evaluation plans should describe what data will be collected, when data will be collected, and who will perform the collection. Additionally, this plan should describe who, when, and how this data will be analyzed and evaluated, as well as who will be charged with decision-making resulting from this analysis and evaluation. In this case, analysis refers to the aggregation and visualization of data to identify possible trends. Evaluation occurs after analysis and involves the process used to extract meaning from the analysis. Evaluation should address several questions such as:

  • What is the nature of the observed trend?
  • Is this trend good or bad for this program?
  • What might be potential causes of observed trends?
  • What actions might the program take for more favorable trends benefiting the program, division, institutions, students, and other key stakeholders?

The assessment plan should specify which entities will perform the analysis, provide the evaluation, and make recommendations in response. It should also specify the college district official empowered to authorize program changes in response to data evaluation and the process by which that official will be made aware of recommendations. Each program should develop and maintain 2 to 3 PLOs. Programs may maintain more than 3 PLOs, but it is not recommended.

General Education Assessment (Core Curriculum)

The State of Texas mandates that all public higher education institutions maintain a core curriculum of at least forty-two semester hours and assess the effectiveness of the core. Each institution of higher education in the State of Texas is required to submit data reports to THECB every ten years. In 2014, new Core Objectives were developed by the THECB’s Undergraduate Education Advisory Committee (UEAC) as part of the Core Curriculum Revision Project. The Core Objectives are: Critical Thinking Skills, Communication Skills, Empirical & Quantitative Skills, Teamwork, Social Responsibility, and Personal Responsibility. Each Foundational Component Area has a “Required” or an “Optional” designation, as follows:

Foundational Component Area Critical Thinking Communication Skills Empirical & Quantitative Skills Teamwork Social Responsibility Personal Responsibility
Communication Required Required Optional Required Optional Required
Mathematics Required Required Required Optional Optional Optional
Life & Physical Sciences Required Required Required Required Optional Optional
Language, Philosophy and Culture Required Required Optional Optional Required Required
Creative Arts Required Required Optional Required Required Optional
American History Required Required Optional Optional Required Required
Government / Political Science Required Required Optional Optional Required Required
Social / Behavioral Science Required Required Required Optional Required Optional
Component Area Option** Optional Optional Optional Optional Optional Optional

**Component Area Option MUST contain a minimum of 3 Core Objectives selected by the institution.

REQUIRED = required Core Objectives to be addressed in each course selected for inclusion in the Foundational Component Area.

OPTIONAL = institution may include Core Objective for each course selected for inclusion in the Foundational Component Area.

Blinn College evaluates all courses in the core curriculum each year to ensure that students receive a quality general education. Similar to course-level SLOs, the respective CRT will develop a common assignment and rubric and set appropriate benchmarks for students’ level of proficiency. Data is collected according to the academic unit’s assessment plan and results are reported in SPOL.

Chapter 2: Getting Started

What Is Assessment?

  • Assessment is the shared process of gathering purposeful and systematic measurement for documentation, reflection, and improvement of both student learning and institutional practices.
    • Shared reflects a collective responsibility of instructors, students, support staff, and administrators to work together to enhance student learning and institutional practices.
    • Purposeful implies careful planning.
    • Systematic refers to a continuous and organized effort.
    • Measurement refers to quantifying meaningful data.
    • Documentation provides evidence.
    • Reflection encourages the college to consider what and how much our students are learning.
    • Improvement enriches the quality of education.

What Is the Purpose of Assessment?

  • Assessment is the product of ongoing reflection on content and methods in the delivery of information. This key element makes assessment not a rigid formulation of regulations but a process that keeps changing through time. Assessment involves continuously recognizing changes in an institution and in its individual players in the process of accomplishing institutional and individual missions, goals, and objectives.
  • When assessment is used, a judgment is made between “where the individual is” against the standard of “where that individual wants to be.” In so doing, assessment generates information and/or systematic, data-driven outcomes that can be used for further reflection. The data needs to be meaningful and presented in a way for decision-makers to make appropriate changes/adjustments. The goal is to use the feedback to let go of practices that are not effective while allocating resources effectively.
  • Assessment of student learning evaluates the learners’ understanding of the material being taught in relation to the missions, goals, and objectives of a course, discipline, and institution. Assessment also provides critical insights into the skills and knowledge learned as well as on the learners’ progress and success.
  • Assessment not only maintains and improves the quality of college programs and services but also fosters student success.
  • Assessment examines the consistency between goals at the institution, division, and program levels.
  • Assessment examines the consistency between stated objectives and assessment strategies.
  • Assessment helps meet the standards set by accreditation agencies.
  • Assessment quantifies the quality of instruction, course material, and the students’ comprehension of the material. Since it is important to know that it is working well and what needs to be approached differently to ensure that students are mastering the course material, a faculty member may want to ask:
    • Are students learning what you want them to learn?
    • Is your class teaching what students need to know about your subject?
    • Is your division facilitating student learning in your classroom?
    • Are you quantifying and/or defining/describing what a student has learned with reference to a defined goal/outcome?
    • Do you have a means of measuring how well things are being accomplished?
    • Have you established the degree of accomplishment for a given task?

In other words, assessment can 1) identify areas for improvement, 2) validate the instruction or program, and 3) ensure safety and efficiency.

Who benefits from assessment?

  • Students can assess their own learning to determine which educational choices are working or not working for them and how to make better decisions in the future.
  • Prospective students can determine how well this institution helps them reach their learning, professional, and personal education goals.
  • Instructors can assess their teaching strategies to identify areas that can lead to improvement of students’ comprehension and performance.
  • Administrators can assess whether departments and divisions are meeting their stated goals and how to help them meet those goals.
  • Support staff can use assessment to determine how well their programs meet the students’ needs and what sort of action plan will be implemented to make improvements.
  • Accreditation agencies can evaluate the quality of education which must be demonstrated by substantial compliance with relevant standards.

Where Can We Use Assessment?

Assessment can take place within the classroom, program, division, or institution.

  • Within courses, assessment can be used to determine what individual students are learning and how well they are meeting the goals of the course.
  • Across courses, assessment can be used to determine what and how well individual students are learning during the progression within a particular program or over their years at college.
  • Within departments/programs/disciplines, assessment can be used to determine how well a class or course is meeting its stated goals and/or objectives to advance institution-wide goals.
  • At the institutional level, assessment can be used for internal improvement or to meet external demands of accountability.

Assessment Is Not Evaluation!

  • Evaluation analyzes and uses data to make judgments about student performance.
  • Assessment analyzes and uses data to make decisions about improvements in teaching strategies and student learning.

Evaluation Versus Assessment

Evaluation Assessment
An instructor grades an exam and assigns a grade to a student. An instructor provides feedback to a student regarding his/her performance on an exam. The student uses that feedback to study differently to improve learning and performance.
Pop quizzes are given in class to determine if students have done their reading before coming to class. Simple Pass/Fail grades are assigned and totaled at the end of the semester. The quizzes count for 5% of the total grade. A team of instructors analyzes exam results of all students and discovers that <40% of the students demonstrated understanding of an important concept. The team investigates possible causes and makes improvements to teaching/learning strategies.
An instructor grades an essay and assigns a grade to a student. An instructor develops a grading rubric for an essay assignment to help students understand the grading criteria and encourages them to submit their drafts for feedback.

Potential Barriers to Assessment

  • Assessment is not new – some faculty do it regularly. However, good assessment takes effort and time which may lead to faculty resistance in formalizing the process or integrating new pedagogical approaches.
  • Some faculty hesitate to become involved in assessment, thinking that assessment might be used to penalize faculty instead of being used to help them attain their course and institutional goals. This usually shows a misunderstanding of the purpose of assessment, which is to provide feedback to facilitate accomplishing goals and objectives.
  • The most common pitfalls of assessment include:
    • Merely complying with external demands.
    • Triggering resistance and hostility of faculty.
    • Gathering data no one will use.
    • Letting administrators do it.
    • Making the process too complicated.
    • Lacking information and understanding about how assessment works.
  • Factors influencing the success or failure of assessments include:
    • Faculty knowledge of assessment, motivation to conduct assessment, and knowledge of how assessments are used or not used.
    • Understanding of how assessment fits into the larger picture of the college mission, vision, and goals.
    • Institutional support
    • Training, discussion, peer observations
    • Objective oversight of assessment
    • Share best practices at Blinn College
    • Division leadership training in assessment and effective use of assessment
    • Guest speakers
    • Motivation to change
    • Time/money
    • Explanation of why assessment is done at all levels
    • Explanation of how assessment helps faculty and their divisions
    • Hiring assessment consultants/experts

How to Enhance the Acceptance of Assessment

  • Educate the members of the institution on what assessment is and on how to use it.
  • Coordinate between individuals involved in direct assessment and OIE&EM staff to make the process of data collection and reporting more effective, simpler, and clearer.
  • Train faculty through workshops, interdisciplinary exchange of ideas, and attendance at conferences and symposiums, which will motivate faculty into greater involvement and less hesitation about working with assessment.
  • Make assessment a faculty-driven process at all levels to give them a stake in the process and get them actively more involved.

An excellent introduction to the topic of assessment is:

Barbara E. Walvoord’s Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education, 2nd ed. (Jossey-Bass Higher Education, 2010).

Chapter 3 – Good Practices in Assessment

The Assessment Cycle

Assessment is an ongoing, continuously evolving, data-driven process. As this illustration shows, the process is circular rather than linear.

The Process

  • First, the Texas Higher Education Coordinating Board (THECB) and/or credentialing boards determines student learning outcomes (SLOs) for most courses. The Curriculum Resource Team (CRT) for the respective program area will develop SLOs where none have been identified.
  • Once the SLOs have been developed, the faculty design common assignments and assessment tools such as rubrics to assess the students’ work.
  • The tools are then implemented as part of the data-gathering process.
  • After implementation, the evidence is collected and evaluated to identify any shortcomings in student learning.
  • Once those problems are identified, faculty develop potential action plans to see what effect (if any) the described plan has on student improvement.
  • Those plans are integrated into the assessment cycle and the process continues until satisfactory levels of student learning are achieved.
  • The entire process is documented so faculty can check the historical record.

Developing Course-Level Student Learning Outcomes (SLOs)

Student Learning Outcomes (SLOs) are specific statements about what the student will know, feel (attitudes and values), do, or be able to demonstrate at the completion of the course or program. SLOs must be specific, observable, and measurable. At the Blinn College District, all courses are at the lower division level and course-level student learning outcomes are written at the state level or by credentialing bodies. Academic units can add additional course level SLOs, but the externally developed outcomes are mandatory for the sake of transferability and credentialing. In cases where the state has not identified course level student learning outcomes, members of the respective CRT will develop the outcomes.

Definitions

  • Goals focus on the general aims of the program or course.
  • Outcomes focus on what students will know, feel (attitude and values), do or be able to demonstrate by the end of the program or course. Outcomes are changes, benefits, or learning that occur because of the output.
  • Objectives are often used interchangeably with outcomes. For this document and training, objectives have a module/chapter-level focus while outcomes have a course-level focus.
  • Outputs are the measurements – how much of something happened?

Identifying Goals and Outcomes

Explicit goals help focus the design and structure of a course or program and guide the development and implementation of specific, measurable SLOs. Remember that goals can be broad and theoretical. SLOs will be more specific. If you have trouble identifying goals and outcomes, try answering these questions:

  • Why do you use your particular course structure?
  • Why do you use your current assignments and activities?
  • What do you want your students to learn from your assignments?
  • In the past, have your goals for students been realistic?
  • Where do students have difficulty; what do they consistently not get?
  • If you ran into a student who had taken your class the previous semester, what would you hope they would say about what they learned from your course? For each of your stated outcomes, what are the specific student behaviors, skills, or abilities that would tell you it is being achieved?
  • What would an outsider (future instructor, employer, program reviewer) need to see to believe that your students are achieving the outcomes set out for them?
  • In your experience, what evidence tells you when students have met these outcomes – how do you know when they’re “getting” it?

Cognitive Domain

One of the elements that can help in the development of student learning outcomes is Bloom’s Taxonomy, which was first developed in 1956 by educational psychologist Benjamin Bloom. Since then, the taxonomy has been updated by Lorin Anderson and other cognitive psychologists to make it more relevant to the 21st century. Note how the domains have changed from static nouns to active verbs:

Category Definition Related Behaviors
Remembering recalling or remembering something without necessarily understanding, using, or changing it define, describe, identify, label, list, match, memorize, point to, recall, select, state
Understanding understanding something that has been communicated without necessarily relating it to anything else alter, account for, annotate, calculate, change, convert, group, explain, generalize, give examples, infer, interpret, paraphrase, predict, review, summarize, translate
Applying using a general concept to solve problems in a particular situation, using learned material in new and concrete situations apply, adopt, collect, construct, demonstrate, discover, illustrate, interview, make use of, manipulate, relate, show, solve, use
Analyzing breaking something down into its parts; may focus on identification of parts or analysis of relationships between parts, or recognition of organizational principles analyze, compare, contrast, diagram, differentiate, dissect, distinguish, identify, illustrate, infer, outline, point out, select, separate, sort, subdivide
Creating creating something new by putting parts of different ideas together to make a whole blend, build, change, combine, compile, compose, conceive, create, design, formulate, generate, hypothesize, plan, predict, produce, reorder, revise, tell, write
Evaluating judging the value of material or methods as they might be applied in a particular situation, judging with the use of definite criteria accept, appraise, assess, arbitrate, award, choose, conclude, criticize, defend, evaluate, grade, judge, prioritize, recommend, referee, reject, select, support

The Affective Domain

The affective domain (Krathwohl, Bloom, Masia, 1973) includes the way we deal with things emotionally, such as feelings, values, appreciation, enthusiasms, motivations, and attitudes. The five major categories are listed from the simplest behavior to the most complex:

Category Example and Key Words (verbs)
Receiving Phenomena: Awareness, willingness to hear, selected attention. Examples: Listen to others with respect. Listen for and remember the names of newly introduced people. Key Words: acknowledge, asks, attentive, courteous, dutiful, follows, gives, listens, understands.
Responds to Phenomena Active participation on the part of the learners. Attend and react to a particular phenomenon. Learning outcomes may emphasize compliance in responding, willingness to respond, or satisfaction in responding (motivation). Examples: Participates in class discussions. Gives a presentation. Questions new ideals, concepts, models, etc. to fully understand them. Know the safety rules and practice them. Key Words: answers, assists, aids, complies, conforms, discusses, greets, helps, labels, performs, presents, tells.
Valuing The worth or value a person attaches to a particular object, phenomenon, or behavior. This ranges from simple acceptance to the more complex state of commitment. Valuing is based on the internalization of a set of specified values, while clues to these values are expressed in the learner's overt behavior and are often identifiable. Examples: Demonstrates belief in the democratic process. Is sensitive towards individual and cultural differences (value diversity). Shows the ability to solve problems. Proposes a plan for social improvement and follows through with commitment. Informs management on matters that one feels strongly about. Key Words: appreciates, cherish, treasure, demonstrates, initiates, invites, joins, justifies, proposes, respect, shares.
Organization Organizes values into priorities by contrasting different values, resolving conflicts between them, and creating a unique value system. The emphasis is on comparing, relating, and synthesizing values. Examples: Recognizes the need for balance between freedom and responsible behavior. Explains the role of systematic planning in solving problems. Accepts professional ethical standards. Creates a life plan in harmony with abilities, interests, and beliefs. Prioritizes time effectively to meet the needs of the organization, family, and self. Key Words: compares, relates, synthesizes.
Internalizes Values (characterization) Has a value system that controls their behavior. The behavior is pervasive, consistent, predictable, and the most important characteristic of the learner. Instructional objectives are concerned with the student's general patterns of adjustment (personal, social, emotional). Examples: Shows self-reliance when working independently. Cooperates in group activities (displays teamwork). Uses an objective approach in problem solving. Displays a professional commitment to ethical practice daily. Revises judgments and changes behavior considering new evidence. Values people for what they are, not how they look. Key Words: acts, discriminates, displays, influences, modifies, performs, qualifies, questions, revises, serves, solves, verifies.

The Psychomotor Domain

The psychomotor domain (Daves, 1970) includes the skill-based learning seen in both academic as well as technical and workforce education. These five levels of “skill” represent different degrees of competence in performing a skill not the multiple skills themselves. The five levels, in order from most basic to most advanced, are:

  1. Imitation: Learner watches actions of another person and imitates them.
  2. Manipulation: Learner performs actions by memory or by following directions.
  3. Precision: Learner’s performance becomes more exact.
  4. Articulation: Learner can perform several skills together in a harmonious manner.
  5. Naturalization: Learner achieves a high level of performance, and actions become natural with little or no thought about them.
Level Psychomotor ability Key words
Imitate Observe other person’s behavior and copy it. Replicate the behavior shown by example. Impersonate, copy, mimic, imitate, repeat, duplicate, reproduce
Manipulate Ability to perform skills by following the instructions. Follow, demonstrate, perform, execute, present
Precision Ability to perform skills with minimal errors and more precision. Smooth and accurate. Perform skillfully, proficient and becoming expert
Articulation Ability to solve and modify skills to fit new requirements. Adapt, revise, adjust, customize
Naturalization Ability to perform the skills without thinking. Flawless and perfect.

Types of Assessments

Formative assessment is an ongoing assessment that is intended to improve an individual student’s performance, student learning outcomes at the course or program level, or overall institutional effectiveness. Formative assessment can be used at any time during the semester or year to determine how effective current strategies are working and whether those strategies need to be revised. The key elements of formative assessment are:

  • Establishment of a classroom culture that encourages interaction and the use of assessment tools.
  • Establishment of learning goals and tracking the progress of individual students towards those goals.
  • Use of varied instruction methods to meet diverse student needs.
  • Use of varied approaches to assess student understanding.
  • Feedback on student performance and adaptation of instruction to meet identified needs.
  • Active involvement of students in the learning process.

Summative assessment occurs at the end of a unit, course, or program. The purposes of this type of assessment are to determine whether overall goals have been achieved, provide information on performance to an individual student, or obtain statistics about a course or program for internal or external accountability purposes.

Both types of assessment can use direct or indirect measures:

  • Direct assessment provides evidence that actual learning occurred and was related to a specific content or skill. Direct measures test a student against learning criterion directly (for example, via a pop quiz).
  • Indirect assessment reveal characteristics associated with learning, but they only imply that learning has occurred (such as perceptions). Indirect measures examine evidence of learning (self-reflection, surveys).
  • Even though both are necessary, we tend to rely more on direct measures because they are easier to create and produce quantifiable results.
  • We tend to be more comfortable with direct measures; however, test questions must be well-written to ensure that they are truly a measure of SLOs. One or two questions addressing a specific SLO probably cannot determine student mastery of that SLO. However, indirect measures, such as self-reflection and surveys give information that direct measures may not provide.
Direct Assessment Measures Indirect Assessment Measures
Course Level
  • Examinations and quizzes
  • Observations of field work, internship
  • Classroom discussion
  • Oral presentations
  • Class participation
  • Performance
  • Service learning
  • Clinical activities
  • Standardized test
  • Course evaluations
  • Focus groups
  • Hours dedicated to service learning
  • Intellectual or cultural activities related to the course
  • Attendance rates
Program Level
  • Capstone projects, senior theses, exhibits, or performances
  • Pass rates or scores on licensure, certification, or subject area tests
  • Employer and internship supervisor ratings of students' performance
  • Registration or course enrollment information
  • Employer or alumni surveys
  • Student perception surveys
Formative Assessment Summative Assessment

How a student is learning and mastering the skill “real time.” (a snapshot in time)

  • Pop quizzes
  • Reading checks
  • Learning games
  • Homework
  • Discussion questions
  • Music theater rehearsals
  • Hands-on activities

How a student mastered the course of a given portion of the course.

  • Graded tests
  • Standardized test
  • Research projects
  • Portfolio
  • Final exams
  • Arts/Architecture critique
  • Performance jury

Palomba, C.A., & Banta, T.W. (1999), Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education; Jossey-Bass Publishers, San Francisco.

Chapter 4 – Implementing the Assessment Cycle

Assessment Techniques and Tools

Once SLOs are determined, members of the CRT will select an assessment technique (e.g., portfolio analysis, oral presentation, etc.), develop a common assignment, and determine the level of proficiency (benchmark) that will be used by all faculty teaching the course to assess student learning. One excellent way to measure student success on a particular assignment or project is with a rubric, which is simply a standard of performance for a defined population. That standard can be set by the CRT and used for all sections of a course or program.

There are no set rules in how to design a rubric – some faculty prefer five levels of proficiency to coincide with the standard grading scale (A, B, C, etc.) whereas other faculty prefer three or four levels. Also, there is no rule setting how many criteria are “enough,” although more than five or six criteria are probably too many. The rubric should be understandable not only by the faculty member but also by the students, so that they understand how they are being assessed on a particular point.

More importantly, each juncture of criterion and proficiency level should be adequately defined so that any assessor could use it and understand the difference between levels. Numerous examples of rubrics are available on the Web, including VALUE rubrics by The American Association of Colleges and Universities (AACU) at https://www.aacu.org/. VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics are open educational resources that enable educators to assess students’ original work.

Additional examples of various assessment techniques and tools are included in the Appendix. For a more comprehensive collection of ideas, consult works such as Angelo and Cross’ Classroom Assessment Techniques, Walvoord’s Assessment Clear and Simple, and Stevens and Levi’s Introduction to Rubrics.

Course-Level (SLO) Assessment Cycle and Data Collection

An assessment cycle describes the complete workflow associated with assessment-related activities at the Blinn College District. Evaluating when and how often to assess will depend on the outcome under consideration and the specific assessment plans for the program, department, or division. However, it is recommended that each new assessment cycle begins in one of the long semesters (fall or spring) of an academic year and ends in the subsequent long semester. For example, a course containing six SLOs would have data collection begin for three SLOs in the fall. Data would be entered and analyzed in the spring, and CRTs would make recommendations about how to improve student learning. Based on these recommendations, common assignments and rubrics would be modified and data would be collected again in the fall, which would allow for a one-year follow-up and closing of the loop. The remaining three SLOs would begin data collection in the spring and follow the same cycle. All SLOs should be assessed within a 5-year cycle.

Each assessment cycle is composed of 3 major phases as illustrated in Figure 1: Data Collection, Data Entry/Analysis, and Revised Data Collection/Assessment Reporting.

Data Entry

Data collected as part of the assessment process must be loaded into Strategic Planning Online (SPOL) which is the College District’s assessment platform. Persons responsible for data entry for the program/department/division should navigate to each measurement specified by the unit’s assessment plan and enter the data collected. The SPOL platform will automatically aggregate data as appropriate to provide a summary figure for each learning outcome in each assessment cycle. The number of data points entered in this manner varies across instructional unit depending on the number of learning objectives, courses and measures described in their assessment plan. The Office of Institutional Research and Effectiveness provides training for data entry into the SPOL platform (see Appendix, SPOL User Manual).

Interpreting and Reporting Assessment Results

Assessment reporting follows data entry which generally starts at the beginning of the following term but can begin earlier if all the necessary data has been loaded into SPOL. This phase of the assessment process, which includes Interpretation of Results and Action on Results should include observations and input from the respective CRT members. Taken as a whole, the process captures what the students should be able to do/know/feel, how abilities were measured, evidence of student success, and any “actions” taken to improve success or resolve problems. Once the Action Plan has been implemented, data from a One Year Follow-Up is reported to judge the success or failure of that Plan and to make further plans to “close the loop.” This documents efforts toward continuous improvement in course instruction and student learning.

Chapter 5 – Blinn College District Assessment Council and Assessment Structure

Blinn College District Assessment Council

The function of the Blinn College District Assessment Council is to work collaboratively with the college community to sustain innovative, continuous, and reflective assessment practices at the course, program, and institutional level. The council functions as a reliable and trusted source for providing support, development, and examination of policies and best practices to shape the future of assessment at the Blinn College District. The council is comprised of faculty representing multiple disciplines and campuses, instructional leadership, the Vice Chancellors for Academic Affairs, the Director of Institutional Research and Effectiveness, and the Executive Director for Academic Success.

The Assessment Council is responsible for setting forth a series of goals:

  • Goal #1. Create an assessment manual and plan with policies and procedures that reflect a faculty-driven process at the course, program, and institutional levels.
    • Outcome #1: Produce or update course, program, and institutional-level assessment guides, collectively known as the Assessment Manual.
  • Goal #2. Provide professional development in assessment for faculty and administration.
    • Outcome #1: Develop eCampus faculty assessment trainings.
    • Outcome #2: Track faculty participation in workshops and trainings.
  • Goal #3. Actively reflect on the impact of assessment at the Blinn College District within the course, program, and institutional levels.
    • Outcome #1: Disseminate assessment-related information and analyses at the Fall and Spring faculty and staff professional meetings.

Guiding Principles of Blinn College District’s Assessment Program

The following general principles may be useful in fostering the efforts of the Blinn College District Assessment Council. These Assessment Principles were derived from many sources including a literature review and participation in various conferences on assessment.

  • To improve student learning and development.
  • To develop an effective, valid assessment program as a long-term dynamic process.
  • To develop an assessment program that adheres to Blinn College District’s Mission Statement.
  • To involve a multi-method approach.
  • To separate the assessment of student learning and development from faculty evaluation.
  • To design an assessment program that must include training and related support for faculty and staff who are responsible for assessment activities.
  • To ensure that assessment results will not reflect negatively against students.
  • To seek and utilize the most reliable, valid methods and instruments of assessment.
  • To state the assessment objectives/goals in terms of observable student outcomes that are categorized into one of the following areas:
    • Basic College Readiness
    • General Education
    • Major Areas of Study
    • Career Preparation
    • Personal Growth and Development

Chapter 6 – Program Review

Program Review is an integral part of the institution's overall planning process. The purpose of Program Review is to systematically evaluate progress toward achievement of program strategic goals and outcomes, including how a program can be strengthened. This self-study process focuses on four tenets: the search for the true state of the program area, the identification of accomplished services in the program area and those needing improvement, recommendations for implementing improvements, and use of results and follow-up. The overarching goal of program review is to foster student success and learning.

Program-level assessment is required on a rotational schedule. All campus units, both academic and support, take part in the review process. Separate templates are utilized depending on the type of program. The program director or respective member of division leadership is responsible for overseeing the timely and accurate completion of the Program Review process.

Academic programs (including academic transfer and certification) are reviewed on a five-year cycle, while support programs are reviewed on a three-year cycle. The main difference between the academic and support processes is that the academic units comment on faculty load and student enrollment within their programs in addition to the topics also covered by the support units. The process and timeline are as follows for all programs:

  1. Staff from the Office of Institutional Research and Effectiveness (IR&E) meet with division leadership to explain the process and answer any questions.
  2. IR&E staff prepare templates which include enrollment & faculty load data (for academic programs). Academic Support and other units (e.g., Enrollment Services, Budgeting) will use the CAS (Council for the Advancement of Standards in Higher Education) Guidelines as their template.
  3. Templates and data files are distributed electronically to division leadership.
  4. Division leadership responds to questions and statements in the template, including updates to strategic plans.
  5. All documents are forwarded to members of the Program Review Committee for examination.
  6. All documents are stored on the IR&E server for future reference.

The college understands the importance of program review as part of its institution-wide continuous improvement cycle and is committed to excellence in all aspects of program review. The use of feedback is an integral part of the continuous improvement cycle and a foundation to sound assessment practices.

Appendix

Assessment Instruments and Methods for Assessing Student Learning

A few examples of various assessment techniques are included in this section. For a more comprehensive collection of ideas, consult works such as Angelo and Cross’s Classroom Assessment Techniques, Walvoord’s Assessment Clear and Simple, and Stevens and Levi’s Introduction to Rubrics.

Minute paper

The minute paper is a quick and easy way to assess student learning at a particular point in time. It not only provides helpful feedback but requires little time or effort to administer. Just before the end of class (you decide how long you want your students to write), stop the class and ask students to take the rest of the time writing about several questions on the day’s material. These questions might include “What is the most important thing you learned in today’s class?” or “Do you still have questions about the material we covered today?” Students hand in their responses before leaving.

Muddiest Point

The muddiest point exercise is a variation of the minute paper. Administered during or at the end of a lecture or class discussion, the muddiest point exercise asks students to think about what went on in class that day and to write about what was the “muddiest” (least clear) point in that day’s class.

Reading Reaction

The reading reaction paper forces students to slow down the reading process and asks them to think about what they have read. It may be administered as a short homework assignment to be completed after the reading has been done or as an in-class assignment to stimulate class discussion. Typically, a reading reaction paper asks students to respond (or react) to the reading (i.e., what did the author say, did you agree with what was written, why/why not, etc.) in one page or less.

Portfolio Analysis

Portfolio analysis is becoming an increasingly popular method of assessment, both at the classroom and the program level. Portfolio analysis looks at student work during a period of time and evaluates the extent of learning based on the progression of the work from the first assignment until the last.

All the assessment methods described above are adapted from Stassen, Doherty, and Poe, Course-Based Review and Assessment (University of Massachusetts Amherst).

Sample Rubrics

The American Association of Colleges and Universities (AACU) https://www.aacu.org/ is an excellent resource for rubrics. VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics are open educational resources that enable educators to assess students’ original work. Please see the following link for a detailed list of rubrics that can be downloaded:

https://www.aacu.org/initiatives/value-initiative/value-rubrics

Strategic Planning Online (SPOL) User Manual

SPOL User Manual

Broken Aria Reference
Assessment Manual