Nature


Strategic Initiatives

The Multi-State Collaborative
to Advance Quality Student Learning

Overview

The Multi-State Collaborative is an initiative designed to provide meaningful evidence about how well students are achieving important learning outcomes, i.e., written communications, critical thinking, and quantitative literacy in public higher education.

Contact

Robert J. Awkward, Ph.D., Director of Learning Outcomes Assessment,
Academic Affairs and Student Success
(617) 994-6908
rawkward@dhe.mass.edu

Duration

2011 – Present

Related Data

 

 

Target Populations
DHE Responsibilities
  • Assessment
Partnerships

SHEEO; AAC&U
VALUE Institute

Background

The Multi-State Collaborative (MSC) was formerly sponsored by the State Higher Education Executive Officers (SHEEO) and the Association of American Colleges and Universities (AAC&U). The Multi-State Collaborative is an initiative designed to provide meaningful evidence about how well students are achieving important learning outcomes, i.e., written communications, critical thinking, and quantitative literacy in public higher education.

The MSC has been subsumed by the VALUE Institute co-sponsored by the AAC&U. The VALUE Institute includes both public and private institutions who pay a participation fee. The MSC continues as a consortium of public institutions within the VALUE Institute.

Under the VALUE Institute and the MSC assessment model, student learning is is assessed against specific Essential Learning Outcomes developed by faculty under the auspices of AAC&U’s LEAP initiative. Applying the associated VALUE Rubrics, already in use at a growing number of colleges and universities, to assess actual student assignments and work products provides evidence of authentic class-based learning and program-based teaching that, in turn, enables faculty and programs to improve student learning through curricular changes, course design, and more effective teaching. Moving beyond the institution, the assessment model explores the feasibility of using this same evidence of student learning, carefully analyzed and de-identified, to inform state legislators and other interested parties about the focus and quality of student learning in the context of the mission of an institution.

Guiding Principles

  • Any system of assessment should help build and support a culture of student learning that allows for assessment results to be used by each campus and by larger public systems for improving student learning and for program improvement.
  • Any statewide or campus plan for assessment should be based upon authentic student work and allow for the use of multiple measures of student learning —indirect, direct, and embedded— without a single mandated statewide test.
  • A common framework is needed for any credible statewide system of assessment and accountability. The AAC&U LEAP Essential Learning Outcomes and VALUE Rubrics designed to assess the Essential Learning Outcomes are a useful framework given their broad adoption nationally and their endorsement both within and outside of higher education institutions and systems.
  • Assessment approaches should involve an iterative process, and, as such, be viewed as works in progress.
  • Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time.

Reporting

All annual reports contain recommendations for action derived from the data that has and will continue to be shared with key stakeholders and acted upon.

Refinement Year (2016-2017)

Refinement Year (2016-2017)

This report includes a presentation of statewide assessment approaches, and represents a comparison of the following:

  • Massachusetts two- and four-year participating institutions by learning outcome;
  • Massachusetts two- and four-year institutions for 2017 versus 2016 by learning outcome; and
  • Massachusetts two-and four-year institutions versus project level data for 2017 by learning outcome.

Refinement Year (2016-2017)—Revised

This report includes matched samples of a comparison of the same participating institutions in 2016 to the same participating institutions in 2017 (i.e., apple to apple comparison).

Demonstration Year (2015-2016)

Demonstration Year (2015-2016)

This presentation simply reports on the data for that year as there was no prior statewide data with which to compare.