RESOURCES

Read our latest news and access media resources

Assessment Life Cycle Stage 4: Examination Administration

Posted:
Categories: Reports & Guides

Exam administration is careful work; a good deal of planning goes into ensuring it is handled correctly. The central concern at this point in the Assessment Life Cycle is making sure that your examination is administered under the proper testing conditions, and that sensitive data is handled appropriately. If done correctly, the tasks of Stage 4 ensure that your examination will be administered in a way that respects the stakes (or ‘seriousness’) of the exam and supports the validity and defensibility of your assessment program. This white paper explores many of the considerations when undertaking this process.

Assessment Life Cycle Stage 3: Assembling Your Examination Forms

Posted:
Categories: Reports & Guides

Stage 3 of the Assessment Life Cycle is where exam developers take the items that were written and banked in Stage 2 and compile them to into one (or more) examination forms. This may sound fairly easy; indeed, sometimes examination assembly is just a matter of finding the right items to assess the required subject areas. However, assembling a valid and well-balanced examination form from an item bank is a careful process, and one that requires meticulous effort. This is especially true when creating licensure or certification examinations, or other high-stakes assessments. This white paper explores many of the considerations when undertaking this process.

Assessment Life Cycle Stage 2: Item Development

Posted:
Categories: Reports & Guides

Stage 2 of the Assessment Life Cycle is where test questions are created and used to populate an item bank. That may sound simple enough, but in truth, item development is careful work. It requires the coordinated effort of several groups of experts to ensure that the final examination includes the right number and quality of items and is organized into the right categories. This white paper explores many of the considerations when undertaking this process.

Assessment Life Cycle Stage 1: Defining Your Target & Creating Test Specifications

Posted:
Categories: Reports & Guides

Every assessment program starts with defining targets and creating test specifications. During this stage, the test developers decide what the goal of their assessment is going to be. These goals are formalized and used to create an “examination blueprint,” which specifies the number and type of questions that the assessment requires to soundly measure each knowledge area. This white paper explores many of the considerations when undertaking this process.

For Credentialing Programs: 11 Questions to Ask when Considering Online Proctoring

Posted:
Categories: Reports & Guides

Finding the right online proctoring partner can be difficult and time consuming. If you knew the right questions to ask, wouldn’t it be easier to make an informed decision? We think so too! That’s why we’ve outlined some questions that can help guide you in your search. In this white paper, you’ll find: 11 important questions you should be asking every company you’re considering, why the questions matter, and our responses to the questions.

Remote Education 2.0

Posted:
Categories: Reports & Guides

In the wake of the COVID-19 pandemic, colleges and universities across the world were forced to pivot to remote education. This hasty transition resulted in many ad hoc and make-do teaching and assessment plans that were never designed for long-term use. As the pandemic continues to disrupt traditional higher education, administrators and faculty need sustainable methods to remotely teach and assess students. Jeffrey Selingo and Karin Fischer, two well-respected writers in the field of higher education, collaborated on this white paper addressing three key questions regarding online proctoring.

For Academic Institutions: 9 Questions to Ask Every Online Proctoring Vendor Before Committing

Posted:
Categories: Reports & Guides

Finding the right online proctoring partner can be difficult and time consuming. If you knew the right questions to ask, wouldn’t it be easier to make an informed decision? We think so too! That’s why we’ve outlined some questions that can help guide you in your search. In this white paper, you’ll find: 9 important questions you should be asking every company you’re considering, why the questions matter, and Meazure Learning’s responses to the questions

Setting a Cut Score for a Performance-Based Assessment: The Ebel Method

Posted:
Categories: Reports & Guides

Setting a defensible cut score through a process called standard setting is an essential component that supports exam validity. For traditional multiple-choice and other selected-response assessments, there is a wide body of literature that supports the use of established standard-setting methods such as the Angoff method or the Nedelsky method. This white paper explores some of the key considerations when setting a cut score.

The Time Crunch: How Much Time Should Candidates Be Given to Take an Exam?

Posted:
Categories: Reports & Guides

When developing an assessment, two major decisions a credentialing organization needs to make are: How many items will be on the exam? and How much time will test-takers be given to complete the exam? These choices can have a significant impact on fairness and validity. Often, once an exam has been administered, many ctest-takers will anecdotally report that they ran out of time and the assessment was unfair. Therefore, an important question to ask is, What can credentialing organizations do in order to investigate and address these concerns? We explore this topic in the white paper.

Scoring Models for Innovative Items

Posted:
Categories: Reports & Guides

As a way to capture the richness of job performance, many credentialing organizations are supplementing traditional multiple-choice questions (MCQs) with innovative item types. Although this view is not unanimous, one theory suggests that MCQs represent a somewhat artificial representation of job tasks and that innovative item types represent a more refined way to assess candidate competence. This white paper explores this topic in depth.