How We Revamped Student Assessments for a Digital-Focused World


Academic assessments play a major role in providing high-quality education. They provide information on the performance of students and groups of students, give guidance on how teachers should customize their practice to improve learning and – in light of the current education landscape – can also help educators understand and address gaps in education that have been made worse by the coronavirus pandemic.

Our company provides comprehensive interim and formative assessments in reading, math, and early literacy in both English and Spanish to more than one-third of U.S. pre-K-12 schools and districts. Our mission is to accelerate learning for all and to improve academic outcomes nationwide by helping students learn better, teachers teach better, and school administrators lead better. To aid in this mission, we administer more than 80 million assessments on a yearly basis, which our team of 30 content authors work diligently to create, edit, and design.

‘As student testing continues to evolve, leveraging technology designed for digital testing environments can help test developers maintain high-quality assessments.’

As is the case with many small teams faced with monumental tasks, our team grappled with – and overcame – a number of challenges. Here’s how we revamped our content creation and editing approach to develop the best learning assessments for today’s students.

The Challenge: Creating Assessments for Digital Platforms

Student assessments are best when they’re ongoing, consistent, and provide critical feedback to learners. Technology provides an opportunity to continually create, publish, and administer assessment content in our digital-forward (and newly remote) world. The key to creating the best assessments lies in leveraging the right technology.

One of the main technological issues our content authors experienced was tied to their reliance on Microsoft Word templates for creating and updating assessment items. The key problem with this is that Word is a traditional desktop publishing tool that supports the old adage What You See Is What You Get (WYSIWYG). In a nutshell, WYSIWYG means the tool displays the end result as it would appear in paper-based documents. However, when content is delivered only on digital platforms – which has become an increasingly popular option even pre-COVID – the assessment questions the author sees are not displayed as students would see them online. Desktop publishing tools like Word simply do not have the capability to allow content authors to “preview” the items presented in digital platforms until after transforming the content into separate online formats.

Furthermore, Word templates have predefined styles and layouts. In order to correctly map styles and sequences to online formats, content authors must have in-depth knowledge of how to manipulate Word templates and documents. Any slight deviation or complexity from a Word template may cause assessment items to load or present incorrectly.

The Solution: Leverage the Right Tools to Improve Process Efficiencies

In order to streamline the editorial and production processes, there are tools that allow authors to create assessments and reliably automate publishing to digital delivery platforms. These tools use a unique approach colloquially known as What You See Is What You Mean (WYSIWYM). WYSIWYM is a paradigm for authors to create and enrich content in a structured way, allowing them to spend more time working on the assessment itself rather than focusing on how the content looks (i.e. formatting and spacing), like you would with Word. In essence, it aims to accurately categorize the content so the author can gain confidence in the accuracy of their entry, and so the increased machine readability can lead to immediate automations, improved discovery, better re-use, and multi-channel transformations.

Using a WYSIWYM approach, an author writes assessment item content, such as a prompts, questions and response options, via a forms-based user interface. This significantly reduces the ambiguity of author input by applying formatting automatically through guided entry, and subsequently improves the quality of assessments by reducing the number of errors and fix cycles. With this approach, authors can focus on the content itself rather than wasting time on Word styling.

To leverage WYSIWYM’s structured approach to authoring, we worked with Copyright Clearance Center, which has expertise in content and knowledge management, as well as editorial and publishing technologies. No matter what solution you choose, we recommend considering the following when optimizing your assessment processes:

  • Implement a forms-based editing tool that guides authors and focuses them on just the content and metadata.
  • Fully automate the transformation to Question and Test Interoperability (QTI) format needed by your digital learning platforms (“lights out publishing”).
  • Tightly integrate all tools needed to create items to provide a “one stop shop” for authors (e.g., item bank, equations, charting, image and video library, text analyzer for reading level).
  • Enable authors to quickly see how questions, prompts, and answer options will look to students taking online assessments.
  • Adopt open standards wherever possible. When integrating with 3rd party tools, it eases the burden when common “languages” are spoken between them. For example, SVG (for illustrations, charts, and graphs), QTI (for fine-controlled representation of quizzes), MathML (for creating, embedding, and displaying equations), and XHTML (for paragraph-level style hints).

By transitioning our authoring processes to a WYSIWYM-based tool, our content authors have gained a range of new capabilities that enable them to produce assessments faster and more efficiently. Not only can they see how assessment questions look to students taking them, but they are also able to prevent formatting errors before assessment items are published.

‘Not only can they see how assessment questions look to students taking them, but they are also able to prevent formatting errors before assessment items are published.’

As student testing continues to evolve, leveraging technology designed for digital testing environments can help test developers maintain high-quality assessments. One digital assessment at a time, these tools can ultimately improve academic outcomes for students.

This post originally appeared in EdTech Digest, republished with permission.

Topic:

Author: Katie McClarty

Katie McClarty, Ph.D., is the Vice President, Research and Design at Renaissance Learning where she oversees research, psychometrics, and content development. Widely published in the academic community, Dr. McClarty is lead editor of a book on college- and career-readiness as well as recipient of two best paper awards for her work on accelerating gifted learners and the critical role of Algebra II. She presents and consults both nationally and internationally on college readiness, standard setting, gifted and talented education, competency-based education, and computer-based assessment.