Greetings!

The last quarter of 2019 saw many exciting developments at ACTNext. Our number of team members and capabilities swelled and we have taken on a new charter at ACT — to lead the design of ACT assessments and the next generation of learning and navigation products.

It is my pleasure to welcome Vice President Michelle Barrett and her Research Technology, Data Science & Analytics teams— Senior Director Sara Vispoel and her Assessment Design team; John Whitmer and his Learning Analytics group; and Dave Shawver and his Research Services group; along with several new members with expertise in automatic item generation and automated scoring to our artificial intelligence and machine learning team led by Saad Khan.

ACTNext can now drive ACT’s transformation more quickly than ever and though there is much work to be done, we find ourselves poised to realize the unification of learning, measurement, and navigation in the holistic fashion as envisioned by CEO, Marten Roorda.

Other highlights you’ll find below include the 2019 ETCPS wrap-up from Ada Woo, new podcasts from ACTNext Navigator, a wealth of cutting edge research and much more!

Thank you, as always, for reading our newsletter and please share it with your colleagues who might find it of interest.

Best regards,
Alina von Davier
Senior Vice President
ACT


Insights

photo of Michael Yudelson

Michael Yudelson

The focus of modern technology-supported learning is visibly shifting from measuring latent skills to measuring affective states that allegedly influence learning in general. According to a recent publication, boredom is correlated with creativity and positively. And, there are boredom detectors that were created for tutoring systems (consider ACTNext’s HERA project).

Good, right? Not really. As per Ryan Baker’s publication(s) (here and here), where trained coders tagged videos of students, students were bored only 1% of the time. That, of course, is in the context of students working with a tutor. Students could be bored (read — creative) more/less frequently when being lectured or academically engaged otherwise.

However, there’s a definite plus — correlation of boredom and creativity makes detecting either easier. Or does it? In any case, the more collinear phenomena we can detect, the better learning experience we can engineer.

Best regards,
Michael Yudelson
Sr. Research Scientist
ACTNext — Learning Solutions Group


Check out our podcast, ACT Navigator!

Episode 4: Cross-Cutting Capabilities of Creative and Computational Thinking

In this episode, guests Yigal Rosen, Kristin Stoeffler, and Laurel Ozersky talk about the cross-cutting capabilities (CCCs) of creative and computational thinking. CCCs are part of ACT’s Holistic Framework for college and career readiness.

Episode 5: Validating Cognitive Processes with Eye Tracking

In this episode, guest Jay Thomas, Senior Assessment Designer at ACT, discusses how eye-tracking can go beyond psychometrics to evaluate and validate assessment and testing.

Episode 6: Explaining the Grade: Auto Essay Scoring and CRASE+

Our guests Erin Yao and Scott Wood discuss ACT’s automated scoring engine CRASE+, along with the challenges presented by automated scoring using natural language processing, and how to address bias.


ETCPS Wrap-Up, with Ada Woo

photo of Ada Woo

Ada Woo

ETCPS ’19 provided two great days of learning, sharing, and networking amongst researchers, entrepreneurs and edtech industry leaders, with over 200 attendees, including 32 groups of researchers and edtech companies providing hands-on demos and research presentations.

Our speakers and our pre-conference expert panel focused on paramount themes edtech research must support and our industry must understand. If we are indeed at a moment when many of edtech’s promises are primed to be fulfilled, prioritizing equity and understanding the ground truth of our learners’ needs, means, and motivations will be central for not only leading, but realizing the edtech revolution.

Betsy Corcoran cited research that teachers in high poverty districts aren’t asking for edtech products, they’re asking for solutions to help meet their students basic needs, like clothing, food, and hygiene products. How can we expect students to learn, or even think about learning, when their basic needs are not being met?

Edtech researchers and developers need to consider equity and access for student populations and identify where our users are starting from. We need to involve teachers and students early in the design process in order to meet our learners where they are and get them engaged in the learning process.

Many of our presenters spoke about personalizing the learning pathways of each individual with the help of cutting edge technology, while also noting that the goal of developing cool tech and doing impressive research is to get these products in the hands of the end users who need them.

This is a monumental task. The stakes are too high with stakeholders that are too varied. We cannot rely on the myth of the lone genius, solving these problems requires smart and dedicated teams of people collaborating and working together in supportive communities. For this reason, we felt especially proud to have Iowa Governor Kim Reynolds on hand to announce the Iowa Economic Development Authority, along with the Iowa City Economic Development Authority, Higher Leaning Technologies, NewBoCo, ACT, and the University of Iowa have partnered to advance the Iowa EdTech Accelerator (accepting applications through January 15, 2020) to leverage Iowa’s strategic advantages in this area.

Last, but certainly not least, we’d also like to acknowledge the important support from our sponsors in helping to make ETCPS ’19 our best year yet! Many thanks to HCL Technologies, Corridor Business Journal, Ascend Learning, Sana Labs, LearningMate, Kaplan, Duolingo, and the University of Iowa College of Education — thank you for your support!

Watch the ETCPS ’19 Keynote presentation from Betsy Corcoran:

Read the Iowa Economic Development Authority Report (below)

https://www.iowaeconomicdevelopment.com/UserDocs/news/EDTech_ExecSum_092019.pdf

 


What We’re Reading

Jørgen Veisdal,
The Mathematics of Elo Ratings

James Baldwin,
Another Country

Søren Kierkegaard,
The Concept of Irony with Continual Reference to Socrates

Bruno Zumbo and Anita Hubley,
Understanding and Investigating Response Processes in Validation Research

 


Featured ACTNext Research

ACT Study Shows how Social and Emotional Learning Skills Predict Student Online Learning Activities.

The study, which ACT conducted in partnership with Blackboard, the University of Maryland Baltimore County (UMBC) and VitalSource, sought to understand how social and emotional skills are related to students’ online behaviors and course outcomes within a learning management system (LMS) — an interactive online learning environment — in order to identify ways to help improve student outcomes.

View the full ACT research report and a shortened data byte version to learn more.

Integrating Multiple Sources of Validity Evidence for an AssessmentBased Cognitive Model. Journal of Educational Measurement

An assessment of graphic literacy was developed by articulating and subsequently validating a skills‐based cognitive model intended to substantiate the plausibility of score interpretations. Model validation involved use of multiple sources of evidence derived from large‐scale field testing and cognitive labs studies. Data from large‐scale field testing were evaluated using traditional psychometric methods. The psychometric analyses were augmented using eye tracking technology to perform gaze pattern and pupillometry analyses to gain better understanding of problem‐solving strategies and cognitive load. Findings from the data sources were integrated to provide strong evidence supporting the model and score interpretations. Implications for using gaze pattern and pupillometry analyses to enhance learning and assessment are discussed.

The Wiring of Intelligence. Perspectives on Psychological Science

The positive manifold of intelligence has fascinated generations of scholars in human ability. In the past century, various formal explanations have been proposed, including the dominant g factor, the revived sampling theory, and the recent multiplier effect model and mutualism model. In this article, we propose a novel idiographic explanation. We formally conceptualize intelligence as evolving networks in which new facts and procedures are wired together during development. The static model, an extension of the Fortuin–Kasteleyn model, provides a parsimonious explanation of the positive manifold and intelligence’s hierarchical factor structure. We show how it can explain the Matthew effect across developmental stages. Finally, we introduce a method for studying growth dynamics. Our truly idiographic approach offers a new view on a century-old construct and ultimately allows the fields of human ability and human learning to coalesce.

What technology can and cannot do to support assessment of non-cognitive skills. Frontiers in Psychology

Advances in technology hold great promise for expanding what assessments may achieve across domains. We focus on non-cognitive skills as our domain, but lessons can be extended to other domains for both the advantages and drawbacks of new technological approaches for different types of assessments. We first briefly review the limitations of traditional assessments of non-cognitive skills. Next, we discuss specific examples of technological advances, considering whether and how they can address such limitations, followed by remaining and new challenges introduced by incorporating technology into non-cognitive assessments. We conclude by noting that technology will not always improve assessments over traditional methods and that careful consideration must be given to the advantages and limitations of each type of assessment relative to the goals and needs of the assessor. The domain of non-cognitive assessments in particular remains limited by lack of agreement and clarity on some constructs and their relations to observable behavior (e.g., self-control versus -regulation versus -discipline), and until these theoretical limitations must be overcome to realize the full benefit of incorporating technology into assessments.

Gamified Performance Assessment of Collaborative Problem Solving Skills.
Computers in Human Behavior

In this paper we introduce a game-based approach for Collaborative Problem Solving (CPS) Skills assessment and provide preliminary evidence from a validation pilot study. To date, educational assessments have focused more heavily on the concrete, and accessible aspects of CPS with a diminished representation of the social aspects of CPS. We addressed this issue through the integration of our CPS construct into the game-based assessment “Circuit Runner” in which participants interact with a virtual agent to solve a series of challenges in a first-person maze environment (von Davier, 2017). Circuit Runner provides an environment that allows for controlled interdependence between a user and a virtual agent that facilitates the demonstration of the broad range of cognitive and social skills required for effective CPS. Tasks are designed to incorporate telemetry-based (e.g., log file, clickstream, interaction-based) and item response data to provide a more comprehensive measure of CPS skills. Our study included 500 participants on Amazon Mechanical Turk, who completed Circuit Runner, pre- and post-game surveys, and a CPS situational judgment test (CPS-SJT). These elements, in conjunction with the game-play, allowed for an expanded exploration of CPS skills with different modalities and types of instruments. The findings support and extend efforts to provide a stronger theoretical and empirical foundation for insights regarding CPS as a skillset, as well as the design of scalable game-based CPS assessments.

Computational psychometrics approach to holistic learning and assessment systems. Frontiers in Education

Learning and assessment systems have grown and taken shape to incorporate concepts from both models for assessment and models for learning. In this paper we argue that a third dimension is necessary. Not only is it important to understand what the capabilities of a learner are, and how to grow and expand these capabilities, but we must consider where the learner is headed; we need to consider models for navigation. This holistic perspective of learning and assessment systems is encapsulated in the extended learning and assessment system, a framework for conducting research. Fundamental to this framework is the role of computational psychometrics to facilitate the abstraction from raw data to conceptual models. We provide several examples of research projects and describe how they fit into the described framework.


Sign up to receive the ACTNext newsletter directly by email!

Learn more about ACTNext on:
ACTNext.org
Twitter
LinkedIn

 

© 2019 ACT, Inc. All rights reserved.
500 ACT Drive, Iowa City, IA 52243