For a full overview of 2019 ETCPS, please visit the event site.
Oct. 9, 2019
Betsy Corcoran (slides)
Perspectives on Edtech: Trends, Teachers’ Role, and Partnerships
Context matters. Over the past decade, we’ve witnessed an explosion of technology and data applied to both formal and informal learning. We’ve also more rigorously assessed our understanding of how students learn, including the factors that shape their ability to learn. Before diving into the complexities of research and the opportunities created by using the latest technology to advance learning, let’s ground ourselves in the experiences of our contemporary education experience. This talk will give an overview of salient trends in formal education and learning, the experience of educators and how education technology both fits — and at times, does not fit — into the work of students and learners. We’ll explore both the need for measures of efficacy in edtech products — and what it will take to give educators the confidence and encouragement they need to use the evolving measures.
Lu Ou (slides)
Tackling Heterogeneity and Phase Transitions in Learning Processes
Learning processes often exhibit continuous dynamics interspersed with regime-switching discontinuities. It is often at the center of quest to identify individual and group-based differences in learning trajectories when dealing with educational multimodal time series data with potential phase transitions. In this presentation, we present advanced statistical models along with efficient software tools to address the issue, and showcase applications of these models and algorithms to understand learning processes in various environments. We will discuss the usefulness of the methods in providing instructional support in teaching and self-learning.
James Sprengelmeyer (slides)
Innovating at the Intersection of Research & Development and Business Needs, or What Batman’s Villains Can Teach Us About Collaboration
Hybrid Models and Their Uses for Modelling Heterogeneous Populations, Detecting Aberrant Responses and Cheating Behaviour
In this talk I will discuss how models with discrete and continuous latent variables provide a general and flexible framework for modelling populations that consists of multiple groups but also detecting outliers and cheating. Responses are usually test outcomes or answers to questionnaire items and can be of any type such as categorical, continuous or mixed. Similarly latent variables that explain the inter-relationships among the observed variables can be either continuous or discrete. In the hybrid case a combination of continuous and discrete latent variables can accommodate flexible distributions for the latent variables (e.g. mixtures of normals), model zero-inflated data through a degenerate class, detect alternative response strategies and detect cheating behaviour and compromised items. The methods will be illustrated using data from education and social surveys.
Eric Engelmann (slides)
Building an EdTech Ecosystem
There’s no rule book for creating and growing innovative new companies. But in this session, we’ll explore lessons learned while attempting to do just that, including aligning corporate partners, investment capital, entrepreneurial support, research resources, and governmental entities around a key goal to grow an Education Technology ecosystem that can thrive.
Oct. 10, 2019
Ashish Rangnekar (slides)
My Proposal to You: Let’s Get Learners Engaged!
If a tree falls in a forest and nobody is around to hear it, does it make a sound? We’re seeing a similar conundrum with today’s self-assessment and learning programs. While the industry is developing modernized assessment strategies with new learning science findings, that great work is all for naught if it doesn’t make its way all the way down to our candidates. To have a real impact, design and learner engagement are the key drivers all of us must prioritize and focus on. If self-assessments and learning programs are not keeping learners engaged, we’re likely to see increased drop out rates that result in less candidates taking exams and benefiting from new learning strategies.
Tools like gamification, social learning, push notification, and digital badging are now top of mind, but are they actually being applied effectively? In this session, BenchPrep CEO & Co-Founder Ashish Rangnekar will detail three ways to engage candidates, how to best incorporate those examples into learning programs, and what you must consider when designing self-assessments to meet the needs of your organizations.
John Whitmer (slides)
Predicting Student Social and Emotional Skills using Learning Analytics
Clickstream data from Learning Management Systems (and other educational technologies) has been repeatedly demonstrated to predict course-level student success; but why are these predictions accurate? What underlying psychological constructs or learning strategies might be latent in this behavioral data? In this presentation, we will present results from a research project conducted with five courses (n=927) at the University of Maryland Baltimore County to evaluate the relationship between social and emotional skills (e.g. grit, tenacity, curiosity) assessed in the ACT Tessera, student activity in the Blackboard Learn LMS and use of Vitalsource eTextbooks. There were significant relationships between SE skills and activity patterns that demonstrate strong relationship between SE skills and student learning strategies and practices.
This study builds on several recent methodological and conceptual advances in the Learning Analytics field. We take an evidence-centered design approach that maps the available data to pedagogically relevant concepts and activities in the courses under investigation. This addresses the “clicks to constructs” issue noted by (Lang et. al 2018). Further, we employ sequential data mining techniques to group individual data elements into patterned sequences common between courses. Finally, we evaluate the impact of student demographic and educational preparedness factors into our results, ensuring that the findings can be generalized and any differences between student groups identified, if not fully explained through follow-on analyses.
Conversational Assessments and Tutoring with AI
Conversation is at the core of education, training, and workplace. Conversational assessments involve open-response questions, constructed response answers, and instant formative feedback – all expressed in natural language. It supports Benjamin Bloom’s seminal work on mastery learning and allows the measurement of active knowledge recall, critical thinking, and problem solving – skills that are important for succeeding in the 21st century work environment. Virtual Learning Assistant(VLA) is a new genre of AI based EdTech that supports the scalable implementation of conversational assessments and tutoring. VLA can improve students’ learning outcomes, teachers’ productivity, and schools’ scalability of high quality education. This session will introduce the pedagogy of conversational assessments along with the challenges, solutions, and efficacy studies, and a discussion on the future of educational assessments.
From Conceptualization to Implementation: Enabling a Scalable Voice-Based Learning Assistant
In the lead up to taking the ACT, students will typically review skills through test preparation activities. It is beneficial to diagnose what the student appears to have mastered and identify specific areas needing more review based on prior assessment results and ongoing practice activity. This diagnosis can then be used to drive a recommendation engine that can formulate personalized lists of open educational resources. This enables a pathway for self-directed learning and review based on feedback and access to targeted instructional content.
It has been estimated that the worldwide smart speaker installed base will grow to more than 207 million units by the end of 2019. ACTNext, the innovation group within ACT, leveraged the capabilities of these devices by building a voice-driven learning solution. The learning assistant acts as a coach that can access a student’s mastery diagnostics and can provide ongoing advice on where the student should review next. The assistant also knows all about the ACT and enables conversations about what is on the test, when and where to take it and can also set reminders to ensure they are on track to complete the assessment.
In this presentation, I will describe how we took this concept of building a voice-based assistant and implemented a free solution that can scale to millions of students and homes. I will introduce the components that enabled the assistant, from the Holistic Framework, ACT Academy and the Recommendation and Diagnostics (RAD) API to the voice elements of the Alexa Skill.
Kristin Stoeffler & Laurel Ozersky
EDU2050: Design and Development of Innovative Learning and Assessment Solutions
Higher-order skills such as, creativity, critical thinking, scientific inquiry, and computational thinking transform lives and drive economies. However, the learning and assessment of these skills using traditional methods is a challenging task. Recent advancements in technology, learning science, cognitive psychology, and educational assessment enable the development of innovative learning and assessment solutions for these higher-order skills. This presentation will highlight current constructs, concepts, and techniques facilitating the effective design and development of technology-enhanced assessments for higher-order skills at scale. We will also share prototypes being used to explore this space as part of our EDU2050 initiative.
Saad Khan (slides)
Crisis in Space – A New Way to Measure Collaboration Skills Using Online Games
Games are everywhere, in fact it may be hard to keep up with whatever new game your friends or your kids may be playing. ACTNext has been doing work on collaborative problem solving in our own video game, and recently wrapped up their first pilot study in area schools. Using AI and Machine Learning techniques we are developing non-invasive methods of measuring social-emotional skills in an ecologically valid manner. Specifically, we will look at how team dynamics and team member task interactions are manifested as observable behaviors, and how these behaviors can be analyzed via passive collection such as video, audio, and eye tracking data streams. These data are merged with self- and peer-report measures to provide a holistic representation of individuals’ collaboration skills as they manifest in team settings. You may have heard of this project, or seen some coverage of it on the local news, but come hear from the researchers directly involved in this work to find out what it’s all about!