Zoom Logo

Ask Me Anything Event on Learning Engineering with Neil Heffernan - Shared screen with speaker view
Ulrich Boser
19:41
Am exciting to learn more about instrumentation!
Steve Ritter
19:56
Hi - Steve Ritter - Carnegie Learning
Jordi-Ysard Puigbò Llobet
20:08
Hi from Barcelona! I'm an ML engineer interested in EdTech
Ulrich Boser
20:08
All, note we will be recording this AMA.
Katerina Schenke
20:26
Hi! I’m Katerina based in Los Angeles. I am an education research consultant and founder of Katalyst Methods and EdTech Recharge.
David Wiley
20:27
David Wiley from Lumen Learning. We began running a/b tests a few years ago and are interested in connecting with others who have similar interests. Looking forward to learning more about ETRIALS.
Neil Heffernan at WPI and ASSISTments
20:43
Great to see you David Wiley!!
John Whitmer
20:52
Hi - I’m a Senior Fellow with Federation for American Scientists for IES, interested in thinking about the scope and ability to generalize infrastructure to other EdTech applications (and limits).
Roger Taylor
20:53
Good morning from Boston. Roger Taylor - I’m a Learning Scientist working at Emeritus.
David Wiley
21:01
Sarah Ergen- Product Manager at Lumen Learning.
Karen Zhou
21:06
Hi Karen, a PhD student in cognitive science in education at Teachers College, Columbia University. Hope to learn about how I can do my research on the ETRAILS.
Julio Guerra
21:36
Hi everybody! Julio Guerra from Universidad Austral de Chile
Vishal Goenka
21:45
Good morning! Vishal Goenka, Founder/CEO at 2Sigma School. Interested in Learning Engineering implementation in an online school.
Alexander Duffy
21:46
Good morning, I’m Alexander Duffy, a data scientist and independent researcher.
Ulrich Boser
22:10
If you have questions for Neil and Kumar, please drop them here.
Rachel Phillips
22:21
Hi, I’m Rachel Phillips from the OER Project, jI’m not as steeped in the learning engineering side of LS, but have worked in different product settings, and want to keep learning more! Thank you for hosting this!
Jarl Kristensen
22:23
Hi - Jarl Kleppe Kristensen, PhD student at the Centre for Educational Measurement, University of Oslo, Norway.
April Murphy (Carnegie Learning)
22:51
I’m April Murphy at Carnegie Learning. I’m involved with the UpGrade project, an AB testing system for running large scale RCTs in educational software.
Ashish Gurung
22:51
Ashish Gurung, PhD Student(Computer Science), Worcester Polytech
Neil Heffernan at WPI and ASSISTments
23:17
Directions for learning how to use ETRIALS is are here https://www.etrialstestbed.org/ but I will also try to make comments to help others that have their own platforms think about how how to put infrastructure into their own worlds
Jim Goodell
25:14
Hi all, Jim Goodell w/ QIP. Vice Chair of the IEEE Learning Tech Standards Committee and on the steering committee for the IEEE. Working on a book based on the ICICLE definition of Learning Engineering.
Sami Baral
26:25
Hi, I’m Sami a PhD student in Computer Science at Worcester Polytechnic Institute(WPI)
Meg Benner
26:25
For those of you joined recently, please feel free to introduce yourself and share any questions in the chat.
Mark DeLoura
26:59
Hi all, I'm Mark DeLoura, CTO of Games and Learning, former Sr Advisor in Obama OSTP (Hi Kumar!), helped spin up and shut down GLASS Lab, worked in game industry technology around 25 years.
Jim Goodell
28:22
f.y.i. Burr Settles @ Duolingo developed a trainable spaced repetition model that uses machine learning to inform spaced repetition adaptive instruction on the platform.
Susan Berman
28:24
Hi-Susan Berman, research project manager @ Carnegie Learning.
Owen Farcy
28:40
Hello everyone, I'm Owen Farcy, founder of the Encephalon Learning Group and a long-time consultant in higher education. Thanks Kumar and Co. for hosting this session!
Jenessa Peterson
29:22
Hi from Sacramento! I’m Jenessa Peterson, a data scientist and former teacher. I’m especially interested in large scale research infrastructure and the also the important problem of implementing research findings at scale.
Steve Ritter
30:37
Email works better than calling me:)
Ulrich Boser
30:50
Here’s more on Upgrade https://www.carnegielearning.com/blog/upgrade-ab-testing/
April Murphy (Carnegie Learning)
31:21
Our website: https://www.upgradeplatform.org/
April Murphy (Carnegie Learning)
31:57
You can also email us to learn more/ask questions at upgradeplatform@carnegielearning.com
Katerina Schenke
32:24
Question: Would love to know if you can discuss other theoretical orientations/traditions to thinking about learning engineering
Jim Goodell
32:25
That slide shows a great example from CL on improving via A/B testing and understand what experiences can fill the gaps for individual learners.
April Murphy (Carnegie Learning)
34:24
@Jim thanks! Also note that UpGrade doesn’t just enable visual changes in learning experiences—it can be used to test algorithms, under-the-hood settings, or other aspects of the experience.
Ulrich Boser
41:47
Curious. Are there any folks on here who have already instrumented their platforms?
Luke Eglington
42:32
Hi I'm Luke Eglington. I'm a postdoctoral researcher at University of Memphis. I study how to create adaptive practice systems pairing models of learning and pedagogical decision rules to optimally space practice. Question(s) I have: to what extent can these A/B tests adjust the platform itself? e.g., are these tests for tweaking how feedback is presented, or the duration, or can larger changes be explored, like the decision rule for how it is decided what content is practiced next? Can the experiments be more nuanced than A/B? Many variables interact and only manipulating one factor may provide misleading results.
John Whitmer
42:49
It’s an interesting observation, in my experience in EdTech companies there’s no lack of ideas about what to do; but bottlenecks in evaluating the impact of those interventions.
Steve Ritter
43:38
@Luke - our UpGrade system can’t do this yet, but it’s on our radar. We should talk.
Luke Eglington
43:58
Yes definitely!
David Wiley
44:33
Interested to learn more about embedded studies “disclosed” in terms of use and how that pertains to IRB. If there are 100 studies running, are there 100 IRBs?
Ulrich Boser
46:13
I think this study from Erin is fascinating
Katerina Schenke
46:28
Yes! And supports previous theory!
Steve Ritter
51:08
I actually think experimentation can support the platform - running experiments shows a commitment to improvement.
Alexander Duffy
51:23
Q: In your mind what is the role of project based learning in curriculum? Additionally, are there tools / instruments / measurements that would in your mind improve the efficacy? Intuitively, PBL seems like an opportunity for self-generated feedback.
John Whitmer
51:39
I agree with Steve, perhaps you’d want to control experiments if they might challenge your core approach.
David Wiley
52:07
Isn’t the right message for people who don’t beat the “current best” mark to “go to continuous improvement and come back soon”?
John Whitmer
52:26
I’ve found great willingness from platform providers and support from customers to presenting less-than-stellar results. Marketing dep’t doesn’t always love that, but higher ed sure does.
Katerina Schenke
56:36
Can you talk about the limitations to only running experiments online versus supplementing those data with other measures (child’s demographic information, parent report of SES)
Jim Goodell
59:00
IEEE ICICLE definition of learning engineering ieeeicicle.org
Steve Ritter
01:01:00
Neil - can you talk about how you balance making ETRIALs open to researchers without your assistance vs. also setting standards for what you’ll allow to happen within Assistments?
Jim Goodell
01:01:00
Learning Engineering is a process and practice that applies the learning sciences using human-centered engineering design methodologies and data-informed decision making to support learners and their development. - IEEE ICICLE
John Whitmer
01:02:53
Should each “instrumentation” approach be unique, or are there some fundamental components that span EdTech platforms? I wonder about our ability to scale out these systems across different applications with some consistency and maybe efficiency (vs. doing it all separately).
John Whitmer
01:04:33
And, what do you think about these platforms as serving internal researchers vs. external researchers; are these infrastructures intended primarily for external audiences?
Jim Goodell
01:04:37
Building on @John’s question:
Jim Goodell
01:04:42
Even though the term “learning engineering” is 50+ years old, the field of practice is just emerging. Mature engineering domains have adopted technical and practice standards. Learning engineering has de facto standards for platform instrumentation data like LearnSphere/Datashop and emerging standards like xAPI. How might further development and adoption of standards help mature the practice of learning engineering?
David King
01:08:03
great to learn more about your A/B testing platforms @Steve and @Neil; thanks!
Luke Eglington
01:09:35
Are there plans to try to have surveys be part of "normal instructional practice"? A student answer to the Q "are you tired today?" is probably pedagogically meaningful in terms of practice scheduling, and is probably something that real teachers take into consideration (i.e., a part of normal educational practice)
Jim Goodell
01:09:52
There are some PBL platforms that could be used to instrument other aspects of PBL.
Ulrich Boser
01:13:53
Always happy to chat. ulrich@the-learning-agency.com
Rachel Phillips
01:13:56
This was great, super interesting and useful, thank you!!
John Whitmer
01:14:03
My hope, is that we can begin to mature the demands of teachers, administrators, parents so that they think having this kind of research is required, not “optional”.
Meg Benner
01:14:16
Thank you!