What is it about?

Teachers, however, have scarce time to customize or improve instructional materials in AI-tutor. We leveraged a teacherguided crowdsourcing pipeline designed to improve the adaptive math hints of an AI-based tutoring system so they fit teachers’ preferences, while requiring minimal expert guidance. We conducted two experiments involving 144 math teachers and 481 crowdworkers. We found that such an expert-guided revision pipeline could save experts’ time and produce better crowd-revised hints (in terms of teacher satisfaction) than two comparison conditions; but not showing improvement over original hints. We found main challenge for crowdworkers may lie in understanding teachers’ brief written comments and implementing them in the form of effective edits, without introducing new problems. We also found that teachers preferred their own revisions over other sources of hints, and exhibited varying preferences for hints. Overall, the results confirm that there is a clear need for customizing hints to individual teachers’ preferences. They also highlight the need for more elaborate scaffolds so the crowd can have specific knowledge of the requirements that teachers have for hints.

Featured Image

Why is it important?

This work was motivated to address a practical problem, i.e., how to leverage crowdsourcing to help the time-strapped teachers improve and customize instructional materials in AI-tutor, without requiring too much of teachers' time. Theoretically, crowdsourcing pipelines have traditionally focused on content generation. It is an open question how a pipeline might be designed so the crowd can succeed in a revision/customization task, which the results from our study shed light on.

Perspectives

Writing article was a great pleasure as this is my first mixed-method study aims to solve a practically driven problem, and may be of interest to audience in different research areas (e.g., HCI, AIED and crowdsourcing). We aim to leverage crowdsourcing to help the time-strapped teachers improve and customize instructional materials in AI-tutor, without requiring too much of teachers' time. In this work, we conducted randomized controlled experiments with different conditions, performed quantitative to evaluate the pipeline results, and qualitative analysis to uncover the challenges in this expert-guided crowdsourcing pipeline.

Kexin Yang
Carnegie Mellon University

Read the Original

This page is a summary of: Can Crowds Customize Instructional Materials with Minimal Expert Guidance?, Proceedings of the ACM on Human-Computer Interaction, April 2021, ACM (Association for Computing Machinery),
DOI: 10.1145/3449193.
You can read the full text:

Read

Contributors

The following have contributed to this page