Educators are adopting Claude across multiple instructional tasks, with curriculum development and academic research among the most common uses and some teachers automating grading workflows. Anthropic analyzed 74,000 anonymized conversations from Free and Pro accounts tied to higher education emails from May–June 2025 and matched each interaction to O*NET educational tasks. The analysis was supplemented with survey data and qualitative interviews with 22 Northeastern University faculty early adopters. Findings show educators balance using AI as a copilot against outsourcing work to automation, and indicate AI companies are expanding tools tailored to education while educator choices vary by task and context.
Much of the focus on AI in education is on how students will be affected by AI tools. Many are concerned that the temptation to cheat and AI's erosion of critical thinking skills will diminish the quality of their education. However, Anthropic's latest education report focuses on educators' outlook on AI in the classroom -- and finds some surprising ways teachers are implementing the tech.
To conduct this , Anthropic analyzed anonymized conversations between Claude.ai, its chatbot, and Free and Pro accounts associated with higher education email addresses, and filtered for education-specific tasks from May and June of 2025. Within that time period, Anthropic identified 74,000 conversations involving tasks such as creating syllabi, grading assignments, and more. The company also matched each conversation to the most fitting task from the list of educational tasks in the US Department of Labor's Occupational Information Network (O*NET) database.
Collection
[
|
...
]