Research

AI-slop and education

OUWB researchers urge caution following their study of AI-generated content

Drs. Newman and Jones

Co-authors were Jane Newman, Ph.D., assistant professor, and Eric Jones, Ph.D., associate professor — both of OUWB’s Department of Foundational Medical Studies.

Research

icon of a calendarMarch 20, 2026

Pencil IconBy Andrew Dietderich

OUWB researchers urge caution following their study of AI-generated content

High view counts for AI-generated educational videos do not always equate to quality, according to a recent study published by two professors from Oakland University William Beaumont School of Medicine.

AI-Generated ‘Slop’ in Online Biomedical Science Educational Videos: Mixed Methods Study of Prevalence, Characteristics, and Hazards to Learners and Teachers,” recently was published in JMIR Medical Education.

Co-authors were Eric Jones, Ph.D., associate professor, and Jane Newman, Ph.D., assistant professor — both of OUWB’s Department of Foundational Medical Studies. Additional authors were Boyun Kim, a graduate student in Oakland University’s Early Childhood Education program, and Emily Fogle, Ph.D., professor, California Polytechnic Institute.

The researchers said they want to help learners steer clear of bad content while assisting educators in identifying what to avoid if they are considering using AI to teach.

“If I could summarize everything in just one sentence it would be this: If you use AI, don’t let it do the thinking for you,” said Jones. “If anybody sees AI as a shortcut, they’re making a mistake.”

Study origins and AI-slop

The origins of the study begin with Jones wanting to systematically understand and evaluate AI-generated educational videos that medical students are using outside of the classroom and on platforms like YouTube and TikTok.

Understanding how students learn outside the classroom is critical to their success, said Jones.

“(Students) are spending dozens of hours a week on their own, on their devices, looking at other resources that we know nothing about,” he said.

Generally, AI-generated content racks up millions of views daily. Not all of it is to be trusted, states the study – especially videos that purport to be educational.

“The recent growth in generative artificial intelligence (AI) tools … has resulted in low-quality, AI-generated material (commonly called ‘slop’) cluttering these platforms and competing with authoritative educational materials,” the study states.

“AI agents are not intelligent…the term is a misnomer,” Jones further explained. “They’re pattern matching machines and they’re very useful for certain things, but they don’t have logic, expertise and experience, or the motivations and values in a way that a human does.”

Newman — whose main research theme is game-based learning and other types of nontraditional study techniques — added that getting view counts and engagement seems to be the primary motivation for AI-slop.

“The bigger danger is not in the initial viewing of content,” but in the perpetuating of bad knowledge, she said.

Methodology and findings

To start, Jones said the research team generally defined AI-slop to mean “material created by AI without any sort of specific intentional human oversight or intervention to make sure that it met its intended needs.” They also put emphasis on the amount of “human care” that appeared to be used in video production.

Once they defined AI-slop, the team embarked on a qualitative analysis of more than 1,000 AI-generated videos posted to TikTok or YouTube. They looked at videos for 10 medical science terms like biochemistry, cell biology, and others related to OUWB’s own Biomedical Foundations of Clinical Practice (BFCP) courses.

Every video was watched multiple times. Detailed notes were logged to track factual errors, discontinuities, linguistic programs, errors of explanation, and more – all with the intent of classifying each video as either legitimate or AI-slop. 

It took about two months in early 2025.

Of the videos, 57 were deemed “probably AI-generated and low-quality.” (Newman noted that the proportion of AI-slop on YouTube Shorts was “much higher.”)

Some were downright strange, like the video of Barney the Dinosaur — the same character that teaches kids lessons about sharing and the ABC’s — being used to explain scientific terms while the song “Yankee Doodle” played in the background.

“One of the most common things we saw were analogy after analogy that never actually made sense,” said Newman. “It would just keep going through this cyclical, non-explanation of whatever the topic was.”

“Or they would give these grand, sweeping declarations about how important something like metabolism is without ever explaining what metabolism is,” added Jones.

Jones said that he actually expected more videos to be identified as AI-slop, but explained that the team was essentially looking for the “worst of the worst.”

Further, he said, the team fully believes that the number would be much higher now as it’s been about a year since the initial video review.

What’s next?

By publishing the study results, Jones said he hopes it serves as a kind of “cautionary guide.”

For students, it’s a warning to treat AI-generated content — especially short videos — with a healthy amount of skepticism.

For educators, it’s a warning to not take the easy way out.

“I see a lot of instructors who really are going all-in on AI as a time saver and I think this study should tell you to slow down and realize that some things can go very, very wrong,” he said.

Newman said the study should also help educators understand that they “can’t just tell students to go on TikTok or YouTube to learn about a topic.”

“You need to provide direct guidance,” she said.

Jones, for example, has started a short list of recommended videos for every session he teaches.

Also, the team has turned its attention to a project they consider more “positive” — especially as use of AI-generated content continues to grow.

“We’re working on a set of guidelines and rules for how to responsibly make educational materials with AI using a discrete set of auditable steps to make sure what you turn out will not be slop,” said Jones.

“That is probably the most positive piece of this,” said Newman.