Toxicity is one of the leading reasons players leave online games. It’s more than just a community management issue—it affects player retention, brand perception, and ultimately, the success of your game.
In this session, Dr. Ewa J. Antczak—an expert in human behavior with over 20 years of experience—offers a thoughtful approach to addressing toxicity by combining insights from psychology with the capabilities of emerging technologies like AI. Her talk explores how developers can begin to design systems that help reduce harmful behavior and create more positive, lasting player experiences.
Rather than relying solely on moderation tools that react after the damage is done, Dr. Ewa will outline how to identify early signs of toxicity, implement AI that supports healthier interactions in real time, and apply behavior-informed design strategies that help keep communities engaged and respectful. She will also introduce concepts for scalable support systems and a plugin aimed at involving parents and educators in age-appropriate gaming environments.
Additionally, the session will touch on how developers can align their community strategies with growing expectations around compliance, digital safety regulations, and platform policies. As industry standards evolve and pressure increases from platforms, governments, and advocacy groups, building trust and safety into your game isn’t just good practice—it’s becoming a requirement.
Attendees will leave with practical ideas for supporting community health, reducing moderator burnout, meeting compliance expectations, and building trusted, sustainable gaming experiences.