Cyberbullying, toxic language, and online harassment negatively impact students’ mental and emotional well-being. Harmful messages often go unnoticed, and existing systems mainly focus on blocking or punishment rather than understanding and emotio…
Cyberbullying, toxic language, and online harassment negatively impact students’ mental and emotional well-being. Harmful messages often go unnoticed, and existing systems mainly focus on blocking or punishment rather than understanding and emotional support.
SOLUTION:
Kind Guardian is an AI-powered content safety and emotional support system that analyzes text and screenshots from online platforms to detect toxicity, harassment, hate speech, threats, and emotional harm. The system responds in a caring, non-punitive way by explaining why content is harmful and offering emotionally supportive guidance to the user.