Child Safety Standards

Child and Youth Protection on wedium: Safety from the Start

At wedium, the protection of children and youth is our top priority. Our platform is being developed with a multi-layered security concept based on structural safety and transparent moderation. During the current beta phase, youth are not yet admitted—the official launch this summer will begin with a comprehensive protection system developed in close coordination with our Advisory Board for Child and Youth Protection.

 

Our Safety Measures at a Glance:

 

1. Safety by Design: Architecture for Secure Interactions

  • Asymmetric Contact Rules: Adults cannot send direct messages to minors unless there is a mutually confirmed personal connection.
  • Privacy by Default: Youth profiles are set to the highest privacy level by default and are protected from unwanted visibility.
  • Zero Tolerance for Grooming: Targeted contact with sexual intent or requests for private images result in immediate bans and reports to authorities.

 

2. Prevention Against Digital Violence

  • Intervention in Digital Violence: Our architecture makes cybergrooming and sextortion more difficult. For peer-to-peer violence (e.g., cyberbullying), we provide low-threshold reporting channels and rapid intervention processes.
  • Protection from Sharenting: Children have a right to digital privacy. We protect their data and integrity—even from their guardians, if necessary.

 

3. Professional Moderation and Triage System

  • Human Expertise: A specialized team evaluates reported content and acts in cases of danger—supported by technology but guided by human judgment.
  • Triage System for Rapid Response:
  • Level 1: Automated filters block known abusive content (CSAI).
  • Level 2: Expert moderators review reports for context and nuances.
  • Level 3: Trauma-competent specialists handle severe incidents and cooperate with authorities.

 

4. Transparency and Responsibility

  • Clearly Defined Netiquette: Respect, verification requirements, and documentation obligations under the Digital Services Act (DSA) create binding frameworks.
  • No Compromises on Safety: We invest in human moderation and trauma support—because safety is not an option, but our obligation.

 

5. Collaboration with Authorities and Organizations

  • In cases of acute danger (e.g., child abuse), we immediately initiate official reporting channels and cooperate with institutions such as the BKA or Interpol.

 

Why This Matters:
Digital violence is multifaceted—ranging from sexual harassment to organized peer violence. wedium relies on preventive architecture, human expertise, and legal compliance to create a safe environment for young users. Safety is not a feature; it is the foundation of our network.

Note: We consistently invest in human moderation and legal compliance (DSA). In the current beta phase, we are optimizing these systems with adult users to provide youth with a space at launch that consistently protects their rights and well-being.

wedium is the first social media platform for real people only