How to Report Bad Users on Video Chat
The report button only works if you use it correctly. Here is how to file effective reports and what happens after you report someone.
Why Most Users Do Not Report
Most users never report bad behavior. They skip, move on, and convince themselves it was just part of the experience. This normalization is exactly what bad actors count on. If no one reports, platforms have no signal that something is wrong, and the harasser continues unchecked.
Shame plays a role too. Users who have been harassed often feel embarrassed, as if they did something wrong by using the platform in the first place. Nothing could be further from the truth. Reporting is not an admission of weakness — it is how communities protect themselves.
Many users also simply do not know how to report. They see the flag icon but are not sure what happens when they click it, whether their identity will be exposed, or whether anything will even come of it. The truth is that reports are almost always worth filing, and your identity is typically protected.
What Information to Include in a Report
Effective reports are specific. A report that says "this person was mean" gives moderation teams almost nothing to work with. A report that says "this user asked me to remove my clothing within 30 seconds of connecting, then became aggressive when I declined" gives clear, actionable information.
Include the username, the date and time (with time zone if possible), and a description of the specific behavior that violated the rules. If you have screenshots, note that in the report — some platforms let you attach images directly.
Avoid emotional language in the report. Stick to factual descriptions of what happened. "The user repeatedly asked for my phone number after I said no three times" is more useful than "this guy was a total jerk and ruined my experience."
How Long Does It Take Platforms to Act
Most platforms take 24-48 hours to review reports and take action. During this window, the reported account may still be active. This is why blocking simultaneously with reporting is important — it protects you while the moderation team works.
Coomeet's moderation team is faster than most. For serious violations like threats or explicit content, Coomeet often acts within hours. Repeat offender policies mean that accounts with prior violations receive escalated penalties faster than first-time offenders.
If a week passes and you see no action on a clearly serious report, consider following up. Some platforms have support ticket systems where you can check the status of your report or escalate if it was not handled properly.
What Happens to Reported Accounts
First-time minor violations typically receive a warning. The user is notified that their behavior was inappropriate and that further violations will result in harsher penalties. This is the most common outcome for first-time reports.
Temporary bans are the next escalation. Accounts that accumulate multiple valid reports or commit a more serious violation may be suspended for 24 hours to 7 days. During this period, the user cannot access the platform.
Permanent bans are reserved for repeat offenders and serious violations like threats, harassment, blackmail, or sharing illegal content. Some platforms issue permanent bans without prior warnings for severe violations.
Escalating to Platform Management
If your initial report does not result in action and the behavior continues, you can escalate. Most platforms have a trust and safety email address or a support ticket system. Use these to provide additional context and evidence for serious incidents.
Social media escalation can work for high-priority cases. If a platform has an official Twitter or Facebook account, a public mention with details (without sharing personal information) can sometimes accelerate response. Be careful not to share your own private information in public posts.
For extreme cases involving threats, extortion, or illegal content, escalate to law enforcement in addition to platform reports. Provide law enforcement with your evidence and let them know what platform the incident occurred on. Platforms cooperate with valid legal requests.
Coomeet's moderation team responds quickly and takes reports seriously. Their 94% real-user rate means fewer bad actors on the platform overall. Full Coomeet review →