Facebook announced new steps today to reduce bullying and harassment across its platforms. The company added tools to identify harmful content faster. Facebook uses artificial intelligence to spot offensive posts and comments. This technology detects bullying based on words and images. It works in many languages.
(Facebook Expands Its Fight Against Cyberbullying)
Facebook also improved its reporting system. Users can now report bullying more easily. Reports get reviewed quickly. Trained teams check these reports. They remove content breaking Facebook’s rules. The rules ban threats, hate speech, and targeted harassment. People who break rules may lose account access.
New controls give users more power. People can decide who comments on their public posts. Options include friends, specific lists, or no one. Users can also hide comments containing certain words. This helps manage unwanted interactions. A feature called “Limits” automatically hides comments and messages from people not in your friends list. This is useful during times of high attention.
Facebook is working with experts on this issue. These experts include safety organizations and researchers. Their advice helps shape Facebook’s policies and tools. The company also provides resources for people experiencing bullying. Guides offer tips for handling harassment. Support connects users to help organizations.
(Facebook Expands Its Fight Against Cyberbullying)
The company stated its commitment to user safety. It believes these updates will make its platforms safer. Facebook wants people to feel secure online. These efforts are part of ongoing work against online abuse. The goal is to foster respectful communication. Facebook will keep updating its tools as needed.




