Creating a Safe Community with Guidelines & Data Privacy

We spend so much time planning for and thinking about all of the wonderful things that will happen in our community that we often gloss over any potentially harmful situations. One of the most important things you can do for your community is to set the ground rules early on and create a safe place for everyone to engage. Limiting the risk of a bad situation protects the members as well as the company. By this point of your getting started journey, you should be well aware of who your intended audience is, where they are going to live online, and where you want this community to go in the future. A critical next step is to define all of the rules that will keep your community running like a well oiled machine and make people feel comfortable and engaged.

Community Guidelines aren’t meant to be a legal & compliance throwaway document that no one ever reads (like a Terms of Service contract). Guidelines should be visible for all on your site, written in plain speak, and have a human tone to them – after all, people will come to your community to have a good time, not to be bogged down with rules & regulations. Having guidelines ensures that you take this seriously, as well as provides boundaries for members, but also that you’re following the law. 

Data Privacy Laws

Not only are guidelines something that are nice to have, often it’s required by law. For example, if you have an organization that is meant for children under the age of 13, you must be aware of the Children’s Online Privacy Protection Act (COPPA) that was signed into law in the United States in 2000. An organization operating online catering to children (think something like LEGO) must have a privacy policy that states needing consent from a parent or guardian and how they plan to protect children’s privacy and safety online as well limiting marketing to them. This is one reason why social networks typically don’t allow children under the age of 13 on their sites. 

More regulation has come out in recent years surrounding data privacy for internet users. Specifically the General Data Protection Regulation (GDPR) launched in May of 2018 for the residents of the European Union. This was a massive change for companies operating in Europe (including US Based companies) as it requires them to be compliant in the following areas: 

  • Obtaining Consent
  • Timely Security Breach Notification
  • Right to Data Access
  • Right to be Forgotten
  • Data Portability
  • Privacy by Design

While this is related to all online activity, this applies to communities as well. To sign up for a community you typically ask for a certain amount of Personally Identifiable Information (PII) such as name, email, phone, location, birthday, and more, and this is kept in a database or CRM. As you are seeking a third party vendor, you should ask about their compliance set up. 

GDPR in Europe was only the starting point, as California followed suit quickly thereafter with the California Consumer Privacy Act (CCPA) which went into effect as of January 2020. This is the first of its kind within the United States, but is only applicable to companies that operate within California. It’s not as broad sweeping as GDPR, but it has many of the same tenants for protection. All of these laws come with steep fines if not followed.  

Creating Safe Community Guidelines

If you’re not fully aware of all of these laws and their details as a community manager, that’s OK! You want to be generally aware of them as you may get requests from your users to manage their data. Depending on where your company is operating, you may want to include some of these options into your guidelines, or at the very least a link within them to your company’s privacy policy. Ask your legal & compliance department how you handle this subject. 

If you search for “Community Guidelines” on Google you will be presented with a link to almost every major social network’s guidelines page and then some. There are a myriad of examples out there for you to peruse and adopt, but at a minimum you should include information on: 

  • Scams 
  • Harassment, Hate Speech, Bullying, Violence, and other Illegal Actions
  • Sexual Content  (Is this a G, PG, R, or X-rated site?) 
  • Acceptable Language (Define actual words not allowed, if any) 
  • Sharing the Personal Information of Others (i.e. Doxxing) 
  • Acceptable Content  (are self promotion posts allowed? Can people share links? etc.) 
  • Do’s and Don’ts  (Do share stories about X,  Don’t include swear words, etc.) 
  • Repercussions on What Happens if These Guidelines Aren’t Followed

One option that’s a great safe community building engagement tool when you launch is to have your first members work with you to create the rules. Getting input from members on how they want to develop their space with you can be an incredibly empowering tool and allows members to feel ownership of their space. 

Safe Community Moderation

Once you have created your guidelines and they’re posted for all to see, you’ll want to set up the task of monitoring activity. As the community manager this is your responsibility, however, you are not available 24 hours a day, 7 days a week and should not be expected to be on all the time. First answer this question: 

Do you want your community to be pre-moderated or post-moderated? 

Pre-moderation means that anything that is posted to your site is reviewed before it is published. Post-moderation means that it is reviewed after it is posted.  There are pros and cons to both including the amount of time it takes for a moderator to review. 

Next – will you be moderating in-house or will you hire an outside agency? 

Do you have enough community managers or moderators inside of your company to manage all of the content you’ll be looking at?  If not, hiring outside is a very smart choice as you will also be getting their expertise as well. There are agencies dedicated to this that may even help you beyond this area. The Social Element is one full-stop option. 

Lastly – if you are moderating in-house, but don’t quite have the staff to cover the volume, will you create a volunteer moderation team from within the community? 

Much like asking your community for help in creating the guidelines, you can also ask them to volunteer to moderate.  Think of it like an online neighborhood watch where everyone looks out for everyone else. A great example of this is on Reddit and how they ask people to be mods. 

Ultimately online safety isn’t a very fun and stimulating project, but it’s an incredibly important cornerstone of a thriving community. The internet is full of bad actors and trolls out to ruin a good thing. Take a few steps to learn about protections, work with your community to create rules that make sense and are related to your brand, and then figure out how to consistently keep an eye on how things go day to day.  


Looking for information about creating a safe place for your Community? Reach out to Honeycommb today to get started and chat with our team about how strategically connect your users. Book a Demo to see what we can build together for your people!

Jeremy Ross

Leave a Reply