Acceptable Use Policy
Template — South Africa
An attorney-drafted Acceptable Use Policy template designed specifically for South African digital platforms, websites, and online services. This comprehensive, legally compliant document defines permitted and prohibited uses, content standards, enforcement mechanisms, and reporting obligations — ensuring compliance with the Cybercrimes Act 19 of 2020, the Electronic Communications and Transactions Act 25 of 2002, the Consumer Protection Act 68 of 2008, and the Protection of Personal Information Act 4 of 2013.
Drafted by qualified South African attorneys
Reviewed for compliance with current legislation · Last updated April 2026
Why Your Business Needs This Agreement
No Legal Basis for Removing Harmful Content
Without an AUP that specifically prohibits categories of harmful content and establishes enforcement procedures, the platform has no clear contractual basis for removing content — even content that is illegal, defamatory, or dangerous. Users who have their content removed can claim the platform acted arbitrarily if there are no published rules governing content standards. The AUP provides the legal foundation for every content moderation decision.
Criminal Liability Under the Cybercrimes Act
The Cybercrimes Act 19 of 2020 imposes reporting obligations on electronic communications service providers under Section 54. Platforms that become aware of criminal activity on their service must report it to the SAPS within 72 hours. Without an AUP that informs users of these obligations, establishes monitoring procedures, and defines the platform's reporting workflow, the platform risks non-compliance with its statutory obligations — which is itself a criminal offence under Section 54(3).
Loss of ECTA Safe Harbour Protection
ECTA Chapter XI provides limited liability protection for platforms hosting user-generated content — but only if the platform complies with the notice-and-takedown procedure. Without an AUP that implements this procedure, designates a receiving agent, and publishes their contact details, the platform has no safe harbour protection. This means the platform can be held directly liable for infringing content uploaded by its users — as if the platform had created the content itself.
Disproportionate Enforcement Challenged Under the CPA
The CPA requires that contractual terms — including enforcement provisions — be fair, just, and reasonable. An AUP that imposes immediate permanent termination for minor violations, provides no appeals process, or applies enforcement inconsistently across users may be challenged as containing unfair terms under Section 48. A graduated enforcement approach with clear, documented procedures and an appeals mechanism demonstrates the fairness that the CPA requires.
Platform Abuse Degrading Service Quality
Without an AUP that prohibits resource abuse — excessive API calls, bandwidth consumption, storage misuse, cryptocurrency mining — a small number of abusive users can degrade service quality for the entire user base. The AUP provides the contractual authority to enforce technical limits, throttle abusive accounts, and terminate users who persistently abuse shared resources.
Data Scraping Destroying Competitive Advantage
Competitors and data brokers can systematically scrape platform content — product listings, user reviews, pricing data, business information — without consequence if the AUP does not prohibit automated access and data extraction. The AUP, combined with the Cybercrimes Act's prohibition on unauthorised access (Section 2) and the Copyright Act's protection of original content, provides multiple enforcement mechanisms against scraping.
What is a Acceptable Use Policy?
An Acceptable Use Policy (AUP) is an essential governance document for any South African digital platform, website, or online service that provides user accounts, hosts user-generated content, or offers access to shared computing resources. The AUP establishes the rules of engagement — defining what users can and cannot do on the platform, the content standards they must meet, and the consequences of violations. Without an AUP, platforms lack a clear legal basis for moderating content, suspending abusive accounts, or cooperating with law enforcement — leaving them exposed to both user abuse and regulatory liability.
The South African regulatory environment for digital platforms involves multiple intersecting statutes that directly shape the content and enforcement provisions of an AUP. The Cybercrimes Act 19 of 2020 is the most recent and significant addition. It creates criminal offences for a range of online activities that an AUP must address: unlawful access to computer systems (Section 2), unlawful interception of data (Section 3), unlawful interference with data or computer programmes (Section 5), unlawful interference with a computer system (Section 6), cyber fraud (Section 8), cyber forgery and uttering (Section 9), cyber extortion (Section 10), and the distribution of malicious communications (Section 14). Crucially, Section 54 of the Cybercrimes Act imposes reporting obligations on "electronic communications service providers" — which the Act defines broadly to include persons who provide electronic communications services. SaaS platforms, social media services, marketplace operators, and cloud providers may fall within this definition, triggering an obligation to report certain offences to the South African Police Service (SAPS) within 72 hours of becoming aware of them.
The Electronic Communications and Transactions Act 25 of 2002 (ECTA) provides the framework for content moderation and intermediary liability. Chapter XI of ECTA establishes limited liability protections for service providers who host third-party content — but these protections are conditional on compliance with the notice-and-takedown procedure. When a rights holder notifies the platform of infringing content through a compliant takedown notice, the platform must "expeditiously" remove the content. The platform must designate an agent to receive takedown notices and publish the agent's contact details. Failure to comply with the notice-and-takedown procedure exposes the platform to liability for the infringing content it hosts.
The Consumer Protection Act 68 of 2008 (CPA) requires that the AUP — as part of the platform's contractual terms — be fair, just, and reasonable (Section 14), in plain language (Section 22), and not contain terms that are unfair, unreasonable, or unjust (Section 48). This means the enforcement provisions must be proportionate: the platform cannot impose disproportionate penalties for minor violations, and the user must have reasonable notice of what constitutes a violation and what the consequences are.
POPIA intersects with the AUP in several ways. Content that contains personal information of third parties (doxxing, identity theft, non-consensual sharing of personal information) violates POPIA and should be prohibited. The platform's enforcement actions — reviewing reported content, investigating violations, sharing user data with law enforcement — must comply with POPIA's processing requirements. And the AUP itself, as part of the platform's terms, must be consistent with the platform's Privacy Policy regarding data collection and processing.
This attorney-drafted template covers every critical area: permitted uses, prohibited content categories (illegal content, hate speech, defamation, IP infringement, harmful content), prohibited activities (hacking, scraping, spam, malware, DDoS), system and resource abuse, graduated enforcement with clear consequences, reporting and appeals mechanisms, law enforcement cooperation under the Cybercrimes Act, content moderation procedures, ECTA notice-and-takedown compliance, and modification provisions with CPA-compliant notice. Whether you operate a SaaS platform, marketplace, social community, cloud service, or any digital platform with user accounts, this AUP provides the governance framework your platform requires.
Who Needs This
Want early access to the Acceptable Use Policy template?
We'll email you the moment early access opens
The Cybercrimes Act 19 of 2020 Section 54 requires electronic communications service providers to report certain criminal offences to the SAPS within 72 hours — failure to report is itself a criminal offence under Section 54(3)
ECTA Chapter XI provides limited liability protection for platforms hosting user-generated content, but only if the platform complies with the notice-and-takedown procedure — non-compliance exposes the platform to direct liability for infringing content
The CPA Section 48 requires that enforcement provisions in the AUP be fair, just, and reasonable — disproportionate penalties for minor violations may be declared void by a court or the National Consumer Tribunal
Hate speech is defined in Section 10 of PEPUDA and was upheld by the Constitutional Court in Qwelane v SAHRC (2021) — platforms should reference this legal definition rather than creating their own
POPIA applies to the platform's content moderation activities — reviewing reported content, investigating users, and sharing data with law enforcement constitute processing of personal information and must comply with the conditions for lawful processing
Key Clauses Included
This Acceptable Use Policy template covers 11 essential sections, each drafted by South African attorneys.
Permitted Uses
Establishes the legitimate, authorised uses of the platform — what the service is designed for and how users may properly use it. Sets the baseline expectations for normal, authorised activity and provides the context against which prohibited uses are assessed. Includes the requirement that users comply with all applicable South African laws and regulations in their use of the platform.
Prohibited Content
Comprehensive list of content that users may not upload, share, store, or distribute through the platform. Categories include: content that violates South African criminal law (including the Cybercrimes Act and the Films and Publications Act 65 of 1996), hate speech as defined in Section 10 of the Promotion of Equality and Prevention of Unfair Discrimination Act 4 of 2000, defamatory material, content that infringes intellectual property rights (copyright, trademarks, patents), child sexual abuse material, non-consensual intimate images, content promoting violence or terrorism, and personal information shared without the data subject's consent in violation of POPIA.
Prohibited Activities
Activities that constitute platform misuse and may also constitute criminal offences under the Cybercrimes Act 19 of 2020. Includes: unauthorised access to computer systems or accounts (Section 2), unlawful interception of communications (Section 3), data scraping beyond authorised API limits, spam and unsolicited bulk communications, phishing and social engineering, distribution of malware, ransomware, or viruses, denial-of-service attacks or attempts to degrade service performance (Section 6), circumvention of security measures, impersonation of other users, and automated access (bots, crawlers, scrapers) without prior written authorisation.
System & Resource Abuse
Technical restrictions to prevent activities that degrade service quality for other users. Covers API call limits and rate limiting, bandwidth consumption caps, storage quota enforcement, excessive CPU or memory usage, automated access patterns that resemble denial-of-service behaviour, and the use of the platform's infrastructure for cryptocurrency mining, torrenting, or other resource-intensive activities not related to the platform's intended purpose.
Enforcement & Graduated Consequences
The graduated enforcement process that the platform will follow for violations. Tier 1: Warning — for first-time minor violations, the user receives a written warning identifying the violation and the applicable AUP provision. Tier 2: Content removal — offending content is removed with notification to the user. Tier 3: Temporary suspension — the user's account is suspended for a defined period (typically 7-30 days). Tier 4: Permanent termination — the user's account is permanently terminated and they are prohibited from creating new accounts. Severe violations (criminal activity, distribution of CSAM, imminent threats of violence) result in immediate termination without prior warning.
Content Moderation Procedures
The platform's content moderation approach — whether automated (AI-based content filtering), manual (human content reviewers), or hybrid. Specifies the response timeframes for reported content (typically 24-48 hours for initial review), the criteria for prioritising reports (severity of potential harm), the qualifications and training of content moderators, and the documentation maintained for moderation decisions. Addresses the limitations of automated moderation and the right to human review.
Reporting Violations & Appeals
How users can report AUP violations — including the reporting mechanism (in-platform button, email, webform), the information required in a report, the review process for reports, timeframes for enforcement decisions, and confirmation of action taken. Also establishes the user's right to appeal content removal or account suspension — including the appeal mechanism, the review process (conducted by a different reviewer than the original decision-maker), the timeframe for appeal decisions, and the reinstatement procedure for successful appeals. Fair appeals processes are important for CPA Section 48 compliance.
ECTA Notice-and-Takedown Compliance
Implements the notice-and-takedown procedure required by ECTA Chapter XI for platforms hosting user-generated content. Identifies the designated agent for receiving takedown notices (with name, email, physical address, and phone number), specifies the information required in a compliant takedown notice, the timeframe for removing reported content (typically 24-48 hours from receipt of a compliant notice), the counter-notice procedure for users who believe their content was wrongly removed, and the put-back process if the original complainant does not pursue legal action within the prescribed period.
Law Enforcement Cooperation & Cybercrimes Act Reporting
The platform's obligations under Section 54 of the Cybercrimes Act 19 of 2020. Electronic communications service providers must report certain offences to the SAPS within 72 hours of becoming aware of them. The AUP informs users that: illegal activities conducted through the platform will be reported to law enforcement, user data may be disclosed to the SAPS or other authorities pursuant to lawful investigation requests (search warrants, subpoenas, Section 27 preservation orders), the platform will preserve evidence as directed by law enforcement under Section 29 of the Cybercrimes Act, and cooperation with law enforcement is a legal obligation, not a discretionary decision.
Intellectual Property Protection
Provisions protecting the intellectual property rights of both the platform and third parties. Prohibits users from uploading content that infringes copyright (under the Copyright Act 98 of 1978), trademarks (under the Trade Marks Act 194 of 1993), or other IP rights. Establishes a repeat infringer policy — users who receive multiple valid takedown notices will have their accounts terminated. Addresses the ECTA safe harbour protections and the platform's responsibilities as an intermediary.
Modifications & Notice
How the AUP may be updated — the platform's right to modify the policy with reasonable notice (typically 30 days for material changes), the notification mechanism (email and in-platform notice), and how continued use after the effective date constitutes acceptance of the modified terms. Subject to CPA Section 14 requirements that material changes to terms be fair, just, and reasonable. Includes a version history with effective dates for transparency and audit purposes.
South African Law Compliance
Cybercrimes Act 19 of 2020
The Cybercrimes Act is the primary criminal legislation relevant to acceptable use policies. It creates offences for unlawful access to computer systems (Section 2), unlawful interception of data (Section 3), interference with data (Section 5), interference with a computer system (Section 6), cyber fraud (Section 8), cyber forgery (Section 9), cyber extortion (Section 10), and malicious communications (Section 14). Section 54 imposes reporting obligations on electronic communications service providers — requiring them to report certain offences to the SAPS within 72 hours of becoming aware of them. Section 27 provides for preservation of evidence orders, and Section 29 provides for seizure orders. The AUP should prohibit all activities that constitute cybercrimes and inform users of the platform's reporting obligations.
Electronic Communications and Transactions Act 25 of 2002
ECTA Chapter XI is critical for platforms hosting user-generated content. It provides limited liability protection for service providers who act as intermediaries — meaning they host, cache, or transmit third-party content without initiating or modifying it. This protection is conditional on compliance with the notice-and-takedown procedure: when notified of infringing content through a compliant takedown notice, the platform must expeditiously remove the content. Section 75 protects service providers from liability for content they did not create or modify. Section 77 provides the takedown procedure. The platform must designate an agent to receive takedown notices and publish the agent's contact details. Without ECTA compliance, the platform may be held directly liable for user-generated content that infringes third-party rights.
Consumer Protection Act 68 of 2008
The CPA requires that the AUP — as part of the platform's contractual terms — be fair, just, and reasonable (Section 14), in plain language (Section 22), and not contain unfair, unreasonable, or unjust terms (Section 48). The enforcement provisions must be proportionate: the platform cannot impose disproportionate penalties for minor violations, and the user must have reasonable notice of what constitutes a violation and what the consequences are. Section 48(2)(d) specifically provides that a term is unfair if it requires the consumer to waive any rights, assume any obligation, or waive any liability on terms that are so adverse to the consumer that their agreement was not truly voluntary.
Protection of Personal Information Act 4 of 2013
POPIA intersects with acceptable use policies in multiple ways. Content that exposes third-party personal information without consent (doxxing, identity-based harassment, non-consensual sharing of personal details) violates POPIA and should be prohibited. The platform's enforcement actions — reviewing reported content, investigating users, sharing data with law enforcement — constitute processing of personal information under POPIA and must comply with the conditions for lawful processing. Section 11 provides the right to object to processing, Section 22 requires breach notification, and Section 72 restricts cross-border transfers of evidence shared with international law enforcement agencies.
Promotion of Equality and Prevention of Unfair Discrimination Act 4 of 2000
Section 10 of PEPUDA prohibits hate speech — defined as speech that advocates hatred based on race, gender, sex, pregnancy, marital status, ethnic or social origin, colour, sexual orientation, age, disability, religion, conscience, belief, culture, language, or birth, and that constitutes incitement to cause harm. The AUP should prohibit hate speech as defined under PEPUDA, which provides a clearer and broader definition than the common law. The Equality Courts have jurisdiction over hate speech complaints and can order content removal, issue interdicts, and award damages.
South African businesses are lining up for My-Contracts — be first in when we launch
Create Your Acceptable Use Policy in Minutes
Our guided wizard walks you through every clause — no legal knowledge required. Attorney-drafted, South African law compliant.
Identify your platform's risk profile and content categories
Assess what types of content users can create, upload, or share on your platform, and what activities they can perform. Identify the highest-risk content categories (user-generated content, financial transactions, personal information sharing) and the highest-risk activities (automated access, resource-intensive operations, peer-to-peer interactions). This risk assessment determines the scope and detail of your AUP.
Define prohibited content and activities with specific examples
For each prohibited category, provide a clear definition and specific examples. Users should be able to understand what is prohibited without legal interpretation. Cross-reference the Cybercrimes Act offences (Sections 2-14), the PEPUDA hate speech definition (Section 10), and the Copyright Act infringement provisions. Generic prohibitions like "no inappropriate content" are unenforceable — specificity is essential.
Design the graduated enforcement process
Map each prohibited content and activity category to a specific enforcement tier (warning, content removal, temporary suspension, permanent termination). Severe violations should receive immediate termination; minor violations should receive warnings. Document the process clearly in the AUP, including timeframes for each stage, the format of notifications, and the user's right to appeal. Test the process against CPA Section 48 fairness standards.
Implement the ECTA notice-and-takedown procedure
Designate an agent to receive takedown notices and publish their contact details prominently on your website and in the AUP. Create a standardised takedown notice form that collects all information required by ECTA Chapter XI. Establish internal procedures for reviewing notices, removing content, notifying affected users, and processing counter-notices. Set response timeframes (24-48 hours for initial action).
Deploy and integrate with your Terms of Service
Publish the AUP as a standalone document on your website, clearly linked from your Terms of Service. Include a statement in the Terms of Service that the AUP is incorporated by reference and that violations constitute a breach. Configure your platform to present the AUP during user onboarding. Implement the reporting mechanism (in-platform reporting button), the appeals process, and the content moderation workflow. Train your team on the enforcement procedures and the Cybercrimes Act reporting obligations.
Frequently Asked Questions
An Acceptable Use Policy (AUP) is a governance document that defines the rules and boundaries for using a digital platform or service. It specifies what users can do, what they cannot do, the content standards they must meet, and the consequences of violations. Your platform needs an AUP for several legal and practical reasons. First, without one, you lack a clear contractual basis for moderating content, suspending abusive accounts, or terminating users who violate community standards. Second, the Cybercrimes Act 19 of 2020 creates criminal offences for online activities that your platform must prohibit and may be obligated to report to the SAPS under Section 54. Third, ECTA Chapter XI provides liability protection for platforms hosting user-generated content — but only if you comply with the notice-and-takedown procedure, which the AUP implements. Fourth, the CPA requires that your enforcement procedures be fair and reasonable, which means they must be documented and proportionate. An AUP is not optional — it is the legal and operational foundation for platform governance.
What You Get With This Template
Drafted specifically for South African digital platforms — compliant with the Cybercrimes Act 19 of 2020, ECTA Chapter XI, the CPA, and POPIA
Comprehensive prohibited content categories aligned with the Cybercrimes Act offences (Sections 2-14), PEPUDA hate speech provisions (Section 10), and Copyright Act infringement standards
Full ECTA notice-and-takedown procedure implementation with designated agent provisions, compliant notice requirements, counter-notice mechanisms, and response timeframes
Cybercrimes Act Section 54 reporting workflow including the 72-hour SAPS notification timeline, evidence preservation obligations, and law enforcement cooperation procedures
CPA-compliant graduated enforcement process with proportionate consequences, clear notice, and a meaningful appeals mechanism
System and resource abuse provisions to protect shared infrastructure from bandwidth, storage, and compute abuse
Customisable template with clearly marked decision points for content categories, enforcement tiers, response timeframes, and appeal procedures
Plain language drafting meeting the CPA Section 22 standard for accessibility and comprehension