Google helps to enable a free and open web by helping publishers monetise their content and advertisers reach prospective customers with useful, relevant products and services. Maintaining trust in the ads ecosystem requires setting limits on what we'll monetise.
When you monetise your content with Google ad code, you're required to adhere to the following policies. By 'content', we mean anything present on your page or app – including other advertisements and links to other sites or apps. Failure to comply with these policies may result in Google blocking ads from appearing against your content, or suspending or terminating your account.
These restrictions apply in addition to any other policies governing your use of Google publisher products.
We do not allow content that:
We do not allow content that:
We do not allow content that:
Examples: Sale of tigers, shark fins, elephant ivory, tiger skins, rhino horn, dolphin oil
We do not allow content that:
Examples: Promoting hate groups or hate group paraphernalia, encouraging others to believe that a person or group is inhuman, inferior or worthy of being hated
Examples: Singling out someone for abuse or harassment, suggesting a tragic event did not happen or that victims or their families are actors or complicit in a cover-up of the event
Examples: Content advocating suicide, anorexia or other self-harm; threatening someone with real-life harm or calling for the attack of another person; promoting, glorifying or condoning violence against others; content made by or in support of terrorist groups or transnational drug trafficking organisations, or content that promotes terrorist acts, including recruitment, or that celebrates attacks by transnational drug trafficking or terrorist organisations
Examples: Predatory removals, revenge porn, blackmail
We do not allow content that:
Examples: Creating fake or false documents such as passports, diplomas or accreditation; sale or distribution of term papers, paper-writing or exam-taking services; information or products for passing drug tests
Examples: Pages or products that enable illegal access of mobile phones and other communications or content delivery systems or devices; products or services that bypass copyright protection, including circumvention of digital rights management technologies; products that illegally descramble cable or satellite signals in order to get free services; pages that assist or enable users to download streaming videos if prohibited by the content provider
We do not allow content that:
Example: 'Get Rich Quick' schemes
Examples: Impersonating Google products, misusing company logos
We do not allow content that:
Examples: Computer viruses, ransomware, worms, trojan horses, rootkits, keyloggers, diallers, spyware, rogue security software and other malicious programs or apps
Examples: Failure to be transparent about the functionality that the software provides or the full implications of installing the software; failing to include Terms of Service or an End User Licence Agreement; bundling software or applications without the user's knowledge; making system changes without the user's consent; making it difficult for users to disable or uninstall the software; failing to properly use publicly available Google APIs when interacting with Google services or products
We do not allow content that:
Examples: Sex acts such as genital, anal and/or oral sex; masturbation; cartoon porn or hentai; graphic nudity
Examples: Rape, incest, bestiality, necrophilia, snuff, lolita or teen-themed pornography, underage dating
Examples: Prostitution, companionship and escort services, intimate massage, cuddling sites
We do not allow content that:
Examples: Mail order brides, international marriage brokers, romance tours
We do not allow content that:
We do not allow content that:
Examples: Child sexual abuse imagery or other content that visually depicts, encourages or promotes sexual attraction by adults towards minors
Google absolutely prohibits monetisation of content related to child sexual abuse imagery or paedophilia. Google has always been at the forefront in the fight against online child abuse, and an avid supporter of family safety online. Under United States federal law, child sexual abuse imagery is defined as visual depictions of minors (i.e. under 18) engaged in a sexual act such as intercourse, oral sex or masturbation as well as lascivious depictions of the genitals (covered or uncovered). This definition extends to photographs, videos, cartoons, drawings, paintings and sculptures. The image can involve a real child; a computer-generated, morphed, composite or otherwise altered image that appears to be a child (think 'Photoshop'). This also includes soliciting minors for sexual acts, which is also known as 'enticement'. Paedophilia is any content or behaviour (images, texts, videos, etc.) that depicts, encourages or promotes sexual attraction by adults towards minors (i.e. under 18).