February 1, 2024
SafeSearch is a technology used within Google Search, YouTube and Google Images that acts as an automated filter of pornography and potentially offensive content. When a thumbnail is likely to contain explicit content the video may not show up in search results. Neither on YouTube nor Google. The video will also have a much lower chance to show up in the Suggested Videos section on YouTube. That means your video will get much fewer views.
Thumbnails get screened on the following explicit areas:
- Adult content likelihood: Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities.
- Spoof likelihood: The likelihood that a modification was made to the image’s canonical version to make it appear funny or offensive.
- Medical likelihood: Likelihood that this is a medical image.
- Violence likelihood: Likelihood that this image contains violent content.
- Racy likelihood: Likelihood that the image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas.
For every one of these 5 areas your thumbnail gets scored from 1 to 5:
- VERY UNLIKELY – It is very unlikely that the image belongs to be explicit in this area.
- UNLIKELY – It is unlikely.
- POSSIBLE – It is possible.
- LIKELY – It is likely that the image is explicit.
- VERY LIKELY – It is very likely that the thumbnail features explicit content.
If Google’s SafeSearch technology thinks it’s likely, very likely or possible that your thumbnail contains explicit content you should definitely use another image. By changing your thumbnail you increase your chances to rank high in search results and get more views for your video.In tubics’ analysis section you can see whether your thumbnail gets indexed as being explicit, change it and re-check.
The 6-Step Guide to YouTube Content Strategy
How to build relevant video content for every business
FREE Download