Skip to content

The ‘NSFW’ (Not Safe For Work) model returns probability scores on the likelihood that an image contains nudity. This model is great for anyone trying to automatically moderate or filter offensive content from their platform.

Notifications You must be signed in to change notification settings

sankalp7654/NSFW-Sense

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NSFW-Sense

The ‘NSFW’ (Not Safe For Work) model returns probability scores on the likelihood that an image contains nudity. This model is great for anyone trying to automatically moderate or filter offensive content from their platform.

REQUEST
You can call the Predict API with the 'NSFW' model. Simply pass in an image input with a publicly accessible URL or by directly sending image bytes.

RESPONSE
The Predict API returns probabilities for nsfw (Not Safe For Work) and sfw (Safe For Work) that sum to 1.0. Generally, if the nsfw probability is less than 0.15, it is most likely Safe For Work. If the nsfw probability is greater than 0.85, it is most likely Not Safe For Work.

API USED Clarifai API is used in this project.

About

The ‘NSFW’ (Not Safe For Work) model returns probability scores on the likelihood that an image contains nudity. This model is great for anyone trying to automatically moderate or filter offensive content from their platform.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages