
Detect nudity and adult content with granular classification: mild, partial, explicit. AI-powered NSFW moderation with 94% accuracy for platforms.
The Bynn Nudity & Adult Content Detector analyzes images to identify nudity levels and sexual activity content using advanced AI vision analysis. This model provides granular classification enabling platforms to enforce nuanced content policies.
Platforms face a complex balancing act with adult content. Too restrictive, and legitimate artistic, educational, or health-related content gets wrongly censored. Too permissive, and users—including minors—are exposed to inappropriate material, advertisers flee, and regulatory penalties loom.
Simple binary "nude or not" classifiers fail in practice. A swimwear photo at the beach is appropriate; the same exposure in a sexualized context is not. Artistic Renaissance paintings contain nudity but serve educational purposes. Medical imagery requires clinical treatment. Platforms need nuanced understanding that considers both what is shown and how it is presented—distinguishing revealing clothing from explicit nudity, suggestive posing from explicit acts.
In physical security contexts, nudity detection enables monitoring of sensitive environments. CCTV systems in schools, workplaces, or public facilities can detect inappropriate exposure or indecent behavior. Hospitality and entertainment venues can monitor for policy violations. The same detection that protects online users can ensure safety and compliance in physical spaces.
When provided with an image, the detector classifies content across two primary dimensions: nudity level and sexual activity level. Unlike simple binary classifiers, this model provides detailed analysis with specific indicators of what was detected, enabling more nuanced moderation decisions.
Achieving 94.0% accuracy, the model uses Bynn's Visual Language Model technology trained on document forensics and physical understanding to perform contextual visual reasoning rather than simple pattern matching.
The model employs sophisticated visual reasoning to analyze images holistically:
The API returns a structured response containing:
| Level | Description |
|---|---|
| no_nudity | Regular clothing; no visible bare breasts, nipples, genitals, or buttocks |
| mild_nudity | Revealing content: underwear, lingerie, swimwear, see-through clothing without visible nipples/genitals, shirtless (typical male), cleavage, or suggestive clothing without explicit nudity |
| nudity | Non-genital nudity: visible nipples/areola, bare buttocks, or nude body where genitals are NOT visible (occluded by pose, blur, object, or cropping). Includes non-sexual contexts like art or medical imagery |
| explicit_nudity_nsfw | Explicit nudity: visible genitals (vulva/penis/testicles) or clear close-up of genital region |
| Level | Description |
|---|---|
| none | No sexual act implied |
| suggestive | Overtly sexual posing, focus on intimate areas, sexualized context, sex toys present without action |
| implied | Implied or simulated act: groping, explicit intimate contact through clothing, simulated sex act without visible genitals/penetration |
| explicit | Explicit act: visible oral sex, masturbation, penetration, or clearly explicit sex act |
| Metric | Value |
|---|---|
| Classification Accuracy | 94.0% |
| Average Response Time | 15,000ms |
| Max File Size | 20MB |
| Supported Formats | GIF, JPEG, JPG, PNG, WebP |
Important Considerations:
This model provides probability-based classifications, not definitive content judgments.
Best Practice: Combine detection results with human review and platform-specific policy guidelines for optimal content moderation outcomes.
Vision Language Model for image/video understanding with reasoning
media_typestringType of media being sent: 'image' or 'video'. Auto-detected if not specified.
imageimage_urlstringURL of image to analyze
https://example.com/image.jpgbase64_imagestringBase64-encoded image data
video_urlstringURL of video to analyze
https://example.com/video.mp4base64_videostringBase64-encoded video data
Structured Nudity & Adult Content Detector response
responseobjectStructured response from the model
nudity_levelstringno_nuditymild_nuditynudityexplicit_nudity_nsfwvisible_elementsobjectbuttocks_visiblebooleangenitals_visiblebooleansee_through_clothingbooleannipples_or_areola_visiblebooleanunderwear_or_swimwear_visiblebooleansexual_activity_levelstringnonesuggestiveimpliedexplicitthinkingstringChain-of-thought reasoning from the model (may be empty)
{
"model": "nudity-detection",
"image_url": "https://example.com/image.jpg"
}{
"inference_id": "inf_abc123def456",
"model_id": "nudity_detection",
"model_name": "Nudity & Adult Content Detector",
"moderation_type": "image",
"status": "completed",
"result": {
"response": {
"nudity_level": "no_nudity",
"visible_elements": {
"buttocks_visible": false,
"genitals_visible": false,
"see_through_clothing": false,
"nipples_or_areola_visible": false,
"underwear_or_swimwear_visible": false
},
"sexual_activity_level": "none"
},
"thinking": ""
}
}429 HTTP error code along with an error message. You should then retry with an exponential back-off strategy, meaning that you should retry after 4 seconds, then 8 seconds, then 16 seconds, etc.Integrate Nudity & Adult Content Detector into your application today with our easy-to-use API.