Detector24
Nudity & Adult Content Detector
ImageStandard Moderation

Nudity & Adult Content Detector

Detect nudity and adult content with granular classification: mild, partial, explicit. AI-powered NSFW moderation with 94% accuracy for platforms.

Accuracy
94%
Avg. Speed
15.0s
Per Request
$0.0075
API Name
nudity-detection

Bynn Nudity & Adult Content Detector

The Bynn Nudity & Adult Content Detector analyzes images to identify nudity levels and sexual activity content using advanced AI vision analysis. This model provides granular classification enabling platforms to enforce nuanced content policies.

The Challenge

Platforms face a complex balancing act with adult content. Too restrictive, and legitimate artistic, educational, or health-related content gets wrongly censored. Too permissive, and users—including minors—are exposed to inappropriate material, advertisers flee, and regulatory penalties loom.

Simple binary "nude or not" classifiers fail in practice. A swimwear photo at the beach is appropriate; the same exposure in a sexualized context is not. Artistic Renaissance paintings contain nudity but serve educational purposes. Medical imagery requires clinical treatment. Platforms need nuanced understanding that considers both what is shown and how it is presented—distinguishing revealing clothing from explicit nudity, suggestive posing from explicit acts.

In physical security contexts, nudity detection enables monitoring of sensitive environments. CCTV systems in schools, workplaces, or public facilities can detect inappropriate exposure or indecent behavior. Hospitality and entertainment venues can monitor for policy violations. The same detection that protects online users can ensure safety and compliance in physical spaces.

Model Overview

When provided with an image, the detector classifies content across two primary dimensions: nudity level and sexual activity level. Unlike simple binary classifiers, this model provides detailed analysis with specific indicators of what was detected, enabling more nuanced moderation decisions.

Achieving 94.0% accuracy, the model uses Bynn's Visual Language Model technology trained on document forensics and physical understanding to perform contextual visual reasoning rather than simple pattern matching.

How It Works

The model employs sophisticated visual reasoning to analyze images holistically:

  • Contextual analysis: Evaluates the overall scene context, not just isolated body parts
  • Evidence-based classification: Only classifies based on clearly visible content, avoiding speculation
  • Conservative uncertainty handling: When uncertain, defaults to less explicit classifications with lower confidence
  • Multi-person handling: Reports the most explicit visible content when multiple people are present

Response Structure

The API returns a structured response containing:

  • nudity_level: Classification of nudity present in the image
  • sexual_activity_level: Classification of sexual activity or suggestive content
  • visible_elements: Object with boolean flags indicating specific detected elements

Visible Elements Breakdown

  • nipples_or_areola_visible: Whether nipples or areola are clearly visible
  • buttocks_visible: Whether bare buttocks are visible
  • genitals_visible: Whether genitals are visible
  • underwear_or_swimwear_visible: Whether underwear, lingerie, or swimwear is visible
  • see_through_clothing: Whether see-through clothing is present

Classification Levels

Nudity Level (0-3)

Level Description
no_nudity Regular clothing; no visible bare breasts, nipples, genitals, or buttocks
mild_nudity Revealing content: underwear, lingerie, swimwear, see-through clothing without visible nipples/genitals, shirtless (typical male), cleavage, or suggestive clothing without explicit nudity
nudity Non-genital nudity: visible nipples/areola, bare buttocks, or nude body where genitals are NOT visible (occluded by pose, blur, object, or cropping). Includes non-sexual contexts like art or medical imagery
explicit_nudity_nsfw Explicit nudity: visible genitals (vulva/penis/testicles) or clear close-up of genital region

Sexual Activity Level (0-3)

Level Description
none No sexual act implied
suggestive Overtly sexual posing, focus on intimate areas, sexualized context, sex toys present without action
implied Implied or simulated act: groping, explicit intimate contact through clothing, simulated sex act without visible genitals/penetration
explicit Explicit act: visible oral sex, masturbation, penetration, or clearly explicit sex act

Performance Metrics

Metric Value
Classification Accuracy 94.0%
Average Response Time 15,000ms
Max File Size 20MB
Supported Formats GIF, JPEG, JPG, PNG, WebP

Use Cases

  • Platform Content Moderation: Enforce community guidelines with nuanced nudity policies that distinguish between artistic, revealing, and explicit content
  • Age-Gated Content: Automatically categorize content for appropriate age restrictions
  • Advertising Compliance: Ensure ad-served pages meet advertiser brand safety requirements
  • Dating Platforms: Apply tiered moderation policies based on platform guidelines
  • E-commerce: Filter inappropriate product images from marketplaces
  • Educational/Medical Platforms: Distinguish clinical or educational nudity from explicit content

Known Limitations

Important Considerations:

  • Artistic Context: Cannot distinguish intent (art vs. pornography) beyond visible content
  • Cultural Variations: What constitutes "revealing" varies by culture; model uses Western conventions as baseline
  • Clothing Ambiguity: Very tight or form-fitting clothing may be classified differently than intended
  • Image Quality: Low resolution, heavy blur, or extreme cropping may affect classification accuracy
  • Animated/Illustrated Content: Performance on cartoon, anime, or illustrated content may differ from photographic content

Disclaimers

This model provides probability-based classifications, not definitive content judgments.

  • Screening Tool: Use as part of a broader content moderation strategy, not as the sole decision factor
  • Human Review Recommended: Edge cases and borderline content should be reviewed by trained moderators
  • Policy-Dependent: Classification results should be mapped to platform-specific policies; what's acceptable varies by platform
  • Context Matters: The same level of nudity may be appropriate in different contexts (medical, artistic, adult platform)
  • Regular Updates: Content trends evolve; model effectiveness should be periodically validated

Best Practice: Combine detection results with human review and platform-specific policy guidelines for optimal content moderation outcomes.

API Reference

Version
2601
Jan 3, 2026
Avg. Processing
15.0s
Per Request
$0.0075
Required Plan
trial

Input Parameters

Vision Language Model for image/video understanding with reasoning

media_typestring

Type of media being sent: 'image' or 'video'. Auto-detected if not specified.

Example:
image
image_urlstring

URL of image to analyze

Example:
https://example.com/image.jpg
base64_imagestring

Base64-encoded image data

video_urlstring

URL of video to analyze

Example:
https://example.com/video.mp4
base64_videostring

Base64-encoded video data

Response Fields

Structured Nudity & Adult Content Detector response

responseobject

Structured response from the model

Object Properties:
nudity_levelstring
Possible values:
no_nuditymild_nuditynudityexplicit_nudity_nsfw
visible_elementsobject
buttocks_visibleboolean
genitals_visibleboolean
see_through_clothingboolean
nipples_or_areola_visibleboolean
underwear_or_swimwear_visibleboolean
sexual_activity_levelstring
Possible values:
nonesuggestiveimpliedexplicit
thinkingstring

Chain-of-thought reasoning from the model (may be empty)

Complete Example

Request

{
  "model": "nudity-detection",
  "image_url": "https://example.com/image.jpg"
}

Response

{
  "inference_id": "inf_abc123def456",
  "model_id": "nudity_detection",
  "model_name": "Nudity & Adult Content Detector",
  "moderation_type": "image",
  "status": "completed",
  "result": {
    "response": {
      "nudity_level": "no_nudity",
      "visible_elements": {
        "buttocks_visible": false,
        "genitals_visible": false,
        "see_through_clothing": false,
        "nipples_or_areola_visible": false,
        "underwear_or_swimwear_visible": false
      },
      "sexual_activity_level": "none"
    },
    "thinking": ""
  }
}

Additional Information

Rate Limiting
If we throttle your request, you will receive a 429 HTTP error code along with an error message. You should then retry with an exponential back-off strategy, meaning that you should retry after 4 seconds, then 8 seconds, then 16 seconds, etc.
Supported Formats
gif, jpeg, jpg, png, webp
Maximum File Size
20MB
Tags:nsfwadult-contentsafetyvlmai-analysis

Ready to get started?

Integrate Nudity & Adult Content Detector into your application today with our easy-to-use API.