This algorithm allows to detect Child Sexual Abuse Materials (CASM). Generic info about all capabilities and limits see in the generic “Content Moderation” method.
What is Child Sexual Abuse Materials detection?
This method is intended to prevent this type of content from being distributed over the Internet.
For child pornography detection we first run a “soft_nudity” and a “hard_nudity” tasks. If both methods indicate the presence of obscene content with the presence of children (child’s face) in a frame, then such a video is marked as obscene. Frames are designated by the age category of identified children.
How to use?
The information is returned with the number of the video frame in which the child’s face was found and age.
Nudity detection is done using AI, so for each object a probability percentage is applied; objects with a probability of at least 30% are included in the response.
Video processing speed is approximately 1:10.
Example of detected nudity:
{
"`child_pornography_detected`": true,
"`detection_results`": [ "3-9" ],
"frames": [
{
"`frame_number`": 407,
"label": "`FACE_FEMALE`",
"confidence": 0.78,
"age": "3-9",
"`age_confidence`": 0.65
}...
]
}
Example response without nudity found is empty array:
{
"`child_pornography_detected`": false,
"`detection_results`": [],
"frames": []
}
API key for authentication. Make sure to include the word apikey, followed by a single space and then your token.
Example: apikey 1234$abcdef
Name of the task to be performed
content-moderation URL to the MP4 file to analyse. File must be publicly accessible via HTTP/HTTPS.
AI content moderation with child pornography detection algorithm
child_pornography Meta parameter, designed to store your own identifier. Can be used by you to tag requests from different end-users. It is not used in any way in video processing.
256Meta parameter, designed to store your own extra information about a video entity: video source, video id, etc. It is not used in any way in video processing. For example, if an AI-task was created automatically when you uploaded a video with the AI auto-processing option (nudity detection, etc), then the ID of the associated video for which the task was performed will be explicitly indicated here.
4096Response returns ID of the created AI task. Using this AI task ID, you can check the status and get the video processing result. Look at GET /ai/results method.
ID of the created AI task, from which you can get the execution result