Learn how to implement a nudity detector using PHP in Symfony 3.

Public applications are often victim of troublemakers that want to be funny uploading adult content in a website for example, that is made for the entire family (where everybody enters). Many companies solve this problem by hiring people that verifies if uploaded images have obscene content. This is a nice and acceptable solution for companies with resources, however startups can't afford such personal. That's why today we want to share with you an awesome PHP library that detects (mostly, not %100 accurate) if an image contains nudes etc. We'll show you as well how to implement it on your Symfony 3 project.

Important

The nudity detector algorithm implemented by the library is not bullet proof as many services out there. Unless you use some deep learning framework or CV, some pictures may be qualified as not porn (may fail). The false positives rate is low with this implementation, so this script helps you to detect clear "porn" or "nude" images.

If you really want a bullet proof implementation that works using neural networks etc. you may want to use a third party service (an API) that uses neural networks. We have a collection of 5 of the best APIs that you can use for this goal.

1. Install Nudity Detector

To detect if an image has nudes, we'll use the Nudity Detector PHP Script. Nudity Detector is a library published on Packagist and installable with composer. Originally, the library was written by the Guys at FreebieStock that used an awesome algorithm (explanation and technical details information here) and ported it into PHP.

To install this library using composer, run the following command:

composer require necrox87/yii2-nudity-detector "dev-master"

As you can see, the package published on Packagist doesn't handle versioning as it is just a copy of the original PHP project on this repository. For more information about the package available on packagist, visit its repository here.

2. Using the Nudity Detector

As mentioned, this library is pretty easy to use and can be used both in controllers and symfony services, you just need to use the NudityDetector class from the namespace and that's it. You need to create an instance of the detector class that receives as first argument the path to the image file that you want to analize. From the created instance only use the isPorn method to detect if the image has obscene content or not:

<?php

namespace AppBundle\Controller;

use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpFoundation\Request;

// Reference to the NudityDetector class
use necrox87\NudityDetector\NudityDetector;

class DefaultController extends Controller
{
    public function homepageAction(Request $request)
    {
        // Path to an image file that you want to analize
        $imagePath =  $this->get('kernel')->getRootDir() . '/../web/images/nude6.jpg';

        // Create an instance of the NudityDetector
        $NudityChecker = new NudityDetector($imagePath);

        // The isPorn method verifies if the providen image has obscene content
        // and returns a boolean respectively
        if($NudityChecker->isPorn()){
            // Image has adult content !!!!
        }else{
            // Image is suitable for the whole family !
        }
        
        // Rest of your code ...
    }
}

As mentioned at the beginning of the article, the algorithm may fail in some cases, however with clearn images that contain obscene content, the filter will be applied. If you want a perfect solution, we recommend you to check the solution using a third party API.

Happy coding !


Senior Software Engineer at Software Medico. Interested in programming since he was 14 years old, Carlos is a self-taught programmer and founder and author of most of the articles at Our Code World.

Sponsors