Oliverde8's blue Website

  • Home
  • PHP running Asynchronous processes

PHP running Asynchronous processes

I have been programming in PHP for quite a while now, and even through at work & for my own I use PHP quite a lot for creating website I actually started programming in PHP with daemons and still continue to do so. 

My main development for PHP daemons has been for Maniaplanet controllers like eXpansion.

Well particulart for those who don't like PHP, it does sound dump to use PHP as a daemon right? Well it isn't as dump as it sounds. The advantage of using a script language for the controller is to make it easily extensible. We ask people to copy paste files at the right place and there it goes you have new functionality. (we actually use composer) 

But well, we are not here to discuss if using PHP as a daemon is a good idea or not. 

The main issue, or lack of PHP is threading. When building a website you don't need it because you are anyway running one instance of PHP per user and you don't expect one of those instances to run more then half a second (well theoretically practice is not so much...). So threading for a porcess that takes half a second is not such a good idea. If a part of the page is very heavy or needs special treatment we can always use Javascript or ESI blocks.

But when running daemons well you can need to run parallel processes. 

In my case I needed it for our Maniaplanet server controller. One instance of PHP in this situation serves hundred of players n the server. One of those players an admin can wish to add a map to the server. He will do so by using the ingame interface and give the controller an URL to use to download the map. 

And there comes the freeeeze. 

While the PHP instance is downloading the file every other player that needs the controller are waiting. It looks like the controller crashed but it simply is downloading this huge new map. We had a few other cases such as reading all the maps accessible to the server already in the hard disk and getting all the necessary information out of the files. It sometimes is very slow. 

So we needed a way to handle these cases. 

PThread

The best thing would have been to use the pthread library. The issue with that is that it's not part of PHP, and is not that easy to install. Our solution  needs to be easy to install for people that has little to no experience in programming. It's already hard enough to make a tutorial where they install a simple wamp server. 

So we needed another solution something more simple, 

PHP-AsynchronousJobs

Is a little library of mine to do exactly that. It is not fast, and never intended to be so. It is not meant to be used as threads. It must be use to execute big jobs asynchronously not small jobs that executes in a less then a second. 

How does it work?  

It works using the exec function of php. Then it works the worst possible manner possible, it works by creating files & putting/removing locks on those. Using these files it can take data in parameter & send data back to the parent process. 

So the overhead is quite important, and that is why it doesen't replace threads it is just a practical library to run relatively big jobs asynchronously. 

Beside that it works pretty well, and is very easy to use. 

How it should have been done

Of course it should have never existed in a perfect world but well. Ideally I should have used redis, or at least add in the code the possibility to use redis. I do intend to add it at one point, or at least add the abstraction layer allowing it to be added. 

I had not added redis for a obvious reason, the user of the tool that uses it would never install it if I asked them to install redis.

How to use it. 

It's very easy. 

The library comes with a job that allows doing curl request, so let's see that : 

$curlJob = new Curl();
$curlJob->setMethod('GET');
$curlJob->setUrl('http://jsonplaceholder.typicode.com/posts');

$curlJob->start();
JobRunner::getInstance()->waitForAll(1);

$info = $curlJob->getCurlInfo();
$response = $curlJob->getResponse()

So what are we doing here. 

  1. We are creating a new Curl job, 
  2. Setting the method to use, here GET
  3. And the url to use
  4. Then we start the job. 
    • This point onwards we have 2 processes running
  5. And we wait for all jobs to finish. We could wait for a single job as well. 
    • The 1 here means it is going to check if the lock has been removed every second, and idle the rest of the time. 
  6. Once the jobs are all finished we get the curl info back from the job

Let's create a slightly more interesting execution. 

First let's create a job that sleeps. For that you just need to create a class that extends the Job class.

 

class Sleep extends Job
{
    public $time = 1;

    /**
     * Method called by the new instance to run the job.
     *
     * @return mixed
     */
    public function run()
    {
        sleep($this->time);
    }

    /**
     * Method called by the original instance when the job has finished.
     *
     * @return mixed
     */
    public function end()
    {
      $time = $this->time;
      echo "I end after : $time!";
    }
}

We could have put the sleep variable as a protected(not private) variable and add getters & setters but we wish to do something simple. 

Now let's prepare 2 jobs with different sleep times

$job1 = new Sleep();
$job1->sleep = 3;

and a second faster job

$job2 = new Sleep();
$job2->sleep = 2;

So after 3 seconds of running our first job should display I end after : 3 

And after 2 seconds of running our second job should display I end after : 2

So let's start both jobs and wait for them to end

$job1->start()
$job2->start()

// And wait for the end
JobRunner::getInstance()->waitForAll(1);

if the jobs worked without the library we should wait 5 seconds to see both messages. 3 seconds for first and 2 seconds for the second message. But we will see both messages in 3 seconds. We will first see the I end after : 2 message even throught the job was started after. THen we will see the I end after : 3 message. 

You can find the library here with more examples on how to use it : https://github.com/oliverde8/PHP-AsynchronousJobs

Share

Categories :