Hi all,

I have an issue with a loop as it contains communication between my server and a remote one. If the remote server has an issue then it takes forever to complete the loop and the execution times out.

What it currently is:

          $servers = mysql_query("SELECT * FROM server_tab;e where s_enabled = 1 AND s_active= 1;");
          $num_rows = mysql_num_rows($servers);

        if ($num_rows > 0) {
            while ($server_ip = mysql_fetch_array($servers)) {
            echo(''.$server_ip['server_name'].': ');
            communicationQuery::query($server_ip['ip']);
            echo('<br /><br />');

                }
            }
            echo"All communcations completed. Waiting for META refresh...";
            }

The process I want:
1) open process
2) complete the query for ip 1
3) store results
4) close process
then repeat for IP 2, 3, 4 and so on.

Any ideas?

A lot is left vague... what is communicationQuery ? is that a SOAP call to WSDL or does it open a socket or cURL request? Seems like you're bottle neck is somewhere in communicationQuery, i'd check performance of that object. Also, how many servers are you calling? can you cache the results or requests some how? Perhaps you can limit the number of requests or even call the request asynchronously with AJAX.

Thanks for your reply. It's a function that contacts a gaming server and and asks for the number of players online. I've tried everything you've mentioned above - I just want to try and make it run individually for each server IP. There are 100's of servers.

the code you posted would likely do exactly what you are trying to do, however, wouldn't it be better if you could just gather all the ip's at once and pass them off as 1 array or struct to the communicationQuery class to run in bulk? Without seeing the documentation of that class, I wouldn't know why you get any issues. Can you take a the results and only show a small number of them? What happens when you just run 10 server ips with a LIMI 10? Also, do you need the star operator? wouldn't your select run more efficiently if you were just selecting the ip? Also, have you considered memcache?

I'll do all of that now and get back to you. memcache?

http://memcached.org/ just for reference. I don't know if you're problems are related to hundreds of queries on the database or from remote API calls in communicationQuery, but any caching could benifit you.

Okay so here it is.

The code below is my player counter. It connects to a server and brings back 2 values. 1) The number of current online players and 2) the maximum number of player slots.

Then it stores these two variables in to SQL against the server so that I may use them anywhere on the website like: 'Player: 40/100'.

Now the problem is that it takes far too long for one server to reply that it drops out. 8 - 10 servers are updated but the rest are left. What would be great is a method of starting the process, getting 1 ip, get the results, store the results, close the process and repeat. This does them all on one execution. Any ideas?

<meta http-equiv="refresh" content="60">
<?php

if ($_SERVER['SERVER_ADDR'] != '::1'){
echo"You are not authorised to execute this script";
}else{

       include('../config.php');


class MinecraftQuery 
{
    public static function query($address, $port, $timeout) 
    {
        $socket = @fsockopen($address, $port, $timeout);

        if (!$socket) {

         echo("Error.");
         mysql_query("UPDATE servers SET cur_players='0' WHERE ip='".$address."';");

        }else{

        fwrite($socket, chr(254));

        $response = "";

        while(!feof($socket)) $response .= fgets($socket, 1024);

        //$result = array();
        $response = str_replace(chr(0),"",$response);
        $response = substr($response, 2);
        $query = preg_split("[".chr(167)."]", $response);

        //$result['hostname'] = ($query[0]);
        $result ['Players'] = (int) $query[1];
        $result['/'] = (int) $query[2];

        echo($query[1]."/".$query[2]);
        mysql_query("UPDATE servers SET cur_players=".$query[1].", max_players=".$query[2]." WHERE ip='".$address."';");

        }

    }
}

          $servers = mysql_query("SELECT server_name, port, ip FROM servers where enabled = 1 AND active= 1;");
          $num_rows = mysql_num_rows($servers);

        if ($num_rows > 0) {
            while ($serv = mysql_fetch_array($servers)) {
            echo(''.$serv['server_name'].' : ');
            MinecraftQuery::query($serv['ip'], $serv['port'], 2);
            echo('<br /><br />');

                }
            }
            echo"Waiting to refresh...";
            }

?>

yeah, if you think you need to do them a chunk at a time, you can easily do so with ajax. You might run a few results, then push them a chunk at a time with a sleep function. You should also consider cURL instead of fsock since it's much more efficient. It's also very possible that if you have a lot of ip's then many of them are to servers that are either really far away and thus going to timeout on you anyways or they are not accepting connections in the way that you are requesting, leading to other issues long before you get through all of them. Have you ever been able to test the connections of each and every ip manually? Try keeping an error log.

commented: helpful! +0

Hey thanks for helping me so much :)

This is my new code that does actually do all of the servers without failing or timing out. But I know there are much more efficient methods. I will look into using cURL but I have no experience with AJAX. Where should I start? I have done many manually I I watched the results of a few cycles yesterday. It's updaing the website well now - but I;m only on 14 of 1000's of servers :(

Might need to pay a developer for this one

Could anyone rewrite the code above using cURL? I'd love to see how it works and what the difference is.

Thanks,
MM

This is what the code currently is:

<meta http-equiv="refresh" content="180"/>
<?php

if ($_SERVER['SERVER_ADDR'] != '::1'){
echo"You are not authorised to execute this script";
}else{

           include('../config.php');

    set_time_limit(0);        
    class MinecraftQuery 
    {
        public static function query($address, $port, $timeout) 
        {
            $socket = @fsockopen($address, $port, $timeout);

            if (!$socket) {

             echo("Error.");
             mysql_query("UPDATE servers SET cur_players='0' WHERE ip='".$address."';");

            }else{

            fwrite($socket, chr(254));

            $response = "";

            while(!feof($socket)) $response .= fgets($socket, 1024);

            //$result = array();
            $response = str_replace(chr(0),"",$response);
            $response = substr($response, 2);
            $query = preg_split("[".chr(167)."]", $response);

            //$result['hostname'] = ($query[0]);
            $result ['Players'] = (int) $query[1];
            $result['/'] = (int) $query[2];

            echo($query[1]."/".$query[2]);
            mysql_query("UPDATE servers SET cur_players=".$query[1].", max_players=".$query[2]." WHERE ip='".$address."';");

            }

        }
    }

              $servers = mysql_query("SELECT server_name, port, ip FROM servers where enabled = 1 AND active= 1;");
              $num_rows = mysql_num_rows($servers);

            if ($num_rows > 0) {
                while ($serv = mysql_fetch_array($servers)) {
                echo(''.$serv['server_name'].' : ');
                MinecraftQuery::query($serv['ip'], $serv['port'], 2);
                echo('<br /><br />');

                    }
                }
                header('Location: player_count.php');
                }
?>

well, to answer the question of should you use cURL or sockets and should you use AJAX can be determined if we know what you're aiming to do. As I understand it, you are trying to update a sql server with the response of each server as retrieved from the first table. If it's been really slow, does the initial select take a while? If so, did you index those columns in the database so it's nice and quick? Also, are you trying to do this update in a single page load? I'm not sure if the task of updating a sql server via 1000's of requests in one page load is all that feasible, you might consider running it via command line. I might not have the requirements of your project fully understood. But if it's a utility to update a server, consider using maybe a non-blocking server such as tornado or node.js versus apache/php, they are able to handle more connections and concurrency with less. A language that can multithread (unlike php) would really boost the speed of the processing, but you'd probably still have some bottlenecking in the sql updates :-/

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.