Howdy guys, I am using PHP to poll a list of streams in a directory. I am looking for the best way to poll through the ‘pagination’ (offset) to build a single array with all of the data and then process it in a single go instead of processing the data for each 100 results as cURL polls it.
Here is what I am using now:
$url_string = “https://api.twitch.tv/kraken/streams?game=$game&limit=100&language=en&offset=0&client_id=removed”;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HTTPHEADER, Array(“Accept: application/json”, “Content-Type: application/json”));
curl_setopt($ch, CURLOPT_URL,$url_string);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
$cerr = curl_error($ch);
$new_array = array();
$array = json_decode($data, true);
if (!isset($array[’_total’])) {
exit;
}
$total = $array[’_total’];
$current = 0;
foreach ($array[‘streams’] as $thing) {
//process these results
$current++;
}
while ($current < $total) {
$url_string = “https://api.twitch.tv/kraken/streams?game=$game&limit=100&language=en&offset=$current&client_id=removed”;
//process these results
$current++;
}
There is a problem with my current approach in that I believe it takes too long to do the processing of the initial records (on my end) so that the total number of streams in the directory changes and then it invalidates my while loop (and the resulting API call to Twitch) to pull the additional records. I have been seeing a lot of duplicated streams showing up (even though I am filtering them by comparing what is in the API with what is already in my data set) so I think I must just be going about this incorrectly.
Any suggestions will be welcome.