THANKS Peter!
It works!
On Apr 16, 12:12 pm, Peter Denton <
petermden...@gmail.com> wrote:
> and you could bump up the while loop to 15, to get 1500 results, like...
>
> while ($page_num <= 15 )
>
> notivce in the search URL rpp=100 (results per page) and page=$page_num
> (pagination), so you can get 1500 results
>
> On Wed, Apr 15, 2009 at 8:08 PM, Peter Denton <
petermden...@gmail.com>wrote:
>
>
>
> > <?php
> > $page_num = 1;
> > $txtString = "";
> > while ($page_num <= 2 )
> > {
> > $host = "
> >
http://search.twitter.com/search.atom?q=japan&max_id=1529989226&rpp=1...
> > ";
> > $result = file_get_contents($host);
> > $xml = new SimpleXMLElement($result);
> > foreach ($xml->entry as $entry)
> > {
> > $statusUpdate[] = $entry->title;
> > }
> > $page_num++;
> > }
> > foreach($statusUpdate as $su)
> > {
> > $txtString .= $su;
> > }
>
> > $myFile = "myTextFile.txt";
> > $fh = fopen($myFile, 'w') or die("can't open file");
> > fwrite($fh, $stringData);
> > $stringData = $txtString;
> > fwrite($fh, $stringData);
> > fclose($fh);
> > ?>
>
> >> max_id=1529989226&page=2&q=japan<
http://search.twitter.com/search.atom?%0Amax_id=1529989226&page=2&q=j...>
> >
i...@twibs.com
>
i...@twibs.com