Okay let's say I open the url, http://www.nothing.com/ using php's fopen function. then I do this,

<?php
$S = sss
$url = "http://www.nothing.com/" . $S . "&ql=1";
$open = fopen($url,"r");
while(! feof($open))
  {
    echo fgets($open). "<br />";
  }
?>

That above returns all of the page I'm looking at. My goal is to have some sort of php function to sort of collect specific information from the page for example: I want to crawl the page and then find where it says name: next, I want to see what is next to name example: Name: Bob York Then I want to index Bob York into a mysql database? But, how can I do this?

Hey thanks for responding, but how could I use those php functions? Can you give me a quick code example?

If your need is pretty simple, then you may get what you want the way that you are doing it.

Even for simple needs but certainly when you need to start accessing multiple pages, navigating from one to the next and, in some cases, having to log in first, then there are better tools.

Curl can be used for some of this. I have used a class that helps to handle more complicated situations. Overall, I found that the most effective tool, especially for complicated situations is a Windows programming tool called Autoit. It has an Internet Explorer function that lets it automate navigation and extract fields. It is pretty easy to use and it can do just about anything that a human can do. It has the advantage over PHP tools that it works through the browser so when you are accessing secure (https) pages or pages created in ASP it has no problem. I could not get any of the PHP tools to process ASP pages.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.