Use your left/right keys to browse tutorials
Use jQuery and PHP to scrape page content

Use jQuery and PHP to scrape page content

1 Star2 Stars3 Stars4 Stars5 Stars
Posted on November 18, 2009

So we have content on another domain that we want to load via AJAX into a page how can we do this?…. This was a question that was put to the other day at work. More experienced web developers will know that JavaScript doesn’t allow cross domain XMLHttpRequest’s or AJAX requests (Asynchronous JavaScript and XML). There is a ‘dirty’ way to get around this using PHP and CURL to pull the HTML of the page you want to get the content from so JavaScript thinks it’s coming from your domain. Let me just say, this isn’t an ideal solution but it’s a useful technique when executed in the right situation.

NOTE: You need to have PHP5 installed on your server in order to use the CURL module.

The PHP

In this example we’re taking the community news section from smashingmagazine.com. Firstly using PHP we use CURL to get the whole contents of the homepage. we can then specify using javascript a specific div to get as explained below.

$ch = curl_init("http://www.smashingmagazine.com/");
$html = curl_exec($ch);
echo $html;

The JavaScript

This must be the simplest couple of lines of javascript ever. You can see within the DOM ready function we’re loading the content of the div #noupesoc into #content. As simple as that. You can specify any div or element on the page and grab it using this method.

    $("document").ready(function() {
        $("#content").load("curl.php #noupesoc");
    });

The HTML

 <h1>Smashing Community News</h1>
<div id="content"><img src="ajax-loader.gif" alt="Loading..." /></div>

demodownload



More tutorials from Papermashup
Comments
23 discussions around Use jQuery and PHP to scrape page content
Older Comments
  1. (It’s just another reason for me to not like blogs – ha ha. Random past posts get their dates changed and are republished on the front page and on RSS feeds. Very good communication and persuasive expertise are necessary to carryout efficient communication between you and your client, your boss, or your associates in order to close a deal or identify the achievement of a project.

    My homepage: wordpress plugin developer

  2. Brijesh says:

    Hi,

    Thanks for nice tutorial.

    Could we create regular pattern to crawl all page/

    Thanks
    Brijesh Mishra

  3. Delia says:

    Hi there, this doesnt seem to work in explorer. And it also doesnt seem to work on extracting contents of divs on external sites..

  4. SWATANTRA PRASAD CHOUDHURY says:

    NICE ONE. Loved to go through the same..

  5. Steve says:

    Awesome tutorial!
    I had to use curl on my host 1and1.

    http://www.quickscrape.com/ is what I came up with!

  6. Pingback: JSON and PHP product gallery | Papermashup.com

  7. mccormicky says:

    If the content is WP I can and have used Simplepie to do it but getting this content to show in just one place created another problem.

    In the non WP part of the site there are only a few templates that can be used effectively …because the templates in question would be used by more than one page the RSS WP content shows up everywhere the template is used.

    The non WP part of the site uses Qcodo and Ajax. The qcodo controls are in files that have been encrypted. So what if I want to show content from the non WP part of the site in the WP part of the site? This content doesn’t come with RSS.

    So I went looking for other solutions:Scraping.

    With scraping I can create a WP page and put whatever I want in it then include this page in the non WP part of the site using the techniques either from this tutorial or the one from Net Tuts.

    Rubbish: what solution do you propose other than scraping?

    Ashley: is this safe to do if the content is from the same website/domain?

    • Ashley says:

      @mccormicky if you are in control of the content on both domains then it is safe to use this technique (which is the exact reason i used the technique) – if not you should consider the reputation of the site in question before proceeding.

  8. rubbish says:

    Ashley, I am not referring to content theft, (although your argument suggesting that a site’s content is fair game for the talking solely because it is presented on the internet is flat out WRONG.)

    The fact is, if the user has scripting on their page (HTML), you scape their page and incorporate their HTML (to be parsed with jquery) that active content will have access to your page, will not be subject to cross-domain restrictions and will have control over any content or cookies set by your server.

    You will be subjected to cross site scripting in the exact same way as if you had an input form which you echoed back to the user.

    I know this because I have done it myself to idiots. :)

    Remember, when you use CURL to request a page, that request is being sent with a particular signature that identifies it as a non-browser request (the user agent is all wrong, there are more hints as well) , also, the IP address will be that of YOUR server, not that of a user ISP.
    Therefore, the target can recognize and send you targeted text back, that can result in an unfriendly site experience. They could even send back javascript that could parse and steal the log in cookie of whomever is viewing!

    Remember back when Hotlinking graphic files was an issue and people used to substitute offensive files in the place of the hotlinked files?

    In any event, the scraped content no constitutes INPUT and will have to be SANITIZED the same way you sanitize ALL input before including it in a page.

    You can do this all with PHP, no need to write “serious regEX” a short google for PhpHTML dom
    http://simplehtmldom.sourceforge.net/ will provide all you need.

    • Ashley says:

      @rubbish, as I stated in my last comment, “I’m not condoning users to actively trawl the internet and scrape any content they wish…

      I see your point about XSS from an external point of view if you don’t control the domain or content, this wasn’t however the case in my original solution so wasn’t a problem.

      maybe you should check this tutorial out from the reputable Nettuts.com, a little more complex but the the same outcome.

      As I have said previously its nice to use your name rather than hide anonymously.

Older Comments




Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe

Get in contact

Please use the form below to get in contact. If your question is related to a free script download, please use the comments on the article page as community members are more likely to respond quicker than I can personally.

About Me

I'm Ashley Ford, Co-founder and Technical Director at Harkable.com London, UK. Previously I worked at InMobi, Spotify and MySpace. My interests include photography and making short videos I'm also an avid F1 fan. I'm always working on side projects. Here are a few: Easy Poll, We Deliver.



What do you specialise in?

I spend a lot of time coding in PHP and MySQL, as well as front end XHTML and CSS. I also specialise in javascript and the jQuery framework as well as being an avid designer. You can find me on dribbble

Interested in advertising?

If you'd like to advertise on Papermashup.com please get in touch via the contact link below for advertising opportunities.

How do I contact you

You can contact me here. and I'm available for consultation, freelance, programming book reviews.

Get on the mailing list

Join over 3000 people who have subscribed to the Papermashup inbox message, and be the first to find out about tutorial, competitions and giveaways.