Jump to content

Caching pictures from high amount of pages


Recommended Posts

This website has got many pages, the url naming follows pattern, for example:

http://thesite.com/page1.html

http://thesite.com/page2.html

... to

http://thesite.com/page1000.html

On each of these pages there are several pictures shown.

What I want to do is to download one specific kind of picture on each webpage to my computer, so in this example I end up with 1000 pictures. The picture's url is partly randomized, but there is also a static part so it is possible to filter this out for each webpage.

I'm fairly skilled in PHP. From what I remember this is not possible with php though (it's not possible to get source of another webpage?). In C++ this should all be possible, however I'm not too familiar with this language. So my question is, is there any easy programming language (preferable about the same syntax as php) this is possible in?

Link to comment
Share on other sites


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...