Dogway Posted February 11, 2015 Share Posted February 11, 2015 Hello. I'm having issues backing up this web, more precisely the roll menu at the top bar with arrows. Using the below command is supposedly to download locally the whole page but it won't download the part (javascript?) that makes it available to show when on mouse over. Also at the top-right the gear icons aren't downloaded.wget --wait=1 --page-requisites --html-extension --no-clobber --convert-links --restrict-file-names=windows --no-parent http://segaretro.org/Main_Page Link to comment Share on other sites More sharing options...
jaclaz Posted February 11, 2015 Share Posted February 11, 2015 (edited) Maybe wget is simply not suited for the task at hand and a more "comprehensive" approach is needed, *like*:http://www.httrack.com/ jaclaz Edited February 11, 2015 by jaclaz Link to comment Share on other sites More sharing options...
Dogway Posted February 11, 2015 Author Share Posted February 11, 2015 Actually wget is my second alternative. I got fed up of httrack when yesterday wasn't downloading the whole page either. Also, it's slow as hell (capped at 20kbps) and feels lacking-outdated. Anyway, whatever works is welcome. Link to comment Share on other sites More sharing options...
jaclaz Posted February 11, 2015 Share Posted February 11, 2015 Yep, sometimes it is just a hit and miss game you'll need to try with *something else*, ideas:http://alternativeto.net/software/httrack/ jaclaz Link to comment Share on other sites More sharing options...
Dogway Posted February 11, 2015 Author Share Posted February 11, 2015 "game"? I don't know where you want to get at dude. If you check the page you linked me I'm already using the 2 top programs. Link to comment Share on other sites More sharing options...
Tripredacus Posted February 11, 2015 Share Posted February 11, 2015 The site you are trying to download is written in PHP and uses JQuery. When you download from a site, you are only getting the rendered HTML, so certain functionality may well be missing. Also, the site is built with MediaWiki.... it specifically is importing functions from other portions of the site to handle certain things. For example this:<script>if(window.mw){mw.loader.load(["mediawiki.page.startup","mediawiki.legacy.wikibits","mediawiki.legacy.ajax"]);}</script>Documentation regarding loading resources:http://www.mediawiki.org/wiki/JQueryAlso the problem might be that the code is specifically designed to only work while running in PHP/MediaWiki on an actual server... maybe your program can get all the relevant parts you need to make it work, but you are trying to use it in the incorrect environment?Or maybe the method is not correct. Most information I find for making backups of a MediaWiki are for site owners. Maybe research specifically how to use wget to download from MW, here is one thing you can look at:http://webmasters.stackexchange.com/questions/28702/how-to-dump-a-mediawiki-for-offline-useBut it may not get the site 100% as for reasons outlined above. Link to comment Share on other sites More sharing options...
jaclaz Posted February 11, 2015 Share Posted February 11, 2015 dude? hit and miss game= if something doesn't work, try something else [1] jaclaz [1]maybe this *something else* will work, and sometimes it won't, no way to say in advance . Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now