Screaming Frog SEO Spider Update – Version 1.80
Dan Sharp
Posted 31 October, 2011 by Dan Sharp in Screaming Frog SEO Spider
Screaming Frog SEO Spider Update – Version 1.80
I am excited to announce an update to the Screaming Frog SEO spider, to version 1.80.
The following new features are included in the update –
- Crawl Speed Control – The ability to control the speed of the crawl is something we have wanted to release for a long time. We have had some requests for throttle control due to the SEO spider crawling too fast for some older servers. Hence, we have introduced a ‘speed’ feature which allows you to limit the number of threads or the number of URI requests per second. This feature also allows you to significantly increase the speed of the SEO spider at the same time, which we just ask everyone uses responsibly :-).
- XML Sitemaps – We have included the ability to create XML sitemaps in the SEO spider under the top level menu ‘export’ option. Currently this feature is only for HTML pages, so it does not include images, videos etc (which we will be introducing at a later date). We conform to the standards outlined in sitemaps.org protocol. If you have over 49,999 urls the SEO spider will automatically create additional sitemap files and create a sitemap index file referencing the sitemap locations.
- Exclude Is Now Regex – Much like the ‘include’ feature, the ‘exclude’ option in the SEO spider is now regex. In previously versions it was robots.txt syntax, but regex provides more control and better balance.
- Bulk Export Missing Alt Text URI – We have included the ability to bulk export all images with missing alt text and the URI that reference them.
- List Mode Accepts XML Sitemap Files – We added the ability in ‘list’ mode to be able to upload and crawl .xml files. Hence, if you already have an XML sitemap and wish to audit it, simply switch to ‘list’ mode, change the filetype to ‘.xml’ and upload it. We will discover all the URLs automatically and crawl them.
We also fixed up a couple of bugs –
- Level – We did not always find the shortest path. Now we do.
- Https – There was a bug crawling secure pages in version 1.70. This has been fixed!
So, what are you waiting for? Go and download it now!
As always, please just let us know if anyone has any queries or feedback about the above. I will be updating the user guide this week (so please bare with me!).
Thanks again to everyone for all their support of the tool!
Great news chaps. I’d like to know when you are going to release a version which is more affordable for single site webmasters as apposed to pricing which is obvously being targeted at SEO’s and agencies. I’d signup in a snap for a reasonable price, am I the only one? surely not.
Looks great, cant wait to open her up with a license.
Cheers,
Denny.
Hi Denny,
Thanks for the kind comments. We take on board your thoughts regarding pricing which is something we have under continual review.
Cheers,
Dan
Hey Denny,
Fully understand where you are coming from on pricing for single site webmasters. One thing I can say as a regular user of the product though, is that one full audit of your site is well worth the price of admission in time savings alone.
For example… even if you value your time at a low rate of say $30 per hour… the cost savings of running this tool to find and fix your site pays for itself in less than a day vs. having to do this on your own using cheaper or free tools.
As always, thanks for the update! Really love that we can now export the complete list of missing alt text.
On the RegEx section, for your revised user guide, could you include some working examples that would be used to exclude/include pages on http://www.screapingfrog.co.uk so that we can test out/modify/learn from a real life example? I understand basics of RegEx, but can’t ever seem to get the Exclude/Include to work right when crawling my sites.
Thanks!
Hi Dan,
Great to hear you like the missing alt text feature! Apologies it took a little time for us to update the user guide.
We have just started to update the include/exclude sections with some regex examples which will hopefully help!
Thanks,
Dan
I know I whine a lot about the RegEx, but I finally got it enough that I can brag to people that understand it.
This site really helped. I nailed down my include/exclude scripts by running a full scan on SF, then dropping all the URLs into this tester, then working out the correct RegEx statements.
http://regexpal.com/
To everyone that said RegEx is easy….yeah you’re right, it is pretty simple after you play with it a bit!
Great update guys. The crawl throttling is a fantastic addition! Do you have it in the pipeline to include rel next/previous? Oh and load time per page (in ms) would be FABULOUS.
Hey Richard,
Cheers buddy – And yes, when they announced it we immediately added it to the list…! ;-)
Load time is another we have as well.
Any other feedback / suggestions, just shout!
I get error message, How to deal with it
Failed to launch Screaming Frog SEO Spider.
Please ensure you have the latest version of Java installed.
Visit http://www.java.com/ to download and install Java.
Further details on this error:
Message: ‘No such file or directory’ Code: ‘2’
Press Enter to exit
Hi Zhanggang,
You just need to do as the message suggests and download the latest version of Java.
If you’re using the PC version, simply visit java.com and download and install. If you’re running the Mac version, then please just update your OSX to the lastest version.
More details on the support page here – https://www.screamingfrog.co.uk/seo-spider/support/
Thanks,
Dan
@Denny Kugler if you have a site with more than 500 pages, surely £99 per year is pretty reasonable?
@Dan Nolan – If you want to exclude any URL that contains a certain word, try using .*phrase.*
For example, if you wanted to exclude any url that has the word blue in it, you’d use .*blue.*
BTW, £99 per year is very reasonable, especially as there is a free version for any site under 500 pages
Thanks a lot for the update guys, it’s a great tool and I’m sure many users are pleased with the continuance in development. Looking forward to future updates and expansions!
This software works gr8 for onpage, when used for my micro sites which has less then 500 pages.. but also for big dynamic sites. I have just purchased the software so I can crawl more than 3K url.
Very effective and simple tool.