Screaming Frog Log File Analyser Update – Version 3.0
Dan Sharp
Posted 11 December, 2018 by Dan Sharp in Screaming Frog Log File Analyser
Screaming Frog Log File Analyser Update – Version 3.0
I am delighted to announce the release of the Screaming Frog Log File Analyser 3.0, codenamed ‘Yule Log’.
If you’re not already familiar with the Log File Analyser tool, it allows you to upload your raw server log files, verify search engine bots, and analyse search bot data for invaluable SEO insight and understanding about how search engines are really crawling a site.
While this update is fairly small, it includes some significant new features and upgrades to the Log File Analyser based upon user requests and feedback. Let’s take a look at what’s new in version 3.0.
1) Configurable User-agents
You’re now able to completely configure the user-agents you wish to import into a project. You can choose from our pre-defined list of common search engine bot user-agents, or de-select those that are not relevant to you.
This helps improve performance and reduces disk usage by focusing only on bots of interest. You can also add your own custom user-agents, which are then stored and can be selected for projects.
Previously the Log File Analyser only analysed bots for Google, Bing, Yandex and Baidu, so now this allows users to monitor bots from other popular search engines, such as Seznam in the Czech Republic. It also allows users to analyse and monitor other specific user-agents of interest, such as Google-News, or Adsbot etc.
2) Include Functionality
Similar to the SEO Spider include feature, you’re able to supply a list of regular expressions for URLs to import into a project. So if you only wanted to analyse certain domains, or paths, like the /blog/, or /products/ pages on a huge site, then you can now do that to save time and resource – and more granular analysis.
3) New Log File Format Support
The Log File Analyser now supports Application Load Balancing log file format and Verizon Edge Cast format.
All you need to do is drag and drop in log files (or folders) as usual, and the Log File Analyser will automatically detect their format and start to analyse them.
4) JSON Support
You can now upload log files in JSON format to the Log File Analyser. There isn’t a common standard, so we have utilised customer provided JSON formats, and provided support for them all. We’ve found these to cover most cases, but do let us know if you have any issues when importing.
Other Updates
We have also included some other smaller updates and bug fixes in version 3.0 of the Screaming Frog Log File Analyser, which include the following –
- The overview graphs are now configurable, so you can easily select the date range and metrics to display.
- The workspace storage location (where you store the database with log files) is now configurable.
- X-Forwarded-For in W3C logs is now supported.
- Time-taken in W3C logs is now supported.
We hope you like the update! As always, please do let us know via support if you experience any issues at all.
Thanks to the SEO community and Log File Analyser users, for all the feedback and suggestions for improving the Log File Analyser.
If you’re looking for inspiration on analysing log files and using the Log File Analyser, then check out our guide on 22 Ways To Analyse Log Files using our Log File Analyser.
Now go and download version 3.0!
Small Update – Version 3.1 Released 3rd April 2019
We have just released a small update to version 3.1 of the Log File Analyser. This release is mainly bug fixes and small improvements –
- Allow import of JSON logs with [] around timestamps.
- Indicate if regexes entered in Include dialog are valid.
- Fix crash expanding tree view.
- Fix issue with protocol being ignored when URL is entered during import.
- Fix issue with leading 0’s preventing import of Apache logs.
- Fix issue when starting up with locale of es_419.
- Fix crash on start up caused by unclean shutdown.
- Fix crash importing logs.
- Fix crash exporting from the Referers tab.
- Fix Include feature to use full URL to check against when importing.
- Fix import failure due to duplicate timestamps in JSON logs.
- Fix crash when importing URL Data.
- Fix crash searching for certain characters in table views.
- Fix issue with incorrectly using JRE supplied by SEO Spider on Ubuntu.
- Fix greyed out search box when search returns no results.
- Fix crash when comparing Log File with URL Data
Small Update – Version 3.2 Released 9th May 2019
We have just released a small update to version 3.2 of the Log File Analyser. This release is mainly bug fixes and small improvements –
- Fix crash loading importing log file.
- Fix crash deleting import history.
- Fix crash opening project closed in tree view with a sort.
Small Update – Version 3.3 Released 15th January 2020
We have just released a small update to version 3.3 of the Log File Analyser. This release is mainly bug fixes and small improvements –
- Fix stall importing log files.
- Fix crash importing HaProxy log file.
- Fix crash on macOS when plugging/unplugging external monitors.
- Fix crash closing project quickly after opening.
- Fix stall importing multple log files without a licence.
I was only recently wondering whether we’d see a new update soon as it’s been a while since it was last updated, so I’m glad to see this update rolled out, some handy features in there I’ll definitely be giving a try.
Sadly looks like the larger projects I had setup don’t seem to open, just crashes the app each time, though smaller projects it opens up with no issues. Hopefully this gets sorted soon.
Hey Adam,
Thanks for the comment!
Would you be able to pop through the details of the crash you’re experiencing, and your log files (Help > Debug > Save Logs) to us via support?
https://www.screamingfrog.co.uk/log-file-analyser/support/
We can then help!
Cheers.
Dan
Hi there,
last time I checked, the Log File Analyzer did not have the same regex filter options as the Spider tool. Is it planned to add this anytime soon?
Thanks
Hi Pete,
Which regex filter options are you referring to?
It has the same include feature as the SEO Spider now (as above, in point 2), but it doesn’t have an ‘exclude’ for URLs. But did you mean something else?
Cheers.
Dan