Screaming Frog T-Shirt Giveaway

Mark Porter

Posted 29 June, 2023 by in Screaming Frog

Screaming Frog T-Shirt Giveaway

Longtime users of the SEO Spider will know that over the years, we occasionally offer up the chance for you to get your hands on our highly coveted merch. We receive countless requests for SF swag (which our support team can absolutely attest to), and we’re pleased to announce that the time has come once again.

However, we’re shredding our usual ‘mysterious black’ attire and taking it a notch higher, with the ingeniously named: SF Summer Collection.

NOTE: Entries are now closed.


Screaming Frog’s Summer Collection T-Shirts – Because Life’s Too Short for Boring Colours

Our latest batch offers t-shirts that could give the Google logo a run for its money, with colours such as:

  • Indigo blue, for those who love a deep dive into the SERPs, unafraid of rank 100 and beyond.
  • Sky blue, for the daydreamers who hope for a day when the SEO spider not only finds broken links, but also fixes them.
  • Light pink, for the romantics who blush every time their regex works first try.
  • Mint green, for those savvy SEO’s who like to keep their strategies fresh.
  • Sunny yellow, for the perpetual optimists, who, no matter how many issues the SEO Spider uncovers, relish the challenge.
  • Violet, for the spreadsheet sorcerers, those brave souls who aren’t afraid to export Screaming Frog SEO Spider data into Excel and dive headfirst into the cells.

Bonus shot: You can find some smouldering shot of our Head of Strategy, OB, modelling these below:

Before you ask, no, unfortunately we don’t offer product photography as a service line.

But wait, there’s more! We’re also throwing in our iconic Screaming Frog stickers. Just because your laptop shouldn’t feel left out of the party.


How to Win

This giveaway is exclusive to licenced SEO Spider users. We have over 100 of these up for grabs, and all you need to do to be in with a chance of winning is one of two things:

  • Share your favourite Screaming Frog SEO Spider tip in the comments of this post. Be sure to use your real email address so we can reach out if you win.

OR:

  • Share your favourite Screaming Frog SEO Spider tip on social media, tagging us on LinkedIn or Twitter (or both!). Be sure to use the hashtag: #SFTipsForTees so we can find your entry!

While winners will be randomly selected, our favourite tips will receive some extra merch on top of a t-shirt and stickers. As mentioned, only entries from licenced SEO Spider users will be counted.

Unfortunately, we are not able to guarantee what colour you will receive, but we will do our best to accommodate any requests.

The competition will run for 14 days, ending on the 13th July.

We can’t wait to see what tips you come up with!

Mark is a self-confessed geek who has always had a love/hate relationship with blogging, coupled with an addiction to buying domain names. Outside of SEO, Mark loves Manchester United, gadgets and games. If anyone can beat him at FIFA he will happily buy them a drink.

114 Comments

  • Brendan OConnell 1 year ago

    The Screaming Frog Custom Extraction tool, is a game-changer that has personally revolutionized my workflow, saving me countless hours of laborious work. To understand this incredible tool, I highly recommend reading the informative blog post available at https://www.screamingfrog.co.uk/web-scraping/ first.

    Begin your exploration by taking the first step: trying it out with a single sample page. This initial test will help determine if the tool accurately retrieves the desired data. If it performs as expected, you can proceed with confidence. However, if the results are not satisfactory, fear not! Simply return to the tool and fine-tune your extraction settings until you achieve the desired outcome. Remember, thorough testing is vital before executing your crawl.

    Embrace the potential of the Screaming Frog Custom Extraction tool and unlock a world of streamlined web scraping. You’ll wonder how you ever managed without it!

    Reply
    • Justin Harney 1 year ago

      The crawl visualization tool. It makes non seos understand how important site structure is with a click of a button and export of a svg!

      Reply
    • Ewelina Westcott 1 year ago

      Too many to choose from:
      – scheduling and integration with Google Sheets & Google Looker to get a site crawled, data exported and visualised ready for comparison
      – custom extraction to speed up internal linking process. Combined with a Python script, it saves hours of manual work

      Reply
    • Sam Penny 1 year ago

      A simple tip from me:

      In some instances you may want to crawl URLs with a # in them (scraping URLs with a #, analysing jumplinks, etc.)

      This may cause poor results, as by default Screaming Frog ignores jump links. You can ensure that SF crawls links with a # in them by changing the spider’s behaviour in the settings:

      Configuration > Spider > Advanced > Crawl Fragment Identifiers

      Viola! Problem solved.

      Reply
    • Yash 1 year ago

      Using custom extraction by adding X-Path for checking OG & Twitter Tags on web pages. This is something that is very insightful for me as an SEO profesional

      Reply
  • Valentin Broion 1 year ago

    Hey SF Family !

    My favorite tip with SF is the custom extraction : easy way to find out if something (some code, some word, some links…) are in some pages !

    Reply
    • Rasa 1 year ago

      List mode, GSC API, js rendering (for some websites js rendering + Googe User Agent is a must to see correct crawl result). However, I have website, where are many difficult integrations and even SF can’t identify some issues (other popular SEO tools too). This is why I love manual analysis + log file analysis (with SF log file analyzer).

      Reply
  • So, I’m using screaming frog when I’m looking for price drop in e shops.
    I’m using schedule scans on the product page i want to buy and with the custom extraction I scrap the price – upload it to google sheet and run an automatic check if the price dropped…

    Great when you’re looking for vacations…

    Reply
  • Liam 1 year ago

    My favourite tip is to pull all inlinks to conduct an inlink gap analysis for high-value pages!

    Reply
    • Josh 1 year ago

      The ability to link SF with PageSpeed Insights has helped tremendously with site speed and technical audits

      Reply
  • Pete Mindenhall 1 year ago

    The ability to run bulk domain crawls in list mode whilst also attaching a rollup instance in GA (and now GA4) for all instances/streams)

    Reply
  • Lyndsay 1 year ago

    One of my favorite things to use Screamingfrog for is internal link scanning for orphaned pages, and checking that your pillars are as internally-linked as you want them to be. Managing site structure with SF is awesome :)

    Reply
    • Olexander 1 year ago

      work with redirects, one of the most frequent tasks: search, detection and correction of urls. Also checking the placement of links on forums through a mass scan of the url list, and then filtering by domain in the external links tab.

      Reply
  • Simon Cox 1 year ago

    My favourite Screaming Frog Tip is to archive old crawls on a regular basis so that when you open the crawls up (Cmd+O on Mac , Ctrl+O on PC and double tap button one on my converted Google Stadia Controller) it doesn’t take longer than tadpoles spawning to open. :)

    Reply
  • Dean Cruddace 1 year ago

    My tip is don’t forget to crawl using the smartphone user agent. As we know, big G primarily uses mobile-first crawling.

    Reply
    • Thomas Vav 1 year ago

      Oldie but goldie:
      JavaScript sites are sometimes difficult to crawl.
      I try different user agents under the Configuration menu to check which pages and how they are being crawled.

      Reply
  • Barry Schwartz 1 year ago

    Some computers need cooling, just like frogs. :P

    Reply
  • Jan Caerels 1 year ago

    Oh boy this is a hard one. Definitely custom search/extraction have loads of possibilities. I’ve used it for spotting where certain words were mentioned in the copy of a page. If you use some regex you can do this in bulk for multiple keywords. It will return you the page + the keyword that is mentioned in the copy. You can use this to quickly spot internal linking opportunities. The possibilities of custom search/extraction really are endless.

    I also adore the recent “compare crawls” mode. I used it in a migration (with URL mapping to the old domain) to determine if some pages certainly became non-indexable for some reason. Another great tip in migrations: Use the “All redirects” Report to quickly report the old URL with the final new URL after redirection.

    Reply
  • Chris Pauly 1 year ago

    The Configuration -> Custom -> Search term counter while crawling a website is incredible! We just had a need to find what pages some phrases are being used on. I could get a list of items in our CMS (Sitecore) that have that phrase. But the reassembling of those items onto pages is more complicated the way we can use datasources in Sitecore and personalization. Crawling our Sitemap while counting up the term is exactly what the stakeholder wanted.

    Reply
  • Simon Cox 1 year ago

    Sometimes you will find you cannot crawl a site properly or are finding that your crawl has pages that you know have been recently deleted. Some hosting use a prerender cache that restricts user agents (hello Netlify – who are wonderful) so try changing the user agent to Chrome as the hosts tend to treat Chrome differently and your crawl should run properly.

    Reply
  • Joe Leyba 1 year ago

    List Mode + GSC API FTW!

    Reply
    • CASTAING Brice 1 year ago

      So many useful features in SF have been added year after year. I could say my best tip would be the use or Custom Search and Extraction. Best way I know to find Price or stock issues on an e-commerce, to find non crypted emails or get a full view on Hn page usage for example. But, what I like the most with SF is the fact your team keeps working on the scraper after so many years and you always find ways to make a part of SEO’s work so much easier!
      SF is any SEO best friend!

      Reply
  • Juan Reino 1 year ago

    Custom extraction, it has to be it. It’s just the best way to do competitors analysis and get ideas to improve your own content.

    Reply
  • Matt Tutt 1 year ago

    Use the Custom Search function to search for tracking code instances (eg GA4 tracking code) to ensure a site has correctly installed the code on every page of the site.

    Reply
  • Lauren Turner 1 year ago

    I love seeing I’m not the only one who uses the custom search/extraction tool! I can’t tell you how many times I’ve used it to identify all pages experiencing a particular issue with a component or a bug. Being able to provide an estimated traffic or revenue impact to these bugs has made me a better SEO.

    Being able to auto export a crawl into a custom built Looker Studio template is a close second for me. For years I’ve worked in a silo from my devs. With these automated export features we have a shared dashboard that makes Tech SEO more visible to the business and brings SEO into every dev conversation.

    Keep it up Screaming Frog!

    Reply
  • Ryan Webb 1 year ago

    Screaming Frog picks up the pieces that Wayback machine can’t help with. Using the historic crawl analysis I was able to pinpoint the exact date a change was made and what that change involved that saw a dip on performance (we are talking early 2022). WB only offered two dates that were 3 weeks apart.

    It was a lifesaver and cemented to our Finance Dept the usefulness and need of Screaming Frog

    Reply
    • Ash Nallawalla 1 year ago

      +1 for using the compare function in large companies where many teams influence the content. Often provides an educational opportunity.

      Reply
    • Md Jalal 1 year ago

      Sure, here is one of my favorite Screaming Frog SEO Spider tips:

      Use the “List Mode” to crawl a list of URLs. This can be helpful if you want to crawl a specific section of your website, or if you want to crawl a list of external websites. To use List Mode, simply click on the “List Mode” tab and enter the list of URLs that you want to crawl.

      Here are some other Screaming Frog SEO Spider tips that I find helpful:

      Use the “Configuration” tab to customize your crawl. You can choose which pages to crawl, how deep to crawl, and what data to extract.

      Use the “Filters” tab to narrow down your results. You can filter your results by status code, page type, and other criteria.

      Use the “Reports” tab to generate reports on your crawl data. These reports can help you identify SEO issues and track your progress over time.

      Use the “Export” tab to export your crawl data to a file. This can be helpful if you want to share your data with others or if you want to use it in another tool.

      I hope these tips help you get the most out of Screaming Frog SEO Spider!

      Reply
  • Florian Elbers 1 year ago

    Use xPath in Custom Extraction to scrape more content, e. g. “//h4” for all H4-headings or even count extractions, e. g. “count(//h4)”

    Reply
  • Corinna 1 year ago

    My favorite tip is scheduled crawling – especially when there are many brands with multiple language country versions to be checked at regular time intervals. I love the export feature so I can find and review everything in one place. Incredible time saver – love it!

    Reply
  • James H 1 year ago

    What’s better than Screaming Frog?

    2 Screaming Frogs

    To open SF multiple times on Mac, open ‘Terminal’ and paste: open -n /Applications/Screaming\ Frog\ SEO\ Spider.app

    Your life just got 2 x better ☕️

    Reply
  • Julian Meister 1 year ago

    My favorite tip:
    Visualisations and especially the Force-Directed crawl diagram. Really great to get a good first look/feeling for the website and clients are usually blown away when they see their site displayed that way. IMO one of the most underutilized features of Screaming Frog.

    Reply
    • Maja 1 year ago

      My favorite tip is using the supercool API options that are available with Screaming Frog. You get various URL data plus data from GSC, GA and a tool such as Ahrefs, so everything is in one place and ready to be exported! Perfect for an internal linking audit. Plus, connecting GSC and GA APIs coupled with crawling sitemaps and crawl analysis as the final step will help you get your hands on those poor little orphan pages!

      Reply
  • I love the quick filter for sorting images by Kilobyte size (just found a 5 MB picture in an SEO-Audit yesterday).
    Replacing these images by web-optimized ones, is really one of the fastest ways to boost user-experience, save bandwidth and make the site faster.
    It also helps in smaller Backup sizes ;)

    Reply
  • Kevin Mui 1 year ago

    I love using the link scoring algorithm to determine the “health” of our on-page SEO! Internal linking helps a lot with our SEO strategy.

    Reply
  • Stephan 1 year ago

    If you’re in e-commerce, use custom extraction to extract the number of products on your category pages.

    Also, list mode + redirect chains is great for site migrations.

    Reply
  • Alejandro Bernal 1 year ago

    Searching for Xpaht is intimidating for the frog @screamingfrog makes your life easier : #SFTipsForTees

    Reply
  • Jess Joyce 1 year ago

    My favourite tip for Screaming Frog at the moment (as it always changes and evolves as the ultimate multi-tool is using sitemaps to crawl and compare URLs in migrations.
    Upload both sites to Screaming Frog and pull out the status code of all and then match them all up with spreadsheets. I’m sure there’s a faster way to do this but I love being able to look at all the URLs for spot checks as I just completed one where the client re-named a bunch of URLs that had registered symbols in them and were converted to ascii characters in the URLs and they were able to update them to be more SEO friendly as well.

    (also Screaming Frog I would pay you for a sweater. Give you money for one!! Seriously have been FOMO-ing for one since you ran the last giveaway so hit me up – happy to support you all truly)

    Reply
  • Stefan Persson 1 year ago

    Activate follow redirects and export redirects chain report is a golden timesaver when cleaning up bad migrations!

    Reply
  • Bill Rubosky 1 year ago

    Use Custom Search to find those deeply buried nuggets of outdated text. You can even search PDF text if you check “Store PDF” under Config > Spider > Extraction.

    Reply
  • Murtaza Asgerali 1 year ago

    I use screaming frog tool for technical SEO purposes. Specially the broken links and which anchor text is sending the 404 error message.

    Reply
  • Custom search extraction and the exporting it to a spread sheet is also my favourite feature. I’m still learning to work with your tool as I’m not properly in charge of SEO (but somebody has to fix stuff), and this comments gave me some great ideas! (and also a reminder that I need to refresh my knowledge). What I mostly use it for is to check that all the metas, alts, Titles, H1s and url status are fine for Google’s taste :) (purple is my fav as I’m the spread sheet queen) o.O

    Reply
  • Joakim 1 year ago

    My best tip is to make excel-”templates” with important values set with conditional formatting levels.. e.g the report ”crawl overview” or export ”internal html” to quickly visualise errors or possible problems. Just export as csv and import to your template excel-document.

    Reply
  • Juan Caballero 1 year ago

    We all know (and have likely been impacted by) Google’s increased use of features like “People Also Ask” and other snippets, to retain users on the search engine.

    So the key tip for me right now is to audit the structured data on your website. You can configure SF to extract structured data during the crawl and compare your implementation against Google’s and Schema.org’s guidelines —Pretty cool feature.

    Reply
  • Lexy Hannon 1 year ago

    My favorite tip is to make sure you connect the free APIs before crawling! Simple but effective. I also love to export my crawl then grab out the “easy win/fixes” first like sort by H1-2 characters largest to smallest any you find should be switched since it’s best to only have one H1. Easy win!

    Reply
  • Edwin Acevedo 1 year ago

    I’m not a very complicated user, so I likely won’t win. I do love this tool, though. Do you have a store where I can just buy your shirt? How do I give you my money?

    I just did this, so: To find out which pages contain images that need optimization, run a crawl and click the Images tab in the top-left pane, then the Over 100 KB dropdown. Select all images. In the lower-left pane, select the Inlinks tab. Export, open in Google Sheets and save To, From and Alt Text columns and delete everything else. Give the list to graphic designer for optimization.

    Reply
    • Sarah Harradine 1 year ago

      Edwin, I’ve been using Screaming Frog for years and I’ve never done this. We can always learn off each other, no matter how complicated we think our work is, or isn’t!

      Reply
      • Edwin Acevedo 1 year ago

        Ah, thanks so much, Sarah! Everyone else seems so fancy with their APIs and custom extractions and whatnot.
        I just would love the shirt haha. (If you’re reading this, Screaming Frog, size large, please.)

        Reply
  • Raghavarao 1 year ago

    I appreciate the feature one can use to customize the search. I use this feature couple of times to crawl the webpages with certine text or group of words. Also.other feature i love is finding broken links of the website. I use bult outword links report, filter the status code to 404, and filter for domine pages only. Thus we can find broken internal links present on the page, this opzion also help to find webpage with internlinks point to x or y landing pages .

    Reply
  • My favourite Screaming Frog tip used most recently has been to crawl JavaScript content and find pages with the highest JavaScript content percentages!

    Reply
  • Gaetano Romeo 1 year ago

    My favorite tip for using Screaming Frog SEO Spider is to utilize the custom extraction feature. This powerful functionality allows me to extract specific data from web pages, making it incredibly useful for various SEO tasks.

    Reply
  • Joachim Almeke 1 year ago

    possibly to different api connections, my favorite is connection to gsc to easily get out which pages are not indexed. But also be able to see which pages have problems. Another frequently used api connection is with page speed insights, to get the status of web core vitials at the page level. To be able to improve performance.

    Reply
  • Shantanu Ramteke 1 year ago

    My favorite tip on Screaming Frog SEO Spider is the List mode.
    You can switch easily in list mode and upload your selected URLs and get them crawled in a span of seconds.
    This mode has helped me to get always get the desired data quickly and you can also generate the custom XML sitemap with selected URLs.
    Isn’t this an amazing feature?

    Reply
  • Nick 1 year ago

    SERP crawls to get up-to-date snapshots of Google results

    Reply
  • Extract the parts of a site you want with Xpath! Process, interpret or use the extracted data!

    Reply
  • One of my favorite tips is using the Include/Exclude configurations. Go to the Configuration section of Screaming Frog, and choose Include or Exclude. This allows you to enter a specific URL or directory to include or exclude in your crawl. If you have a large website or only want to dig into certain sections, this functionality saves crawling time and uses less resources, giving you only the insights you need to dig deeper.

    Reply
  • Hi there! When using Screaming Frog SEO Spider, I make sure to utilize the ‘Custom Extraction’ feature. It allows me to extract specific data from web pages, such as meta titles, descriptions, or even structured data like Schema markup. This is extremely useful for auditing and analyzing our website’s on-page optimization. Happy crawling!

    Reply
  • Sarah Harradine 1 year ago

    If your custom extraction isn’t working as expected, then check JavaScript crawling is switched on. The xpath you copied might have been sat within some script. Would looooove a minty green or yellow tee

    Reply
  • Lok Sam 1 year ago

    If you don’t have the greatest computer for handling crawls of big (e-commerce) sites, a pro tip is to set up a remote desktop and to use Screaming Frog there. Say bye to computing strain on your physical laptop!

    Screaming Frog specific tip as many have already called out is Custom Extraction. I love using this to extract page type specific elements in order to quickly identify different page types (e.g. reviews element for PDPs, sort-button for PCPs, and author for Blogs). Just one of many ways of using this versatile feature.

    Reply
    • Stephanie Miller 1 year ago

      I like to use SF for website restructure or migration projects of all kinds. I get a full crawl beforehand, during staging, and after launch so I can compare. This helps me map redirects and make sure all meta data and resources are migrated. It’s also great to know which problems existed before the project versus only after.

      Reply
  • Ash Nallawalla 1 year ago

    I run SF in an instance of AWS on a virtual machine inside our corporate perimeter, so that it does not suffer the barriers we place on bad actors. The excluded zone includes work-from-home people like me, so the VM lets me crawl faster.

    Reply
  • Mazen Aloul 1 year ago

    My favourite SF tip is to run two different crawls, one with JavaScript rendering turned on and the other without JavaScript rendering. Once both crawls are complete, I use the “Compare” mode of SF and see all the differences between the HTML and JavaScript versions of the website! Helps uncover issues that are not visible in a single crawl.

    Reply
  • Ralf 1 year ago

    How to extract or count almost every content element from websites:
    https://www.screamingfrog.co.uk/web-scraping/

    #SFTipsForTees #Scrapingfrog
    #Countofcounting #Xtraxtraction
    #Frogfashion #Geilesshirt
    #Screamingfrog

    Reply
  • Korab Emši 1 year ago

    My favourite Screaming Frog feature is API access.
    I can do so much for clients when showing them data from Google Search Console and Google Analytics.
    So instead of looking into GSC and GA, I can get them a list of pages with traffic, clicks, indexation status, last crawl date, backlinks and so much more.
    It would be good if you could add the SEMrush API as well, so we can gather all backlinks from all tools in one place :)

    Reply
  • Sean Devlin 1 year ago

    Use the redirect chain report to look for audit 301 redirect maps. Really handy for verifying 301 redirect maps to ensure there are no redirect loops, 404 errors etc being produced. You can cross-reference your intended redirect destinations with the final URLs within the report to confirm everything matches and get the response code. Brill for validating implementation!

    Reply
  • Nicolas Montanares 1 year ago

    There are numerous powerful features in Screaming Frog, but one often overlooked gem is the Visualisations > Force-Directed Directory Tree Diagram. It’s a game-changer! Customize it to highlight 404s, redirects, or nonindexable URLs for a comprehensive bird’s eye view of any website. Impress your peers with your mad scientist skills while exploring websites for your own amusement (looks really cool – will for sure impress others)!

    Hope I win! If I do any other color but pink please :)

    Reply
  • Maddie Deane 1 year ago

    The crawl comparison feature has been a god send! I often see user-agent sniffing happening on website and before it was quite a manual task to review the differences of HTML v JavaScript, now you can view it all straight away, with all the stats! Has made analaysis so much easier.

    Reply
  • Bhushan Dhake 1 year ago

    One of my favorite features of SF is the custom extraction, which provides an effortless method to determine if certain elements (code, words, links, etc.) exist within specific pages.

    Reply
  • Nicolò 1 year ago

    Hi everyone!
    My favourite Screaming Frog SEO Spider tip is about structured data. Structured data needs to be validated and we can use SF to do this. In order to do this, navigate to Spider -> Extraction in the menu. Here you can select JSON-LD, Microdata, and RDFa. After the program has done its work, we can take a look under Overview -> Structured Data to see if there is anything that needs to be fixed. Cheers!

    Nicolò

    Reply
  • Grzegorz 1 year ago

    The way you can extract things with XPath – amazing. Also big thanks to Jonathan Moore for the talk that inspired me @ this years Brighton conference.

    #xpathtodiscovery

    Reply
  • Adam 1 year ago

    Easy: Custom extraction. Can’t comprehend how much time this feature saved me :) Thx <3

    Reply
  • Medha Dixit 1 year ago

    My favourite Screaming Frog feature is being able to patch the Google Search Console API Data within a site crawl, and having it mapped to all of a Site’s URLs. This includes metrics such as clicks, CTR, avg position, indexability, rich results status and much more. Helps turbocharge my Site audits. Awesome job with the API integrations SF Team!

    A close second is the Inlinks & Outlinks analysis which can be super insightful for eCommerce client sites. Cheers & thanks for your hard work.

    Reply
  • Miguel 1 year ago

    Custom Extraction -> XPath Breadcrumbs -> Text. Then go to excel and dynamic table. Easyest way to group URLs :)

    Reply
  • Erik 1 year ago

    Hi there!
    My favorite tip with Screaming Frog is the custom extraction: easy way to find out if something (some code, some word, some links…) are in some pages !

    Reply
  • Patrick 1 year ago

    Screaming Frog SEO Spider is a fantastic tool for SEOs and digital marketers. Here are five of my favorite tips for using it:

    Custom Extraction: The Custom Extraction feature allows you to scrape virtually anything from the HTML of a webpage using CSS Path, XPath, or regex. This can be a lifesaver when you’re looking for specific pieces of information that the standard Screaming Frog crawl doesn’t provide.

    API Integration: Screaming Frog allows you to connect directly to the APIs of Google Analytics, Google Search Console, and Ahrefs. By linking these services, you can pull in valuable data directly into your crawl results, providing a more comprehensive view of each URL.

    Crawl Comparison: This feature is handy for spotting changes before and after site migrations or significant site updates. You can compare two crawls, and Screaming Frog will highlight the differences between them, showing new, missing, or changed elements.

    Visualization: Screaming Frog has a ‘Crawl Path Report’ and ‘Visualizations’ feature that shows you the architecture of a website in a more visually friendly way. These diagrams can be very useful for spotting structural issues or just getting a better understanding of a site’s layout.

    Saving and Re-uploading Crawls: You can save the results of a crawl to review or use later. This is particularly useful if you’re running large crawls that take a long time. You can simply save the results, close Screaming Frog, and re-open the results later when you have more time to analyze them.

    Bonus tip! If you aren’t a coder or anything (like me), utilize this data in excel or sheets and learn how to create reports that show your previous crawl data and how it compares to current data. And always make sure to pull your crawls with the apis connected to Google and save that data. It may come in useful one day.

    Reply
  • Matt 1 year ago

    Quick tip, SF it’s not just for SEO: need to quickly identify all the elements of a certain type on your website? Use the custom extraction function.

    We’ve used this SF function on a customer’s site in order to correctly identify and track all of the forms via Google Tag Manager.

    Reply
  • Benedetta Romano 1 year ago

    A cool SF tip to extract all the URLs in an XML sitemap in no time:

    List mode > Download XML sitemap > enter the sitemap URL > the list of URLs is generated as “Found + URL” > do not press “OK”, but copy-paste the content of the box > in Excel with the “find and substitute” function eliminate the word “Found” before each URL: here you have all the URLs contained in a sitemap in no time!

    Reply
  • Mike 1 year ago

    One of my fav uses of SF is looking for link opportunities across a website, doing a custom extraction with an xpath generated in a spreadsheet that’s containing a list of keywords to look for in specific page templates or copy sections, plus another formula to check if there’s already a link including any of those keywords as anchor text to exclude them. Et voilá.

    Reply
  • Axelle 1 year ago

    The continuous enhancements to Screaming Frog over the years have truly made it a standout SEO tool. My go-to feature has to be the Custom Search and Extraction. It’s a lifesaver when you need to identify pricing or inventory discrepancies on an e-commerce site, spot unprotected emails, or get a comprehensive view of a page’s Hn usage.

    However, what truly sets Screaming Frog apart is your team’s unwavering commitment to refining and expanding the capabilities of this scraper, year after year. It’s this commitment that keeps simplifying and streamlining the SEO process for us, making our jobs easier and more efficient.

    In essence, Screaming Frog isn’t just a tool, it’s a trusted companion in the journey of any SEO professional!

    Reply
  • Vasileios Mylonas 1 year ago

    Your ui makes it easy for us to explain the errors discovered, to our customers. Thank you!

    Reply
  • David 1 year ago

    Screaming Frog SEO Spider is indeed a remarkable tool in the hands of SEO experts and digital marketers. Here are my five favorite tips for maximizing its potential:

    Embrace Custom Extraction: This invaluable feature empowers you to scrape virtually any data from a webpage’s HTML using CSS Path, XPath, or regex. This proves to be a boon when you need specific information that a standard Screaming Frog crawl doesn’t fetch.

    Harness API Integration: One of Screaming Frog’s strengths is the ability to integrate with the APIs of Google Analytics, Google Search Console, and Ahrefs. By doing so, you can infuse valuable data directly into your crawl results for a more thorough analysis of each URL.

    Utilize Crawl Comparison: This feature is instrumental in identifying changes post-site migrations or major site updates. By comparing two crawls, Screaming Frog brings to light the differences between them, exposing new, missing, or altered elements.

    Leverage Visualization: The ‘Crawl Path Report’ and ‘Visualizations’ feature of Screaming Frog provides a visually engaging depiction of a website’s architecture. These diagrams serve as great tools to detect structural flaws or to gain a more profound understanding of a site’s layout.

    Save and Reload Crawls: The ability to save crawl results for future review or use is particularly handy when running large, time-consuming crawls. You can simply preserve the results, shut Screaming Frog, and reopen the results later for a more relaxed analysis.

    A bonus tip for those who aren’t tech-savvy! Make the most of this data in excel or sheets and learn to formulate reports that contrast your previous crawl data with the current one. Don’t forget to connect your crawls with the APIs to Google and preserve that data. You never know when it might come in handy!

    David from https://www.noiise.com/agences/lille/seo/#https://www.screamingfrog.co.uk/

    Reply
  • Katie Mishner 1 year ago

    We’ve have been using ‘Custom Search’ to help clients who are scaling their website internationally and need to correct certain spellings, i.e. from UK to US (colour to color). It helps us to locate which pages we need to update, rather than manually looking through them manually.

    I’ve also used custom search within page text to demonstrate the difference between client websites and competitors with the rate of specific keyword mentions, e.g. on 50% of your pages, vs the 60% of the client’s website.

    Basically, up the custom search feature!

    Reply
  • Cauê 1 year ago

    One feature of Screaming Frog that I find valuable and very useful is the structured data analysis feature. With this tool, I can easily identify if the structured markup code is correctly implemented on the pages. I use dynamically protected data a lot on programmatic websites and because of that this tool is fantastic.

    Reply
  • Frank van Dijk 1 year ago

    Using Screaming Frog to look for new internal linking possibilities. Love using the custom search option

    Reply
  • Vishal Solanki 1 year ago

    The Screaming Frog SEO Spider is a powerful tool for website analysis and SEO optimization. Here are some tips to help you make the most of it:
    Exporting Crawl Data: Export crawl data in various formats, including CSV, XLSX, and Google Sheets. This makes it easier to share and analyze the data using other tools or platforms. and this one Crawl JavaScript-Rendered Content: Enable the “JavaScript Rendering” feature to crawl pages that rely heavily on JavaScript. This allows you to identify and optimize any SEO issues that may arise with dynamically generated content. This is one of the most useful features https://prnt.sc/6486PsnveM5D that’s essential to having and is extremely helpful.

    Reply
  • Thomas Harvey 1 year ago

    My favourite tip has to be rendering pages from the sitemap (or just general crawling) such a great way of seeing the site in bulk and spotting things you might not otherwise see. Great for UX, SEO or just generally on areas that are weak on a site.

    Reply
  • Marco 1 year ago

    For a website analysis: First tip for every beginner; make sure you use the API for connecting Analytics and Google Search Console. The data will be so much more useful.
    Export, and upload it to a Google doc.
    At the same time export the internal 301, internal 404 reports, and orphan pages. And add them to new tabs in the spreadsheet.
    Start by checking missing Titles, H1’s. Then duplicate Titles and H1’s.
    Fix all internal 404 errors, unnecessary 301 redirects, and why there are orphan pages.

    Use the spreadsheet to find pages with missing H2s, and low word count.

    Also make sure you don’t forget looking whether pages are getting any impressions in Google.

    Once you’ve got the hang of this, start looking at other powerful options.

    And if you ever run into crawling issues, try to change the crawlbot to Google Mobile, and see if that fixes the crawling problems.

    From an SEO campaign’s perspective, SF is such a helpful tool .

    Love from a long time user.

    Reply
  • Nayanaba Gohil 1 year ago

    SF is an excellent tech SEO audit tool – a must in your marketing toolset.

    Here’s a quick tip for on-page optimization using SF:

    – Use Screaming Frog to export all pages with short, long, or missing titles & descriptions.
    – Take help from ChatGPT + Sheet to generate the metadata in bulk.
    – Make the changes live (Most CMS support bulk updating metadata.)

    Now, that’s a major on-page SEO improvement made possible only using Screaming Frog.

    Reply
  • Quentin 1 year ago

    I use screaming frog every day! The visualization, and in particular the crawl diagram based on the forces, is rather interesting to understand the structure of the site and what is wrong.

    Reply
  • Lee 1 year ago

    So many good tips I could share but keeping it simple. If you don’t have the resource for a server dedicated to running SF to crawl big sites make sure you uncheck External Links from the crawl and add an exclusion for .*=.*. That will cut the number of URLs being crawled down massively!

    Reply
  • Ooni 1 year ago

    Always been a huge fan of using the tool for internal linking audits/strategy forming. Visualisations –> Crawl Tree Graph, so simple but gives you an amazing visual cue for how your website is structured!

    Reply
  • Alexandra Zieniewicz 1 year ago

    Looking for missing title tags, meta descriptions and duplicate H1s. As an agency showing potential clients that they are missing out on the simple stuff is an easy win!

    Reply
  • Best tip: explore the crawl summary and find orphan pages and links between pages that are hard to find otherwise.

    Reply
  • Floris Gouw 1 year ago

    Please check my (Floris Gouw) comment on https://www.linkedin.com/feed/update/urn:li:activity:7081945151626997760?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7081945151626997760%2C7082040561590775808%29

    There was a large LinkedIn discussion on how to find internal linking possibilities on your website. They were using the “site:” operator, but i was suggesting to use the Custom Search function in Screaming frog to search for a specific keyword within page text of all of your sitemap.xml URLs. You can share it with your content team which gives them a nice overview and they would be able to insert the links.

    Besides:

    Important comment from Google on this: please check this interesting article below with comments from John Mueller, where he explains that using “site:” doesn’t always cover all of the indexed pages of your website:

    https://www.searchenginejournal.com/google-a-site-search-doesnt-show-all-pages/416662/

    Therefore, i recommend to use your sitemap OR check Google Search Console index coverage, crawl those pages with Screaming Frog and check which pages you can use for internal linking.

    Reply
  • Amy 1 year ago

    Too many things to name, but maybe the best is being able to say “Screaming Frog” (as often as I can) in serious conversations with executives.

    Reply
  • Alex Entwistle 1 year ago

    Building an automated Technical Health dashboard in Google Data Studio (now Looker Studio) using automated scheduled Screaming Frog crawls in headless mode, hooked up to Google Cloud

    Reply
  • Michelle 1 year ago

    We love using custom extraction in Screaming Frog for blog inlinks! It can be incredibly useful for understanding the internal linking structure and optimising the SEO of your blog. By extracting specific information such as anchor text, target URLs, or even the number of inlinks to each blog post, you can gain valuable insights into how your content is interconnected and identify opportunities for improvement. We also extract content from the pages, paste this into Google Spreadsheets and then search within on-page content for specific keywords so we can find related content so easily!

    Reply
  • Henning 1 year ago

    One of my favorite tricks is to set existing session cookies from my browser via Configuration > HTTP Header. This allows you to crawl pages in the “logged in” state where other types of login such as basic auth are not possible.

    Reply
  • ohgm 1 year ago

    Screaming Frog SEO Spider helps me cause 503 errors.

    Reply
  • Dennis Jorgensen 1 year ago

    My tip, is that the automated crawls are a flood of gold, that are often overlooked.
    You can pull daily crawls, export data to sheets, and do all kinds of automations and alerting based on this – especially with a bit of Google Apps Script. Or you can load it into Looker to have some nice dashboards.

    Reply
  • Georgi 1 year ago

    Set your crawl speed, because the Sysadmin will kick your… :)

    Reply
  • Xavier 1 year ago

    My favorite usage of Screaming Frog is to check the redirects of my site to monitor each change.

    Reply
  • Michael Cortez 1 year ago

    API integration 100%. Being able to quickly extract important metrics from GA, GSC, Ahrefs, and more. This streamlines the process for analyzing a site’s performance and saves so much time. It’s hard to pick one. But API integration for the win!

    Reply
  • Evgeniy 1 year ago

    My favorite typ for using SF as mighty scraping machine is to use proxy gateway as proxy for unlimited IP rotation.

    Reply
  • Max W 1 year ago

    Thanks guys, the things I would do to own one of these t shirts, honestly… I hope I’ve not missed the deadline!

    My favourite tip / thing to do with SF is; run a site audit, filter by 4xx & 3xx URLs and export for the inlinks section for a list of pages that require internal link fixes.

    Bonus! I’m a big fan of searching for orphan pages by combining SF, GA and GSC data.

    Love your work guys!

    Reply
  • Tyler Wilhelm 1 year ago

    One of my favorite tips that I’ve learned was how to efficiently use the ‘Custom Search’ tool within Screaming Frog, specifically when it comes to performing internal linking opportunities between new or existing pages of a website.

    When it comes to optimizing pages for SEO and determining the primary and secondary keywords for the page, the ‘Custom Search’ functionality allows me to quickly scan an entire website for the term(s) I’m wanting to build internal links towards in an SEO-friendly way.

    #SFTipsForTees

    Reply
  • Nick 1 year ago

    Yoo Hi There,

    Locate Orphan Pages: In order to find pages on your website that are not linked internally, you can utilize the “Internal” filter option available in the “Filter” menu. This tool assists you in identifying pages that may lack internal links and enables you to enhance their visibility within your website’s structure.

    Also i almost every day use the : Custom Extraction: (in combination with excel) Custom extraction refers to the process of extracting specific data elements from a website by creating customized rules or patterns. This technique allows you to extract relevant information tailored to your specific needs, which can be particularly useful when dealing with large amounts of data.

    Reply
  • Ben 1 year ago

    Use custom search to build in in-linking workbooks!!

    Reply
  • Ben 1 year ago

    Use custom search to build in in-linking workbooks!!

    Reply
  • Charles 1 year ago

    There are dozens of possible uses for Screaming Frog, but the one I prefer is the use of the word counter per page. Coupled with the crawl comparison functionality, it allows you to compare the volume of words on a page before and after migration, for example. Moreover, Screaming Frog is an essential tool for supporting site migration or graphic redesign, which can sometimes have harmful effects on SEO.

    Reply
  • Daniel Beddows 1 year ago

    Update the Frog when the Frog asks you to! You might just be pleasantly surprised by the latest feature.

    Reply
  • Ralf 1 year ago

    With the extraction mode you can scrape and count nearly all website elements. When additionally pulling GSC data via API, you receive easily many columns of valueable data to check for correlations, e.g. words count x Clicks.

    Reply
  • Can 1 year ago

    Turn on Javascript rendering to get more information

    Reply
  • Suzie Schmitt 1 year ago

    There are SO many features I could talk about my love for, but the easy tie-in to data studio is truly amazing. It immediately gives us an actionable dashboard that can double as a presentation tool when explaining technical issues to clients.

    Reply

Leave A Comment.

Back to top