If you actively work with your website or you maintain webs of your clients, you will from time to time find yourself in a situation when you come across some broken links. This might happen because you have unintentionally changed addresses of subpages and haven’t linked them correctly from texts located in a different part of a website. Or because you are trying to create a link to somebody else’s website that have changed its structure in the meantime.
This simple freeware program - Greenflare will help you to reveal such problems. It will search for broken links but also images, JavaScript or CSS stylesheets that are unable to load. In short, it is a flexible tool for checking a website.
There are more similar tools. Screaming Frog SEO Spider mentioned in the lead paragraph is considered as kind of a benchmark in this area. However I think that Greenflare could be a fully-fledged replacement for the most of you. It is free, with no limits and it runs well in Windows, Linux and macOS.
How to check links on a website with Greenflare
After downloading the app from the website just launch it and before you crawl the website for the first time, look around which options it offers in its four tabs:
Crawl – when you’re done browsing through the following settings options, return to this tab again. Enter the URL of the homepage of the website and launch the checking process.
Settings – in this tab you can set up which components of a website will be checked. I recommend to also turn on checking of images, CSS and JavaScript. At least on macOS this function wasn’t turned on from the beginning.
Exclusions – Greenflare scans the whole web automatically if you have linked everything correctly. But there still might be some addresses on the web you don’t want to be checked. And here is the space for creating a list of those.
Extractions – here the program offers some space for a quite low-key but interesting function of data extraction from the website that is being checked. More on this function later.
After launching the checking process and finishing it Greenflare will create a list of the addresses that have been scanned. Similarly as in rival apps you will see a verbal formulation of the scanning status and a number code corresponding with http status. For example OK 200, not found 404 etc.
You can organize the list by the right-click of a mouse to a heading of one of the columns, or you can quickly filter it out by some of the criteria using the program menu View. The displayed view can be exported to CSV using the menu File. The data then can be processed also in other apps.
How to extract data from websites
In the Settings tab Greenflare offers turning on checking of the tags: title, meta description, h1 and h2. If you would like to obtain content of other elements of a website and add it to the table with the results of scanning and then for example export it to CSV, switch to the Extractions tab.
Because that’s the place where you with a CSS selector specify the element with the content that Greenflare ought to put into the report. If the CSS selector isn’t definite, then the content of the element first found is input to the report. The HTML tags will also be removed from this element.
Obtained data are displayed in the last columns of the report. You can prepare as many extractions as you need of course. This way it is possible to obtain for example names of authors of articles, publication dates or information about OpenGraph tags for social media.
No matter how is the CSS selector specification easy, Greenflare doesn’t limit you only to this option but it offers the option of specification of an element using XPath. Thanks to this you can extract anything, from a tag to an attribute value anywhere on the website.
I strongly recommend to try Greenflare out.