Sei sulla pagina 1di 32

TECHNICAL

SEO AUDIT
www.cedar-rose.com

Ross Britten

Cedar rose | MARCH 2017


Table of Contents
Introduction .............................................................................................................................. 4
1. On-Page and Hygiene .......................................................................................................... 5
Title Tags ........................................................................................................................................... 5
Missing H1 Tags ............................................................................................................................ 5
Duplicate H1 Tags.......................................................................................................................... 5
H2 – H6 Tags ................................................................................................................................. 6
Canonical Tags.................................................................................................................................. 6
Missing Canonical Tags ................................................................................................................. 6
Page Titles ......................................................................................................................................... 7
User and SEO-friendly Page Titles ................................................................................................ 7
Meta Descriptions .......................................................................................................................... 8
Alt Text .............................................................................................................................................. 8
Image Indexing and Keyword Targeting ....................................................................................... 8
2. Crawlability ........................................................................................................................ 10
CSS Delivery ................................................................................................................................... 10
Multiple CSS Files ....................................................................................................................... 10
Render-Blocking CSS .................................................................................................................. 10
Critical Render Path ................................................................................................................................ 11
Labelling CSS .......................................................................................................................................... 12
Combining Multiple CSS Files ............................................................................................................... 12
Inline Above-Fold CSS ................................................................................................................ 12
Minify CSS................................................................................................................................... 12
JavaScript Delivery ........................................................................................................................ 12
Render Blocking JavaScript ......................................................................................................... 12
Multiple JavaScript Files.............................................................................................................. 13
Minify JavaScript ......................................................................................................................... 13
Deferring JavaScript..................................................................................................................... 13
3. Technical ............................................................................................................................. 14
URL Structure and Taxonomy...................................................................................................... 14
Duplicate URLs ............................................................................................................................ 14
Parameter URLs ........................................................................................................................... 14
Company List URLs .................................................................................................................... 15
Product URLs ............................................................................................................................... 16
Solutions Navigation ............................................................................................................................... 17
Core Products .......................................................................................................................................... 17
Solutions: Companies vs Individuals ...................................................................................................... 18
Empty Subdirectories ..................................................................................................................... 18
Homepage Cannibalisation ............................................................................................................ 19
Internal Linking.............................................................................................................................. 19
Log File Analysis............................................................................................................................. 22
Uncrawled Pages .......................................................................................................................... 22
Wasted Crawl Budget .................................................................................................................. 22
404 and 500 Errors ......................................................................................................................... 23
XML Sitemap .................................................................................................................................. 23
Robots.txt ........................................................................................................................................ 24
Location Declaration ...................................................................................................................... 24
Non-HTTPS Content ...................................................................................................................... 24
Site Speed ........................................................................................................................................ 25
Desktop ........................................................................................................................................ 25
Mobile .......................................................................................................................................... 25
Backlink Analysis ........................................................................................................................... 26

2
4. Other Technical Considerations ....................................................................................... 27
Language Targeting ....................................................................................................................... 27
Schema and Structured Data ........................................................................................................ 27
5. Content ................................................................................................................................ 29
Thin and Duplicate Content .......................................................................................................... 29
Blog Content.................................................................................................................................... 29
6. Google Analytics................................................................................................................. 31
Incorrect GA Implementation ....................................................................................................... 31
Event Tracking ............................................................................................................................... 31
Ecommerce Tracking ..................................................................................................................... 31
Prioritised Actions ................................................................................................................. 32

3
Introduction
This technical SEO audit is intended to highlight areas on www.cedar-rose.com that will be
having a negative effect on visibility in organic search engine results pages (SERPs). By
fixing these issues, you will greatly improve how search engine bots crawl and index your
site, meaning it will be easier for them to find your content and display it in search results
pages. This will help boost your organic traffic.

I have broken down these areas into four main categories: On-page and Hygiene issues,
Crawlability issues, Content issues, and Technical issues, Technical Considerations, and
Content. For each of these issues, I have provided a recommendation in-keeping with current
SEO best practice.

To prioritise these recommendations and to ensure improved organic visibility, I have added
the following legend to each of the tasks:

Priority: HIGH | Difficulty: HIGH

A breakdown of these ratings is below. They reflect both the impact I expect the changes to
have and how much time and effort I anticipate the fixes will require.

Priority:

HIGH PRIORITY – high impact on organic search performance


MEDIUM PRIORITY – medium impact on organic search performance
LOW PRIORITY – low impact on organic search performance

Difficulty:

HIGH DIFFICULTY – technically demanding or very time consuming


MEDIUM DIFFICULTY – reasonably demanding or relatively time consuming
LOW DIFFICULTY – easy or quick fix

A summary of prioritised tasks can be found at the end of this document.

4
1. On-Page and Hygiene

Title Tags

Title tags are used to signal to search engines what a page is about. They are an important on-
page ranking factor. As such, we should optimise pages by using H1 tags to markup page
titles.

As a general rule, H1 tags should be unique to that page, contain the keyword you want to
target and limited to one per page.

Missing H1 Tags
Priority: LOW | Difficulty: LOW

The homepage is currently missing an H1 tag. This is common for homepages as they usually
have a logo in place of a title.

H1 tags on the homepage ensure that the homepage appears in search results when someone
searches for your company’s name.

Cedar Rose currently does rank in position one for the search term ‘cedar rose’ without an
H1, however, I would recommend including an H1 on the homepage as this will help
maintain that visibility as part of SEO best practice.

Recommendation: Add H1 tag containing ‘Cedar Rose’ to the homepage.



Duplicate H1 Tags
Priority: MED | Difficulty: LOW

There are currently 2,868 pages with the same H1 tag. This will be creating confusion for
search engines as they cannot work out which page to display in search results when someone
searches for one of those pages.

For example, someone searching for ‘Cedar Rose bankruptcy check’ will either see one of the
following as they both contain the same H1 tag ‘Bankruptcy Check’.

• http://www.cedar-rose.com/Product/Detail/399
• http://www.cedar-rose.com/Product/Detail/400

This creates confusion as search engines do not know which page to display. Instead of
choosing one to display, search engines will usually demote both pages by either pushing
them down search results pages, or not displaying them all together for the search term
‘Cedar Rose bankruptcy check’. This results in less traffic to these pages.

These duplicate H1 tags are caused by three separate, wider issues:

1. Multiple product pages targeting the same keywords


2. URL duplication

5
3. URL parameters

I have addressed each of these issues separately in this audit as they will also be having wider
SEO implications. Please refer to pages in the URL Structure and Taxonomy section to
address these issues and fix the duplicate H1 tags.

H2 – H6 Tags
Priority: MED | Difficulty: LOW

As well as the H1 tag, H2 – H6 tags can be used to help break up content in the form of
subheadings. In HTML coding, heading tags from H1 to H6 form a top-down hierarchy. The
most important heading should be marked up as an H1, and subsequent subheadings should
be marked up using H2, H3, H4 tags and so on. These subsequent subheadings can be used to
target secondary keywords.

This helps create association around a particular subject matter. For example, by creating a
page with the H1 ‘Bankruptcy Checks in Algeria’, and then creating subheadings with further
information, such as H2 ‘What is the bankruptcy process in Algeria?’, you can create more
association between your page and bankruptcy in Algeria. This means that you are more
likely to appear for the search term ‘bankruptcy in Algeria’.

There can be multiple H2 – H6 tags on the same page, as long as they are nested in separate
html elements.

Recommendation: Update all pages where appropriate to include H2 – H6 tags to target


secondary keywords.

Canonical Tags

Missing Canonical Tags


Priority: HIGH | Difficulty: LOW

Canonical tags are used to help search engines understand the relationship between similar
pages or variations of the same page. This is to make sure search engines display the correct
version in SERPs.

At present, there are no canonical tags on any page on www.cedar-rose.com. Each unique
page should have a self-referencing canonical tag as follows:

<link rel="canonical" href="http://example.com/page" />

Variations of the same page should have canonicals referencing the main page.

For example, the following two pages are localised versions of the same page
http://www.cedar-rose.com/product/detail/284

6
• http://www.cedar-rose.com/product/detail/284?countryId=49
• http://www.cedar-rose.com/product/detail/284?countryId=48

However, search engines will see these as three separate pages. This will be having a
negative impact on how search engines crawl and index the site. It will also make search
engines think that this is duplicate content which can lead to a penalty and removal from
Google’s index.

These kinds of pages should reference the original page to avoid duplication and
cannibalisation in SERPs, i.e. both localised versions should have the following canonical
tag:

<link rel="canonical" href=" http://www.cedar-rose.com/product/detail/284" />

Recommendation: Add self-referencing canonical tags to all unique pages and add
canonical tags to variation pages referencing the original page.

Page Titles

User and SEO-friendly Page Titles


Priority: HIGH | Difficulty: LOW

Page titles are the blue links that appear in search results for your pages:

While page titles are not a direct ranking factor for organic search, they are the first thing a
user sees about your site and do influence user click-through rates (CTR) which can increase
organic traffic. As such, they should be both user and SEO-friendly to maximise CTR.

Current best practice for page titles adheres to the following guideline:

• They should be unique


• Contain the target keyword
• Contain the brand name
• Be within 50 – 60 characters long

At present, page titles on www.cedar-rose.com do not follow this guideline. There are
currently 2,885 pages with the same page title ‘Cedar Rose’. This does not give the the user
any information about these pages, which makes them less likely to click on them.

Recommendation: Optimise page titles to current SEO best practice using above
guideline

7
Meta Descriptions
Priority: HIGH | Difficulty: LOW

Similar to page titles, meta descriptions are not direct ranking factors but can have a
positive effect on organic click-through rates.

www.cedar-rose.com is not currently optimising meta descriptions on site. Instead,


unspecified meta descriptions are being displayed in SERPs. Unspecified meta descriptions
are generated by search engines automatically pulling on-page content into search results.
This can lead to truncated descriptions or inappropriate content which can lead to lower CTR
and less traffic.

For example, the following meta description is pulled directly from the content on this page:

This auto-generated meta description gives very little information about this particular
service, which means fewer people are likely to click on this link.

Current best practice for writing SEO-friendly meta descriptions is as follows:

• Contain a call-to-action
• Contain the target keyword
• Contain the brand name
• Be within 150 to 160 characters long

Recommendation: Create unique meta descriptions for all html pages following SEO best
practice guidelines.

Alt Text

Image Indexing and Keyword Targeting


Priority: MED| Difficulty: LOW

Current search engines are unable to understand or contextualise images. As such, many
search engines rely on alt text to understand what images portray. Alt text should be a brief
description of what the image contains. It should also include relevant keywords or answers
to search queries. This is to help search engines crawl and index images, but also to help
target specific keywords on a page to help rank that page for those keywords.

By including alt text, www.cedar-rose.com can target secondary keywords on a page to help
increase visibility for those keywords.

For example, on the homepage you could include secondary keywords such as ‘Credit reports
in United Arab Emirates’ for the below image to help create association between your site

8
and credit reports in UAE. This will help increase visibility of your site when someone
searches for credit reports in UAE.

Recommendation: Include alt text on every image to target secondary keywords and help
boost visibility for those terms

9
2. Crawlability

CSS Delivery

Multiple CSS Files


Priority: MED | Difficulty: HIGH

Multiple CSS files can have a negative impact on site load times. Page speed has become an
increasing ranking factor on mobile devices, with faster sites favoured over slower sites in
organic SERPs.

www.cedar-rose.com currently has four separate CSS files. Each time a user visits a page, a
browser must call each of these files before displaying the page. This can slow down how
long it takes to load a page.

Best practice is to combine CSS files into one main file so that browsers only have to spend
time calling and serving one resource.

SOURCE: VARVY.COM

Recommendation: Combine CSS files into one CSS file. Inline small CSS into HTML
where necessary to load the page.

Render-Blocking CSS
Priority: MED | Difficulty: HIGH

Render blocking CSS is where a page cannot load until all the CSS has been called and
served. This slows down page load times and can have a negative impact on organic
visibility. This is especially true for mobile searches.

10
By default, CSS is naturally render blocking as a user-friendly page cannot load without CSS.
However, there are measures to reduce the amount of time it takes a browser to render CSS.

Critical Render Path

Unless specified, browsers will call multiple resources, such as html, CSS and JavaScript, at
the same time. A page will not fully render until all these resources have been processed.
This can lead to slow load times. Often, these resources do not all need to be called at the
same time.

This can be avoided by optimising a page’s critical render path. That is to say, prioritising
resources that are required to load content on a page above-the-fold, and deferring the rest.
That way, the user, and the search bot, see the content they came for but are not hindered by
scripts running in the background or below the fold.

SOURCE: VARVY.COM

We can optimise the critical render path to concentrate on above-the-fold content by ensuring
we deliver resources in a logical order, i.e., html first, then CSS, then JS:

SOURCE: VARVY.COM

For CSS, this would involve:

• Labelling CSS files


• Combining multiple CSS files
• In-lining CSS for above-fold content

11
Labelling CSS

Incorrectly labelling CSS files can result in browsers unnecessarily calling all CSS files at the
same time, even those that are not needed. This can be avoided by correctly labelling CSS
files so a browser only renders the correct CSS files.

Combining Multiple CSS Files

See Multiple CSS Files above

Inline Above-Fold CSS

Small CSS can be included in the HTML of a page to prioritise above-the-fold content. This
is useful because it reduces the number of external CSS files a browser has to call, speeding
up page load times.

Recommendation: Optimise critical render path by labelling CSS files correctly, combining
multiple CSS files into one, and in-lining above-fold CSS.

Minify CSS
Priority: MED | Difficulty: LOW

Minifying CSS refers to compressing CSS files. While the above measures will improve page
load times, minifying CSS files can save many bytes of data and speed up download and
parse times.

Recommendation: Minify CSS files onsite.

JavaScript Delivery

Render Blocking JavaScript


Priority: MED | Difficulty: HIGH

Similar to render blocking CSS, JavaScript can block browsers from loading pages until all
the JS files have to be called and parsed. These are often not needed to display user-focused
content.

There are currently 12 JavaScript files on www.cedar-rose.com.

As with render blocking CSS, this can be combatted by optimising the critical render path for
above-the-fold content.

For JS, this would involve:

• Combining multiple JS files


• Minifying JS files
• In-lining small JS into html
• Deferring JS until other resources have been parsed

12
Multiple JavaScript Files
Priority: MED | Difficulty: HIGH

There are currently 12 JavaScript files on site. At the moment, browsers have to call and
parse each of these files separately. Combining all JS files into one can reduce the amount of
time it takes to load a webpage.

Recommendation: Combine multiple JavaScript files into one file

Minify JavaScript
Priority: MED | Difficulty: MED

At present, the total JavaScript on www.cedar-rose.com is 831kb. This places a significant


toll on page load times. JavaScript can be compressed to reduce the overall file size.

Recommendation: Minify JavaScript files to improve page speed.

Deferring JavaScript
Priority: MED | Difficulty: HIGH

A lot of JavaScript is not essential to display content above-the-fold, yet browsers will call JS
files and parse them at the same time as HTML and CSS files. This slows down page load
times.

Non-essential JavaScript, that is to say JavaScript that is not needed to display content above
the fold, can be deferred until after the rest of the page has loaded.

Recommendation: Allow HTML and CSS to be parsed first, run internal script to call
external JavaScript once content has fully loaded.

13
3. Technical

URL Structure and Taxonomy

Duplicate URLs
Priority: HIGH | Difficulty: LOW

URLs are case sensitive. For example, the following page has two URLs:

• http://www.cedar-rose.com/Registration
• http://www.cedar-rose.com/registration

Search engines will see this as two separate pages and will crawl and index them separately.
This can lead to duplicate content penalties. It can also waste our crawl budget, whereby
Google wastes resource crawling these two identical pages rather than crawling other pages
on the site. This can prevent some pages from being indexed.

There are currently 97 duplicate URLs on the site.

Recommendation: Use a regex rule to 301 redirect all upper case URLs to their lower case
counterparts.

Parameter URLs
Priority: HIGH | Difficulty: LOW

There are 2,817 URLs with added parameters. For example, the following URLs are all
generated based on a master URL’s template that produce a parameter:

• http://www.cedar-rose.com/CompanyList?&page=42
• http://www.cedar-rose.com/product/detail/284?countryId=157
• http://www.cedar-rose.com/Product/DownloadProductPdf?ProductId=412

Search engines see these as individual pages rather than variations of a master page. This can
have a negative impact on your crawl rate as search bots have to crawl each page individually
to analyse their content. This means that search engines have to crawl 2,800+ nearly identical
URLs each time they visit your site. The result is that search bots reduce how many times
they visit your site as it is too resource heavy. In addition, as the content is very similar,
search engines may also penalise you for duplicate content. This can lead to your site being
removed from search results.

Recommendation: As in interim fix, I would recommend adding a canonical tag to each of


these parameter pages that reference the master page. This will help with your crawl
budget and duplicate content issues.

However, we should look at a longer term fix to address the URL structure of these pages
to be more user and SEO friendly. Please find these recommendations below.

14
Company List URLs
Priority: HIGH | Difficulty: MED

There are 2,754 URLs that sit under http://www.cedar-rose.com/CompanyList/

I understand that this is a directory of all your contacts in your database for people to search
for specific companies. However, this offers very little benefit to users in organic search.

Most of these pages are not being crawled or indexed by Google as they are seen as
duplicates with very little content to benefit the user (refer to Parameter URLs above).

Those that are indexed, also appear to be generic pages with very little information about said
companies. See image below:

As these pages are so similar, they have no discerning information to distinguish them from
one another. They also do not have specified page titles or meta descriptions so Google auto-
generates these instead. (Please refer to User and SEO-friendly Page Titles and Meta
Descriptions). This results in very low click-through rates as users are unlikely to click on
these links.

However, there is a lot of search opportunity around people searching for lists of companies
in particular markets. For example, people do search for ‘companies in Bahrain’. We could
boost traffic by restructuring the Company List pages to rank for these kinds of searches,
rather than for individual companies.

Recommendation: Update the Company List pages into the following hierarchical
structure:

1. www.cedar-rose.com/company-register/
2. www.cedar-rose.com/company-register/companies-in-bahrain/
3. www.cedar-rose.com/company-register/companies-in-bahrain/page2
4. www.cedar-rose.com/company-register/companies-in-bahrain/page3 etc.

15
Explanation:

1. More people search for ‘company register’ than ‘company list’. Change the name of
this page from Company List to Company Register. This page would contain links to
company registers for each of the markets you operate in, i.e. a link to a list of
companies in Bahrain, Saudi Arabia, Algeria etc.
a. www.cedar-rose.com/company-register/companies-in-bahrain/
b. www.cedar-rose.com/company-register/companies-in-saudi-arabia/ etc.
2. Each market would then have its own list of companies in its own subdirectory
3. Use rel=prev/next navigation to navigate to subpages: page 2, 3 and so on.
4. Add a canonical tag to all subpages, e.g. page 2, 3 and so on, referencing the original
page, in this case www.cedar-rose.com/company-register/companies-in-bahrain/

Product URLs
Priority: HIGH | Difficulty: MED

Similar to the Company List pages, product pages have very little distinguishable information
in the URL. They are also very hard to navigate to both from search results pages and
internally.

For example, the following URL structure www.cedar-rose.com/product/detail/284 has no


information about the product or service it offers. It also follows no hierarchical structure
which makes it hard for search engines to work out how important this particular page is
within your site structure.

To make this easier for users and search engines to understand these pages, I would introduce
a hierarchical structure to these URLs.

For example, www.cedar-rose.com/product/detail/284 would become:

• www.cedar-rose.com/solutions/business-credit-report

You could then break this structure further to target credit report searches for specific
markets, i.e. ‘company credit reports in Algeria’ or ‘credit reports Saudi Arabia’

• www.cedar-rose.com/solutions/business-credit-report/business-credit-reports-algeria
• www.cedar-rose.com/solutions/business-credit-report/business-credit-reports-saudi-
arabia

This format can be rolled out across all your products and markets to maximise organic
traffic.

Please note, this will require further keyword research to ascertain a URL naming
convention based on search volumes across all products and markets

16
Solutions Navigation

I would also recommend creating the page www.cedar-rose.com/solutions to help with page
hierarchy and crawl efficiency. At the moment, your solutions are available through a drop
down menu. See below image:

This kind of navigation is hard for search engines to crawl and index as it cannot understand
the hierarchy of these pages. I would recommend creating this as a standalone page at
www.cedar-rose.com/solutions as this will help Google understand that these links are all the
services you offer.

This would work similarly to, and replace, www.cedar-rose.com/products, but I would not
advise a filter function on this page as search engines need to be able to access these links.

I would also recommend working with a UX specialist to make sure this navigation works
for users and not just search bots.

Core Products

I would also recommend creating individual pages for your core products:

• Investigative Due Diligence


o www.cedar-rose.com/solutions/investigative-due-diligence
• Database/ID Verification
o www.cedar-rose.com/solutions/database-id-verifcation
• Business Credit Reporting
o www.cedar-rose.com/solutions/business-credit-report
• Electronic Identity Verification
o www.cedar-rose.com/solutions/electronic-identity-verification

At the moment, these are only available via drop-down menus and are not actually accessible
themselves. You would improve visibility for these search terms if they had their own pages.

17
Each of these pages could then link to the individual products within each category. You
could then structure these into a hierarchical structure for each market, e.g.

www.cedar-rose.com/solutions/business-credit-report/business-credit-report-algeria
(see Product URLs above).

There is no need to remove your current drop-down navigation for Solutions, as this can
work in conjunction.

Solutions: Companies vs Individuals

At the moment, there are two links for ‘Due Diligence Report’, ‘Bankruptcy Check’, ‘Court
Records Check’ etc. in the drop-down Solutions navigation (see above image). One is for
companies, the other is for individuals. This is difficult for search engines to understand as it
doesn’t sit within a URL structure and creates conflict.

As these two links have the same anchor text, but link to different product pages, you are
creating confusion for crawlers.

Rather than having two different navigations for these, I would direct users to one product
page. For example:

• www.cedar-rose.com/solutions/bankruptcy-check

The user can then choose whether this is for a company or an individual when ordering the
product. This can also work for market level product pages:

• www.cedar-rose.com/solutions/bankruptcy-check/bankruptcy-check-algeria

This solution would also help consolidate organic searches around a single product page,
rather than having one product page for companies and one for individuals, which can cause
confusion for both users and search engines.

Empty Subdirectories
Priority: HIGH | Difficulty: LOW

Empty subdirectories are folders within a URL that return a 404 error. For example:

• http://www.cedar-rose.com/product/detail/399 << contains content


• http://www.cedar-rose.com/product/detail << does not contain content
• http://www.cedar-rose.com/product << does not contain content

These empty subdirectories are causing crawling issues as search engine bots will expect to
see accessible content in these directories. As there is no content, or internal linking, in these
subdirectories, search bots will often cease their crawl here and not continue to crawl all the
remaining subdirectories.

• http://www.cedar-rose.com/product/ << No content. Will stop crawling here


• http://www.cedar-rose.com/product/detail << Less likely to get crawled

18
• http://www.cedar-rose.com/product/detail/399 << Unlikely to get crawled

This can prevent search engines from indexing your product pages. It also places them lower
in your site’s hierarchy as it sees these as unimportant pages.

Recommendation: Update URL structure as recommended in Product URLs (see above).


A temporary solution would be to redirect /product/ and /product/detail/ to the homepage
or /products/ page to prevent 404 errors and dead ends.

Homepage Cannibalisation
Priority: HIGH | Difficulty: LOW

SEO cannibalisation is where two or more pages compete for the same keywords or search
queries. This can lead to duplicate content, in which case these pages may be penalised and
drop in rank, or it can result in reduced visibility where both pages are demoted as search
engines are confused as to which page to display.

This is currently what is happening to the homepage. The following pages are all indexed,
duplicate versions of each other:

§ http://www.cedar-rose.com/Index
§ http://www.cedar-rose.com/

There is also a similar issue involving pages with the following format as highlighted above
in Duplicate URLs:

§ http://www.cedar-rose.com/about
§ http://www.cedar-rose.com/About
§ http://www.cedar-rose.com/Product/Detail/402
§ http://www.cedar-rose.com/product/detail/402

Recommendation: 301 redirect duplicate page to main page. Add canonical tags if
duplicate page needs to stay live. Write regex rule to redirect uppercase URLs to lower
case.

Internal Linking
Priority: MED | Difficulty: MED

Internal linking is an important ranking factor for search engines. It helps identify the most
important page on a website. This is usually the homepage.

At present, www.cedar-rose.com currently has many pages that have more internal links than
the homepage. This could mean the site is not optimising crawl budget. When search bots
crawl a website, they find and navigate from what they think is the most important page
downwards in a hierarchical manner. If we have too many internal links pointing to a page
that is not that important, search engines will think that this page is the most important and
will crawl it more frequently. This could result in more important pages, such as the
homepage or product pages, being crawled less often. This could decrease their overall
visibility.

19
Ideally, we would like internal linking to be in a hierarchical order with the homepage, About
page, Contact page and Solutions pages at the top, and then subsequent product pages in
order of importance.

I have used two different sets of data to calculate internal linking. The first looks at how
many internal URIs a single page has pointing as it:

Screaming Frog Data (Internal Linking)

Address Inlinks Notes


http://www.cedar-rose.com/Contact 25987 Page with most internal links
http://www.cedar-rose.com/product/detail/5000 19722 Second most important page to search bots
http://www.cedar-rose.com/product/detail/400 17313
http://www.cedar-rose.com/product/detail/274 17313
http://www.cedar-rose.com/Faq 11555
http://www.cedar-rose.com/product/detail/291 11551
http://www.cedar-rose.com/product/detail/294 11551
http://www.cedar-rose.com/product/detail/285 11551
http://www.cedar-rose.com/product/detail/380 11551
http://www.cedar-rose.com/ 11549 Page that should have most links
http://www.cedar-rose.com/product/detail/290 11543
http://www.cedar-rose.com/product/detail/297 11543
http://www.cedar-rose.com/product/detail/299 11543
http://www.cedar-rose.com/product/detail/288 11543
http://www.cedar-rose.com/About 11543
http://www.cedar-rose.com/product/detail/287 11543
http://www.cedar-rose.com/product/detail/406 11543
http://www.cedar-rose.com/product/detail/401 11543
http://www.cedar-rose.com/product/detail/403 11543

20
The second takes into account the importance of those pages based on how many internal
links they have and gives them each an internal page rank. This is how search bots will
interpret the hierarchy among these pages.

R Program Language Data (Internal linking weighted by internal page rank)

URL Page Rank Notes


http://www.cedar-rose.com/Contact 0.030579378 Page with highest Page Rank
http://www.cedar-rose.com/product/detail/400 0.019977158 Second most important page
http://www.cedar-rose.com/product/detail/274 0.019977158
http://www.cedar-rose.com/product/detail/5000 0.013681297
http://www.cedar-rose.com/product/detail/294 0.013560555
http://www.cedar-rose.com/product/detail/285 0.013560555
http://www.cedar-rose.com/product/detail/380 0.013560555
http://www.cedar-rose.com/product/detail/291 0.013560362
http://www.cedar-rose.com/ 0.013532299 This page should have highest PR
http://www.cedar-rose.com/Faq 0.013502961
http://www.cedar-rose.com/product/detail/423 0.013338832
http://www.cedar-rose.com/product/detail/290 0.013337122
http://www.cedar-rose.com/product/detail/297 0.013337122
http://www.cedar-rose.com/product/detail/299 0.013337122
http://www.cedar-rose.com/product/detail/288 0.013337122
http://www.cedar-rose.com/product/detail/406 0.013337122
http://www.cedar-rose.com/product/detail/401 0.013337122
http://www.cedar-rose.com/product/detail/403 0.013337122
http://www.cedar-rose.com/product/detail/374 0.013337122
http://www.cedar-rose.com/product/detail/301 0.013337122

In both sets of data, the contact page has a disproportionately high internal page rank. This
means that there are more links pointing to this page than any other. Search engines will
therefore see this as the most important page and will crawl it more frequently than the other
pages. It will also prioritise any subdirectories that sit under /Contact/.

From an SEO perspective, it would make more sense to make the homepage the most
crawled page. This will ensure all subdirectories are crawled.

Recommendation: Updating internal linking can be difficult and is dependent on URL


structure. I would recommend working through the recommendations concerning URL
Structure and Taxonomy above as many of these will also fix this issue.

21
Log File Analysis

Uncrawled Pages
Priority: MED | Difficulty: MED

There are currently 246 pages that should be accessible to search bots, but have not been
crawled or indexed in the last 30 days. These are all Company List pages.

Google has seen these pages as low priority and will therefore not crawl them frequently or at
all. This is likely due to crawl budget. Google will assign a set amount of URIs it will crawl
in one go, and it will prioritise these pages based on internal page rank. It will see low
priority pages such as these as a waste of resource and will not crawl them.

Ideally, all accessible pages should be crawled by search bots. This helps with indexing the
whole site, and encourages search bots to revisit the site frequently. We can improve crawl
rate by creating a clear URL structure that helps search bots navigate the site efficiently. This
can be achieved by working through the recommendations suggested in URL Structure and
Taxonomy above.

Wasted Crawl Budget


Priority: MED | Difficulty: MED

Fortunately, despite the internal page rank issues cited above, Googlebot still seems to be
crawling the homepage the most often. However, internal page rank is having an impact on
how often other pages are crawled.

The following pages are the top 20 most crawled pages over the last 30 days:

URL Googlebot Notes


http://www.cedar-rose.com/ 169 Most Crawled Page
http://www.cedar-rose.com/robots.txt 75
http://www.cedar-rose.com/css/site_default.css 52
http://www.cedar-rose.com/CompanyList?&page=467 50 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList 43
http://www.cedar-rose.com/CompanyList?&page=406 38 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=2681 35 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=1505 34 Wasting Crawl Budget
http://www.cedar-rose.com/Contact 29
http://www.cedar-rose.com/CompanyList?&page=1270 27 Wasting Crawl Budget
http://www.cedar-rose.com/?author=1 26 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=2678 25 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=2726 23 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=1397 22 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=2048 22 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=647 21 Wasting Crawl Budget
http://www.cedar-rose.com/CompanyList?&page=1069 20 Wasting Crawl Budget

22
http://www.cedar-rose.com/scripts/bootstrap/css/bootstrap.css 20
http://www.cedar-rose.com/scripts/nprogress/nprogress.css 20
http://www.cedar-rose.com/CompanyList?&page=2405 18 Wasting Crawl Budget

There seems to be a big disparity between the number of times the homepage is crawled and
the number of times other pages are crawled. For example, the homepage has been crawled
169 times in the last month, but the Contact page has only been crawled 29 times. None of
the other top level pages, such as About Us, FAQ or any of the product pages are in the top
20.

This is because Googlebot is placing too much emphasis on the Company List pages. As
there are 2,754 Company List URLs, Google is spending a lot of its crawl budget to try and
crawl and index all these pages. This will reduce how many times product pages are crawled.

In addition, the vast majority of Company List pages have only been crawled once in the last
30 days. This is indicative of a low crawl rate. Ideally, pages should be crawled several times
a week to ensure organic visibility in search results pages.

Recommendation: Update URL structure and Taxonomy to create page hierarchy and
boost crawl rate of priority pages and across the domain.

404 and 500 Errors


Priority: HIGH | Difficulty: LOW

There are currently 837 404 and 500 server errors on the domain. These are pages that search
bots have gone to crawl but can no longer find them or access the sever. This is usually
because the content has been deleted or is no longer available.

This will be having a negative impact on crawl rate as Google is wasting resource trying to
find these pages.

Recommendation: 301 redirect these pages to the homepage (see accompanying excel doc)

XML Sitemap
Priority: HIGH | Difficulty: LOW

The current XML sitemap located at www.cedar-rose.com/sitemap.xml only contains six


URLs.

This sitemap should contain all URLs on the site that you want search bots to crawl and
index. Sitemaps should also be updated regularly to make sure the URLs are up to date.

Recommendation: Most CMS will have plugins that allow you to auto-generate sitemaps
on a regular basis. I would recommend creating an XML sitemap that includes every html
page on the site.

23
Robots.txt
Priority: LOW | Difficulty: LOW

It is best practice to include a reference to your XML sitemap in your robots.txt file to help
search engines locate your sitemap. E.g.:

SITEMAP: http://www.cedar-rose.com/sitemap.xml

Recommendation: add reference to XML sitemap in robots.txt file.

Location Declaration
Priority: HIGH | Difficulty: LOW

Domains that target specific markets or countries can declare their target location in Google
Search Console. For global domains, or domains that target more than one country, I would
not recommend using this feature in Google Search Console as you are telling Google to
restrict visibility of the domain to that particular market only.

At the moment, www.cedar-rose.com is targeting users located in Lebanon.

Recommendation: Select ‘Unlisted’ to ensure you target a global audience and do not limit
visibility to one country. Please note, you cannot select market regions such as MENA,
EMEA, etc. For this it is best to target a global audience.

Non-HTTPS Content
Priority: HIGH | Difficulty: HIGH

Site security is becoming an increasing ranking factor for search engines, with https pages
being favoured over their http counterparts. Google also tends to index https before http if
available.

At the moment, www.cedar-rose.com is insecure. This means that Google is more likely to
favour other sites with https protection when ranking your domain.

Recommendation: Install an SSL certificate on www.cedar-rose.com. 301 redirect http to


https in .htaccess file and update all internal linking to https. Sitemaps will also need to be
rebuilt to include https. You will also need to add the https version of the site to Google
Search Console and manage the migration from http to https.

24
Site Speed
Priority: MED | Difficulty: HIGH

Site speed is becoming an increasingly important ranking factor, with Google favouring
faster sites over slower ones. This is truer on mobile devices.

Desktop

The desktop version of www.cedar-rose.com currently scores 48/100 for speed. This will be
having a negative impact on rank as well as user experience. (Data from Google Page Speed
Insights tool).

Mobile

The mobile version of www.cedar-rose.com currently scores 46/100 for speed. This will be
having more of a negative effect as mobile algorithms take page speed into account when
displaying search results. Faster sites will tend to have higher organic rank. (Data from
Google Page Speed Insights tool).

25
Recommendation: Improve site speed by:

• Optimising critical render path


• Combining CSS files
• Combining JavaScript files
• Removing render blocking CSS
• Removing render blocking JavaScript
• Deferring non-essential JavaScript below the fold
• Minifying CSS
• Minifying JavaScript
• Compressing images

Backlink Analysis
Priority: HIGH | Difficulty: MED

Backlinks are links from other domains that reference your domain. Google uses these
backlinks to work out how important your website is in relation to other websites. In general,
the more backlinks a domain has, the more important it is.

However, many webmasters manipulated their backlinks by artificially creating lots of


backlinks to trick Google’s algorithms into thinking the site was more important than it was.
As such, Google began penalising sites that did this. It did this by taking into account the
authority or quality of the referring domain. If a domain had a lot of backlinks from low
quality sources, such as directory listings or spam sites, then Google would penalise the
referent domain.

It is now best practice to periodically audit your domain’s backlink profile to make sure there
are no low quality or spammy backlinks pointing to your site.

At the moment, there are 5,222 backlinks from other domains pointing to www.cedar-
rose.com.

Recommendation: I would highly recommend conducting an analysis of these 5,222


backlinks to ensure they are not from low quality or spammy sources. Any low quality
backlinks that are found, should be disavowed to ensure that Cedar Rose does not receive a
penalty.

26
4. Other Technical Considerations

Language Targeting
Priority: MED | Difficulty: MED

The site is predominantly optimised for English language speakers. This limits the amount of
potential traffic to people searching for your services in English. I would strongly
recommend creating an Arabic language version of the site to capture search traffic in this
language. E.g.

• www.cedar-rose.com << English version


• www.cedar-rose.com/ar << Arabic version

At the moment, the Company List does target searches for these companies in Arabic.
However, as these Company List pages have several crawl issues, they are currently having
little impact on driving traffic. By combining Arabic and English companies in one Company
List section, you are also creating confusion for Google’s search bots as they will use
different search bots for different languages.

If resource does not allow for translation of the whole domain, I would recommend
prioritising a separate Arabic version of the Company List section. For example:

English Content:

• www.cedar-rose.com/company-register
• www.cedar-rose.com/company-register/companies-in-bahrain

Arabic Content:

• www.cedar-rose.com/ar/company-register
• www.cedar-rose.com/ar/company-register/companies-in-bahrain

By separating this out, this would help with crawling and indexing the Arabic content. It will
also create a more consistent user experience. You should also be able to increase visibility
for company searches in Arabic.

Please note, I would also recommend using the hreflang attribute if creating content in
different languages so that search engines better understand the relationship between these
pages.

Schema and Structured Data


Priority: MED | Difficulty: MED

Schema.org (often called Schema) is a specific vocabulary of tags (or microdata) that you can
add to your HTML to improve the way your page is represented in SERPs. It is a way for
webmasters to provide the information search engines need to understand the context of your
content and provide the best search results possible.

27
Although schema is not a ranking factor, marking-up data can have a positive effect on
organic visibility and CTR.

For example, reviews and ratings can be marked up using schema to create star ratings in
SERPs. This can encourage more click-throughs:

At present, there is no structured data on www.cedar-rose.com. However, marking up the


following could have a positive effect on organic visibility and boost traffic:

• Organisation information
• Blog Posts
• Products and services

Recommendation: Mark-up above data using microdata vocabulary to encourage higher


CTR and increased organic visibility.

28
5. Content

Thin and Duplicate Content
Priority: HIGH | Difficulty: HIGH

Thin and duplicate content can result in penalties from search engines. This could lead to
demoted visibility in organic search.

Thin content refers to pages with very little useful information to the user. Such as:

http://www.cedar-rose.com/product/detail/284?countryId=90

This kind of content offers a poor user experience. Google takes this into account when
crawling the site. If there is too much content like this, it can penalise the site and remove
these pages from its index.

Similarly, duplicate content can also lead to penalties. Duplicate content is where more than
one page seems to have identical content. As well as offering a poor user experience, search
bots can interpret this content as an attempt to trick Google’s algorithms by creating lots of
similar content to improve organic visibility.

In addition to this, search bots will find it difficult to distinguish between pages with similar
content. This can lead to a loss of visibility in search results for these pages.

As an example, the following pages all have duplicate and thin content:

• http://www.cedar-rose.com/product/detail/284?countryId=90
• http://www.cedar-rose.com/product/detail/288
• http://www.cedar-rose.com/product/detail/284?countryId=49
• http://www.cedar-rose.com/product/detail/402

This affects all Company List and Product pages and leaves them open to penalty.

Recommendation: Create rich content that is beneficial to the user that explains each
product in detail to avoid penalties

Blog Content
Priority: HIGH | Difficulty: MED

At the time of this audit, there is no long-tail content on the domain. Long-tail content can be
used to target new traffic and increase overall organic visibility. Regularly publishing content
also encourages search engines to revisit your domain to crawl and index new content. One
of the best ways to target long-tail keywords is through a regular blog.

By creating long-tail content strategies, I have seen clients’ organic traffic increase by up to
150% year-on-year. I would highly recommend using content as a strategy to boost traffic to
your domain.

29
Recommendation: Create blog on domain to target long-tail keywords based on keyword
research and search volume. Separate keyword research would be required to identify
traffic opportunities and create content strategy.

30
6. Google Analytics

Incorrect GA Implementation
Priority: HIGH | Difficulty: LOW

It appears that Google Analytics has not been correctly implemented on the domain. As such,
there is very little data around traffic, users, navigation and site usage that can help inform
future web strategies.

Without correct GA implementation, it also makes it difficult to measure ROI from SEO or
UX strategies you have invested in. I would highly recommend fixing GA implementation
before investing any further in SEO or UX to make sure you can accurately measure ROI for
these strategies.

Event Tracking
Priority: HIGH | Difficulty: MED

There is also no event tracking enabled in your GA set up. Event tracking allows us to
measure how users interact with the site by assigning ‘events’ to certain interactions. For
example, it would be useful to know how many users click on particular links, add products
to the cart, download certain content etc. This can help provide valuable data and insights for
future digital strategies such as Conversion Rate Optimisation.

Ecommerce Tracking
Priority: HIGH | Difficulty: MED

Similarly, there is no ecommerce data available in your GA set up. This makes it difficult to
assign an accurate ROI for any digital strategies you have invested in.

Ecommerce tracking allows you to measure how much revenue you generate on particular
pages, which products perform better than others, and which channels (such as organic
traffic, PPC traffic, social traffic) perform the best. This enables you to direct resource
towards the most appropriate strategies.

Recommendation: I would highly recommend implementing all of the above before


embarking on any advanced digital strategies such as SEO or UX.

I am happy to provide a quote and summary of what would be needed to set this up
correctly.

31
Prioritised Actions
I would recommend tackling the above issues in the following order:

Action Order Priority Difficulty


Incorrect GA Implementation 1 HIGH LOW
Event Tracking 2 HIGH MED
Ecommerce Tracking 3 HIGH MED
Duplicate URLs 4 HIGH LOW
Parameter URLs 5 HIGH LOW
Product URLs 6 HIGH MED
Missing Canonical Tags 7 HIGH LOW
Company List URLs 7 HIGH MED
Empty Subdirectories 8 HIGH LOW
Homepage Cannibalisation 9 HIGH LOW
404 and 500 Errors 10 HIGH LOW
XML Sitemap 11 HIGH LOW
Location Declaration 12 HIGH LOW
User and SEO-friendly Page Titles 13 HIGH LOW
Non-HTTPS Content 13 HIGH HIGH
Meta Descriptions 14 HIGH LOW
Backlink Analysis 15 HIGH MED
Thin and Duplicate Content 16 HIGH HIGH
Blog Content 17 HIGH MED
Wasted Crawl Budget 17 MED MED
Uncrawled Pages 18 MED MED
Duplicate H1 Tags 19 MED LOW
H2 – H6 Tags 20 MED LOW
Internal Linking 21 MED MED
Site Speed 22 MED HIGH
Multiple JavaScript Files 23 MED HIGH
Multiple CSS Files 24 MED HIGH
Minify CSS 25 MED LOW
Minify JavaScript 26 MED MED
Render Blocking JavaScript 27 MED HIGH
Render-Blocking CSS 28 MED HIGH
Deferring JavaScript 29 MED HIGH
Schema and Structured Data 30 MED MED
Language Targeting 31 MED MED
Image Indexing and Keyword Targeting 32 MED LOW
Robots.txt 33 LOW LOW
Missing H1 Tags 34 LOW LOW

32