Skip to content

Instantly share code, notes, and snippets.

@yongqianme
Forked from nicolasdao/seo_sem_guide.md
Created November 22, 2023 20:20
Show Gist options
  • Save yongqianme/3a1f8d84dee0f565da8776000fb355a6 to your computer and use it in GitHub Desktop.
Save yongqianme/3a1f8d84dee0f565da8776000fb355a6 to your computer and use it in GitHub Desktop.

Revisions

  1. @nicolasdao nicolasdao revised this gist Mar 19, 2022. 1 changed file with 4 additions and 1 deletion.
    5 changes: 4 additions & 1 deletion seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -403,6 +403,8 @@ Simply paste the URL in the search bar at the top.

    > - List of all Google SEO tools: https://support.google.com/webmasters/topic/9456557
    > - JS minification techniques: [Optimizing JavaScript bundle size](https://www.debugbear.com/blog/reducing-javascript-bundle-size)
    > - [Article: Small Bundles, Fast Pages: What To Do With Too Much JavaScript](https://calibreapp.com/blog/bundle-size-optimization)

    | Topic | Description | Link |
    |:------|:------------|:-----|
    @@ -590,4 +592,5 @@ Please refer to the [Finding the historical keywords ranking for a domain](#find
    # References
    - [Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO](https://moz.com/blog/introducing-progressive-web-apps)
    - [Mobile-first indexing best practices](https://developers.google.com/search/mobile-sites/mobile-first-indexing)
    - [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)
    - [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)
    - [Small Bundles, Fast Pages: What To Do With Too Much JavaScript](https://calibreapp.com/blog/bundle-size-optimization)
  2. @nicolasdao nicolasdao revised this gist Mar 19, 2022. 1 changed file with 4 additions and 1 deletion.
    5 changes: 4 additions & 1 deletion seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -401,7 +401,8 @@ Simply paste the URL in the search bar at the top.

    # Tools

    > List of all Google SEO tools: https://support.google.com/webmasters/topic/9456557
    > - List of all Google SEO tools: https://support.google.com/webmasters/topic/9456557
    > - JS minification techniques: [Optimizing JavaScript bundle size](https://www.debugbear.com/blog/reducing-javascript-bundle-size)
    | Topic | Description | Link |
    |:------|:------------|:-----|
    @@ -411,6 +412,8 @@ Simply paste the URL in the search bar at the top.
    | `sitemap.xml` | Create a sitemap.xml online | https://www.xml-sitemaps.com/ |
    | `sitemap.xml` | Validate a sitemap.xml online | https://www.xml-sitemaps.com/validate-xml-sitemap.html |



    # Tips and tricks
    ## Red flags
    ### Red flags - Google Search Console
  3. @nicolasdao nicolasdao revised this gist Mar 19, 2022. 1 changed file with 2 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -401,6 +401,8 @@ Simply paste the URL in the search bar at the top.

    # Tools

    > List of all Google SEO tools: https://support.google.com/webmasters/topic/9456557
    | Topic | Description | Link |
    |:------|:------------|:-----|
    | `robots.txt` | Create a robots.txt online | https://www.seoptimer.com/robots-txt-generator |
  4. @nicolasdao nicolasdao revised this gist Aug 8, 2021. 1 changed file with 8 additions and 0 deletions.
    8 changes: 8 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -56,6 +56,7 @@
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    > - [How to request Google to recrawl your website?](#how-to-request-google-to-recrawl-your-website)
    > * [Annex](#annex)
    > - [JSONLD examples](#jsonld-examples)
    > - [ahrefs recipes to rank](#ahrefs-recipes-to-rank)
    @@ -427,6 +428,13 @@ This renders all the HTML, but unfortunately, it won't render a full image of th

    Please refer to the [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain) section.

    ## How to request Google to recrawl your website?

    1. Login to the Google Search Console (https://search.google.com/search-console).
    2. Choose one of the two options:
    1. Upload a new sitmaps.xml with new `lastmod` date for the URLs you wish to refresh. That's the fastest way to perform a batch re-crawl.
    2. Paste a URL in the `Inspect` search bar at the top, then click in the `REQUEST INDEXING` button.

    # Annex
    ## JSONLD examples

  5. @nicolasdao nicolasdao revised this gist Aug 8, 2021. 1 changed file with 4 additions and 0 deletions.
    4 changes: 4 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -186,6 +186,8 @@ Crawling your website is not effortless. This means that search engine companies

    ## `robots.txt` or how to block pages with no marketing value to waste your crawl budget

    > [Shopify Robots.txt Guide: How To Create & Edit The Robots.txt.liquid](https://gofishdigital.com/shopify-robots-txt/)
    > - To create a robots.txt online, please refer to https://www.seoptimer.com/robots-txt-generator
    > - To test a robots.txt file, use the [Google robots.txt tester tool](https://www.google.com/webmasters/tools/robots-testing-tool).
    > - The [`X-Robot-Tag` with `noindex`](#x-robots-tag-with-noindex) will not block crawling. It just prevents the GoogleBot from indexing the page. Pages with `noindex` will still consume crawl budget.
    @@ -205,6 +207,7 @@ Where:
    - All other user agents are allowed to crawl the entire site. This could have been omitted and the result would be the same; the default behavior is that user agents are allowed to crawl the entire site.
    - The site's sitemap file is located at http://www.example.com/sitemap.xml.


    ### Making Google aware of the robots.txt

    Manually submit it to the Google Serch Console.
    @@ -401,6 +404,7 @@ Simply paste the URL in the search bar at the top.
    |:------|:------------|:-----|
    | `robots.txt` | Create a robots.txt online | https://www.seoptimer.com/robots-txt-generator |
    | `robots.text` | Test the validity of a robots.txt | https://www.google.com/webmasters/tools/robots-testing-tool |
    | `robots.text` | Test URLs against an inline robots.txt | https://technicalseo.com/tools/robots-txt/ |
    | `sitemap.xml` | Create a sitemap.xml online | https://www.xml-sitemaps.com/ |
    | `sitemap.xml` | Validate a sitemap.xml online | https://www.xml-sitemaps.com/validate-xml-sitemap.html |

  6. @nicolasdao nicolasdao revised this gist Aug 8, 2021. 1 changed file with 1 addition and 0 deletions.
    1 change: 1 addition & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -402,6 +402,7 @@ Simply paste the URL in the search bar at the top.
    | `robots.txt` | Create a robots.txt online | https://www.seoptimer.com/robots-txt-generator |
    | `robots.text` | Test the validity of a robots.txt | https://www.google.com/webmasters/tools/robots-testing-tool |
    | `sitemap.xml` | Create a sitemap.xml online | https://www.xml-sitemaps.com/ |
    | `sitemap.xml` | Validate a sitemap.xml online | https://www.xml-sitemaps.com/validate-xml-sitemap.html |

    # Tips and tricks
    ## Red flags
  7. @nicolasdao nicolasdao revised this gist Aug 8, 2021. No changes.
  8. @nicolasdao nicolasdao revised this gist Aug 7, 2021. 1 changed file with 151 additions and 0 deletions.
    151 changes: 151 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -14,6 +14,19 @@
    > - [Optimizing a web page for keywords](#optimizing-a-web-page-for-keywords)
    > - [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain)
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Crawl budget or how to use `robots.txt` `sitemap.xml` and `noindex`](#crawl-budget-or-how-to-use-robotstxt-sitemapxml-and-noindex)
    > - [What is the crawl budget](#what-is-the-crawl-budget)
    > - [Factors that improve the crawl budget](#factors-that-improve-the-crawl-budget)
    > - [Understanding how the crawl budget is spent](#understanding-how-the-crawl-budget-is-spent)
    > - [Non-marketing pages](#non-marketing-pages)
    > - [Dealing with duplicate content](#dealing-with-duplicate-content)
    > - [Canonical URL](#canonical-url)
    > - [`robots.txt` or how to block pages with no marketing value to waste your crawl budget](#robotstxt-or-how-to-block-pages-with-no-marketing-value-to-waste-your-crawl-budget)
    > - [Making Google aware of the robots.txt](#making-google-aware-of-the-robotstxt)
    > - [sitemap.xml](#sitemapxml-overview)
    > - [Making Google aware of the sitemap.xml](#making-google-aware-of-the-sitemapxml)
    > - [`X-Robots-Tag` with `noindex`](#x-robots-tag-with-noindex)
    > - [`robots.txt` vs `noindex` or the difference between crawling and indexing](#robotstxt-vs-noindex-or-the-difference-between-crawling-and-indexing)
    > * [Website to-do list](#website-to-do-list)
    > - [Standard](#standard)
    > - [Metadata in the `head`](#metadata-in-the-head)
    @@ -36,6 +49,10 @@
    > - [Anlysing a specific URL](#anlysing-a-specific-url)
    > - [Tips](#google-search-console-tips)
    > * [UX and SEO](#ux-and-seo)
    > * [Tools](#tools)
    > * [Tips and tricks](#tips-and-tricks)
    > - [Red flags](#red-flags)
    > - [Red flags - Google Search Console](#red-flags---google-search-console)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    @@ -96,6 +113,126 @@ When you click on that bar, you can see the details of those keywords.

    This is achieved by following the same steps as the [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.

    # Crawl budget or how to use `robots.txt` `sitemap.xml` and `noindex`

    Tl;dr Those three techniques aim to optimize your crawl budget:
    - Use a `robots.txt` to prevent non-marketing pages to consume your crawl budget.
    - Use one or many `sitemap.xml` to make sure that marketing pages are crawled to make the best out of your crawl budget.
    - Ise the `noindex` value in the HTML head to prevent certain pages that can't be listed in the `robots.txt` to be crawled. This technique is used to:
    - Prevent duplicate content.
    - Deal with faceted navigation
    - Soft 404 error page (i.e., pages that return a 200 status code saying that the page is not found), instead of a explicit 404 status HTML page.
    - Infinite space (e.g., calendar page where the URL contains the date)

    ## What is the crawl budget

    > WARNING: Optimizing crawl budget only worth it if your website contains a few thousands web pages. Otherwise, it is a waste of time. That being said, nurturing good SEO habits doesn't hurt and will make it easier to grow.
    Crawling your website is not effortless. This means that search engine companies don't allocate an infinite amount of resources to crawl your precious website. Instead, they allocate it a specific _budget_ called the _crawling budget_. This budget is usually denominated in number of pages that the search engine will crawl. This budget depends on many factors that are left to the discretion of each search engine company, though [some factors have become public](#factors-that-improve-the-crawl-budget). Without knowing exactly what your budget is, you should do your best to configure your website to prioritize the pages you want to be index and de-prioritize the pages that you do not wish to consume any amount of your precious crawl budget. The pages you should block from consuming your crawl budget are:
    - [Duplicate content](#dealing-with-duplicate-content).
    - [Page that are important to users but present no marketing value](#robotstxt-or-how-to-block-pages-with-no-marketing-value-to-waste-your-crawl-budget) (e.g., admin panel, settings page).
    - Soft 404 pages (i.e., pages that return a 200 status code saying that the page is not found), instead of a explicit 404 status HTML page.

    ## Factors that improve the crawl budget

    - Fast web pages even under pressure: If the GoogleBot notices that your pages load very quickly even with a lot of traffic, it can decide that increase the number of pages it schedules to crawl.
    - No errors
    - JS and CSS files: Every resource that Googlebot needs to fetch to render your page counts toward your crawl budget. To mitigate this, ensure these resources can be cached by Google. Avoid using cache-busting URLs (those that change frequently).
    - Avoid long redirect chains. A redirect count as an additional page to crawl in your budget.

    ## Understanding how the crawl budget is spent

    > For a detailed explanation of the Google Search Console's `Coverage` report, please refer to https://support.google.com/webmasters/answer/7440203?hl=en.
    - Use the `Coverage` section of the Google Search Console.
    - Review the URLs in the `Valid` category to confirm they are listed as expected. Unexpected pages are:
    - Duplicated content (often due to faceted URLs).
    - Soft 404s.
    - Non-marketing pages.

    ## Non-marketing pages

    - __Thank you page__. Those pages could rank for long-funnel keywords.
    - __User settings__.

    ## Dealing with duplicate content

    - Determine whether duplicate pages have already been indexed:
    - Login to the [Google Search Console](https://search.google.com/search-console).
    - Select the correct property.
    - Click on the `Coverage` section in the menu.
    - Review all `Valid` URLs and look for duplicate URLs.
    - For all duplicate URLs:
    1. Do not block them in the `robots.txt` yet. Otherwise, this won't give Google a change to deindex them first.
    2. Make sure they have a canonical URL set the head.
    3. Add the [`noindex`](#x-robots-tag-with-noindex).
    4. Wait until the effect of the previous steps shows the duplicate page in the `Excluded` URLs category (this could take a couple of days).
    5. Block that page in the `robots.txt`.
    6. Optionally, if that page was useless and could be deleted in favor of the canonical version, then do it. Then make sure to create a 301 from that duplicate link to the canonical.

    ### Canonical URL

    ```
    <link rel="canonical" href="https://example.com/dresses/green-dresses" />
    ```

    - A canonical URL impacts both indexing and crawlability:
    - Indexing: When all duplicate pages use the same canonical URL, only the canonical URL is indexed.
    - Crawlability: Once the page has been crawled and indexed once, Google will now which page is a duplicate. This means that subsequent crawls will only crawl the canonical URL and skip the duplicated content. This will allow to avoid wasting the crawl budget. Also, by making sure that the only the canonical URLs are added to the sitemap.xml, we can implicitely improve the crawl budget (as opposed to listing the duplicated links in the sitemap).
    - Both `rel="canonical"` and `content="noindex"` will prevent the page to be indexed by Google.
    - Do not mix canonical URL with `noindex`. This confuses the GoogleBot. If it sees both, it will choose to follow the canonical URL signal (ref: [Google: Don’t Mix Noindex & Rel=Canonical](https://www.searchenginejournal.com/google-dont-mix-noindex-relcanonical/262607/)).
    - Canonical URL has the same effect as a 301 permanent redirect. In fact, canonical URL was originally made for situation where a 301-redirect was not possible.
    - Use only the URLs that are canonical for your sitemap.

    ## `robots.txt` or how to block pages with no marketing value to waste your crawl budget

    > - To create a robots.txt online, please refer to https://www.seoptimer.com/robots-txt-generator
    > - To test a robots.txt file, use the [Google robots.txt tester tool](https://www.google.com/webmasters/tools/robots-testing-tool).
    > - The [`X-Robot-Tag` with `noindex`](#x-robots-tag-with-noindex) will not block crawling. It just prevents the GoogleBot from indexing the page. Pages with `noindex` will still consume crawl budget.
    ```
    User-agent: Googlebot
    Disallow: /nogooglebot/
    User-agent: *
    Allow: /
    Sitemap: http://www.example.com/sitemap.xml
    ```

    Where:
    - The user agent named Googlebot is not allowed to crawl any URL that starts with http://example.com/nogooglebot/.
    - All other user agents are allowed to crawl the entire site. This could have been omitted and the result would be the same; the default behavior is that user agents are allowed to crawl the entire site.
    - The site's sitemap file is located at http://www.example.com/sitemap.xml.

    ### Making Google aware of the robots.txt

    Manually submit it to the Google Serch Console.

    ## sitemap.xml overview

    > To create a robots.txt online, please refer to https://www.xml-sitemaps.com/
    > - [Everything you need to know about multilingual and multinational sitemaps](https://skryvets.com/blog/2018/05/01/everything-you-need-to-know-about-multilingual-and-multinational-sitemaps/)
    Pages that are not listed in the sitemap.xml as well as not listed in the robots.txt will still be eventually crawled, but they won't receive as much attention from Google.

    ### Making Google aware of the sitemap.xml

    There are two ways to make Google aware of your sitemap.xml:
    1. Include it in the robots.txt. To see an example, please refer to the [robots.txt](#robotstxt-overview) section.
    2. Manually submit it to the Google Serch Console.

    > #1 is considered a best practice.
    ## `X-Robots-Tag` with `noindex`

    ```
    <meta name="robots" content="noindex" />
    ```

    ## `robots.txt` vs `noindex` or the difference between crawling and indexing

    # Website to-do list

    To double-check that the list below is correctly implemented, refer to successfull website that tick all the SEO boxes:
    @@ -258,6 +395,20 @@ Simply paste the URL in the search bar at the top.

    [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)

    # Tools

    | Topic | Description | Link |
    |:------|:------------|:-----|
    | `robots.txt` | Create a robots.txt online | https://www.seoptimer.com/robots-txt-generator |
    | `robots.text` | Test the validity of a robots.txt | https://www.google.com/webmasters/tools/robots-testing-tool |
    | `sitemap.xml` | Create a sitemap.xml online | https://www.xml-sitemaps.com/ |

    # Tips and tricks
    ## Red flags
    ### Red flags - Google Search Console

    - Sudden spike in valid URLs in the `Coverage` section. This is usually due to misconfigured faceted pages.

    # How to
    ## How to test how your page is seen by Google?

  9. @nicolasdao nicolasdao revised this gist Aug 6, 2021. 1 changed file with 49 additions and 0 deletions.
    49 changes: 49 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -31,6 +31,10 @@
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [Videos](#videos)
    > - [Key moments](#key-moments)
    > * [Google Search Console](#google-search-console)
    > - [Understanding how your pages are performing](#understanding-how-your-pages-are-performing)
    > - [Anlysing a specific URL](#anlysing-a-specific-url)
    > - [Tips](#google-search-console-tips)
    > * [UX and SEO](#ux-and-seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    @@ -205,6 +209,51 @@ The first two points are the most important as the last two points are good prac

    https://developers.google.com/search/docs/data-types/video#clip

    # Google Search Console

    This online tool from Google allows to gain insights on how your website is being crawled by the GoogleBot. It can also submit pages for crawling.
    - Submit new sitemap.xml or explicit new URLs.
    - Get alerted on issues.
    - Understand how Google sees your pages.
    - Test:
    - Mobile usability
    - Rich results ()

    ## Understanding how your pages are performing

    This is mainly detailed under the `Performance` section.

    - __`Queries`__: Details which keywords drive the most traffic.
    - __`Pages`__: Shows which pages receive the most traffic.

    How to use this section:
    - __Improve conversion__: Use the `Pages` sectino to identify the pages that receive a lot of traffic but do not convert in terms of click.
    - __Optimize your website for the best keywords__. Use the `Queries` to understand which keywords is driving the most traffic and create dedicated pages just for those keyrods.
    - __Compare your page performance from one period to another__:
    - Select filter at the top (e.g., Query with keywords)
    - Click on the `Date` filter at the top and select `Compare` rather than `Filter`
    - You may see an increase of traffic due to:
    - Seasonality
    - Better content optimization for specific keywords.
    - Improvement is Web Vitals and fixed issues.
    - You may see a decrease of traffic due to:
    - Seasonality
    - Page errors (jump to the [Anlysing a specific URL](#anlysing-a-specific-url) section to diagnoze issues)
    - Content is less popular
    - You've canibalized that page with a new optimized landing page

    ## Anlysing a specific URL

    Simply paste the URL in the search bar at the top.

    ## Google Search Console Tips

    - Link your property in Google Analytics with the one in Google Search Console:
    - Open your property in Google Analytics
    - Click the `Admin`
    - Under the `Property` section, under the `PRODUCT LINKING`, click on the `All Products`
    - Link Google Search Console to feed new valuable data into your Google Analytics.

    # UX and SEO

    [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)
  10. @nicolasdao nicolasdao revised this gist Jul 13, 2021. 1 changed file with 2 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -4,6 +4,8 @@
    > 1. [Beginner SEO](https://developers.google.com/search/docs/beginner/get-started)
    > 2. [Advanced SEO](https://developers.google.com/search/docs/advanced/guidelines/get-started)
    > If you're just interested in performances, please refer to the [Web Vitals document](https://gist.github.com/nicolasdao/fad8bb808970805ff2fef6a84ee61af0).
    # Table of contents

    > * [Concepts](#concepts)
  11. @nicolasdao nicolasdao revised this gist Jul 12, 2021. 1 changed file with 2 additions and 1 deletion.
    3 changes: 2 additions & 1 deletion seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -132,7 +132,8 @@ At a minimum, the page's `head` tag must contain:
    - http://www.example.com
    - https://example.com
    - https://www.example.com
    Then it is technically duplicated 4 times, which confuses the Google bot and dilute your SEO efforts.

    Then it is technically duplicated 4 times, which confuses the Google bot and dilute your SEO efforts.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.
    - Use hyphens in your pathname (no underscore). Google treats a hyphen as a word separator, but does not treat an underscore that way.
    - Use an __*absolute path*__ in the `canonical` URL on all pages and make sure there is a trailing slash. Also, make sure that the the search params are included if they matter.
  12. @nicolasdao nicolasdao revised this gist Jul 12, 2021. 1 changed file with 6 additions and 0 deletions.
    6 changes: 6 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -127,6 +127,12 @@ At a minimum, the page's `head` tag must contain:

    ### Pathname and information architecture

    - USE ABSOLUTE PATH IN ALL LINKS AND MAKE THE ROOT DOMAIN REDIRECT TO WWW. USE WWW. IN ALL YOUR ABSOLUTE PATH. The reason behind this is to create unique content. Otherwise, if your content is accessible from:
    - http://example.com
    - http://www.example.com
    - https://example.com
    - https://www.example.com
    Then it is technically duplicated 4 times, which confuses the Google bot and dilute your SEO efforts.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.
    - Use hyphens in your pathname (no underscore). Google treats a hyphen as a word separator, but does not treat an underscore that way.
    - Use an __*absolute path*__ in the `canonical` URL on all pages and make sure there is a trailing slash. Also, make sure that the the search params are included if they matter.
  13. @nicolasdao nicolasdao revised this gist Jul 7, 2021. 1 changed file with 2 additions and 0 deletions.
    2 changes: 2 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -192,6 +192,8 @@ The first two points are the most important as the last two points are good prac
    # Videos
    ## Key moments

    <img src="https://user-images.githubusercontent.com/3425269/124760540-30089000-df74-11eb-885c-b1b40f27c35e.jpg">

    https://developers.google.com/search/docs/data-types/video#clip

    # UX and SEO
  14. @nicolasdao nicolasdao revised this gist Jul 7, 2021. 1 changed file with 11 additions and 0 deletions.
    11 changes: 11 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -1,5 +1,9 @@
    # SEO & SEM GUIDE

    > Google release the best document for SEO practices here: https://developers.google.com/search/docs. There are 2 sections:
    > 1. [Beginner SEO](https://developers.google.com/search/docs/beginner/get-started)
    > 2. [Advanced SEO](https://developers.google.com/search/docs/advanced/guidelines/get-started)
    # Table of contents

    > * [Concepts](#concepts)
    @@ -23,6 +27,8 @@
    > - [`sitemapimages.xml`](#sitemapimagesxml)
    > - [Multi-languages](#multi-languages)
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [Videos](#videos)
    > - [Key moments](#key-moments)
    > * [UX and SEO](#ux-and-seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    @@ -183,6 +189,11 @@ That being said, there are a series of caveats to avoid in order to not be penal

    The first two points are the most important as the last two points are good practices for any websites in general.

    # Videos
    ## Key moments

    https://developers.google.com/search/docs/data-types/video#clip

    # UX and SEO

    [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)
  15. @nicolasdao nicolasdao revised this gist Jul 7, 2021. 1 changed file with 4 additions and 0 deletions.
    4 changes: 4 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -85,6 +85,10 @@ When you click on that bar, you can see the details of those keywords.
    This is achieved by following the same steps as the [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.

    # Website to-do list

    To double-check that the list below is correctly implemented, refer to successfull website that tick all the SEO boxes:
    - https://www.nytimes.com/

    ## Standard
    ### Metadata in the `head`

  16. @nicolasdao nicolasdao revised this gist Jul 6, 2021. 1 changed file with 30 additions and 6 deletions.
    36 changes: 30 additions & 6 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -10,6 +10,12 @@
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Website to-do list](#website-to-do-list)
    > - [Standard](#standard)
    > - [Metadata in the `head`](#metadata-in-the-head)
    > - [Pathname and information architecture](#pathname-and-information-architecture)
    > - [JSON-LD](#json-ld)
    > - [Culture](#culture)
    > - [Images](#images)
    > - [Content](#content)
    > - [`robots.txt`](#robotstxt)
    > - [sitemap](#sitemap)
    > - [`sitemap.xml`](#sitemapxml)
    @@ -80,7 +86,9 @@ This is achieved by following the same steps as the [Finding the keywords rankin

    # Website to-do list
    ## Standard
    - At a minimum, the page's `head` tag must contain:
    ### Metadata in the `head`

    At a minimum, the page's `head` tag must contain:
    ```html
    <title>Example Title</title> <!-- Keep it between 50 and 60 characters. Use your targetted keywords as well as long-tail keywords. -->
    <link rel="canonical" href="https://your-website.com/"> <!-- Don't forget the trailing slash -->
    @@ -106,15 +114,31 @@ This is achieved by following the same steps as the [Finding the keywords rankin
    <meta name="twitter:image" content="Path to image">
    <meta name="twitter:title" content="Page title">
    ```
    - Add the `lang` attribute on the `html` tag: `<html lang="en">`

    ### Pathname and information architecture

    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.
    - Use hyphens in your pathname (no underscore). Google treats a hyphen as a word separator, but does not treat an underscore that way.
    - Use an __*absolute path*__ in the `canonical` URL on all pages and make sure there is a trailing slash. Also, make sure that the the search params are included if they matter.
    - Use `alt` attribute on all images. Favor proper description, rather than SEO keywords. Use 50-55 characters (up to 16 words) in the alt text.
    - If you know the language of a link, try to add `hreflang` on it.
    - Add a trailing `/` on all internal links and make sure that all web page are using `/`, otherwise, Google may think the 2 version are duplicated content.

    ### JSON-LD

    - Use JSONLD on all pages (please refer to Annex in the [JSONLD examples](#jsonld-examples) section).

    ### Culture

    - Add the `lang` attribute on the `html` tag: `<html lang="en">`
    - If you know the language of a link, try to add `hreflang` on it.
    - Explicitely set up the `hreflang`, even if you only use a single language. Use an absolute path for that URL.
    - Add a trailing `/` on all internal links and make sure that all web page are using `/`, otherwise, Google may think the 2 version are duplicated content.

    ### Images

    - Use `alt` attribute on all images. Favor proper description, rather than SEO keywords. Use 50-55 characters (up to 16 words) in the alt text.

    ### Content

    - Find a way to organize your text content so that important keywords are in `H1` tags and less important keywords are in `h2`.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.

    ## `robots.txt`

  17. @nicolasdao nicolasdao revised this gist Jun 30, 2021. 1 changed file with 65 additions and 22 deletions.
    87 changes: 65 additions & 22 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -9,6 +9,13 @@
    > - [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain)
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Website to-do list](#website-to-do-list)
    > - [Standard](#standard)
    > - [`robots.txt`](#robotstxt)
    > - [sitemap](#sitemap)
    > - [`sitemap.xml`](#sitemapxml)
    > - [`sitemap.html`](#sitemaphtml)
    > - [`sitemapimages.xml`](#sitemapimagesxml)
    > - [Multi-languages](#multi-languages)
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [UX and SEO](#ux-and-seo)
    > * [How to](#how-to)
    @@ -72,34 +79,70 @@ When you click on that bar, you can see the details of those keywords.
    This is achieved by following the same steps as the [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.

    # Website to-do list

    - Add all the must have `meta` tags in the `header`:
    - `<meta name="robots" content="index,follow"> <!-- Very important, otherwise, Google might not be able to index your page -->`
    - `<link rel="canonical" href="https://your-website.com/"> <!-- Don't forget the trailing slash -->`
    - `<link rel="alternate" href="https://your-website.com/" hreflang="en"> <!-- Don't forget the trailing slash -->`
    - `<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">`
    - `<!-- SEO, Meta and Opengraph -->`
    - `<meta itemprop="description" content="Clear page description">`
    - `<meta itemprop="name" content="Page title">`
    - `<meta name="description" content="Clear page description">`
    - `<meta name="keywords" content="SEO keywords">`
    - `<meta name="og:keywords" content="SEO keywords">`
    - `<meta name="twitter:card" content="summary">`
    - `<meta name="twitter:description" content="Clear page description">`
    - `<meta name="twitter:image" content="Path to image">`
    - `<meta name="twitter:title" content="Page title">`
    - `<meta property="og:description" content="Clear page description.">`
    - `<meta property="og:image" content="Path to image">`
    - `<meta property="og:title" content="Page title">`
    - `<meta property="og:type" content="website">`
    - `<meta property="og:url" content="https://your-website.com/"> <!-- Don't forget the trailing slash -->`
    ## Standard
    - At a minimum, the page's `head` tag must contain:
    ```html
    <title>Example Title</title> <!-- Keep it between 50 and 60 characters. Use your targetted keywords as well as long-tail keywords. -->
    <link rel="canonical" href="https://your-website.com/"> <!-- Don't forget the trailing slash -->
    <link rel="alternate" href="https://your-website.com/" hreflang="en"> <!-- Don't forget the trailing slash -->

    <!-- SEO, Meta and Opengraph -->
    <meta name="title" content="Example Title">
    <meta name="description" content="This is meta description Sample."> <!-- Keep it between 50 and 160 characters. -->
    <meta name="robots" content="index,follow"> <!-- Very important, otherwise, Google might not be able to index your page -->
    <meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">
    <meta itemprop="description" content="Clear page description">
    <meta itemprop="name" content="Page title">
    <meta name="description" content="Clear page description">
    <meta name="keywords" content="SEO keywords">
    <meta name="og:keywords" content="SEO keywords">
    <meta property="og:description" content="Clear page description.">
    <meta property="og:image" content="Path to image">
    <meta property="og:title" content="Page title">
    <meta property="og:type" content="website">
    <meta property="og:url" content="https://your-website.com/"> <!-- Don't forget the trailing slash -->
    <meta name="twitter:card" content="summary">
    <meta name="twitter:description" content="Clear page description">
    <meta name="twitter:image" content="Path to image">
    <meta name="twitter:title" content="Page title">
    ```
    - Add the `lang` attribute on the `html` tag: `<html lang="en">`
    - Use an __*absolute path*__ in the `canonical` URL on all pages and make sure there is a trailing slash. Also, make sure that the the search params are included if they matter.
    - Use `alt` attribute on all images. Favor proper description, rather than SEO keywords. Use 50-55 characters (up to 16 words) in the alt text.
    - If you know the language of a link, try to add `hreflang` on it.
    - Use JSONLD on all pages (please refer to Annex in the [JSONLD examples](#jsonld-examples) section).
    - Use an absolute path in the `canonical` URL and make sure there is a trailing slash.
    - Explicitely set up the `hreflang`, even if you only use a single language. Use an absolute path for that URL.
    - Add a trailing `/` on all internal links and make sure that all web page are using `/`, otherwise, Google may think the 2 version are duplicated content.
    - Find a way to organize your text content so that important keywords are in `H1` tags and less important keywords are in `h2`.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.

    ## `robots.txt`

    `robots.txt` does not have authority than the `robots` meta tag, and vice versa, but `noindex` in any of those two places will stop the bot to index.

    ## sitemap

    There are 3 types of sitemap files:

    - [`sitemap.xml`](#sitemapxml)
    - [`sitemap.html`](#sitemaphtml)
    - [`sitemapimages.xml`](#sitemapimagesxml)

    > WARNING: The info in the sitemap.xml must be the same as in the actual page, otherwise, it will confuse the crawler bot, which might result in worst results than no sitemap at all (e.g., outdated hreflang or canonical URL).
    ### `sitemap.xml`

    ### `sitemap.html`

    ### `sitemapimages.xml`

    ## Multi-languages

    - Use your page is duplicated in another language, add an `alternate` link in the header:
    ```html
    <link rel="alternate" href="https://your-website.com/" hreflang="en"> <!-- Don't forget the trailing slash -->
    ```

    # Progressive Web Apps aka PWA & SEO

    As of 2019, PWA are all the rage and Google has made a lot of progress to index them properly. To test how Google sees your PWA, please refer to the [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google) section.
  18. @nicolasdao nicolasdao revised this gist Jun 26, 2021. 1 changed file with 6 additions and 0 deletions.
    6 changes: 6 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -10,6 +10,7 @@
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Website to-do list](#website-to-do-list)
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [UX and SEO](#ux-and-seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    @@ -111,6 +112,10 @@ That being said, there are a series of caveats to avoid in order to not be penal

    The first two points are the most important as the last two points are good practices for any websites in general.

    # UX and SEO

    [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)

    # How to
    ## How to test how your page is seen by Google?

    @@ -274,3 +279,4 @@ Please refer to the [Finding the historical keywords ranking for a domain](#find
    # References
    - [Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO](https://moz.com/blog/introducing-progressive-web-apps)
    - [Mobile-first indexing best practices](https://developers.google.com/search/mobile-sites/mobile-first-indexing)
    - [Rendering on the Web](https://developers.google.com/web/updates/2019/02/rendering-on-the-web)
  19. @nicolasdao nicolasdao revised this gist Apr 11, 2021. 1 changed file with 1 addition and 0 deletions.
    1 change: 1 addition & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -273,3 +273,4 @@ Please refer to the [Finding the historical keywords ranking for a domain](#find

    # References
    - [Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO](https://moz.com/blog/introducing-progressive-web-apps)
    - [Mobile-first indexing best practices](https://developers.google.com/search/mobile-sites/mobile-first-indexing)
  20. @nicolasdao nicolasdao revised this gist Mar 14, 2021. 1 changed file with 22 additions and 2 deletions.
    24 changes: 22 additions & 2 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -72,12 +72,32 @@ This is achieved by following the same steps as the [Finding the keywords rankin

    # Website to-do list

    - Add all the must have `meta` tags in the `header`:
    - `<meta name="robots" content="index,follow"> <!-- Very important, otherwise, Google might not be able to index your page -->`
    - `<link rel="canonical" href="https://your-website.com/"> <!-- Don't forget the trailing slash -->`
    - `<link rel="alternate" href="https://your-website.com/" hreflang="en"> <!-- Don't forget the trailing slash -->`
    - `<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">`
    - `<!-- SEO, Meta and Opengraph -->`
    - `<meta itemprop="description" content="Clear page description">`
    - `<meta itemprop="name" content="Page title">`
    - `<meta name="description" content="Clear page description">`
    - `<meta name="keywords" content="SEO keywords">`
    - `<meta name="og:keywords" content="SEO keywords">`
    - `<meta name="twitter:card" content="summary">`
    - `<meta name="twitter:description" content="Clear page description">`
    - `<meta name="twitter:image" content="Path to image">`
    - `<meta name="twitter:title" content="Page title">`
    - `<meta property="og:description" content="Clear page description.">`
    - `<meta property="og:image" content="Path to image">`
    - `<meta property="og:title" content="Page title">`
    - `<meta property="og:type" content="website">`
    - `<meta property="og:url" content="https://your-website.com/"> <!-- Don't forget the trailing slash -->`
    - Use JSONLD on all pages (please refer to Annex in the [JSONLD examples](#jsonld-examples) section).
    - Use an absolute path in the `canonical` URL.
    - Use an absolute path in the `canonical` URL and make sure there is a trailing slash.
    - Explicitely set up the `hreflang`, even if you only use a single language. Use an absolute path for that URL.
    - Add a trailing `/` on all internal links and make sure that all web page are using `/`, otherwise, Google may think the 2 version are duplicated content.
    - Find a way to organize your text content so that important keywords are in `H1` tags and less important keywords are in `h2`.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manafacturing` rather than `/shoe`.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manufacturing` rather than `/shoe`.

    # Progressive Web Apps aka PWA & SEO

  21. @nicolasdao nicolasdao revised this gist Jan 11, 2021. 1 changed file with 137 additions and 0 deletions.
    137 changes: 137 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -8,11 +8,13 @@
    > - [Optimizing a web page for keywords](#optimizing-a-web-page-for-keywords)
    > - [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain)
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Website to-do list](#website-to-do-list)
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    > * [Annex](#annex)
    > - [JSONLD examples](#jsonld-examples)
    > - [ahrefs recipes to rank](#ahrefs-recipes-to-rank)
    > * [References](#references)
    @@ -68,6 +70,15 @@ When you click on that bar, you can see the details of those keywords.

    This is achieved by following the same steps as the [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.

    # Website to-do list

    - Use JSONLD on all pages (please refer to Annex in the [JSONLD examples](#jsonld-examples) section).
    - Use an absolute path in the `canonical` URL.
    - Explicitely set up the `hreflang`, even if you only use a single language. Use an absolute path for that URL.
    - Add a trailing `/` on all internal links and make sure that all web page are using `/`, otherwise, Google may think the 2 version are duplicated content.
    - Find a way to organize your text content so that important keywords are in `H1` tags and less important keywords are in `h2`.
    - Use keywords in your URL paths. For example, if you're a shoe manufacturer, you may want to use a path similar to `/shoe-manafacturing` rather than `/shoe`.

    # Progressive Web Apps aka PWA & SEO

    As of 2019, PWA are all the rage and Google has made a lot of progress to index them properly. To test how Google sees your PWA, please refer to the [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google) section.
    @@ -94,6 +105,132 @@ This renders all the HTML, but unfortunately, it won't render a full image of th
    Please refer to the [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain) section.

    # Annex
    ## JSONLD examples

    - [General website description](#general-website-description)
    - [Describing a home page structure](#describing-a-home-page-structure)
    - [Describing the position in the website](#describing-the-position-in-the-website)

    ### General website description

    ```html
    <script type="application/ld+json">
    {
    "@context" : "http://schema.org",
    "@type" : "Organization",
    "legalName" : "Australian Barnardos Recruitment Services",
    "alternateName" : "ABRS",
    "url" : "https://www.abrs.net.au/",
    "contactPoint" : [{
    "@type" : "ContactPoint",
    "telephone" : "(02) 9218 2334",
    "Email" : "[email protected]",
    "contactType" : "Sydney Office"
    }],
    "logo" : "https://www.abrs.net.au/images/abrs-logo-hd.png",
    "sameAs" : "https://www.linkedin.com/company/abrs---australian-barnardos-recruitment-service/"
    }
    </script>
    ```

    ### Describing a home page structure
    ```html
    <script type="application/ld+json">
    {
    "@context":"http://schema.org",
    "@type":"ItemList",
    "itemListElement":[
    {
    "@type":"SiteNavigationElement",
    "position":1,
    "name": "Home",
    "description": "{{ page.homeSiteNavDescription }}",
    "url":"https://www.abrs.net.au/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":2,
    "name": "About Us",
    "description": "{{ page.aboutUsSiteNavDescription }}",
    "url":"https://www.abrs.net.au/about-us/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":3,
    "name": "Job Types",
    "description": "{{ page.aboutUsSiteNavDescription }}",
    "url":"https://www.abrs.net.au/job-types/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":4,
    "name": "Industry Sectors",
    "description": "{{ page.aboutUsSiteNavDescription }}",
    "url":"https://www.abrs.net.au/industry-sectors/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":5,
    "name": "Clients",
    "description": "{{ page.clientsSiteNavDescription }}",
    "url":"https://www.abrs.net.au/clients/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":6,
    "name": "Jobs",
    "description": "{{ page.candidatesSiteNavDescription }}",
    "url":"https://www.abrs.net.au/jobs/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":7,
    "name": "Blog",
    "description": "{{ page.aboutUsSiteNavDescription }}",
    "url":"https://www.abrs.net.au/blog/"
    },
    {
    "@type":"SiteNavigationElement",
    "position":8,
    "name": "Contact",
    "description": "{{ page.contactSiteNavDescription }}",
    "url":"https://www.abrs.net.au/contact/"
    }]
    }
    </script>
    ```

    ### Describing the position in the website

    ```html
    <script type="application/ld+json">
    {
    "@context": "https://schema.org",
    "@type": "BreadcrumbList",
    "itemListElement": [{
    "@type": "ListItem",
    "position": 1,
    "name": "Home",
    "item": "https://www.abrs.net.au/"
    }, {
    "@type": "ListItem",
    "position": 2,
    "name": "About us",
    "item": "https://www.abrs.net.au/about-us"
    },{
    "@type": "ListItem",
    "position": 3,
    "name": "Our values",
    "item": "https://www.abrs.net.au/about-us/values"
    }]
    }
    </script>
    ```

    ## ahrefs recipes to rank
    ### General SEO

  22. @nicolasdao nicolasdao revised this gist Aug 8, 2020. 1 changed file with 23 additions and 0 deletions.
    23 changes: 23 additions & 0 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -12,6 +12,8 @@
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    > * [Annex](#annex)
    > - [ahrefs recipes to rank](#ahrefs-recipes-to-rank)
    > * [References](#references)
    # Concepts
    @@ -91,5 +93,26 @@ This renders all the HTML, but unfortunately, it won't render a full image of th

    Please refer to the [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain) section.

    # Annex
    ## ahrefs recipes to rank
    ### General SEO

    - [How to Get on the First Page of Google](https://ahrefs.com/blog/how-to-get-on-the-first-page-of-google/)
    - [A Simple (But Effective) 31-Point SEO Checklist](https://ahrefs.com/blog/seo-checklist/)
    - [How to Improve SEO: 8 Tactics That Don’t Require New Content](https://ahrefs.com/blog/how-to-improve-seo/)
    - [SEO For Beginners: A Basic Search Engine Optimization Tutorial for Higher Google Rankings](https://www.youtube.com/watch?v=DvwS7cV9GmQ&list=PLvJ_dXFSpd2uHtGoHf8K06ebr-TIrgM0G&index=2)

    ### Keyword research

    - [How To Do Keyword Research for SEO — Ahrefs’ Guide](https://ahrefs.com/blog/keyword-research/)
    - [Keyword Difficulty: How to Determine Your Chances of Ranking in Google](https://ahrefs.com/blog/keyword-difficulty/)
    - [How many keywords can you rank for with one page? (Ahrefs’ study of 3M searches)](https://ahrefs.com/blog/also-rank-for-study/)

    ### Link building

    - [The Noob Friendly Guide To Link Building](https://ahrefs.com/blog/link-building/)
    - [9 EASY Link Building Strategies (That ANYONE Can Use)](https://ahrefs.com/blog/link-building-strategies/)
    - [Guest Blogging for SEO: How to Build High-quality Links at Scale](https://ahrefs.com/blog/guest-blogging/)

    # References
    - [Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO](https://moz.com/blog/introducing-progressive-web-apps)
  23. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 22 additions and 6 deletions.
    28 changes: 22 additions & 6 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -5,8 +5,9 @@
    > * [Concepts](#concepts)
    > - [SERP](#serp)
    > * [Keywords](#keywords)
    > - [Keywords ranking for a domain](#keywords-ranking-for-a-domain)
    > - [Historical keywords ranking for a domain](#historical-keywords-ranking-for-a-domain)
    > - [Optimizing a web page for keywords](#optimizing-a-web-page-for-keywords)
    > - [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain)
    > - [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain)
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    @@ -19,7 +20,22 @@
    Stands for `Search Engine Result Page`.

    # Keywords
    ## Keywords ranking for a domain
    ## Optimizing a web page for keywords

    Once you've found the list of keywords you wish a web page to rank for (ideally only a few because each technique only support one keywors at a time, therefore, too many different keywords will dilute your results), place them in the following HTML tags (sorted by order of importance):

    1. `title` tag in the HTML head.
    2. `meta description` tag in the HTML head.
    3. `canonical link` tag in the HTML head (e.g., `<link rel="canonical" href="https://example.com"/>`). The Google bot hates duplicate content. This tag tell the GBot which page is the one and only page that should receive SEO love from it. Use it even if you think you don't have any duplicate page, because in reality, you do. Indeed, as far as the GBot is concerned, https://example.com and https://example.com?refer=facebook are duplicated page.
    4. `h1`(1) tag in the HTML body.
    5. `h2`(1) tag in the HTML body.
    6. `h3`(1) tag in the HTML body.
    7. `image alts` tag in the HTML body. There are so many missing `alt` attributes in web page. This is a shame as it is a missed opportunity to rank.
    8. `anchor text`. Make sure that the text you use in your `a` tag describes the link as clearly as possible. If that links points to an external website, that website's domain authority will benefit from your good description. The same applies to an internal link. The Google bot loves organized content.

    > (1) A typical mistake is to use a H2 with no H1 because H1 looks too big. The issue is that H1 tags worth more than H2s when it comes to SEO. If the content of your header contains keywords you wish to rank for, try to use H1. Use CSS to change its style so it matches your design.
    ## Finding the keywords ranking for a domain

    To figure out what are the keywords for which a specific domain ranks, use the `Organic Keywords Trend` chart. To see that chart:
    1. Login to [Semrush](https://www.semrush.com/).
    @@ -46,9 +62,9 @@ The __*Organic Keywords Trend*__ can be read as follow:

    When you click on that bar, you can see the details of those keywords.

    ## Historical keywords ranking for a domain
    ## Finding the historical keywords ranking for a domain

    This is achieved by following the same steps as the [Keywords ranking for a domain](#keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.
    This is achieved by following the same steps as the [Finding the keywords ranking for a domain](#finding-the-keywords-ranking-for-a-domain) section. However, you'll need to have the Semrush Guru tier at minimum (almost USD200/month). In the `Organic Keywords Trend` chart, click on any bar in the chart to see the keywords ranking details for that point in time.

    # Progressive Web Apps aka PWA & SEO

    @@ -73,7 +89,7 @@ This renders all the HTML, but unfortunately, it won't render a full image of th

    ## How to check keywords ranking history for a domain?

    Please refer to the [Historical keywords ranking for a domain](#historical-keywords-ranking-for-a-domain) section.
    Please refer to the [Finding the historical keywords ranking for a domain](#finding-the-historical-keywords-ranking-for-a-domain) section.

    # References
    - [Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO](https://moz.com/blog/introducing-progressive-web-apps)
  24. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 6 additions and 6 deletions.
    12 changes: 6 additions & 6 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -38,12 +38,12 @@ The top horizontal bar can be read as follow:
    The __*Organic Keywords Trend*__ can be read as follow:
    - The legend show the colors that represent the keywords categories based on how they rank in the SERP (e.g., `Top 3` are keywords that makes your domain rank in the top 3 Google pages).
    - Each vertical bar is a snapshot of the keywords ranking. For example, in the image above, hovering on the _March 20_ bar shows that 33 keywords in total ranked your webiste inside the first 100 SERPs. Amongst those 33 keywords:
    - 0 ranked in the first top 3 SERPs.
    - 5 ranked between the 4th and 10th SERP.
    - 4 ranked between the 11th and 20th SERP.
    - 11 ranked between the 21st and 50th SERP.
    - 13 ranked between the 51st and 100th SERP.
    - 0 ranked in the first top 3 SERPs.
    - 5 ranked between the 4th and 10th SERP.
    - 4 ranked between the 11th and 20th SERP.
    - 11 ranked between the 21st and 50th SERP.
    - 13 ranked between the 51st and 100th SERP.
    When you click on that bar, you can see the details of those keywords.

    ## Historical keywords ranking for a domain
  25. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 17 additions and 3 deletions.
    20 changes: 17 additions & 3 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -2,6 +2,8 @@

    # Table of contents

    > * [Concepts](#concepts)
    > - [SERP](#serp)
    > * [Keywords](#keywords)
    > - [Keywords ranking for a domain](#keywords-ranking-for-a-domain)
    > - [Historical keywords ranking for a domain](#historical-keywords-ranking-for-a-domain)
    @@ -11,6 +13,11 @@
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    > * [References](#references)
    # Concepts
    ## SERP

    Stands for `Search Engine Result Page`.

    # Keywords
    ## Keywords ranking for a domain

    @@ -24,13 +31,20 @@ To figure out what are the keywords for which a specific domain ranks, use the `
    <img src="https://user-images.githubusercontent.com/3425269/89608523-b1652780-d8b8-11ea-8434-c6a37a8db310.png" width="650px">

    The top horizontal bar can be read as follow:
    - `Keywords`: Current number of keywords that rank withing the first 100 Google pages.
    - `Traffic`: Current number of users that have visited your website this month.
    - `Keywords`: Current number of keywords that rank within the first 100 Google pages.
    - `Traffic`: Current number of users that those keywords have redirected to your website this month.
    - `Traffic cost`: How much would it cost to rank the way your keywords do.

    The __*Organic Keywords Trend*__ can be read as follow:
    - The legend show the colors that represent the keywords categories based on how they rank in the SERP (e.g., `Top 3` are keywords that makes your domain rank in the top 3 Google pages).
    - Each vertical bar is a snapshot of the keywords ranking. For example, in the image above, we've hovered the
    - Each vertical bar is a snapshot of the keywords ranking. For example, in the image above, hovering on the _March 20_ bar shows that 33 keywords in total ranked your webiste inside the first 100 SERPs. Amongst those 33 keywords:
    - 0 ranked in the first top 3 SERPs.
    - 5 ranked between the 4th and 10th SERP.
    - 4 ranked between the 11th and 20th SERP.
    - 11 ranked between the 21st and 50th SERP.
    - 13 ranked between the 51st and 100th SERP.

    When you click on that bar, you can see the details of those keywords.

    ## Historical keywords ranking for a domain

  26. @nicolasdao nicolasdao revised this gist Aug 7, 2020. No changes.
  27. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 2 additions and 2 deletions.
    4 changes: 2 additions & 2 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -21,7 +21,7 @@ To figure out what are the keywords for which a specific domain ranks, use the `

    ### Understanding the `Organic Keywords Trend` chart

    <img src="https://user-images.githubusercontent.com/3425269/89607816-c2149e00-d8b6-11ea-806a-f193251c889b.png" width="650px">
    <img src="https://user-images.githubusercontent.com/3425269/89608523-b1652780-d8b8-11ea-8434-c6a37a8db310.png" width="650px">

    The top horizontal bar can be read as follow:
    - `Keywords`: Current number of keywords that rank withing the first 100 Google pages.
    @@ -30,7 +30,7 @@ The top horizontal bar can be read as follow:

    The __*Organic Keywords Trend*__ can be read as follow:
    - The legend show the colors that represent the keywords categories based on how they rank in the SERP (e.g., `Top 3` are keywords that makes your domain rank in the top 3 Google pages).
    - Each vertical bar is a snapshot of the keywords ranking. For e
    - Each vertical bar is a snapshot of the keywords ranking. For example, in the image above, we've hovered the

    ## Historical keywords ranking for a domain

  28. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 11 additions and 2 deletions.
    13 changes: 11 additions & 2 deletions seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -19,9 +19,18 @@ To figure out what are the keywords for which a specific domain ranks, use the `
    2. Select `Domain Analytics/Overview`, enter the domain in the top input and click the `Search` button.
    3. Select the `Organic Research` in the left pane to see the `Organic Keywords Trend` chart.

    To read that chart:
    ### Understanding the `Organic Keywords Trend` chart

    <img src="https://user-images.githubusercontent.com/3425269/89607816-c2149e00-d8b6-11ea-806a-f193251c889b.png" width="450px">
    <img src="https://user-images.githubusercontent.com/3425269/89607816-c2149e00-d8b6-11ea-806a-f193251c889b.png" width="650px">

    The top horizontal bar can be read as follow:
    - `Keywords`: Current number of keywords that rank withing the first 100 Google pages.
    - `Traffic`: Current number of users that have visited your website this month.
    - `Traffic cost`: How much would it cost to rank the way your keywords do.

    The __*Organic Keywords Trend*__ can be read as follow:
    - The legend show the colors that represent the keywords categories based on how they rank in the SERP (e.g., `Top 3` are keywords that makes your domain rank in the top 3 Google pages).
    - Each vertical bar is a snapshot of the keywords ranking. For e

    ## Historical keywords ranking for a domain

  29. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -21,7 +21,7 @@ To figure out what are the keywords for which a specific domain ranks, use the `

    To read that chart:

    <img src="https://user-images.githubusercontent.com/3425269/89607816-c2149e00-d8b6-11ea-806a-f193251c889b.png" width=450px">
    <img src="https://user-images.githubusercontent.com/3425269/89607816-c2149e00-d8b6-11ea-806a-f193251c889b.png" width="450px">

    ## Historical keywords ranking for a domain

  30. @nicolasdao nicolasdao revised this gist Aug 7, 2020. 1 changed file with 1 addition and 1 deletion.
    2 changes: 1 addition & 1 deletion seo_sem_guide.md
    Original file line number Diff line number Diff line change
    @@ -8,7 +8,7 @@
    > * [Progressive Web Apps aka PWA & SEO](#progressive-web-apps-aka-pwa--seo)
    > * [How to](#how-to)
    > - [How to test how your page is seen by Google?](#how-to-test-how-your-page-is-seen-by-google)
    > - [How to check keywords ranking history for a domain?](#how to check keywords ranking history for a domain)
    > - [How to check keywords ranking history for a domain?](#how-to-check-keywords-ranking-history-for-a-domain)
    > * [References](#references)
    # Keywords