Referencing a website: the true answers, the right method

Be highly visible to search engines

Directory

Introduction

Many webmasters do not know how to reference their websites on search engines. I submit a summary of what you need to do to be well indexed. A huge fantasy revolves around SEO, because everyone wants to be at the top of the results. I will break the myth, tell you the truth and save you money.

Referencing, indexing, positioning?
What do all these words mean?

The referencing is the registration of your website in the databases of search engines.
Indexing is the action of a search engine to register all the web pages of your site in its database.
Positioning is the ranking your website gets when displaying the search result. Everyone wants their site to appear on the first page.

1- Referencing: registering with search engines

Here are the direct links to register your website with the major search engines. This registration is free. Just put the URL of your website and that's it.
Google: google.com/webmasters/tools/submit-url
Bing: bing.com/toolbox/submit-site-url
Qwant: help.qwant.com/fr/aide/qwant-com/comment-referencer-mon-site-sur-qwant/
Baidu: ziyuan.baidu.com/linksubmit/url
Yandex: webmaster.yandex.ru

Paying to register with 300 indexes and directories is pointless. Here, you have access to 99% of the market.

It takes at least 1 to 4 weeks for the search engine to visit your website, index it and display it in the search results. It is possible to go down to 1 to 4 days to have a first-limited result (all the pages will not have been indexed and included in the database). Why does it take so long? Google has thousands of servers around the world. They must synchronize and it takes at least a few weeks. However, some news pages can be indexed in a few hours if the search engine is notified of this change (see below).

2- Indexing: helping the search engine to save all your pages

The big search engines index billions of pages, and they believe that at least half have eluded them. There is an official and standard method to help them: the “sitemap.xml” file.

The creation of the file “sitemap.xml” is fundamental to the success of your SEO and indexing. It has 2 purposes: make a complete list of all your pages and notify the search engine which pages have been modified since its last visit. This way, it will index these latest pages and won't waste its time with others. I noticed that a modification could be displayed a few days later! Otherwise, it's closer to 1 to 4 weeks. If you don't have a “sitemap.xml” file, you will wait up to 6 months before the search engine displays your changes.

Go to the official website sitemaps.org to find out more.

a) Creation of the “sitemap.xml” file

If you use a CMS, blog, forum, e-commerce, etc., it is likely that a function creating this file automatically exists. Otherwise, a plug-in, module, extension should be available. Go to the website of your CMS, blog, e-commerce to find out more.
Softwares to be installed either on the web server or on your computer can also do this. Finally, some websites offer automatic file creation. Some of these services are free, others require a fee.
For small websites, I use this free service: xml-sitemaps.com , but there are others.

b) Register the “sitemap.xml” with search engines

Search engines will not find your "sitemap.xml" file by chance, you must warn them of its existence.

It is recommended that the “sitemap.xml” file is at the root of your hosting, at the first level, where the index.html or index.php file is located. You don't have to call it “sitemap.xml”, but it is better to respect this convention. This file can be compressed in GZ in order to take less space, it will then be named “sitemap.xml.gz”. Search engines can read the compressed version.

There are 4 methods of registration:
On the first page of your website, put a link to the file “sitemap.xml”. When a search engine finds it, it will read it.

Another official method is the insertion of its reference in the “robots.txt” file. If you don't know what it is, go to the official page robotstxt.org. In short, it is used to prevent search engines from indexing specific folders and files. This file is read at each visit of a search engine, it is mandatory, it is useless to refer to it. It must be at the root of your hosting, at the first level, where the index.html or index.php file is found.
So, we add the following line in the file “robots.txt”:

Sitemap: https://www.domaine.tld/sitemap.xml
Notify the search engines directly. This is like point 1 for registration, but complementary, it does not replace it. This method is mainly used to warn them that the file “sitemap.xml”" has been updated and that they will find the references of the new pages.
Replace https://www.domaine.tld/sitemap.xml by the URL of your sitemap:
Google:
https://www.google.com/ping?sitemap=https://www.domaine.tld/sitemap.xml
Bing:
https://www.bing.com/ping?sitemap=https://www.domaine.tld/sitemap.xml
Baidu:
https://ping.baidu.com/ping.html and submit the sitemap URL in the form.
Manage the registration of your “sitemap.xml”. Google, Bing and others offer an interface to control indexing. To prove that you are indeed the owner or the webmaster, they will ask you to authenticate the website by either placing an HTML meta tag in the index.html or index.php file, or by placing a file with a number in your hosting. I invite you to take this step to speed up your referencing and the update of your sites. You must open a free account to do so:
Google: search.google.com/search-console/about
Bing: bing.com/toolbox/webmaster/
Baidu Webmater Tool: ziyuan.baidu.com/
Yandex Webmaster Tool: webmaster.yandex.ru

3- Positioning: being number 1

This is where all the fantasies and frustrations take shape. Let's break the first myth: there is no method to be listed for sure on the front page. For some popular keywords, Google even takes care to break down the ranking and avoid that it is always the same websites that appear first. No SEO company can guarantee a ranking. Don't go to pay for a service that promises you a fast referencing and a guaranteed positioning on the first page, it's a lie. The reason is simple: no company can force these internet giants to do it! However, there are tricks to gain better positioning. As a gift, I give you the conclusions without going into detail, it's up to you to do the job. I am only talking about Google knowing that others follow the rules of the leader and do the same.

To be successful, you have to keep in mind a guiding principle that Google asserts every time it changes its evaluation criteria. First and foremost, you must make a website for humans (fast, accessible, with well-presented and honest information, with a logical layout and outline). It's not up to you to adapt to Google, but it's Google that adapts to humans and their way of reading quality information. If your content is appreciated and relevant to a human, it will also be relevant to the search engine.

a) Optimize your web page for a good index:

The <title> tag must be explicit and describe what is in the page. No need to be long, search engines cut titles that exceed 60 characters (spaces and punctuation included).
The meta description and keywords tags: you can say anything about them. Simply put, the meta keyword tag is useless. It is no longer used because of abuse (list too long, does not match the content, etc.). The meta description tag can be useful by displaying a summary in the result page and by highlighting particular keywords. This summary must match the content, otherwise the page will be banned. It must be less than 160 characters (including spaces and punctuation) and will be displayed as a description in the results page. The meta description tag is optional. If it does not exist, the search engine will display an extract of the page highlighting the keywords found in the text (it is also a good solution to invite the reader to view your page).
When you write your article, highlight important keywords using the tags <i> <b> <strong> <em> <h1> <h2> <h3> <h4>. But don't overdo it. If there are too many of them or if the keyword is highlighted too often, it will not be taken into account.
Your URL must contain a few keywords, but this URL must not exceed 120 characters otherwise the included keywords will not be included in the search result (but technically the URL can be very long).
Page length is important. Avoid texts that are too long and heavy (more than 20 KB), the minimum is 300 words per page, 10000 words per page seems to be a maximum.
For indexing robots, the content at the top of the page is more important than the rest of the page. As for journalistic writing, write the first paragraph as the summary of the article in order to include your keywords. This place (the introduction or teaser) is the most strategic, because your first keywords will carry more weight. Consequently, the HTML code of the sidebars and menu bars will be placed at the end of the HTML file, where they will not interfere with the indexing of the text; use CSS style sheets to position these elements in the desired location.

b) Having a good “pagerank” = having a lot of friends.

The more websites cite you and link to yours, the more popular you are (so you are considered interesting and relevant) and the better ranked your site is. The key is there. SEO companies can help you through their network of websites by artificially increasing your popularity and optimizing your site through their experiences.

Otherwise, do it yourself by talking about your site in many forums, blogs, newsgroups, partnerships…

An effective SEO is very long, it takes months, it needs patience.

Or advertise for a fee to ensure your visibility, but this will never influence the search results.

4- Test and optimize your website

Therefore, a good and unique content will give a good natural SEO. However, if the site is slow with content that is complicated for a robot to read, the indexing will be limited and punished by poor positioning, far behind the other competitors. It is helpful to know how the indexing robot works and what are the good practices that will improve positioning and visibility.

a) Search Engine Optimization:

An article from Wikipedia gives a good overview of the indexing rules of search engines where all competing services replicate these practices for more consistency. It contains references to other articles written in English in a jargon that is relatively accessible to the enlightened amateur.
Wikipedia: Search engine optimization

b) Test the site and analyse the results:

In addition to the content of a web page, a whole series of technical criteria are taken into account to evaluate the positioning in the search results. Some are easy to fix, others require technical expertise to optimize coding and set up the web server. I invite you to have your website examined at these addresses, which will display a detailed report and an evaluation of the technical efficiency:
Google Developers inspect your site and give you advice: developers.google.com/speed/pagespeed/insights/
Varvy SEO tool tests and gives a lot of tips to speed up your website: varvy.com
Webpagetest examines the performance of a site: webpagetest.org
GTMetrix analyses the speed of a site: gtmetrix.com
Pingdom determines the speed of a site: tools.pingdom.com

“If-” you'll be a Man…

Re-discover Rudyard Kipling's famous poem “If- you'll be a Man, my son” which has inspired so many generations. It celebrates the courage to overcome hardship.

I've decided to succeed

I have put together 10 tried and tested tips for finding the will to successfully achieve your ambitions and overcome obstacles in your personal and professional life.

Gourmet treats

Exquisite recipes for mini-cakes (madeleines, financiers, biscuits, cakes, muffins) and other delicacies (croissants, brioche, traditional cakes…).