Many search engine like google.com, yahoo.com, bing.com have their own algorithm to determine the rank of the websites or other HTML projects. A robot crawler will save every page, screen the content to make an index and build a huge database like the yellow-pages. Ultimately with the goal that a user can find the exact things they are looking for when using search engines. A good HTML page should have a friendly SEO structure.
Barisco allows you to create projects that are optimized for search engines. This means that automatically a site map of the project is created and that the content is separated from the presentation without using inline CSS. Next to this Barisco introduces a canonical link for each page.
To improve the optimalisation we advise to:
- use a relevant edition name
- use a relevant page name
- add a description to each page. This description can be 2 sentences that relate to the content of the page.
To make your publication private your can deactivate the site-map and meta-data. Go to Dashboard > Settings > SEO Active
Keywords in SEO could be the index keys for crawler engine to define your website and words. This means the people can find your website from the keywords you use.
For example keywords :
cake in Amsterdam, cakes in Amsterdam, Amsterdam cake, Amsterdam cakes, baking factory in Amsterdam, Amsterdam baking factory, pastry in Amsterdam, Amsterdam pastry
Each keyword must separate by comma. The word “Amsterdam cake” and “Amsterdam cakes” are not the same. They will not use the keyword “Amsterdam” or “cake” because it’s too generic. So every keywords should be specific words.
When you want to see the page source you can right-click on the web page in browser > View source, you will see html language and see the meta tag “keywords” and “description” define on your website.
When you want to check how the crawler engine see your site map, add /sitemap.xml to your url.
Proper SEO settings help the robot crawler to define and index your project and improve search results.