Best Website Builders For SEO (2024) – Forbes Advisor UK

Here are the definitions of some key SEO terms you’re likely to encounter:

Meta title: The main title of a page as defined by its underlying HTML code. Search engines such as Google use these as written in their results pages. The meta title of this page for example, is: ‘Best Website Builders For SEO’.

Meta description: A brief synopsis of a webpage designed to explain to a user, within search engine results pages, what it contains. Meta descriptions can be used to demonstrate to a user that your site is what they are looking for. The meta description for this page for example, is: ‘Find the best website builder to help your business appear in Google searches and attract more customers’.

Permalink: A static, unchanging URL for a page that can tell search engines more about what it contains. Some website builders allow you to customise your permalinks to target them for certain keywords.

Responsive design: A responsive website will be as easily readable on a laptop as it will on a mobile device, adapting itself dynamically to fit its space and give the user the best experience.

301 redirect: A digital signpost that tells a search engine to automatically forward a user from the page they chose to visit to another page. Websites use 301 redirects when an old or defunct page is replaced with a new one and wants search engines to send users to the newer version.

SSL certificate: A third-party verified, digital certification that authenticates a website’s identity for search engines’ purposes.

Canonical tag: A small piece of HTML code that tells a search engine which page to index from a set of duplicated pages. For example, your website might have two versions of a page for two different geographies. Putting a canonical tag on one of the duplicates prevents search engines from having duplicate listings in their results pages.

Sitemap: A file listing all the pages on your website and how they relate to each other. Search engines use these files to more efficiently index websites.

Robots.txt: A file that helps to control web requests from web robots (‘bots’) to your website. For example, a robots.txt file can be used to determine which parts of your site a bot can ‘crawl’ and which it cannot. You might, for instance, want to prevent Googlebot from crawling your login page.

Scheme/structured data: Structured data is a standardised format for web content that search engines can use for categorisation. For example, if you were to search for ‘events in London’, you’ll see a list of upcoming events that Google has pulled from websites using structured data to tell it that they are events.

HTML: A basic computer language used to build web pages.

URL: A Uniform Resource Locator (URL) is basically a web page’s address on the internet, readable by humans.

Recommended For You

Leave a Reply