Search Engine Optimization: Best Practices, Web Tools for Site Audit
The algorithm for broadcasting content on social media is opaque and relies heavily on artificial intelligence. The aim is to increase engagement rather than disseminate information. Without access to this algorithm, broadcasting content on social media has become increasingly difficult.
Social media platforms favour sponsored content because of its monetary value. The flood of sponsored content obfuscates essential information.
Centralized institutions force social media owners to moderate and censor the content. The appeal process for censored content is lengthy and strenuous; it can lead to unfavourable outcomes in critical situations.
A personal website can solve most of the abovementioned issues and provide a reliable platform for content delivery. While centralized institutions regulate the Internet, it is difficult to moderate and censor website content. Moderating content on social media is easy due to opaque algorithms. Blocking a domain to censor content can be quickly identified by users and increases the accountability of centralized institutions. It is essential to understand that centralized and decentralized technologies are not black and white but belong to a spectrum. On this spectrum, a platform controlled by an opaque social media algorithm is more centralized than hosting your website for content propagation.
Creating quality content is essential both for social media and personal websites. What differentiates the two is the mechanism for broadcasting the content. Social media platforms handle content dissemination for the user, but posting on an individual webpage requires extra effort from the user. To reach the target audience, the content creator needs to enhance their website's visibility through search engines, as users primarily rely on search engines to search for specific information on the web. This blog will guide such creators in making their website (1) accessible to search engines for indexing and (2) improve ranking to be featured on top of search results. Indexing allows a search engine to populate its database and refine its search results. Ranking decides the order of the search results for a search query. Rank is equivalent to the reputation of a webpage. Building a reputation for a webpage is similar to building a reputation for yourself. One can improve the reputation of their webpage by making themselves known to others and asking others to give you a reference. Technically, this is known as building external backlinks.
This blog primarily focuses on making your webpage accessible to search engines. Specifically:
What are meta tags and how to use them to summarize the content for search engines?
What are good coding practices for writing a website?
How do you inform search engines about new content?
What site audit tools are available to identify and correct errors?
Head, Meta, OG, Twitter tags
Specify a region for your site. This can affect site ranking in search results for location-dependent queries. Put in top so that it is processed before other information about page.
Canonical link to avoid duplicate content by crawler. For e.g. https://www.anuragg.in/ and https://www.anuragg.in/index.html are duplicate contents. Put this after meta tags but before open graph tags. Otherwise, the page can be excluded from search results because it belongs to a secondary mirror of the site.
A general recommendation is to keep your page description between 110 and 160 characters, although Google can sometimes show longer snippets.
Title length to be less than 70 characters. Make it unique and relevant to best describe the page content.
Specify language of the content.
Favicon: A favicon is a small image that is displayed in a snippet in search results, next to the site address in the browser address bar, near the site name in Favorites or in browser Bookmarks. A favicon makes it easier for the user to find your site in search results and among browser tabs, which speeds up the work with the site and simplifies the process of returning to it.
Best Practices
Coding guidelines
Title length should be between 50 to 60 characters. Reference: https://moz.com/learn/seo/title-tag
Having duplicated title tag values can negatively impact a website’s SEO performance. Title tags are HTML elements that define the title of a web page and summarize its contents. They are important for both search engines and users, as they help search engines understand the content of the page and help users decide whether to click on the page in search results.
When multiple pages on a website have the same title tag, it can create confusion for both search engines and users. Search engines may have difficulty determining which page is the most relevant for a given query, and users may have difficulty distinguishing between pages in search results. This can result in lower click-through rates, reduced traffic, and lower search engine rankings.
To avoid these issues, it is important to ensure that each page on a website has a unique, descriptive title tag that accurately reflects the content of the page. This can help improve the website’s SEO performance and provide a better user experience for visitor.
Meta description. 50-160 characters. Reference: https://moz.com/learn/seo/meta-description
One and only h1 tag should be used. Ideally less than 60 characters, no technical limit. Reference: https://moz.com/learn/seo/h1-tags
Add full link inlcuding https/http and not relative links. This is required for crawlability.
Use .htaccess to rewrite url. This will be useful if you migrate from .html to .php in the future. You won't have to add a redirect rule in .htaccess. Redirect is not favored by search engines.
Proper anchor text for links. External link to authoratitive domains can improve SEO. Spammy external links to low quality sources will have negative consequence.
noscript tag for crawlers.
Helping the crawlers for indexing
Lazy loading, infinite scrolling will cause crawler not to see some content. Make sure a static version is avaialable for crawler. Alternatively, update url based on scroll position and add all the urls in the sitemap.
Use server side to display a static content to the crawler. Do not add links, images using javascript as it will be too late to be seen by the crawler.
If your website depends on user's location, use geoip location as an alternative for crawlability. Geolocation permission may not be available for crawlers.
Generate a sitemap (for crawlability), json ld (for rich content/structured data), RSS feed for subscribers and robots.txt (help for crawlers)
How to increase engagement?
Website should be mobile-friendly.
Internal links help with SEO and also keeps a user engaged.
Improving visibility
Create IndexNow for each search engines.
Expand your site traffic by distributing contents to Bing, Edge, Windows, Office & Dynamics. Refer to their guidelines on https://www.bing.com/webmasters/help/pubhub-publisher-guidelines-32ce5239
Set up business profile on google, yandex (need to upload YML file also), bing. Register your company for authenticity.
SEM: Search Engine Marketing
Web Tools
Site audit
Use SEO search frog to check for missing files and other indexability issues.
Check website's mobile compatibility on yandex webmaster.
Yandex server response
How to improve the visibility and quality of webpage?
Use Bing webmaster keyword research to identify keywords for your pages.
Lighthouse is an open-source, automated tool for improving the quality of web pages. You can run it against any web page, public or requiring authentication. It has audits for performance, accessibility, progressive web apps, SEO, and more.
Handling Increased Website Traffic
VPS or dedicated servers. Linux vs. cPanel.
To avoid cPanel license cost, use open source web host manager. For e.g.,
Virtualmin
Monetizing websites
Google adsense.
Yandex turbo page.
Contact companies relevant to your content and ask for sponsorship.
Author
Anurag Gupta is an M.S. graduate in Electrical and Computer Engineering from Cornell University. He also holds an M.Tech degree in Systems and Control Engineering and a B.Tech degree in Electrical Engineering from the Indian Institute of Technology, Bombay.