This article compares client-side and server-side scripts for generating website content. Client-side script runs in the user’s browser after the webpage loads. Server-side script runs on the server before sending content to the user's browser.
Client-side scripting is used for user interaction and dynamic user interface. It reduces server load but the content may not be discoverable by search engines. Server-side scripting is used for secure, SEO-friendly, and database-driven content.
Mandatory: Website design using HTML, CSS, Javascript and PHP.
Flexible: Basic knowledge of search engines and crawlers.
A search engine indexes web page and ranks them. Example: Duckduckgo, Google, Bing, Yahoo, Yandex, etc. Search engines use crawlers to discover new web pages and contents. Crawling web pages on internet is equivalent to traveling in real world. Crawlers are resource-constrained. Therefore, most crawlers only discover static content, i.e., content generated immediately after downloading files from the server.
Scripts are used to make a website user-friendly while reducing coding overhead. Some common use cases for scripting are:
There are several ways to generate website content on the client side, the most popular being Javascript. Website content generated by client-side scripting should be limited to content where crawling by search engines is not important.
There are several ways to generate website content on the server side, the most popular being PHP. Website content generated by server-side scripts are discoverable by crawlers. Server-side scripting is also useful in personlizing content for individual users.
Depending on scripting mode for generating website content, one may need additional steps to make the content on their website discoverable by search engines. For client-side scripting, pre-rendering is required for discovery by crawlers. Scripting is also be used for efficient delivery of content, for e.g. lazy loading of image, and DASH video streaming. Though this prevents media discovery by crawlers; one can make them discoverable as follows:
Crawlers increase load on servers. Therefore, depending on how resource-intensive a website is, one may need to trade-off between discoverability by crawlers and bandwidth conservation. There are several ways to reduce bandwidth overhead by crawlers:
Anurag Gupta is an M.S. graduate in Electrical and Computer Engineering from Cornell University. He also holds an M.Tech degree in Systems and Control Engineering and a B.Tech degree in Electrical Engineering from the Indian Institute of Technology, Bombay.