First, search engine working principle

There will be a very large database in the background of the search engine website, which stores a huge number of keywords, and each keyword corresponds to a lot of websites, these websites are called “search engine spider” or “web crawler” program from the vast Internet a little bit to download the collection. With the emergence of various websites, these hardworking “spider crawling on the Internet every day, from one link to another link to download the contents, carries on the analysis, find the key words, if the” spider “don’t think keywords in the database and is useful for the user into the background database. On the other hand, if the spider thinks it is spam or duplicate information, it will abandon it and continue to crawl, looking for the latest and useful information to save for users to search. When the user searches, the url related to the keyword can be retrieved and displayed to visitors.

A keyword to use more than one url, so there is a sorting problem, the corresponding when the most consistent with the keyword url will be in front of the. In the process of “spider” grabbing web content and refining keywords, there is a problem: “spider” can understand. If the website content is flash and JS, etc., then it is not to understand, will make confused, even if the keyword again appropriate also useless. Accordingly, if the website content can be identified by the search engine, then the search engine will increase the weight of the site, increase the friendliness of the site. This process is called SEO.

Two, SEO profile

SEO(Search Engine Optimization), that is, Search Engine Optimization. SEO is with the emergence of search engines, the two are mutually promoted, mutualistic relationship. The existence of SEO is to improve the number of pages in the search engine natural search results included in the ranking position to do the optimization behavior. The purpose of optimization is to improve the weight of the site in the search engine, increase the friendliness of the search engine, so that users can rank in front of the site when visiting.

Category: White Hat SEO and Black Hat SEO. White hat SEO, played a role in improving and standardizing website design, so that the site is more friendly to search engines and users, and the site can also obtain reasonable traffic from search engines, which is encouraged and supported by search engines. Black hat SEO, the use and amplification of search engine policy defects to obtain more user visits, most of this kind of behavior is to cheat the search engine, general search engine companies are not supported and encouraged. This article aims at white hat SEO, so what can white hat SEO do?

1. Carefully set the title, keywords and description of the website to reflect the positioning of the website and let the search engine understand what the website is about;

2. Website content optimization: the corresponding content and keywords, increase the density of keywords;

3. Properly set the robots.txt file on the website;

4. Generate search engine friendly sitemap;

5. Add external links to various websites for publicity.

Three, why do SEO

Increase the weight and search engine friendliness of your site to increase your ranking, increase your traffic, improve your (potential) user experience, and boost your sales.

Four, front-end SEO specifications

Front-end is a very important link in the construction of a website, front-end work is mainly responsible for the page of HTML+CSS+JS, optimization of these aspects will lay a solid foundation for SEO work. Through the structure of the site layout design and page code optimization, so that the front-end page can not only let the browser users can understand (improve user experience), but also let the “spider” to understand (improve search engine friendly).

Front-end SEO considerations:

1, website structure layout optimization: as simple as possible, to the point, advocate flat structure

In general, the less the structure of the website, the easier it is to be “spider” crawl, it is easy to be included. General medium and small website directory structure more than three, “spider” will not be willing to climb down. And according to the relevant data survey: if the visitors haven’t found the information they need after three jumps, they are likely to leave. Therefore, a three-tier directory structure is also required for the experience. To do this, we need to:

(1) Control the number of links on the home page

Website home page is the highest weight of the place, if the home page links too little, no “bridge”, “spider” can not continue to climb down to the inside page, directly affect the number of sites included. But the home page links can not be too much, once too much, no substantive links, it is easy to affect the user experience, will also reduce the weight of the site home page, the effect is not good.

(2) Flat directory level

Try to make the “spider” as long as 3 jumps, can reach any inside page of the website.

(3) Navigation optimization

Navigation should use text as far as possible, can also be paired with picture navigation, but the image code must be optimized,The tags must have “Alt” and “title” attributes to tell the search engine where to navigate, so that users can see the prompt text even if the image does not display properly.

Second, on each page should be combined with bread crumbs navigation, benefits: from the aspects of user experience, allows users to understand the current position and the current page in the site location, help the user to quickly understand the organization form of web site, to form a better sense of place, at the same time provides the return each page interface, easy operation; To “spider”, can understand the website structure clearly, at the same time also increased a large number of internal links, convenient crawl, reduce jump out rate.

(4) The structure of the site layout – a detail that cannot be ignored

Page header: logo and main navigation, and user information.

Body of the page: left side text, including breadcrumb navigation and text; On the right side of the popular articles and related articles, benefits: retain visitors, let visitors stay more, for the “spider”, these articles belong to the relevant links, enhance the relevance of the page, but also to enhance the weight of the page.

Bottom of page: copyright information and links.

Special attention: paging navigation writing method, recommended writing method: “home page 1 2 3 4 5 6 7 8 9 drop down box”, so that “spider” can directly jump according to the corresponding page number, drop down box directly select the page jump. And the following writing method is not recommended, “home page next page”, especially when the number of pages is particularly large, “spider” need to go through many times to climb down, to grab, will be very tired, will easily give up.

(5) Use the layout to put the important HTML code first

Search engines crawl HTML content from top to bottom, using this feature, you can let the main code read first, advertising and other unimportant code in the bottom. For example, if the code for the left and right columns remains the same, just change the style and use float:left; And float: right; You can make the two columns interchangeably on the display so that the important code comes first and the crawler grabs it first. The same applies to multiple columns.

(6) control the size of the page, reduce HTTP requests, improve the loading speed of the website.

A page should not exceed 100K, too large, page loading speed is slow. When it’s slow, the user experience is bad, visitors don’t stay, and the spider leaves once it times out.

2. Web code optimization

(1) Highlight the important content – reasonable design title, description and keywords

Tag: page description, need to be a high summary of the content of the page, remember not too long, too much keywords, each page should be different.

(2) Write HTML code semantically, in line with W3C standards

Make your code as semantically as possible, using the right tags in the right places, doing the right thing with the right tags. Let the reader source code and “spider” at a glance. For example, h1-h6 is used for the title class,

(3) Tag: the link within the page should be added with the “title” attribute to explain, so that visitors and “spiders” know. External links, which link to other sites, need to add el=”nofollow” to tell the spider not to climb, because once the spider climbs the external link, it will not come back.

< a href = "https://www.360.cn" title = "360 security center" class = "logo" > < / a >Copy the code

(4) To use the title of the text

“Spider” thinks it is the most important. A page can have at most one H1 tag, which can be placed above the most important title of the page, such as the logo on the home page. Subtitle with

The h header tag should not be used carelessly elsewhere.

(5)This should be illustrated using the “Alt” attribute

<img SRC ="cat. JPG "width="300" height="200" Alt ="cat" />Copy the code

When the network speed is slow or the image address is invalid, the Alt attribute can be used to let the user know the function of the image when it is not displayed. Set both the height and width of the image to make the page load faster.

(6) Tables should use table title labels

The Caption element defines the table title. The Caption tag must follow the table tag, and you can only define one for each table

< table border = '1' > < caption > table header < caption > < tbody > < tr > < td > apple < / td > < td > 100 < / td > < / tr > < tr > < td > banana < / td > <td>200</td> </tr> </tbody> </table>Copy the code

(7) Tags: used only for line breaks of text content, such as:

<p> < span style = "box-sizing: border-box; color: RGB (74, 74, 74); line-height: 22px; font-size: 13px! Important; white-space: inherit! Important;"Copy the code

(8), label: need to emphasize the use. Tags in the search engine can be highly valued, it can highlight keywords, the performance of important content, tag emphasis effect is second only to the tag; , tag: only used for display effect, in SEO will not have any effect.

(9) Do not use special symbols to indent text should be set using CSS. Do not use any special symbol for copyright ©. You can use the input method to type the copyright ©.

(10) Do not use JS output for important content, because “spider” will not read the content in JS, so the important content must be put in HTML.

11) Use the Iframe framework as little as possible, because spiders don’t usually read it.

(12) Use display: None with caution: For unwanted text, set the z-index or indent to a large enough negative number to deviate out of the browser. Search engines filter out display: None.

3. Front-end website performance optimization

(1) Reduce the number of HTTP requests

When the browser communicates with the server, the communication is mainly through HTTP. There are three handshakes between the browser and the server, each of which takes a lot of time. Moreover, the number of concurrent requests for resource files from different browsers is limited (the number of concurrent requests allowed by different browsers). Once the number of HTTP requests reaches a certain number, the resource request will have a wait state, which is very fatal. Therefore, reducing the number of HTTP requests can greatly optimize the performance of the website.

CSS Sprites

Commonly known as CSS Sprite, this is a solution to combine multiple images into a picture to reduce HTTP requests, you can access the image content through the BACKGROUND property of CSS. This scheme can also reduce the total number of bytes in the picture.

Combine CSS and JS files

There are many engineering packaging tools on the front end, such as grunt, gulp, Webpack, etc. To reduce the number of HTTP requests, you can use these tools to combine multiple CSS or multiple JS files into a single file before republishing.

Using the lazyload

Commonly known as lazy loading, can control the content on the web page at the beginning without loading, do not need to send a request, wait until the user operation really need to load the content immediately. This controls the number of one-time requests for web resources.

(2) Control the priority of loading resource files

When the browser loads HTML content, it parses the HTML content from top to bottom. When the HTML content is parsed to the link or script tag, it will load the href or SRC corresponding link content. In order to show the page to the user in the first time, it is necessary to load the CSS in advance, and do not be affected by the JS loading.

Generally, CSS is in the header and JS is at the bottom.

(3) Try to link CSS and JS (structure, performance and behavior separation), to ensure that the web code clean, but also conducive to future maintenance

<link rel="stylesheet" href="asstes/css/style.css" />

<script src="assets/js/main.js"></script>
Copy the code

(4) Use the browser cache

The browser cache stores network resources locally. If the resources already exist, the browser directly reads the resources locally without re-requesting the resources from the server.

(5) Reflow

Basic principle: Reflow is when a DOM change affects the geometry attributes (width and height) of an element. The browser will recalculate the geometry attributes of the element, rendering the affected part of the tree invalid. The browser will verify the visibility attribute of all other nodes in the DOM tree, which is also the reason for the inefficiency of Reflow. If the Reflow is too frequent, CPU usage will spike.

Reduce Reflow, and if you need to add styles to the DOM, try to add the class attribute instead of using style.

(6) Reduce DOM operations

(7) Replace ICONS with IconFont

(8) Not using CSS expressions will affect efficiency

(9) CDN cache is used to speed up user access and reduce server pressure

(10) Enable GZIP compression, browsing speed faster, search engine spider crawl information will also increase

(11) Pseudo-static Settings

If it is a dynamic web page, you can turn on the pseudo static function, let the spider “mistake” this is a static web page, because static web pages are more suitable for spiders, if the URL with keywords better.

Dynamic address: www.360.cn/index.php

The pseudo-static address is www.360.cn/index.html

** Conclusion: ** Correct understanding of SEO, not too much SEO, the site is still based on content.