Search Engine Optimization Frequently Asked Questions.

Search Engine Optimization Frequently Asked Questions.-

Nowadays, many people are doing seo optimization. Everyone understands the big aspects: content, topics, keywords, external links and the like.

But if you want to do better than others, you must look at the key items. This problem is more common in technical URLs. In fact, because the web server is equipped with a URL that does not respond to lowercase, it is not easy to redirect or rewrite the lowercase version number.

As a result, the Google search engine has developed a lot in the technical aspects of identifying canonical version numbers and ignoring repeated version numbers, and people often don’t care about this problem. However, Google search engine does not have regrets, so people must do it.

With a URL rewrite control module on the IIS 7 web server, it can help deal with this problem. On the page of this special tool, you can implement a lowercase URL, and that standard will be added to the URL’s configuration file in the future to solve the problem.

To find such web pages will be a bit complicated, because different service platforms have different URL structures, so the solution is like a guessing game.

You can use special tools to simulate the spider crawling movement of your website, export the crawling movement record of the excel sheet, select the Meta logo, search the website homepage title, it is very easy to find the repeated homepage.

I tend to switch to 301, and bias other repeated webpages to which homepage people clearly know. You can also handle this problem by adding the rel = canonical logo.

Another such plan is to use special tools, such as Screaming Frog, to simulate the spider crawling movement and find connections that tend to repeat web pages. Later you can write this kind of repetitive web page, biasing to the proper URL, so that you don’t have to worry about the reduction of the connection weight value according to 301.

Tip: If you can query the Google cache file for each URL, it seems that it is not very good. If Google doesn’t notice that the repeated URLs are the same, you can see the different PR and cache file times for this URL. Such problems are common on e-commerce sites with database query drivers.

It does not mean that other types of websites do not exist, but usually there are many product features and selection options on e-commerce websites, such as hue and size. In this case, the URLs clicked by customers are more friendly at the seo optimization level, but I often see that the end of many links is like the case below me: In this case, a certain type of hue is used as Basis for selecting product types.

This type of selection method is very good for customers, but it is not good for Google search engine, especially when sometimes customers do not use color to retrieve a particular product. In this case, for some keywords, this URL is not a good landing page.

When many of the main parameters are combined, the spider resources may be exhausted. What’s even worse is that sometimes the main parameters are different, but they return the same content. For example, although the relative paths are different, the two URLs return the same content. Google search engine will think that this kind of web page It is repeated content. Remember, Google assigns spider resources based on the PR value of your URL. Please ensure that this spider resource is fully utilized.

Before again, people had to deal with another such pervasive problem: URLs will be unfriendly to Google search engines because they are not database query drivers.

In this unique situation, I’m not worried about the left and right problems, I’m more worried about the extravagance and waste of spider resources and some unused web pages being indexed by the database.

The first thing to deal with is what web pages are crawled and indexed by the database. This lies in your keyword scientific research. You must cross-reference the characteristics of key keywords in database queries.

Your job is to find out what features are keywords, and customers can find this product.

It is also important to define what features the customer must apply. After doing that, you will find a keyword with a high search index is North Face + waterproof jackets.

At this time, you must make a North Face+waterproof jackets login page indexed by crawling and database. Also make sure that there is a URL that is friendly to Google search engine in the database query feature, not “waterproof-jackets/?Brand=5”,just “waterproof-jackets/north-fac/.” In the guide structure of the website, the PR value can be transmitted, and the customer is also very easy to find.

At the same time, you will find that the keyword Northface + Black is very low. It is not easy for you to be willing to be crawled and indexed by the web pages of the two features of Northface + Black.

When you already know what features are to be indexed by the database and not what, the next stage of unified action needs to start with the URL being indexed by the database.

If the URLs have not been indexed by the database, the effective method is to add the URL structure to the robots.txt file.

To do this you will have to try RegEx a bit more, please make sure RegEx is appropriate to avoid it. In addition, Google ’s administrator-specific tool Fetch must be applied. It must be noted that adding the URL that has been indexed by the database to the Robots.txt file will not allow them to be deleted from the database index comparison library.

If the URL is already indexed by the database, one must use the rel=’canonical’ flag to handle it. If the web site has been developed and designed, you can’t carry out the modification work. You can’t handle the key problems like the situation you encountered above. At this moment, the rel = canonical logo can help you delay the time and solve the difficulties a little. Add the rel = canonical flag to URLs that you don’t want to be indexed by the database, and then prefer related URLs that you don’t want to be indexed by the database.

A soft 404 web page means that you can’t find the real incorrect web page, and you can’t find these areas on the URL, which is not good for the customer experience.

From the perspective of connecting capital construction, no one is the best choice. The link that will give you back comes to a bad URL, but it is difficult to track this connection, and the official website optimization then redirects to the appropriate web page.

Using some of the functions in Google ’s administrator-specific tools can help you find soft 404 pages, it will tell you soft 404 pages that have already been checked.

I like to use Web Sniffer as a special tool for testing. If you use Chrome computer browser, you can also use Ayima as a special tool.

It is very easy for a URL developer to select this redirect incorrectly. From the perspective of customers, there is no difference between the two, but Google search engines do treat them differently.

Tagged , , .

发表评论

电子邮件地址不会被公开。 必填项已用*标注