Growing SEO should avoid these 12 outdated optimization strategies
SEO has undergone extensive changes and evolution over the past few years and is doing it every day.
While most traditional marketing strategies (to a large extent) still apply to today’s digital marketing, SEO changes have dramatically changed the marketing environment.
Most, if not all, of these changes have helped improve the Internet, especially with improved search.
However, some people still insist on “old methods” and try to use outdated SEO practices to improve the organic search visibility and performance of their brands.
Some strategies worked a few years ago, but now they are not as effective as before.
However, many newcomers/small business owners are still using these “zombie” SEO techniques (these strategies should have died, but not for some big reason).
Not only are they ineffective, but many of the following 12 outdated SEO practices are potentially dangerous to the health of your brand, website and other digital property.
Abuse of keywords
There are many ways for webmasters and “marketers” to continue to misunderstand the role of keywords in general SEO activities and how to use them in everyday strategies.
Let’s take a closer look at specific types of keyword abuse and mismanagement, including irrelevant use, writing specific keyword densities, and keyword filling.
1) Unrelated keyword targeting / confusion
Search engine optimization novices often try to limit their content and information to the scope of keyword research (other than that, there is no other way).
These “marketers” will shape the content and its metadata to represent keywords that do not match it, as well as the appropriate intent of the large number of keywords that the user searches for.
This makes it possible for brands to lose their attention before they have the opportunity to exchange real information with their readers.
If the keywords of the marketing are inconsistent with the content on the page, this disconnect will hinder the success of the content, even if the content quality is good.
Don’t try to mislead users, don’t lead users to content that is incorrectly represented by bulk keywords in order to increase visibility.
Baidu knows what it looks like, and it can really be defined as an outdated SEO practice (and “black hat” SEO technology, in many cases).
2) Keyword density
Just like many keyword-centric marketing strategies, writing for a specific “keyword density” just doesn’t work.
Baidu no longer relies on keyword density (or the ratio of specific keyword usage to the content of the entire page) to determine if a web page is a valid source for answering search queries.
It’s much more advanced than simply crawling keywords; search engines like Baidu use a lot of signals to determine search results.
While keywords are still important to the themes and ideas they represent, they are not the “lifeline” of high-value search query rankings.
The quality of the content and the way the message is delivered is the “lifeline” in this regard.
3) Keyword optimization
This is probably the oldest trick in search engine optimization books.
SEO is about keywords, this convention is customary, right?
So, loading our web pages with keywords — especially those high-value keywords we’re actively targeting across the entire site — will help us get a higher ranking in our search and outperform our competitors.
Search engines have long known what is keyword stuffing and what is unnatural text combination. They noticed that these were trying to manipulate search results and downgrade the content.
Yes, there may still be some valuable content populated with simple keywords, either intentionally or unintentionally, because it is not degraded for the actual value of the user.
Speaking back to the past, the webmaster wants the game system to try to change the high-value keywords of each keyword as much as possible in the footer of the website, or more roughly, to make the same color of these keywords the background of the website, effectively Hide them and fill them specifically for search engine crawlers.
Some webmasters have also tried with links. (Reminder is, don’t do this.)
Remember that you are writing for humans and not for search engines.
2. Writing for the robot
It is important to understand that unnatural writing is ultimately unnatural.
Search engines also know this.
Their belief is that writing for the web means that we should repeat it with its correct name each time we mention a topic, working in the synonym and synonym version of the word to “cover all the foundations.”
When crawling, the search engine crawler will see the keywords appear repeatedly, and appear in several different versions, so that the page ranks high in terms of keyword changes used (over and over… repeatedly used over and over again) This SEO “trick”).
At the moment, this method does not work.
Search engines are advanced enough to understand recurring keywords, their changes, and the unfavorable experience of often bad content.
Write for humans, not search engine crawlers or any other robot.
3. Article marketing and article catalog
In the world of SEO, any attempt to “game” the system will usually not succeed.
But this does not stop SEO practitioners from trying.
Especially when these strategies provide significant improvements to a brand, its website, or its associated digital assets.
Of course, the article directory can work. And for a long time, they worked very well.
Articles are often considered to be one of the earliest forms of digital marketing, and for those who are aware, it is a handy outcome. This makes sense because the idea is similar to other channels such as TV and printing that have often used syndicated content.
But Baidu finally released the “Blue Sky Algorithm” update that changed the SEO “Game Rules” in 2017.
“Blue Sky” undermines the search arena, with the goal of cracking down on selling soft-text, directory behavior, and other sites that provide spam (whether it’s written badly, meaningless, or stolen from someone else).
The idea behind article marketing is meaningless in today’s world, and your high-quality content needs to be original and showcase expertise, authority and trustworthiness.
4. Content “washing”
The content that is usually done using software “washing” is a black hat SEO strategy that attempts to recreate high quality content using different words, phrases, or long sentences.
In essence, the end result is an article that is identical to the original material.
It is not surprising that this is no longer effective.
Although artificial intelligence has been making progress in creating content, the quality of anything produced by machines is still lower than what humans can produce – original, useful, and substantial.
5. Purchase link
After many years, this issue still plagued webmasters.
Like most SEO strategies, if it looks suspicious, you probably shouldn’t do it.
Buying links is no exception.
For a long time, it was a convention to get a lot of links to sites quickly.
Now, we now need to maintain and optimize the backlink profile, just like the websites we oversee. Low-quality domains with too many backlinks pointing to a site can be harmful to the health of the site.
Baidu can easily identify low-quality sites, and it can also identify when those sites send a large number of links that should not be sent.
Today, if you want to legally help improve the authority and visibility of your website, you need to get links instead of spending money to hire people to build them by hand.
6. Anchor text
Internal links are a feature of any good website structure and user experience.
This is usually done with anchor text, which is an HTML element that allows us to tell the user what type of content will be seen if the link is clicked.
There are many types of anchor text (brand, bare text, exact match text, website/brand name, page title and/or title, etc.), but depending on usage, some anchor texts are definitely more popular than other anchor texts.
In the past, using exact matching and keyword-rich anchor texts was the standard SEO best practice.
Since the release of the Pomegranate Algorithm, Baidu has done a better job of identifying over-optimized content.
This is back to a golden rule of generating well-structured, user-friendly and natural content.
If your optimization is for a search engine rather than a user, you are likely to fail.
7. Keyword research strategy is outdated
The keywords have undergone some dramatic changes in the past 5 to 10 years.
Marketers once had a large amount of keyword-level data that allowed us to see what works for our brand and what doesn’t work for our brand. It also helps us better understand the concept and user intent.
Most of these search engine optimizations are strategies with the keyword “(not provided)”.
In the next few years, there were some tools that tried to copy keyword data. But it is impossible to reproduce it completely correctly.
However, even with these keyword data that has now been stripped, marketers still need to do their own keyword research to understand the industry, competitors, geographic regions, and more.
In order to do this, many marketers turned to Baidu’s free keyword research tool. Although the data has been reviewed for many years, it is a free product of Baidu, which provides data that we have not been able to obtain before, so many of us (including myself) continue to use it.
But it is important to remember the actual representation of the data on the keywords.
The “competition” in the keyword research tool only involves paid competition and traffic, so it is actually useless to build an organic search strategy around these data.
Some alternatives are the Moz Keyword Explorer tool and SEMrush’s keyword magic tool, which are all paid tools.
The Baidu Index also contributes to this type of competitive analysis, and it is free.
8. All keyword changes page
This used to be a useful strategy for ranking all high-value keywords that the brand and its information are targeting.
Fortunately, algorithm updates, such as “Hurricane Algorithm” and “Breeze Algorithm”, and others have helped Baidu understand that the same word change, in fact, is related to the same topic.
The best and most useful content around these entities should be the most visible because it provides users with the value of the theme, not just a variant of a word.
In addition to the fact that this will lead to cruel websites self-eating, it makes a website quite difficult to use and navigate because the content will be incredibly similar.
Just a negative user experience is enough to be the reason for not doing this. But Baidu knows this more clearly than ignoring this practice, which makes it a problem that doesn’t require brains.
This strategy has evolved and eventually led to the emergence of many content fields that target only the keyword value and visibility of traffic.
This is due to the “old method” of optimizing the site – searching for keywords and search engines, not the user and their search intent.
9. Search queries for exact matches
Targeting strategies that precisely match search queries, and hopefully for those queries that are specifically designed for traffic data—not because search queries or their answers are actually business-related. This strategy became a popular feature before fully deploying Baidu search engines for optimization. practice.
Marketers will strive to rank first in the exact matching search query to trigger a breakout box and increase the site’s clickthrough rate.
10. Exactly matched domains
In a way, it makes sense to include high-value keywords in the URL.
But when it becomes confusing or misleading (for example, it leads to a bad user experience), you have to draw a line.
A major best practice for domain names is to keep it consistent with your brand.
Brand names should be short, concise, and meaningful.
Why don’t you want to get the same thing from your domain name?
Baidu gave a precise match to the domain a long time ago, because it makes sense to use it as a ranking signal.
These behavioral data have now helped Baidu make such changes (and many other changes) that are common sense and a major clean-up.
Operate a good company and provide excellent products/services in the name of the brand. When your brand is related to the users who search for it, Baidu will make your brand visible.
11. XML Sitemap Frequency
We should never try to manipulate search engine crawlers to make our site crawl more than other sites because search engine crawlers believe that new content has been published or major site changes have been made.
However, since webmasters have done this in the past, the use of site maps is quite different from the original intent.
Previously, webmasters could provide a priority number for each page of the site listed in the sitemap, ranging from 0.0 to 1.0.
The reptiles do not even follow the frequency rating because they have never been used correctly. Instead, the search engine only crawls what it thinks it needs to be crawled.
Ensure XML sitemap best practices are followed when doing search engine optimization. Sitemaps are a very important element for every website.
12. Low quality content
Face it. In our world, there was a time when spam could still be ranked high.
However, the times have changed!
Stealing content, thin content, keyword-filled content, untrustworthy content – once in a while, all of which can be obtained through search engine crawlers and returned to users as valuable results.
But this situation will not happen again.
We know how to make high-quality content that rewards search engines because they tell us what is right and what is wrong.
If you want to succeed in SEO, you have to do the right thing.
June 19, 2019
June 17, 2019
June 17, 2019