New SEO Case Study for Impression Increase

Topical Authority Academy continues to be expanded

Reversing Negative Ranking State: Focus on Impressions and Crawl Stats

Welcome to another blue-hat SEO case study (opportunist SEO). We focus on simple instructions and explanations to teach and help SEOs around the globe with their projects. If you want further support from the Best SEO Community in the world, join the Holistic SEO Community Public and Private groups. To further your training, SEO knowledge, and mindset, join the Topical Authority Academy, which refreshes itself constantly.

The data presented here is only for 4-5 days of data after the Quality Nodes in the semantic content network are launched. Many Holistic SEO Community members ask me "how to understand whether our new network of content is perceived by Google." This simple opportunist SEO case study explains the early signals of a possible future re-ranking.

  • Re-ranking changes the ranking state after gathering enough historical data, a concept from Koray's framework.

  • Historical Data is a broader concept than "clicks" or "CTR." Google tracks everything on Google SERP, including "query per second," "next query similarity," "mouse over up," "text selection," and more. Historical data collects SERP behaviors from user clusters to understand possible click satisfaction. This definition is from Koray's Framework.

  • Re-evaluation and Re-ranking are not the same. In the future, a new book will be launched to explain the Google API Leak. Google has leaked its own "Eval" methods and attributes, merging them with the Google Patents and our case studies. A new, better conceptualization will be added to Koray's Framework.

If you do not know what "Ranking State" means, watch the video below.

This project's industry is Finance and Credit Card Debt Consolidation. The language is English, and the region is the USA. The project's name will be explained after a certain amount of time. Check future video and article case studies to catch the name and perform further analysis.

This is a NASDAQ project example that has been running for over three years. The publication frequency has decreased by over 90% for the last three months, and you can see the slowdown. While reading this SEO Case Study, multiply the possibilities based on the frequency.

1. Check Crawl Stats to Understand Whether Google Focuses on the New Content

  • The crawl stats in Google Search Console are complete; thus, always check your web server logs to analyze them. To analyze your crawl stats, use Python or a third-party log file analyzer such as Screaming Frog Log File Analyzer or SEMrush.

  • After publishing your semantic content network and its "Quality Nodes," the crawl stats should recover by increasing their velocity.

  • The website in this SEO case study published only 12 web documents by "updating" them. After updating and revising these fundamental articles, which were losing traffic for 15 months and 3 weeks, the website increased its crawl request by over 400%.

  • This crawl stats and total request number jump demonstrate a clear indication of "re-evaluation" starting from the search engine side.

  • This website was re-evaluated earlier because it is older than 25 years and has enough PageRank to be considered a candidate.

  • Historical data and PageRank keep a web source as a "candidate" for the Topical Authority ranking state so that you can get faster reactions from the search engine side.

  • Ensure most of the crawl hits go to the HTML Documents.

  • Use only 2-3 CSS files and 1 Font file site-wide.

  • Block any non-content related JSS files to the search engine crawlers.

  • Don't leave 404 behind; clean them from the search engine's crawl agenda.

  • The newly published documents increase the web source's crawl quota, which means the old documents are also crawled more.

  • This demonstrates that the quality of a newly published document is affected by and affects the previously published documents.

  • Thus, to exceed Quality Thresholds determined by "cornerstone sources," the web source should be optimized from every angle and level.

  • Store your log files for years. Perform a comparative analysis based on "your moves," "search engine's moves," and "competitors' moves."

  • If crawl stats change, it has to be from search engine systems, your activity, or your competitors' activity. If you have enough data and a note-taking and tracking system, the result will be clear to you.

2. Check the New Query Appearances to Analyze Query Networks

  • Query Netroks's definition is given in earlier BlueHat and Holistic SEO case studies; please check it.

  • Perform query profile analysis every 7 days to understand which new queries are appearing and which are increasing in impressions.

  • Focus on impressions to perform a proper relevance and responsiveness analysis.

  • The new queries should be analyzed from the "distributional semantics" and "formal semantics" points of view.

  • The audit which lemmatizations, interrogative terms, plural and singular forms, predicates, nouns, and trigrams appear in the new queries.

  • Always use a grammatically correct sentence, always give new related information in the sentences, and never use a sentence to involve a certain phrase; do not cause the gibberish score to increase.

  • Always use shorter sentences. Start from the "representative context" and move towards the "specific contexts." Such as "How to perform debt consolidation," "How to consolidate debt for bad credit score," or "How to consolidate debt for retired teachers with debt." Occupations, credit score degrees, and occupation status, such as retired or not retired, create new contextual branches that can be deepened.

  • Balance the contextual coverage. If the contextual coverage and weight are too far apart for one of these angles, the others will lose relevance. The relevance radius has to be balanced, or some of these queries will be lost, and further queries will not appear. The contextual vector and coverage should be paralleled for the entire query network.

  • The definitions of "contextual vector," "contextual coverage," and "contextual weight" are given in earlier case studies and the Topical Authority course. Please check them to continue.

  • See the "synonym," "modality," "interrogative terms," "trigrams," and "attributes" that are explicitly marked in the image above.

  • Perform the query impressions analysis to close the gap between the Query Vocabulary and the Document Vocabulary.

3. Focus on Impressions, not Clicks

  • To understand whether you are in the right direction, search engines can give stronger signals from the impressions column rather than the clicks column.

  • If there are enough impressions for a source after achieving a certain amount of historical data, the website elevates its ranking state to further levels.

  • Above, you see a newly published document in this project. The impression increase on the 4th day of the document revision becomes clear, and the previous levels are exceeded.

  • The same document loses its average position while increasing the impressions. It is a clear signal that the source ranks for new and more queries. Thus, the impression levels and average position (AP) should be and can be reverse correlated.

  • Below is another sample document. In four days, the impressions increased from 1.000 levels to 3.000 levels.

  • Below, you can see the 7-day comparison for the specific source. This time, it didn't lose its Average Position and started to bring new clicks to the website directly.

  • The clicks started to come from the "existing query profile", while the impression increase came from the "new query profile"” The new queries that the web page ranks should validate the relevance of the previous queries.

  • Another sample document is below. You can see the 7-day comparison. Impressions increased by over 100% in 4 days, and the Average Position decreased directly.

  • See the 16 months of graphics for the same URL. It constantly lost impressions for 16 months while getting only 198 clicks. In the last 4 days, for the first time, it increased its impressions 2 times more, while the Average Position is decreasing. This is a clear signal of exceeding quality thresholds.

  • Another document from the same project is below. The impressions increased by over 100% in 4 days, which is reflected in the clicks with a 36% increase. The average position is directly decreased, which shows the expanded relevance radius.

  • Below you can see the repeated and similar behaviors in the Impressions and Average Position perspective. Protecting the web page indexation as a constant candidate by gathering historical data should always be a priority for Topical Authority.

  • Catch newly published documents and performances and analyze them differently. Indexation should be "constant and continuous". The URL below doesn't have continuous indexation.

  • If ındexation is interrupted, use internal links to improve its PageRank and quality score directly.

  • Support these types of web pages strongly. New documents bring queries that the website didn't rank before. Thus, getting a direct positive reaction is harder compared to previous examples. Determine the Highest PageRank URLs on the website, create a contextual bridge, and flow the quality signals to these new documents.

  • The "contextual bridge" is explained earlier. To understand and implement it faster, check previous case studies or the Topical Authority course.

  • Below is another newly published document example. In four days, it gets 866 impressions, and the indexation is not interrupted. This means that it has a better initial ranking, which signals a further brighter re-ranking process.

  • You do not need to prioritize these URLs in the internal linking structure directly. But ensure that you see People Also Ask AI Overview or Featured Snippet existence for the rankings of these URLs. If this URL does not have any of these advanced SERP features, check whether the website lost all these SERP features earlier or not. If it was lost earlier, it is not about this URL. If it is not lost, but the URL doesn't appear for these SERP features, configure the relevance."

  • Relevance configuration is explained in earlier case studies; check them or join the Topical Authority Academy to get all the information quickly.

  • Below is another "revision" example. The feedback is much stronger compared to newly published URLs. The impressions and average position are reverse parallel to each other.

  • To reverse a negative ranking state, you must update at least 80% of a website with similar directions. If you perform these updates in a very brief time, you will increase your chance of recovery. Unfortunately, most website operators and operations are very slow and late to realize these opportunities, and they lose valuable time between the Spam and Broad Core Algorithm Updates; thus, they can't recover on time.

New SEO case studies will be published. BlueHat SEO Case Studies are simple and digestible, designed for beginner-level SEOs. If you like Holistic SEO Case Studies, I suggest you join the Topical Authority Course to test your expertise.

See you at the next ones; love you all.

Stay with logic and reason.

Reply

or to participate.