You created 40 city-specific landing pages by duplicating a template and swapping the city name, address, and a few localized phrases. You expected each page to rank for its target city. Instead, Google deindexed 35 of the 40 pages within three months and issued a manual action for doorway pages in Search Console. The remaining five pages that survived shared a common trait: they contained genuinely unique local content that could not have been generated by find-and-replace templating. The threshold between a legitimate local landing page and a doorway page is defined by content uniqueness, structural differentiation, and user value, and Google’s detection systems have become increasingly effective at identifying the template-and-swap pattern.
How Google’s Doorway Page Detection Algorithm Identifies Templated Local Pages
Google’s doorway page classifier evaluates local landing pages across multiple signal dimensions simultaneously. No single signal triggers classification on its own. The system looks for patterns across a page directory that collectively indicate mass-produced pages designed to capture search traffic rather than serve user needs.
Textual similarity ratio is the primary detection signal. Google compares the content of pages within the same URL directory or structural pattern. When 30 or more pages share 70 percent or more identical text with only city names, addresses, and minor phrases swapped, the similarity ratio crosses the classification threshold. Google’s John Mueller has warned explicitly against building hundreds of city-based landing pages with this pattern, noting that Google has treated city and state landing pages as doorway pages since 2008.
URL pattern regularity contributes to the classifier’s confidence score. A directory of pages following an identical URL structure (/locations/city-name/ repeated across 200+ entries) signals programmatic page generation. The uniformity itself is not penalized, but it increases scrutiny on the content those pages contain.
Content-to-template ratio measures how much of each page changes versus how much remains identical across the directory. Google’s system can identify shared template blocks, including headers, footers, navigation elements, sidebar content, and boilerplate paragraphs. Only content that genuinely varies between pages counts toward the uniqueness calculation. A page that appears 1,500 words long but shares 1,200 words with every other page in the directory has an effective uniqueness of only 300 words.
User engagement metrics provide behavioral confirmation. When users land on multiple city pages and exhibit identical behavior patterns (same bounce rates, same scroll depths, same time-on-page), the engagement uniformity suggests that all pages serve the same function, reinforcing the doorway classification. Genuinely different pages produce different engagement patterns because users interact differently with different content.
The March 2024 Core Update increased the effectiveness of doorway page detection. A regional HVAC company reported that over 80 percent of its suburb-specific pages lost rankings after that update, with analytics showing a 63 percent organic traffic drop within 30 days. Recovery began only after consolidating pages into comprehensive location-specific content with genuine unique elements.
The Minimum Content Uniqueness Requirements That Pass Doorway Page Evaluation
Based on observed enforcement patterns across multiple industries and site sizes, local landing pages need at minimum 60 to 70 percent unique content per page to avoid doorway classification. This percentage represents genuinely different substantive content, not superficial variations.
Content that counts toward the uniqueness calculation includes location-specific business descriptions that reference the actual community being served, staff profiles for team members who serve that area, customer testimonials from clients in that specific city (with enough detail to be clearly genuine), case studies or project examples completed in that location, service details that vary by location (different availability, different pricing, different equipment used), local regulatory or permit information specific to that jurisdiction, and references to local landmarks, neighborhoods, and community characteristics.
Content that does not count toward uniqueness despite appearing different includes city name substitutions within otherwise identical paragraphs, address and phone number swaps, minor phrasing variations that do not change meaning (“serving the Springfield area” versus “serving the Shelbyville community”), and dynamically inserted city names within static content blocks.
The 60 to 70 percent threshold is a practical guideline, not a confirmed algorithmic cutoff. Google has not published a specific percentage. The threshold is inferred from the pattern of pages that survive versus pages that get deindexed across large-scale local page deployments. Pages below 50 percent uniqueness are consistently flagged. Pages above 70 percent uniqueness rarely face classification issues. The 50 to 70 percent range represents a gray zone where other signals (engagement metrics, structural differentiation, link profile) influence the outcome.
Word count also matters in absolute terms. A 300-word page with 70 percent unique content contains only 210 unique words, which is insufficient to demonstrate substantive local value. Effective local landing pages typically range from 800 to 1,500 words, with the unique content component providing enough depth to genuinely inform a user about the business’s presence and capabilities in that specific market.
Structural Differentiation Signals That Separate Legitimate Local Pages From Doorway Patterns
Beyond content uniqueness, structural signals influence doorway classification. Pages that share identical HTML structures, heading hierarchies, section ordering, and layout patterns reinforce the template detection signal even when content differs.
H2 and H3 structures should reflect location-specific topics rather than a uniform outline applied to every page. A Springfield page might include an H2 about the city’s aging water infrastructure, while a Shelbyville page might address that city’s new construction boom. The heading structure communicates to both users and Google that the page was designed for its specific market rather than stamped from a template.
Unique imagery distinguishes legitimate local pages from doorway patterns. Stock photos reused across all location pages signal template production. Original photographs of the actual team, completed projects, or the location itself signal genuine local presence. Google’s image recognition systems can identify duplicate images across pages, and image reuse within a location page directory reinforces the template pattern.
Internal link structures should vary by location page. A Springfield page should link to content relevant to the Springfield market (nearby service areas, Springfield-specific blog posts, Springfield case studies) rather than linking to the same set of generic service pages that every other location page links to. Varying internal link targets demonstrates that each page exists within a genuine contextual web rather than a mass-produced directory.
Page length variation signals organic content production. When every page in a directory is exactly 1,200 words, the uniformity suggests template-controlled production. Natural content creation produces pages of varying lengths because some locations have more to discuss than others. Allowing page lengths to vary between 800 and 2,000 words based on the genuine content available for each location produces a more natural pattern.
The User Value Test That Determines Whether a Local Page Justifies Its Existence
The fundamental question Google’s quality systems evaluate is whether a user landing on a specific city page receives value they would not get from the main service page. RicketyRoo’s analysis of location page quality frames it clearly: if you can swap out the location and all the rest of the content still makes sense, the page has not justified its existence.
A page passes the user value test when it answers location-specific questions that the main service page cannot: which team members serve this area, what projects have been completed nearby, what are the common service issues in this neighborhood, how quickly can the business respond to this location, and what do other customers in this city say about the service.
A page fails the user value test when removing the city name makes the page functionally identical to the main service page or to any other city page on the site. The test is straightforward to apply: print two location pages side by side, redact the city names, and evaluate whether a reader can distinguish between them. If not, the pages fail.
Google’s Quality Rater Guidelines reinforce this standard. Raters are instructed to evaluate whether content demonstrates experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) for the specific topic the page addresses. A city-specific landing page that demonstrates no specific knowledge of or experience in that city fails the expertise component regardless of how well the generic service content is written.
The user value test should be applied before publication, not after classification. Review each page against the test during content production, and reject pages that cannot demonstrate location-specific value. Publishing pages that fail the test in hopes that Google will not detect them is a strategy with deteriorating odds as detection improves with each core update.
Staff Interview and Project Documentation Pipelines for Scalable Local Content
Creating genuinely unique content for 50+ city pages requires a systematic content production pipeline that generates local inputs efficiently rather than writing each page from scratch.
Local staff interviews provide the most authentic content input. A 15-minute structured interview with the team member or crew that serves a specific area generates location-specific insights about common service issues, notable projects, and community relationships. Recording and transcribing these interviews produces raw content that a writer can develop into unique page sections. The interview template standardizes the process while the responses are naturally unique per location.
Project documentation workflows capture case study material as work is completed. Requiring field teams to photograph completed projects and provide brief descriptions (what was the issue, what was the solution, what was unique about this job) creates a library of location-specific examples that populate city pages with genuine content. A single case study adds 200 to 400 unique words to a location page.
Testimonial Collection and Public Data Integration for Location Page Differentiation
Customer testimonial collection by geography provides social proof that is inherently location-specific. Post-service review requests that ask customers to mention their city or neighborhood in their feedback produce testimonials that cannot be templated across locations. Even three to five city-specific testimonials add 100 to 200 unique words per page.
Local market data from public sources (census data, housing statistics, climate information, municipal records) provides factual content that varies by location. Integrating relevant local data points, such as the age distribution of housing stock, local water quality metrics, or seasonal temperature extremes that affect service demand, adds unique, factual content without requiring manual writing for each page.
The combined pipeline produces sufficient unique content for each page at a marginal cost per page that decreases as the system matures. The initial setup of interview templates, documentation workflows, and data sources requires investment, but the ongoing production cost per page is a fraction of commissioning unique articles from scratch.
Can a single high-authority backlink to a local landing page protect it from doorway classification even if content uniqueness is borderline?
No. Doorway page classification evaluates content quality and template patterns at the page and directory level, not backlink strength. A strong backlink may improve a page’s organic ranking for a period, but it does not exempt the page from quality evaluation. If the page’s content falls below the uniqueness threshold and shares template patterns with sibling pages, it remains vulnerable to classification regardless of its link profile.
How does Google treat local landing pages that use AI-generated city-specific content to achieve the uniqueness threshold?
Google evaluates content quality and uniqueness regardless of production method. AI-generated content that contains genuinely differentiated, factually accurate, location-specific information can pass doorway evaluation. AI content that produces superficially varied text without substantive local differences, such as rephrased generic descriptions with swapped city names, fails the same user value test as manual template-and-swap content. The production method is irrelevant; the output quality determines classification.
Does consolidating thin city pages into fewer regional hub pages trigger a temporary ranking drop during the transition?
Expect a two to four week adjustment period after consolidation. During this window, Google processes the 301 redirects, re-evaluates the hub pages, and recalculates rankings. Some ranking volatility is normal. The hub pages typically begin recovering within three to four weeks and exceed the combined pre-consolidation traffic of the thin pages within six to eight weeks, because the consolidated content surpasses quality thresholds that the individual thin pages could not meet.
Sources
- Google Warns Against City Landing Pages as Doorway Pages – Search Engine Roundtable
- Location Pages: What Crosses the Line to Doorway Abuse – RicketyRoo
- Are City Landing Pages Doorway Pages – Google Search Central Community
- Doorway Pages vs Landing Pages: Hidden SEO Risks in 2026 – Big Red SEO
- How to Avoid Google’s Doorway Page Spam Penalty – Orbit Media