Google’s published webspam reports document over 6 million manual actions processed in recent years, each involving a human reviewer evaluating a site against specific spam policy criteria. That volume clarifies the scale of a process most site owners misunderstand. Manual actions are not algorithmic penalties and not automated SpamBrain classifications. Sites enter the review queue through algorithmic flags from SpamBrain, user spam reports, escalations from quality rater evaluations, and targeted webspam team investigations. The triage system prioritizes review based on potential impact: sites with higher visibility, YMYL content, and suspected violations affecting large numbers of queries receive priority. Reviewers follow internal guidelines mapped to published spam policies, evaluating documented criteria rather than making subjective quality judgments. The distinction between algorithmic suppression and manual action determines both the prevention approach and the reconsideration strategy.
How Sites Enter the Manual Review Queue and What Triggers Human Evaluation
Not every site receives manual review. The vast majority of spam is handled algorithmically by SpamBrain and other automated systems. Sites enter the manual review queue through several documented pathways:
Algorithmic flags. SpamBrain and other spam detection systems identify sites that exhibit patterns warranting human verification. These flags may indicate suspected link schemes, cloaking, or content spam that the algorithm detected but flagged for human confirmation before action.
User spam reports. Google provides a spam reporting mechanism that allows users and competitors to report suspected spam. While not all reports trigger review, a pattern of reports about the same site increases the likelihood of manual evaluation.
Escalation from quality rater evaluations. Quality raters who encounter suspected spam during their content evaluation work can flag sites for further review. This creates a pathway from routine quality assessment to manual action review.
Targeted investigations. Google’s webspam team conducts proactive investigations into specific spam categories, particularly during spam update cycles. Sites identified during these sweeps enter the review queue.
The triage system prioritizes review based on the potential impact of the suspected spam: sites with higher visibility, sites in YMYL categories, and sites where the suspected violation affects a large number of search queries receive priority review. [Observed]
What Manual Reviewers Evaluate and the Evidence Standards They Apply
Manual reviewers follow internal guidelines that map directly to Google’s published spam policies. The review is not subjective quality judgment. It is a policy compliance evaluation against specific, documented criteria.
For unnatural links (the most common manual action type), reviewers examine:
- The backlink profile for patterns consistent with link scheme definitions
- Specific link sources identified by algorithmic flags
- Anchor text distributions that indicate manipulation
- Link acquisition patterns that suggest paid or exchanged links
- Evidence of link selling from the site to others
For thin content manual actions, reviewers evaluate:
- Whether pages provide substantive, unique value
- The proportion of auto-generated or template-based content
- Evidence of content scraping or spinning
- Doorway page patterns where multiple pages target similar queries with minimal content variation
For pure spam, reviewers check for:
- Cloaking (different content served to Googlebot versus users)
- Aggressive auto-generated content
- Malware or phishing
- Hidden text or links
For site reputation abuse, reviewers assess:
- Whether third-party content is published primarily to exploit the host site’s ranking signals
- The editorial relationship between the host site and the third-party content
- Whether the content would rank independently without the host site’s authority
The evidence standard requires confirmation of the violation, not just suspicion. Reviewers must be able to document specific examples that clearly violate published policy. [Confirmed]
The Manual Action Issuance Process and How Scope Is Determined
When a reviewer confirms a policy violation, they determine the manual action scope based on how widespread the violation is:
Site-wide manual actions are applied when the violation pervades the entire site or when the violation is structural (e.g., site-wide cloaking, a site built entirely on scraped content). Site-wide actions suppress all pages on the domain.
Partial manual actions target specific sections or page types where the violation occurs. A site with a legitimate main content section but a spam-ridden forum subsection may receive a partial action that affects only the forum URLs.
The scope determination depends on proportionality. If 5% of pages on a 10,000-page site violate spam policies, a partial action targeting those pages is appropriate. If 60% of pages violate, a site-wide action is more likely because the site’s overall character is spammy.
The manual action notification in Search Console specifies the action type and scope. For partial actions, the notification may identify affected URL patterns or sections. For site-wide actions, all pages are affected. Understanding the scope determination logic helps in crafting a reconsideration request that demonstrates comprehensive remediation across the full affected scope. [Confirmed]
The Reconsideration Request Review Process and What Determines Approval or Rejection
After remediation, the site owner submits a reconsideration request through Search Console. A reviewer, often different from the one who issued the original action, evaluates whether the violation has been genuinely resolved.
A successful reconsideration request includes three components:
- Explanation of the issue. Demonstrate that you understand exactly what violation occurred. Vague acknowledgments (“we had some bad links”) perform worse than specific descriptions (“we identified 847 links from 23 guest post networks that violated your link scheme policies”).
- Documentation of remediation. Provide evidence of every step taken: links removed, disavow files submitted, content deleted, cloaking configurations fixed. Include timestamps, screenshots, and specific examples. Reviewers verify remediation claims against the actual current state of the site.
- Prevention measures. Describe specific changes to processes, governance, and monitoring that will prevent recurrence. Generic promises (“we will follow guidelines”) are insufficient. Specific measures (“we implemented automated backlink monitoring that flags new links matching these patterns”) demonstrate credible prevention.
Common rejection reasons include:
- Incomplete remediation where some violating pages, links, or practices remain
- Evidence that the site plans to continue practices that led to the violation
- Remediation that addresses symptoms rather than the root cause
- Poor documentation that does not allow the reviewer to verify claims
Review typically takes several days to weeks. Link-related reconsideration requests may take longer due to the complexity of verifying link cleanup. Multiple rejections before approval are common for complex violations, particularly inherited manual actions where historical context is incomplete. [Confirmed]
Do user spam reports directly trigger manual actions, or do they only flag sites for review?
User spam reports enter a triage queue where they contribute to review prioritization rather than directly triggering manual actions. A single report rarely initiates review. Patterns of multiple reports about the same site increase the likelihood of human evaluation. The manual action itself requires a reviewer to independently confirm the policy violation against documented criteria before issuance.
Can a site receive a manual action for a violation that only affects a small percentage of its pages?
Yes. Google applies partial manual actions when violations are confined to specific sections or page types. A site with a legitimate main content section but a spam-ridden forum or user-generated content area may receive a partial action targeting only the affected URLs. The proportionality threshold is contextual, but concentrated violations in even 5% of pages can trigger a partial manual action for that section.
How does the reconsideration reviewer verify that remediation claims in the request are accurate?
Reviewers cross-reference the claims in reconsideration requests against the site’s current state. They check whether flagged links still exist or appear in the disavow file, whether violating content has been removed or corrected, and whether cloaking configurations are resolved. Reviewers also examine the site for additional violations beyond the originally flagged issues. Documentation with timestamps, screenshots, and specific examples increases credibility.