Bot Policy &
Operational Standards
Official technical and legal specifications for the HumangoBot audit network. This document outlines our commitment to RFC 9309, GDPR compliance, and ethical web auditing.
Technical Audit Scope
View the detailed list of compliance checks and detection logic.
1. Purpose & Operator Identity
HumangoBot is a specialized security crawler operated by Humango Limited. Our primary mission is the automated identification of technical vulnerabilities and GDPR compliance gaps across global web infrastructure.
Static IP
116.203.3.75
User-Agent
HumangoBot/1.0 (+https://humango.app)
Reverse DNS
bot.humango.app
2. Technical Standards (RFC 9309)
HumangoBot is a "Polite Crawler". We prioritize the server stability of the audited websites over the speed of our indexing.
- REP
Robots Exclusion Protocol
Full adherence to
robots.txt. We strictly avoid paths marked asDisallowfor our agent or*. - DELAY
Crawl-Delay & Backoff
We support
Crawl-delay. In its absence, we maintain a minimum of 5 seconds between requests. We respectRetry-Afterheaders with exponential backoff. - DNT
Privacy Headers (DNT/GPC)
Every request transmits
DNT: 1(Do Not Track) andSec-GPC: 1headers, signaling our intent not to track or fingerprint users.
3. Data Protection & Ethics (GDPR)
Stateless Operation
No cookie storage or session persistence. Each page crawl starts with a clean incognito context. No fingerprinting is performed.
Data Minimization
Our system is hard-coded to ignore and redact emails, phone numbers, and PII found in page source. We only collect technical metadata.
Legal Basis (Art. 6(1)(f))
Audits are conducted under Legitimate Interest. This processing is necessary for the security monitoring of the digital ecosystem.
Retention Policy
Audit evidence (screenshots/logs) is stored for 365 days for verification purposes and automatically deleted thereafter.
Data Protection Officer
For Article 17 (Right to Erasure) requests, DPA inquiries, or domain exclusion (Opt-Out):
Verification Manifesto v1.1