Scraping commons.wikimedia.org

commons.wikimedia.org requires Camoufox (Tier 2) because the site uses advanced anti-detect (datadome strict mode, akamai, kasada). Cheaper engine tiers fail; this is the cheapest one that actually works.

100%
Success rate
34 of 34
8649ms
Avg latency
across all requests
Camoufox
Primary engine
10 credits per request
0
Discovered APIs
none captured yet

Why Camoufox wins on commons.wikimedia.org

Camoufox is a patched Firefox build with anti-fingerprint protections at the C++ level — properties like `navigator.webdriver`, WebGL vendor strings, and timing signatures all read as a real browser instead of a headless harness.

Cost math: at 10 credits per request, scraping commons.wikimedia.org costs $1.00–$1.80 per 1,000 requests on the Starter tier. Compare to ScrapingBee ($14.70/1K) or Firecrawl (~$5.33/1K flat). For high-volume workloads on this domain, the credit-based model lands cheaper.

Try commons.wikimedia.org in the playground

10 free requests per day, no signup. The router picks the engine — you get clean markdown back.

Other domains we route through Camoufox

Sites with similar protection profiles. Each link goes to its own intel page with real production routing data.

Related deep-dives