An autonomous, 11-vector AI swarm for uncovering deep web financial assets. Powered by the TinyFish Accelerator.
Every year, billions of dollars in unclaimed assets, federal pensions, and class-action settlements go unclaimed. This data is heavily fragmented, protected by hostile government WAFs (Web Application Firewalls), and buried inside unstructured, 500-page PDF documents. Traditional static web scrapers fail immediately when confronted with dynamic DOMs or CAPTCHAs.
ClaimSleuth is a consumer FinTech platform that deploys a live 11-Vector AI Swarm to autonomously navigate the web on the user's behalf.
Instead of relying on brittle HTML parsers, ClaimSleuth utilizes TinyFish AI Agents combined with a proprietary OSINT (Open Source Intelligence) routing system. The orchestrator fires agents in strategic, rate-limited squads to bypass government timeouts, extract unstructured data from the deep web, and return perfectly formatted JSON payloads to a fault-tolerant React dashboard.
- 11-Vector OSINT Dragnet: Scrapes Class Actions, SEC Fair Funds, PBGC Pensions, Retail Bankruptcies, and Municipal Check registries simultaneously.
- Batched Concurrency: A highly optimized Node/Express orchestrator that fires agents in "Squads of 3" with micro-cooldowns to prevent DDoS flags and CAPTCHA walls.
- Fault-Tolerant UI: A React frontend that gracefully isolates agent timeouts without crashing the user's dashboard.
- Day-1 Revenue Funnel: Converts the extracted JSON payloads into a high-contrast data visualization suite, paired with a scarcity timer and a secure Stripe checkout.
- Frontend: React, Tailwind CSS
- Backend: Node.js, Express
- Database & Auth: Supabase
- AI Orchestration: TinyFish API (Veo/Swarm Models)
- Clone the repository.
- Run
npm installin both the/frontendand/backenddirectories. - Add your
TINYFISH_API_KEYand Supabase credentials to the.envfile. - Run
npm run devto launch the API Gateway and React dashboard.