Upload
Choose any audio file (WAV, MP3, FLAC). The backend loads it with torchaudio, resamples to 16 kHz, and normalizes for the model.
Voice Deepfake Detection
Upload an audio clip and get a real or fake label in seconds. RawNetLite plus our meta-learning layer for robust, production-ready detection.
Choose any audio file (WAV, MP3, FLAC). The backend loads it with torchaudio, resamples to 16 kHz, and normalizes for the model.
RawNetLite plus our meta-learning layer runs on a fixed 3-second waveform and returns P(fake) and a real/fake label in one request.
Each run is stored in your browser. View history on Results and clear when needed. Ready for API-backed storage when you need it.
What makes our system different is a meta-learning layer on top of the base RawNetLite encoder. Instead of using a fixed classifier, we train a meta-learner that quickly adapts to new domains or attack types with few examples—improving generalization and robustness to unseen deepfakes. This is our golden point: learning to learn at inference time.
From file to label: pipeline and model in detail.
Three steps from upload to result.
Select an audio file. Supported formats include WAV, MP3, and more.
We resample, normalize, and run RawNetLite + meta-learning layer.
Receive P(fake) and a real/fake label; results are saved to history.
We build and share tools to detect voice deepfakes—so the world can catch harmful synthetic audio with better, advanced, and modern systems. Open source, API access, and a community-driven approach to make the digital world safer and easier to live in.
Read our missionWe provide API keys and a hosted endpoint (in progress) so you can integrate our service into your apps without hosting locally or on a VPS. Docs show how to use the API; we’ll announce when the endpoint and key signup are live.
How to use the API →Run locally or host on a VPS. Get started for local setup; Deployment covers host on VPS (where to get a server), Nginx, SSL, and security.
Deep dive into the pipeline, model, and meta-learning layer.