AI vs Manual Interview Cheating Detection
Data verified weekly
AI detection tools analyze 10+ signals simultaneously — timing, eye gaze, audio, phrasing — that human interviewers cannot track in real-time. Published detection rates range from 85-97% for AI tools. Manual observation catches an estimated 30-50% of cheating attempts, limited to what a single interviewer can notice during a conversation.
Detection Comparison
| Factor | AI Detection | Manual Observation |
|---|---|---|
| Signals analyzed | 8-20+ simultaneous | 2-3 (what interviewer notices) |
| Detection rate | 85-97% (vendor-reported) | ~30-50% (estimated) |
| Analysis time | 9 seconds to real-time | During interview only |
| Cost per interview | $0-5 | $0 (interviewer time already spent) |
| Consistency | Same criteria every time | Varies by interviewer |
| Eye gaze tracking | ✓ | — |
| Audio pattern analysis | ✓ | — |
| Timing anomalies | ✓ | Limited |
| Phrasing comparison vs AI models | ✓ | — |
| Rapport / gut feeling | — | ✓ |
| Follow-up probing | — | ✓ |
Data Sources
- 35% cheating rate: Fabric AI analysis of 19,368 interviews (late 2025)
- 85-97% AI detection rate: vendor-reported figures (Sherlock AI, Fabric)
- Manual detection estimate: industry consensus, no controlled study available
- VerifyMeeting uses 8 detection signals and returns results in 9 seconds