GPT-5.2 vs.Claude 4.1 Opus
Passed:
GPT-5.2 78.0% (64/82)
Claude 4.1 Opus 61.4% (51/83)
Average request time:
GPT-5.2 6.94s
Claude 4.1 Opus 17.48s
Passed:
GPT-5.2 78.0% (64/82)
Claude 4.1 Opus 61.4% (51/83)
Average request time:
GPT-5.2 6.94s
Claude 4.1 Opus 17.48s
Summary
Annotation Understanding
GPT-5.2
1 / 2
Claude 4.1 Opus
1 / 2
CAPTCHA
GPT-5.2
0 / 2
Claude 4.1 Opus
2 / 2
Color Identification
GPT-5.2
0 / 1
Claude 4.1 Opus
1 / 1
Defect Detection
GPT-5.2
14 / 15
Claude 4.1 Opus
9 / 15
Document Understanding
GPT-5.2
9 / 10
Claude 4.1 Opus
9 / 10
Localization
GPT-5.2
1 / 1
Claude 4.1 Opus
0 / 1
OCR
GPT-5.2
9 / 9
Claude 4.1 Opus
8 / 9
Object Counting
GPT-5.2
3 / 11
Claude 4.1 Opus
0 / 11
Object Detection
GPT-5.2
2 / 2
Claude 4.1 Opus
1 / 2
Object Measurement
GPT-5.2
0 / 1
Claude 4.1 Opus
0 / 1
Object Understanding
GPT-5.2
8 / 11
Claude 4.1 Opus
7 / 11
Receipt Reading
GPT-5.2
1 / 1
Claude 4.1 Opus
1 / 1
Sign Understanding
GPT-5.2
2 / 2
Claude 4.1 Opus
2 / 2
Spatial Relations
GPT-5.2
15 / 17
Claude 4.1 Opus
11 / 17
Web Action Understanding
GPT-5.2
4 / 4
Claude 4.1 Opus
4 / 4
Contribute a Prompt
Have an idea for a prompt? Open a pull request on the project repository!