Claude 4.1 Opus vs.GPT-5 Nano
Passed:
Claude 4.1 Opus 61.4% (51/83)
GPT-5 Nano 61.0% (50/82)
Average request time:
Claude 4.1 Opus 17.48s
GPT-5 Nano 13.09s
Passed:
Claude 4.1 Opus 61.4% (51/83)
GPT-5 Nano 61.0% (50/82)
Average request time:
Claude 4.1 Opus 17.48s
GPT-5 Nano 13.09s
Summary
Annotation Understanding
Claude 4.1 Opus
1 / 2
GPT-5 Nano
0 / 2
CAPTCHA
Claude 4.1 Opus
2 / 2
GPT-5 Nano
0 / 2
Color Identification
Claude 4.1 Opus
1 / 1
GPT-5 Nano
1 / 1
Defect Detection
Claude 4.1 Opus
9 / 15
GPT-5 Nano
11 / 15
Document Understanding
Claude 4.1 Opus
9 / 10
GPT-5 Nano
8 / 10
Localization
Claude 4.1 Opus
0 / 1
GPT-5 Nano
0 / 1
OCR
Claude 4.1 Opus
8 / 9
GPT-5 Nano
6 / 9
Object Counting
Claude 4.1 Opus
0 / 11
GPT-5 Nano
1 / 11
Object Detection
Claude 4.1 Opus
1 / 2
GPT-5 Nano
1 / 2
Object Measurement
Claude 4.1 Opus
0 / 1
GPT-5 Nano
0 / 1
Object Understanding
Claude 4.1 Opus
7 / 11
GPT-5 Nano
7 / 11
Receipt Reading
Claude 4.1 Opus
1 / 1
GPT-5 Nano
1 / 1
Sign Understanding
Claude 4.1 Opus
2 / 2
GPT-5 Nano
2 / 2
Spatial Relations
Claude 4.1 Opus
11 / 17
GPT-5 Nano
12 / 17
Web Action Understanding
Claude 4.1 Opus
4 / 4
GPT-5 Nano
4 / 4
Contribute a Prompt
Have an idea for a prompt? Open a pull request on the project repository!