GPT-5 Nano vs.Claude 4 Opus
Passed:
GPT-5 Nano 62.7% (52/83)
Claude 4 Opus 62.7% (52/83)
Average request time:
GPT-5 Nano 12.74s
Claude 4 Opus 17.98s
Passed:
GPT-5 Nano 62.7% (52/83)
Claude 4 Opus 62.7% (52/83)
Average request time:
GPT-5 Nano 12.74s
Claude 4 Opus 17.98s
Summary
Annotation Understanding
GPT-5 Nano
0 / 2
Claude 4 Opus
1 / 2
CAPTCHA
GPT-5 Nano
2 / 2
Claude 4 Opus
2 / 2
Color Identification
GPT-5 Nano
1 / 1
Claude 4 Opus
1 / 1
Counting nodes
GPT-5 Nano
0 / 1
Claude 4 Opus
0 / 1
Defect Detection
GPT-5 Nano
12 / 15
Claude 4 Opus
10 / 15
Document Understanding
GPT-5 Nano
8 / 10
Claude 4 Opus
9 / 10
Localization
GPT-5 Nano
0 / 1
Claude 4 Opus
0 / 1
OCR
GPT-5 Nano
7 / 9
Claude 4 Opus
8 / 9
Object Counting
GPT-5 Nano
1 / 10
Claude 4 Opus
0 / 10
Object Detection
GPT-5 Nano
2 / 2
Claude 4 Opus
2 / 2
Object Measurement
GPT-5 Nano
0 / 2
Claude 4 Opus
1 / 2
Object Understanding
GPT-5 Nano
6 / 11
Claude 4 Opus
7 / 11
Receipt Reading
GPT-5 Nano
1 / 1
Claude 4 Opus
1 / 1
Sign Understanding
GPT-5 Nano
2 / 2
Claude 4 Opus
1 / 2
Spatial Relations
GPT-5 Nano
11 / 17
Claude 4 Opus
9 / 17
Web Action Understanding
GPT-5 Nano
4 / 4
Claude 4 Opus
4 / 4
Contribute a Prompt
Have an idea for a prompt? Open a pull request on the project repository!