Claude 4 Opus vs.Mistral Small 3.1 24B
Passed:
Claude 4 Opus 62.7% (52/83)
Mistral Small 3.1 24B 44.6% (37/83)
Average request time:
Claude 4 Opus 17.98s
Mistral Small 3.1 24B 15.16s
Passed:
Claude 4 Opus 62.7% (52/83)
Mistral Small 3.1 24B 44.6% (37/83)
Average request time:
Claude 4 Opus 17.98s
Mistral Small 3.1 24B 15.16s
Summary
Annotation Understanding
Claude 4 Opus
1 / 2
Mistral Small 3.1 24B
0 / 2
CAPTCHA
Claude 4 Opus
2 / 2
Mistral Small 3.1 24B
2 / 2
Color Identification
Claude 4 Opus
1 / 1
Mistral Small 3.1 24B
1 / 1
Counting nodes
Claude 4 Opus
0 / 1
Mistral Small 3.1 24B
0 / 1
Defect Detection
Claude 4 Opus
10 / 15
Mistral Small 3.1 24B
8 / 15
Document Understanding
Claude 4 Opus
9 / 10
Mistral Small 3.1 24B
5 / 10
Localization
Claude 4 Opus
0 / 1
Mistral Small 3.1 24B
0 / 1
OCR
Claude 4 Opus
8 / 9
Mistral Small 3.1 24B
7 / 9
Object Counting
Claude 4 Opus
0 / 10
Mistral Small 3.1 24B
0 / 10
Object Detection
Claude 4 Opus
2 / 2
Mistral Small 3.1 24B
1 / 2
Object Measurement
Claude 4 Opus
1 / 2
Mistral Small 3.1 24B
0 / 2
Object Understanding
Claude 4 Opus
7 / 11
Mistral Small 3.1 24B
4 / 11
Receipt Reading
Claude 4 Opus
1 / 1
Mistral Small 3.1 24B
1 / 1
Sign Understanding
Claude 4 Opus
1 / 2
Mistral Small 3.1 24B
0 / 2
Spatial Relations
Claude 4 Opus
9 / 17
Mistral Small 3.1 24B
6 / 17
Web Action Understanding
Claude 4 Opus
4 / 4
Mistral Small 3.1 24B
4 / 4
Contribute a Prompt
Have an idea for a prompt? Open a pull request on the project repository!