Models | XQuAD | MLQA | ||
---|---|---|---|---|
EM↑ | F1↑ | EM↑ | F1↑ | |
URA-LLaMa 70B | 0.01 ± 0.00 | 0.17 ± 0.00 | 0.01 ± 0.00 | 0.18 ± 0.00 |
URA-LLaMa 13B | 0.00 ± 0.00 | 0.09 ± 0.00 | 0.00 ± 0.00 | 0.10 ± 0.00 |
URA-LLaMa 7B | 0.00 ± 0.00 | 0.09 ± 0.00 | 0.00 ± 0.00 | 0.10 ± 0.00 |
LLaMa-2 13B | 0.00 ± 0.00 | 0.02 ± 0.00 | 0.00 ± 0.00 | 0.03 ± 0.00 |
LLaMa-2 7B | 0.00 ± 0.00 | 0.02 ± 0.00 | 0.00 ± 0.00 | 0.02 ± 0.00 |
Vietcuna 7B | 0.00 ± 0.00 | 0.06 ± 0.00 | 0.00 ± 0.00 | 0.05 ± 0.00 |
MixSUra 8x7B | 0.00 ± - | 0.11 ± - | 0.00 ± - | 0.12 ± - |
GPT-3.5 | 0.00 ± 0.00 | 0.19 ± 0.00 | 0.00 ± 0.00 | 0.20 ± 0.00 |
GPT-4 | 0.00 ± 0.00 | 0.24 ± 0.00 | 0.00 ± 0.00 | 0.25 ± 0.00 |