BERTimbau finetuning - Portuguese Metaphors
Collection
5 items • Updated
This model is a fine-tuned version of neuralmind/bert-base-portuguese-cased on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.4548 | 1.0 | 23 | 0.2657 | 0.0 | 0.0 | 0.0 | 0.9197 |
| 0.2106 | 2.0 | 46 | 0.1400 | 0.2727 | 0.125 | 0.1714 | 0.9510 |
| 0.1216 | 3.0 | 69 | 0.1064 | 0.8462 | 0.4583 | 0.5946 | 0.9687 |
| 0.0646 | 4.0 | 92 | 0.0923 | 0.9231 | 0.5 | 0.6486 | 0.9714 |
| 0.0347 | 5.0 | 115 | 0.0930 | 0.9286 | 0.5417 | 0.6842 | 0.9741 |
| 0.0218 | 6.0 | 138 | 0.0821 | 0.8824 | 0.625 | 0.7317 | 0.9796 |
| 0.0166 | 7.0 | 161 | 0.1217 | 0.8125 | 0.5417 | 0.65 | 0.9741 |
| 0.0116 | 8.0 | 184 | 0.1249 | 0.875 | 0.5833 | 0.7000 | 0.9769 |
Base model
neuralmind/bert-base-portuguese-cased