| --admin_code_1 | (Optional) If you wish to train the network on a specific region |
| --admin_code_1 | (Optional) If you wish to train the network on a specific region |
# New model based on BERT embeddings
In the recent years, BERT architecture proposed by Google researches enables to outperform state-of-art methods for differents tasks in NLP (POS, NER, Classification). To verify if BERT embeddings would permit to increase the performance of our approach, we code a script to use bert with our data. In our previous model, the model returned two values each on between [0,1]. Using Bert, the task has shifted to classification (softmax) where each class correspond to a cell on the glob. We use the hierarchical projection model : Healpix. Other projections model like S2geometry can be considered : https://s2geometry.io/about/overview.
In order, to run this model training, run the `bert.py` script :
python3 bert.py <train_dataset> <test_dataset>
The train and test dataset are table data composed of two columns: sentence and label.