Skip to content
Snippets Groups Projects
Commit 37d5dd23 authored by Jacques Fize's avatar Jacques Fize
Browse files

debug table in README

parent c9c3c602
No related branches found
No related tags found
No related merge requests found
...@@ -44,11 +44,12 @@ For Anaconda users ...@@ -44,11 +44,12 @@ For Anaconda users
## Get pre-trained model ## Get pre-trained model
Pre-trained model are available : Pre-trained model are available :
| Geographic Area | Description | URL |
|-----------------|-------------------------------------------------------------------------|--------------------------------------------------------------------------------------| | Geographic Area | Description | URL |
| FR | Model trained on the France populated places and area | [Download](https://projet.liris.cnrs.fr/hextgeo/files/trained_models/FR_MODEL_2.zip) | |-----------|-------------------------------------------------------------------------|--------------------------------------------------------------------------|
| GB | Model trained on the England populated places and area | [Download](https://projet.liris.cnrs.fr/hextgeo/files/trained_models/GB_MODEL_2.zip) | | FR | Model trained on the France populated places and area | https://projet.liris.cnrs.fr/hextgeo/files/trained_models/FR_MODEL_2.zip |
| US | Model trained on the United States of America populated places and area | [Download](https://projet.liris.cnrs.fr/hextgeo/files/trained_models/US_MODEL_2.zip) | | GB | Model trained on the England populated places and area | https://projet.liris.cnrs.fr/hextgeo/files/trained_models/GB_MODEL_2.zip |
| US | Model trained on the United States of America populated places and area | https://projet.liris.cnrs.fr/hextgeo/files/trained_models/US_MODEL_2.zip |
## Load and use the model ## Load and use the model
...@@ -104,6 +105,7 @@ Use the following command to generate the datasets for training your model. ...@@ -104,6 +105,7 @@ Use the following command to generate the datasets for training your model.
python3 generate_dataset.py <geonames_dataset> <wikipedia_dataset> <geonames_hierarchy_data> python3 generate_dataset.py <geonames_dataset> <wikipedia_dataset> <geonames_hierarchy_data>
| Parameter | Description | | Parameter | Description |
|-----------------|------------------------------------------------------------------------------------------------------------------------------------------------------| |-----------------|------------------------------------------------------------------------------------------------------------------------------------------------------|
| --cooc-sampling | Number of cooccurrence sampled for each place in the cooccurrence dataset | | --cooc-sampling | Number of cooccurrence sampled for each place in the cooccurrence dataset |
...@@ -111,6 +113,7 @@ Use the following command to generate the datasets for training your model. ...@@ -111,6 +113,7 @@ Use the following command to generate the datasets for training your model.
| --adj-nside | Healpix resolution where places within are considered adjacent | | --adj-nside | Healpix resolution where places within are considered adjacent |
| --split-nside | Size of the zone where the train/test split are done | | --split-nside | Size of the zone where the train/test split are done |
| --split-method | [per_pair\|per_entity] Split each dataset based on places (place cannot exists in both train and test) or pairs(place can appears in train and test) | | --split-method | [per_pair\|per_entity] Split each dataset based on places (place cannot exists in both train and test) or pairs(place can appears in train and test) |
### If you're in a hurry ### If you're in a hurry
French Geonames, French Wikipedia cooccurence data, and their train/test splits datasets can be found here : [https://projet.liris.cnrs.fr/hextgeo/files/](https://projet.liris.cnrs.fr/hextgeo/files/) French Geonames, French Wikipedia cooccurence data, and their train/test splits datasets can be found here : [https://projet.liris.cnrs.fr/hextgeo/files/](https://projet.liris.cnrs.fr/hextgeo/files/)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment