diff --git a/README.md b/README.md index 56ff448af83e25055919dc39bea4d62fa667e9e8..f3d8f3fedebbdd25cda209786871cec010969da6 100644 --- a/README.md +++ b/README.md @@ -23,7 +23,7 @@ Workshop AVVision : Autonomous Vehicle Vision - [Acknowledgements](#acknowledgements) # Overview -This repository contains materials from the paper +This repository contains the materials presented in the paper [DriPE: A Dataset for Human Pose Estimation in Real-World Driving Settings](https://openaccess.thecvf.com/content/ICCV2021W/AVVision/papers/Guesdon_DriPE_A_Dataset_for_Human_Pose_Estimation_in_Real-World_Driving_ICCVW_2021_paper.pdf). We provide the link to download the DriPE [dataset](#dataset), @@ -32,13 +32,13 @@ SBl, MSPN and RSN. Furthermore, we provide the code to evaluate HPE networks with [mAPK metric](#evaluation), our keypoint-centered metric. # Dataset -DriPE dataset can be found [here](http://dionysos.univ-lyon2.fr/~ccrispim/DriPE/DriPE.zip). We provide the 10k images, +DriPE dataset can be download [here](http://dionysos.univ-lyon2.fr/~ccrispim/DriPE/DriPE.zip). We provide 10k images, along with keypoint annotations, split as: * 6.4k for training * 1.3k for validation * 1.3k for testing -Annotations follow the COCO annotation style, with 17 keypoints. +The annotation files follow the COCO annotation style, with 17 keypoints. More information can be found [here](https://cocodataset.org/#format-data). ##### **DriPE image samples** @@ -81,7 +81,7 @@ Paths can be absolute, relative to the script or relative to the respective json -h, --help\tdisplay this help message and exit ``` -We provide in this repo one annotation file and one prediction. To evaluate these predictions, run: +We provide in this repo one annotation and one prediction file. To evaluate these predictions, run: ``` python eval_mapk.py keypoints_out_SBL_autob_test-repo.json autob_coco_test.json ``` @@ -108,8 +108,7 @@ If you use this dataset or code in your research, please cite the paper: ``` # Acknowledgments -This work was supported by the Pack Ambition -Recherche 2019 funding of the French AURA Region in +This work was supported by the Pack Ambition Recherche 2019 funding of the French AURA Region in the context of the AutoBehave project. <div style="text-align:center"> <img style="margin-right: 20px" src="assets/logo_liris.png" alt="LIRIS logo" height="75" width="160"/>