Skip to content
Snippets Groups Projects
Commit 257bd854 authored by Romain Guesdon's avatar Romain Guesdon
Browse files

first init

parent 026317b7
No related branches found
No related tags found
No related merge requests found
# vis-pred
<h1 style="text-align:center">
DriPE: A Dataset for Human Pose Estimation in Real-World Driving Settings
</h1>
<div style="text-align:center">
<h3>
<a href="https://liris.cnrs.fr/page-membre/romain-guesdon">Romain Guesdon</a>,
<a href="https://liris.cnrs.fr/page-membre/carlos-crispim-junior">Carlos Crispim-Junior</a>,
<a href="https://liris.cnrs.fr/page-membre/laure-tougne">Laure Tougne</a>
<br>
<br>
ICCV: International Conference on Computer Vision 2021
<br>
Workshop AVVision : Autonomous Vehicle Vision
</h3>
</div>
# Table of content
- [Overview](#overview)
- [Dataset](#dataset)
- [Networks](#networks)
- [Evaluation](#evaluation)
- [Citation](#citation)
- [Acknowledgements](#acknowledgements)
# Overview
This repository contains the materials presented in the paper
[DriPE: A Dataset for Human Pose Estimation in Real-World Driving Settings](https://openaccess.thecvf.com/content/ICCV2021W/AVVision/papers/Guesdon_DriPE_A_Dataset_for_Human_Pose_Estimation_in_Real-World_Driving_ICCVW_2021_paper.pdf).
We provide the link to download the DriPE [dataset](#dataset),
along with trained weights for the three [networks](#networks) presented in this paper:
SBl, MSPN and RSN.
Furthermore, we provide the code to evaluate HPE networks with [mAPK metric](#evaluation), our keypoint-centered metric.
# Dataset
DriPE dataset can be download [here](http://dionysos.univ-lyon2.fr/~ccrispim/DriPE/DriPE.zip). We provide 10k images,
along with keypoint annotations, split as:
* 6.4k for training
* 1.3k for validation
* 1.3k for testing
The annotation files follow the COCO annotation style, with 17 keypoints.
More information can be found [here](https://cocodataset.org/#format-data).
##### **DriPE image samples**
![DriPE image samples](assets/dripe_sample.png)
# Networks
We used in our study three architectures:
* __SBl__: Simple Baselines for Human Pose Estimation and Tracking (Xiao 2018) [GitHub](https://github.com/microsoft/human-pose-estimation.pytorch)
* __MSPN__: Rethinking on Multi-Stage Networks for Human Pose Estimation (Li 2019) [GitHub](https://github.com/megvii-detection/MSPN)
* __RSN__: Learning Delicate Local Representations for Multi-Person Pose Estimation (Cai 2020) [GitHub](https://github.com/caiyuanhao1998/RSN)
We used for training and for inference the code provided by the authors in the three linked repositories.
Weights of the trained model evaluated in our study can be found [here](http://dionysos.univ-lyon2.fr/~ccrispim/DriPE/models).
More details about the training can be found in our [paper](https://openaccess.thecvf.com/content/ICCV2021W/AVVision/papers/Guesdon_DriPE_A_Dataset_for_Human_Pose_Estimation_in_Real-World_Driving_ICCVW_2021_paper.pdf).
##### **HPE on the COCO 2017 validation set.**
AP OKS (\%) | AP | AP<sup>50</sup> | AP<sup>75</sup> | AP<sup>L</sup> | AR | AR<sup>50</sup> | AR<sup>75</sup> | AR<sup>L</sup>
:---- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
SBl | 72 | 92 | 80 | 77 | 76 | 93 | 82 | 80
MSPN | __77__ | 94 | 85 | 82 | __80__ | 95 | 87 | 85
RSN | 76 | 94 | 84 | 81 | 79 | 94 | 85 | 84
##### **HPE on the DriPE test set.**
AP OKS (\%) | AP | AP<sup>50</sup> | AP<sup>75</sup> | AP<sup>L</sup> | AR | AR<sup>50</sup> | AR<sup>75</sup> | AR<sup>L</sup>
:---- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
SBl | 75 | 99 | 91 | 75 | 81 | 99 | 94 | 81
MSPN | 81 | 99 | 97 | __81__ | 85 | 99 | 97 | __85__
RSN | 75 | 99 | 93 | 75 | 79 | 99 | 95 | 79
# Evaluation
Evaluation is performed using two metrics:
* __AP OKS__, the original metric from COCO dataset, which is already implemented in the [cocoapi](https://github.com/cocodataset/cocoapi)
and in the three network repositories
* __mAPK__, our new keypoint-centered metric. We provide script for evaluate the network predictions in this repository.
Evaluation with mAPK can be used by running the eval_mpk.py script.
```Script to evaluate prediction in COCO format using the mAPK metric.
Usage: python eval_mapk.py [json_prediction_path] [json_annotation_path]
Paths can be absolute, relative to the script or relative to the respective json/gts or json/preds directory.
-h, --help\tdisplay this help message and exit
```
We provide in this repo one annotation and one prediction file. To evaluate these predictions, run:
```
python eval_mapk.py keypoints_out_SBL_autob_test-repo.json autob_coco_test.json
```
Expected results are :
F1 score: 0.733
Metric | Head | Should. | Elbow | Wrist | Hip | Knee | Ankle | All | Mean | Std
:--- | :---: | :----: | :----: | :---: | :----: | :----: | :----: | :----: | :----: | :-----:
AP | 0.30 | 0.86 | 0.78 | 0.92 | 0.91 | 0.76 | 0.13 | 0.68 | 0.67 | 0.29
AR | 0.87 | 0.92 | 0.93 | 0.96 | 0.88 | 0.61 | 0.05 | 0.80 | 0.75 | 0.31
# Citation
If you use this dataset or code in your research, please send us an email with the following details and we will update our webpage with your results.
* Performance (%)
* Experimental Setup
* Paper details
The DRIPE dataset is only to be used for scientific purposes. It must not be republished other than by the original authors. The scientific use includes processing the data and showing it in publications and presentations. If you use it, please cite:
```
@InProceedings{Guesdon_2021_ICCV,
author = {Guesdon, Romain and Crispim-Junior, Carlos and Tougne, Laure},
title = {DriPE: A Dataset for Human Pose Estimation in Real-World Driving Settings},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {October},
year = {2021},
pages = {2865-2874}
}
```
# Acknowledgments
This work was supported by the Pack Ambition Recherche 2019 funding of the French AURA Region in
the context of the AutoBehave project.
<div style="text-align:center">
<img style="margin-right: 20px" src="assets/logo_liris.png" alt="LIRIS logo" height="75" width="160"/>
<img style="margin-left: 20px" src="assets/logo_ra.png" alt="RA logo" height="60" width="262"/>
</div>
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment