diff --git a/README.md b/README.md
index 9181da7d362e0f4af407a1cbd719b2d3921adef5..4d63dcd4edd96398139e0cda19b9f6672351e6be 100644
--- a/README.md
+++ b/README.md
@@ -2,10 +2,52 @@
 
 # GDRNPP for FruitBin
 
-This repository is a fork of the official GDRNPP repository TODO link
+This repository is a fork of the official GDRNPP [repository](https://github.com/shanice-l/gdrnpp_bop2022) 
 
 
-TODO doc
+### Preprocessing of FruitBin dataset
+
+In order to train the GDRNPP model on a new dataset, it is necessary to have this dataset in BOP format. Several scripts were created to convert the Fruitbin dataset to BOP format.
+
+The first step is to resize the bounding boxes. It is necessary to reduce them for correct operation. Here is the [script](https://gitlab.liris.cnrs.fr/gduret/gdrnpp_bop2022/-/blob/main/preprocessing/resize_bbox.py?ref_type=heads). The paths to the input and output folders are hardcoded and can be easily changed.
+
+Then, a [script](https://gitlab.liris.cnrs.fr/gduret/gdrnpp_bop2022/-/blob/main/preprocessing/preprocess_fruitbin.py?ref_type=heads) was created that does the main part of preprocessing. It creates the necessary directories, copies the necessary files into them, and creates json files with ground truth in the required format. The command to run the script:
+
+`python /gdrnpp_bop2022/preprocessing/preprocess_fruitbin.py --src_directory PATH_TO_SRC_DIRECTORY --dst_directory PATH_TO_DST_DIRECTORY --scenario SCENARIO`
+
+- --src_directory - the input directory with the folders of all the fruits;
+- --dst_directory - the output directory;
+- --scenario - the scenario for splitting data in the dataset from the Splitting folder. Basic dataset splitting scenarios:
+```
+
+_world_occ_07.txt, _world_occ_05.txt, _world_occ_03.txt, _world_occ_01.txt, _camera_occ_07.txt, _camera_occ_05.txt, _camera_occ_03.txt, _camera_occ_01.txt
+
+```
+Due to the specifics of the fruitbin dataset, it turned out that using yolox did not give good results, so instead of the detections detected by yolox, a gt was used. To do this, a script was written that generates a .json file of gt in the required format. The command to run the script:
+
+  `python /gdrnpp_bop2022/preprocessing/generate_gt.py`.
+
+After creating the file, make sure that the path to it is correct in the main config file
+
+The [generate_image_sets_file](https://gitlab.liris.cnrs.fr/gduret/gdrnpp_bop2022/-/blob/main/preprocessing/generate_image_sets_file.py?ref_type=heads) and [generate_test_targets_file](https://gitlab.liris.cnrs.fr/gduret/gdrnpp_bop2022/-/blob/main/preprocessing/generate_test_targets_file.py?ref_type=heads) scripts create two files required for testing:
+
+`python /gdrnpp_bop2022/preprocessing/generate_image_sets_file.py`
+`python /gdrnpp_bop2022/preprocessing/generate_test_targets_file.py`
+
+The paths to the input and output directories are also hardcoded in the script. If necessary, the scenario for splitting data in the dataset can also be changed.
+
+### Model evaluation
+The ADD metric is used to evaluate the accuracy of different models trained on the fruitbin dataset. It is necessary to use the script that evaluates the accuracy of detecting the position of a single fruit, i.e. it has to be run separately for each fruit. The command:
+
+```
+python /gdrnpp_bop2022/core/gdrn_modeling/tools/fruitbin/eval_pose.py --path_data=/gdrnpp_bop2022/datasets/BOP_DATASETS/fruitbin/ --pred_path=/gdrnpp_bop2022/output/gdrn/fruitbin/convnext_a6_AugCosyAAEGray_BG05_mlL1_DMask_amodalClipBox_classAware_fruitbin/inference_$MODEL/fruitbin_test/convnext-a6-AugCosyAAEGray-BG05-mlL1-DMask-amodalClipBox-classAware-fruitbin-test-iter0_fruitbin-test.csv --class_name=apple2 --symmetry=True
+```
+
+- --path_data - path to the dataset
+- --pred_path - the path to the .csv file created after testing the model with the detected positions
+- --class_name - the fruit for which the pose accuracy is estimated 
+- --symmetry - a boolean value of whether the fruit is symmetrical. In the Fruitbin dataset, only banana and pear are asymmetrical.
+
 
 
 # Original README of GDRNPP for BOP2022