In this project, we implement a new method for the space-time registration of a growing plant based on matching the plant at different geometric scales.
The following python packages are needed to run the sample code
- pyntcloud
- tqdm
- numpy
- pandas
- open3d
- networkx
- scipy
- matplotlib
- Dijkstar
- scikit-learn
- SpharaPy
- polyscope
- robust-laplacian
- joblib
- imageio
Install these python packages by pip install -r requirements.txt
-
Download the data from [https://media.romi-project.eu/data/4d_plant_analysis_data.zip]
-
Unzip the data compressed in to
data/fold. -
Run the registration pipeline with:
python3 run_registration_pipeline.py --type XXX --method YYYReplace
XXXby the target type of plant inarabidopsis,tomato,maize. If it's not explicitly specified, it's set by default totomato.Replace
YYYby the method. We provide two choices of method:local_icpandfm, withlocal_icpour method andfmthe functional map method as the reference method. If it's not explicitly specified, it's set by default tolocal_icp -
Attention: every time rerun the registration process, please delete the existed
data/{type}/registration_resultdirectory -
The expected visualization results:
registration result of tomato:
registration result of maize:
- the running time and metric scores might not be exactly the same as presented in the paper. Since the scores are computed on random sampled subsets. But they should be of the same order.
- First run the registration framework to get the point-wise correspondence. Check
data/XXX/registration_resultto see if the registration work is done - Run the interpolation pipeline with:
python3 run_interpolation.py --type XXXwithXXXthe type of plant - The interpolated point clouds will be shown frame by frame. You can save the images and generate a video with them.
- The expected result:
compared with the interpolation video produced by funtional map based method:
We can see that the interpolation produced by our method is more smooth and less noisy.



