在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称:AlphaPose开源软件地址:https://gitee.com/mirrors/AlphaPose开源软件介绍:News!
AlphaPoseAlphaPose is an accurate multi-person pose estimator, which is the first open-source system that achieves 70+ mAP (75 mAP) on COCO dataset and 80+ mAP (82.1 mAP) on MPII dataset.To match poses that correspond to the same person across frames, we also provide an efficient online pose tracker called Pose Flow. It is the first open-source online pose tracker that achieves both 60+ mAP (66.5 mAP) and 50+ MOTA (58.3 MOTA) on PoseTrack Challenge dataset. AlphaPose supports both Linux and Windows! COCO 17 keypoints Halpe 26 keypoints + tracking Halpe 136 keypoints + tracking ResultsPose EstimationResults on COCO test-dev 2015:
Results on MPII full test set:
More results and models are available in the docs/MODEL_ZOO.md. Pose Tracking
Please read trackers/README.md for details. CrowdPose
Please read docs/CrowdPose.md for details. InstallationPlease check out docs/INSTALL.md Model ZooPlease check out docs/MODEL_ZOO.md Quick Start
./scripts/inference.sh ${CONFIG} ${CHECKPOINT} ${VIDEO_NAME} # ${OUTPUT_DIR}, optional For high level API, please refer to
./scripts/train.sh ${CONFIG} ${EXP_ID}
./scripts/validate.sh ${CONFIG} ${CHECKPOINT} Examples: Demo using ./scripts/inference.sh configs/coco/resnet/256x192_res50_lr1e-3_1x.yaml pretrained_models/fast_res50_256x192.pth ${VIDEO_NAME}#orpython scripts/demo_inference.py --cfg configs/coco/resnet/256x192_res50_lr1e-3_1x.yaml --checkpoint pretrained_models/fast_res50_256x192.pth --indir examples/demo/ Train ./scripts/train.sh ./configs/coco/resnet/256x192_res50_lr1e-3_1x.yaml exp_fastpose More detailed inference options and examples, please refer to GETTING_STARTED.md Common issue & FAQCheck out faq.md for faq. If it can not solve your problems or if you find any bugs, don't hesitate to comment on GitHub or make a pull request! ContributorsAlphaPose is based on RMPE(ICCV'17), authored by Hao-Shu Fang, Shuqin Xie, Yu-Wing Tai and Cewu Lu, Cewu Lu is the corresponding author. Currently, it is maintained by Jiefeng Li*, Hao-shu Fang*, Haoyi Zhu, Yuliang Xiu and Chao Xu. The main contributors are listed in doc/contributors.md. TODO
We would really appreciate if you can offer any help and be the contributor of AlphaPose. CitationPlease cite these papers in your publications if it helps your research: @inproceedings{fang2017rmpe, title={{RMPE}: Regional Multi-person Pose Estimation}, author={Fang, Hao-Shu and Xie, Shuqin and Tai, Yu-Wing and Lu, Cewu}, booktitle={ICCV}, year={2017}}@article{li2018crowdpose, title={CrowdPose: Efficient Crowded Scenes Pose Estimation and A New Benchmark}, author={Li, Jiefeng and Wang, Can and Zhu, Hao and Mao, Yihuan and Fang, Hao-Shu and Lu, Cewu}, journal={arXiv preprint arXiv:1812.00324}, year={2018}}@inproceedings{xiu2018poseflow, author = {Xiu, Yuliang and Li, Jiefeng and Wang, Haoyu and Fang, Yinghong and Lu, Cewu}, title = {{Pose Flow}: Efficient Online Pose Tracking}, booktitle={BMVC}, year = {2018}} LicenseAlphaPose is freely available for free non-commercial use, and may be redistributed under these conditions. For commercial queries, please drop an e-mail at mvig.alphapose[at]gmail[dot]com and cc lucewu[[at]sjtu[dot]edu[dot]cn. We will send the detail agreement to you. |
请发表评论