You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Keel. I am looking for something similar and saw your post in Qiita. I did not finally understand if you had managed to get the estimated 3d pose transformed into an avatar ? I am following the following github code , maybe you know it or maybe not. link below
Hi timtensor.
I apologize for the late reply.
First of all, this repository is not a Unity project for real-time execution. It is a project that only loads the static attitude data output by tf-openpose and moves the 3D model.
Therefore, I think that it may be necessary to link Unity and Python execution environment by some means in order to operate on the editor. After a little research, it seems to be possible with IronPython2.
To be honest, I forget the details of this repository, but I think that the complementation of attitude information was an easy one that is only linear complementation. As for the reflection of posture information, in the case of IK, the position of IKTarget such as feet and hands is reflected by normalizing the direction of each bone and multiplying the length of the bone in the model. In the case of FK, simply rotate each bone. I think that it was reflected and linear interpolation was applied. I think this is the information I can provide.
Hi Keel. I am looking for something similar and saw your post in Qiita. I did not finally understand if you had managed to get the estimated 3d pose transformed into an avatar ? I am following the following github code , maybe you know it or maybe not. link below
https://github.com/ArashHosseini/3d-pose-baseline
I have personally no idea on how to run a gaming engine and if you could provide a detailed steps that would be really appreciated .
Please let me know how it goes for you .
The text was updated successfully, but these errors were encountered: