Timeline for How can I import an AvatarCLIP animation into Unity?
Current License: CC BY-SA 4.0
13 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Nov 22, 2023 at 7:13 | comment | converted from answer | xu sun | I also encountered the same problem, how did you solve it? | |
| Aug 16, 2022 at 18:20 | comment | added | DMGregory♦ | I think those videos were very likely captured in a different environment than Unity. Some examples could also be a generated mesh/avatar animated with conventional skeletal animations from Mixamo, since the repo also discusses that pipeline. I have no doubt that the animation tech works, and could be implemented in a Unity plugin, but I don't see such a plugin with a cursory scam of that page — one may exist elsewhere, or you may need to port such a tool made for another environment to work within Unity. | |
| Aug 16, 2022 at 17:57 | comment | added | EL_9 | @DMGregory Thank you. I am just confused because at the top of the repo there seems to be working examples (custom avatars with custom animations). Are you saying that this is not currently feasible using the current state of the repository? | |
| Aug 16, 2022 at 13:41 | answer | added | Pikalek | timeline score: 1 | |
| Aug 16, 2022 at 11:42 | comment | added | DMGregory♦ | From the description, it sounds to me like these npy files contain the parameters fed into the machine learning model to generate this motion. So it's not conventional skeletal key frames like we'd usually use in a game engine. You'll either need to convert this animation to keyframes in a format Unity understands like fbx, or implement this ML model in your Unity project, so you can feed it parameters and get out poses to apply via script at runtime. Both will take a deeper understanding of this ML model than most gamedevs have — can you contact other AI folks using this model? | |
| Aug 16, 2022 at 11:26 | history | edited | Philipp | CC BY-SA 4.0 |
deleted 11 characters in body
|
| Aug 16, 2022 at 11:15 | history | edited | Philipp | CC BY-SA 4.0 |
Clarified question based on comments
|
| Aug 16, 2022 at 11:04 | comment | added | EL_9 | @Philipp I understand. I was using the following repo and followed the steps in the "motion generation" part. github.com/hongfz16/AvatarCLIP. | |
| Aug 16, 2022 at 10:56 | comment | added | Philipp | Well, a 2d video of a character performing a motion does not include the technical 3d data of the motion itself. I haven't encountered .npy files before. A web search says it has something to do with the Python library NumPy? How exactly did you "use some model" to "generate a motion"? What tools did you use? | |
| Aug 16, 2022 at 10:48 | comment | added | EL_9 | @Philipp You are most likely correct as I am a complete beginner. I used some model to generate a motion and as a result what I got were a .mp4 file with a video of said motion and a .npy file if that helps. | |
| Aug 16, 2022 at 10:27 | comment | added | Philipp | MP4? Isn't that a format for video files? I didn't know that those could contain animations for 3d models. Usually it's not a problem to apply animations from one humanoid model in Unity to another, provided that the avatars are set up correctly and that the animations can be imported as Unity animations. | |
| S Aug 16, 2022 at 10:24 | review | First questions | |||
| Aug 16, 2022 at 14:44 | |||||
| S Aug 16, 2022 at 10:24 | history | asked | EL_9 | CC BY-SA 4.0 |