Animaze

Animaze

Zepondrax 2 Aug, 2024 @ 5:42am
3 Questions
Over the yeahs I have used this on and off. Came back to it again.
So i am wondering a few things.

1. Didn't Animaze use to recognise Hand or Arm Gestures via your webcam, I dont think it does any only recognise head and eye movements?

2. How come characters mouths move when your not moving your mouth? it looks a bit off? Is there a setting (Free user) to reduce is so that doesnt happen?

3. You can't stream to any other platform like Youtube? It looks like I might have to use Bandicam to record the screen area and then I can upload to youtube I guess? Does this sound right?
< >
Showing 1-6 of 6 comments
devops  [developer] 2 Aug, 2024 @ 6:08am 
Originally posted by Zepondrax:
Over the yeahs I have used this on and off. Came back to it again.
So i am wondering a few things[snip].


1. Didn't Animaze use to recognise Hand or Arm Gestures via your webcam, I dont think it does any only recognise head and eye movements?

Animaze uses a multitude of tracking libraries, each with their own capabilities.
Use the ones you like most.
Hands have been in since the start via the Leap Motion, but there have been new hand trackers added since.

The full list of accepted motion trackers is here.
https://www.animaze.us/manual/appmanual/trackers

Trackers that move the arms and hands are
- VMC (Virtual motion capture) receiver, works with webcams (there's plenty of VMC senders like XR Animator. Most of them use Google's Media Pipe webcam based hand tracking, so the differences between them in terms of core tracking are not super large.)
- Leap Motion (requires specialized sensor)
- Perception Neuron (requires specialized mo-cap suit) full body including fingers
- Sony mocopi (requires specialized sensors) full body with arms, no fingers though

An internal implementation of Google Media Pipe Hands (that does not require an external VRM sender app) and only requires webcam video input, will likely be launched in a week or two.
The face tracking module of Google Media Pipe tracking it is already test-able on the next-version branch of Animaze here on Steam if you want to give it a whirl ahead of the official launch.
https://steamhost.cn/steamcommunity_com/app/1364390/eventcomments/4352247346363143622/



2. How come characters mouths move when your not moving your mouth? it looks a bit off? Is there a setting (Free user) to reduce is so that doesnt happen?

It is a trivial fix. You need to correctly calibrate your motion sensors and motion tracking settings/sensitivity to be correctly set for your IRL motion capture conditions.

A grainy image video feed for video tracking, or a microphone without a proper noise gate setting for audio tracking, can mis-interpret background noise (or image grain) as lip motion.
Set them correctly (using the in-app time filters or nose gate filters) and unwanted lip motion will be a thing of the past :).
https://www.animaze.us/manual/calibration




3. You can't stream to any other platform like Youtube? It looks like I might have to use Bandicam to record the screen area and then I can upload to youtube I guess? Does this sound right?

Oh no, nothing like that. Someone has been gravely misleading you if they told you that's the only way.
Almost any video platform can accept Animaze input via the virtual webcam (usable on the free tier, but limited to 30 fps. 60+ fps is for paid tiers),
https://www.animaze.us/manual/appmanual/virtualcam (that way it can work with any app that accepts webcam input) ,

For customers (paying users) there's unlimited (60 fps) virtual webcam, and also spout2 (https://www.animaze.us/manual/streaming-transparency/spout)
as well as the dedicated window capture (that can work with OBS or Streamlabs or any other video capture software like Bandicam)
Last edited by devops; 2 Aug, 2024 @ 6:54am
Zepondrax 2 Aug, 2024 @ 6:14am 
So I can use this in conjunction with Youtube?

I purchased a skeleton face girl with coins and she doens't move her arms if i use my webcam, although her head moves etc? I can however use keyboard shortcuts to have her move her arms but thats it? I obviously need to do something else?

I also added butterly wings prop on her back, the arms of the avatar keep going behind the butterfly wings for a few seconds, even though I shrunk the wings, is this normal?
devops  [developer] 2 Aug, 2024 @ 6:26am 
Yes you can use it with Youtube, Twitch, Tiktok too. Any platform or app that accepts webcam input.

The initial free coin allotment does allow you to stay on the free tier and unlock some avatars without a dollar spend, but the full feature set of Animaze (and full roster of avatar art) is for people who decide to commit and buy a sub/paid tier.
The free tier will only broadcast via the virtual webcam at 30 fps, to get 60 fps smoothness or more you'll need to use a paid tier.

We do have a varied array of items, but it would have been too limiting to make them small enough to make any 3D clipping impossible with any 3D animation, so not all of them can be freely combined with everything (without some clipping occurring in some cases).
If you place a large 3D prop on your avatar obviously you'll need to be careful on how you place the avatar and which limb animations you trigger so that they don't clip though that large object.

To use body tracking you do need to activate and configure one of the body trackers in the tracker list and ideally read about it in the app manual.

Assuming you want to stay on the free tier that will probably mean you don' t want to buy bespoke tracking hardware sensors, so then you'll want to use Animaze free tier as a VMC receiver, and use a free VMC sender app such as XR Animator or similar.

Next week or the week after we'll also launch an internal implementation of the Google Media Pipe hand tracker (the hand tracker that is used in most free webcam based hand tracking apps these days), so you'll be able to avoid the VMC protocol "detour" and using an external VMC sender app, if you want to get access to the Google Media Pipe hand tracking.
Last edited by devops; 2 Aug, 2024 @ 6:50am
Zepondrax 2 Aug, 2024 @ 6:32am 
Thanks. I did try one of the software installs and see if it moves the arms but it didnt. Thats ok I am just one of those people who doesnt like o show their face on webcams so trying to figure something better out. Trying to come up with avatar or something.

I see it can use audio from your webcam, I guess it cant convert to a female voice. All good anyway I will keep searching and see if I find anything good. Maybe I can just use it in conjunction with Bandicam or something.

Thanks

Originally posted by devops:
Yes you can use it with Youtube, Twitch, Tiktok too. Any platform or app that accepts webcam input.

The initial free coin allotment does allow you to stay on the free tier and unlock some avatars without a dollar spend, but the full feature set of Animaze (and full roster of avatar art) is for people who decide to commit and buy a sub/paid tier.

We do have a varied array of items, but it would have been too limiting to make them small enough to make any clipping impossible with any animation, so not all of them can be freely combined with everything (without some clipping occurring in some cases).
If you place a large prop on your avatar obviously you'll need to be careful on how you place the avatar and which animations you trigger so that they don't clip though that object.


To use body tracking you do need to understand how to use and configure one of the body trackers in the tracker list in the app manual.

Assuming you want to stay on the free tier that will probably mean you want to use Animaze as a VMC receiver, and use a free VMC sender app such as XR animator.

Next week or the week after we'll also launch an internal implementation of the Google Media Pipe hand tracker, so you can avoid the VMC protocol and the ext3ernal VMC sender app if you want to.
devops  [developer] 2 Aug, 2024 @ 6:35am 
Animaze focuses on generating the visuals, not generating audio (we have a few voice effects, but pretty simple stuff).
The audio output from Animaze, if done, is not done via the Animaze virtual webcam, but an entirely different virtual driver for audio, webcam is strictly just video.

For complex voice filters, you may want to try voice-focused apps like Voice Mod, that Animaze can work together with.
Last edited by devops; 2 Aug, 2024 @ 6:51am
Zepondrax 2 Aug, 2024 @ 8:45pm 
Thanks :)

Originally posted by devops:
Animaze focuses on generating the visuals, not generating audio (we have a few voice effects, but pretty simple stuff).
The audio output from Animaze, if done, is not done via the Animaze virtual webcam, but an entirely different virtual driver for audio, webcam is strictly just video.

For complex voice filters, you may want to try voice-focused apps like Voice Mod, that Animaze can work together with.
< >
Showing 1-6 of 6 comments
Per page: 1530 50