MetaHuman Creator - an Insane Level of Competition...

1246789

Comments

  • drzap said:

    frankiev888_28b3c399af said:

    Doing this video stuff for a while, not making me smart just old, my first impression was "humm nice paint job" which goes to what I think Daz is working on and that is materials and how it is applied has always been a sore spot as everyone does it different. If thinking about it the sample file already runs on 4.26 so if you rip the materials what would a Genesis model look like with a MetaHuman material?

    The "Next" best thing is a selling point so I look on such improvements in the same manner as someone selling me a use car and except for hair, which is always useful, I stick with Genesis thank you very much. :D

    Final opinion except for the fact it is officially supported buy Epic what makes MetaHuman better?

     

     

     

    Oh, let me count the ways:

    1.  FIRST class VFX quality facial rig.  A rig like that is worth multi-thousands of dollars out in the wild
    2.  A real body rig with FK/IK and corrective shapes
    3.  Support from industry heavyweights using industry-standard workflows
    4.  industry standard strand-based hair options.  If you're a creator, you can create and simulate your own grooms using a choice of multiple softwares
    5.  More photorealistic than any other figure on the market.

    If you are a traditional Daz Studio user, these things may mean nothing to you.  For you, MetaHumans may not be better.  But for those who have been struggling to bring Daz figures out of Studio and into the wide world of animation, these features are a bucket list from heaven.


     

    Well I'm not a traditional Daz user and although impressive specs I've never had issues harvesting what I need from the Genesis framework.I am a traditional artist that wants to make my stuff and not handed on a silver platter. The gem for me here though is the materials so as far as looks goes if I can convert the materials to use with Genesis then Genesis can be made to look as good in UE4.

    Everything is in the paint job an not in how the geometry is shaped.

    Granted if I was making a movie or animated film that would be different but as to my needs G3-8 offers the base framework that I can build my stuff on.

    Once again the benefit is still it has the full support of Epic so as such is consider gospel.

    If someone else finds the assets of use that's fine but a Daz killer...I don't think so.

    For me though I don't need a Ferrari when a pick up trunk does the job of helping a friend move on the weekend. :D

     

    All Daz has to do is to provide a studio quality material

     

     

  • nonesuch00nonesuch00 Posts: 18,142

    notiusweb said:

    nonesuch00 said:

    notiusweb said:

    MH probably will be subscription based, that seems to be how Qixel rolls. "Free for use" in Unreal does not mean the creation service is free.  They would have made a big deal about it. 

    Indeed the Sample Human project is free, but the MH Creator part probably will cost $, either subscription or per-character.  With that, Daz has a lot of fertile ground for business.   

    If Epic were to have this be a complete full fledged free-for-all creator, then all bets are off.   But they are not going to do that, as that is not the Fortnite way - free to play, but not free to creatively design.  

    No, MetaHuman and it's editing capability are free to use in the Epic UE engines. Epic also bought Quixel and the Quixel assets are also free to use in the Epic UE engines. It's only outside the Epic UE engines that any payment applies. So you can make and publish you own UE games or make your animations & still renders in Epic UE and pay nothing (until and if your income from said games exceeds a certain amount of money as stated by Epic).

    We have to parse this carefully - you start off by saying "No", but then everything else you wrote would technically be in agreement with everything I said.  You yourself only detail that it would be"free for use" in Unreal, which is what I had said, because the CGChannel article had that quote it its article. 

    If you tweak the texture in the MH Sample project, technically you are editing a Meta Human in UE.  But this has nothing to do with MH CReator.  Editing in UE is sequentially after it has been exported from MH Creator.  And so I am not talking about a Meta Human once in UE, I am talking about use of the actual Meta Human Creator web-based application itself.  Now, with this web-based application, there has been no statement as to the price of this, likewise it has been never been stated that it wil be free either.  The only things that have involved a definitive "free" are (1) the free Sample Project, and (2) statement by CGChannel that Meta Humans, once created, are "free for use" in UnrealEngine. 

    So at best you can hope it will be free.  But based on th erelease information and quote in the CGChannel article this is not at all definitive.  I mean, I hope it will be free too....but I am not counting on it. 

    I forgot to say, it's actually the best for you can hope for it not to be free because then you are doing quite well financially from the use of the UE engine and it's free assets. That's the practical truth of the matter for 99%+ of the population that uses their Epic assets. If you don't make money they could care less what you're doing really as they can't squeeze blood from a stone. Practically speaking, they'd squander a fortune in trying to recoup 'profits' from people that don't have the money to pay for those 'lost profits'.

    They are just making reasonable stipulations to keep their free assets from being resold by middle men. Almost every digital product, sold or given away free, has a similar stipulation. 

  • wolf359wolf359 Posts: 3,828
    edited February 2021

    Someone, over on the Reallusion forums, posted confirmation from a youtube comment by Epic games.


    There will be no paid subscription to use the web based MH Figure generator in  UE4.

    IMHO this is all about Getting more professional 3DCG/VFX/ film Companies into the Epic Eco system where they will eventually spend money on assets perhaps even paid 24 hour tech support services as does Autodesk.

    Remember ,you start paying for UE after your income gets over
    a certain high dollar amount.
    This is the target demographic of the MH figure platform.

    companies with high revenues.

    This is not about stealing away existing portrait/ pinup still artists from the poser/Daz Demographic who will likely never become paying users of UE4.


    The obvious major beneficiary will be Autodesk Maya as it & MOBU are still the primary animation software used by Game Dev companies and Hollywood filmmakers.

    And I dont see epic trying to turn UE into Maya or MOBU

    Reallusion should add support for the MH figures to thier new
    online actorcore motion library system immediately  as they have  already done for Maya, Blender C4D etc.

    But frankly anyone else who had hopes of getting the big dollar hollywoodwood/ VFX companies to adopt thier figure platform should  probably abandon those hopes.

     

     

    Post edited by wolf359 on
  • apparently you can import it into 3Dxchange too if you set the LOD to 1

    Bassline an iClone7 user has on their forum

    I'd hate to do all those bone manually in expression editor though

    I did do G3&8 so I can use entirely bone based facial animations

    it took a while

  • Well... this pretty much has curb stomped what remains of Poser and as for Daz, well... hard to say. 

    I do think a lot of Daz's business practices over the years will come back to bite them in the behind when this finally drops, if they don't get it together, because this could be a game changer for those who dabble in realism. My biggest concern is that a lot of games will become very same-y looking asset flips in regards to humans.

    Either way, Daz better work on making DS 5.0 the best it can be, because hoooo'boy.

  • It is interesting, because while Daz is free, a character model and its use in an Unreal distributed game are not.  RL's CC application is not free, but exported content is "free for use" in an Unreal Engine distributed product.

    Epic has been very specific in its court dealings with Apple over Fortnite.  They never leave language vague, as to not leave too much room for interpretation.  But their marketed terminology "free for use" is very vague.  Because that may be applying to use of exported content in a distributed game, outside of the cost of the web-based application use to begin with.  

    I think at the end of the day, if the released Meta Human Creator has, say, 2 download modes (1) as a UE4 asset, with no restrictions, and then (2) as ageneric asset, alerting you to restrictions (ie "subscription required"), and you can download model after model after model into Unreal, then we'd have definitive confirmation that it is not pay-for when it comes to UE4.  But as of this point, the "free for use" terminology is not in fact a demonstrable confirmation, and just because they have a scenario with Qixel Megascans does not then mean this is de-facto the case for Meta Human Creator.  Epic could easily say it to clarify "No creation or download limitations, and no subscrption required for Unreal, etc...", but they have not.

    I do hope eveything is free when it comes down to it, but they have not stated that decisively and specifically.  Rather, it is being assumed, and while that assumption could turn out to be correct, it still is an assumption as-of this point and could turn out to be not correct.   

     

  • EllessarrEllessarr Posts: 1,395

    notiusweb said:

    It is interesting, because while Daz is free, a character model and its use in an Unreal distributed game are not.  RL's CC application is not free, but exported content is "free for use" in an Unreal Engine distributed product.

    Epic has been very specific in its court dealings with Apple over Fortnite.  They never leave language vague, as to not leave too much room for interpretation.  But their marketed terminology "free for use" is very vague.  Because that may be applying to use of exported content in a distributed game, outside of the cost of the web-based application use to begin with.  

    I think at the end of the day, if the released Meta Human Creator has, say, 2 download modes (1) as a UE4 asset, with no restrictions, and then (2) as ageneric asset, alerting you to restrictions (ie "subscription required"), and you can download model after model after model into Unreal, then we'd have definitive confirmation that it is not pay-for when it comes to UE4.  But as of this point, the "free for use" terminology is not in fact a demonstrable confirmation, and just because they have a scenario with Qixel Megascans does not then mean this is de-facto the case for Meta Human Creator.  Epic could easily say it to clarify "No creation or download limitations, and no subscrption required for Unreal, etc...", but they have not.

    I do hope eveything is free when it comes down to it, but they have not stated that decisively and specifically.  Rather, it is being assumed, and while that assumption could turn out to be correct, it still is an assumption as-of this point and could turn out to be not correct.   

     

    man i do feel you re being "too overworried about it, it's easy to see how things can go, they will link metahuman account to unreal engine account, then when you login in the methuman site you also are logged in unreal then when you are finished to "edit the character you gonna have a "free option" to download it on unreal native format to use it, its not complicated process, now if you want to export to others format then you gonna have to pay for it., this is the most basic way, they could come with others, but is not hard to make it free to edit and then user in unreal as they are claim, now if you are worried because you want to use outside unreal then is another history.

  • Ellessarr said:

    now if you want to export to others format then you gonna have to pay for it.

    It's not a matter of worry, but rather a point of accurracy.  For example, where are you yourself specifically getting that you will have to pay for another format.  Has Epic in fact anywhere stated this, or is this an assumption being made, maybe based on Qixel Megascans.  Because unless they said "you will pay for another format other than the Unreal format", then this is then itself an assumption built yet on top of another assumption that the Unreal creator is free.   It's a technical point, not a point of worry.

  • EllessarrEllessarr Posts: 1,395
    edited February 2021

    notiusweb said:

    Ellessarr said:

    now if you want to export to others format then you gonna have to pay for it.

    It's not a matter of worry, but rather a point of accurracy.  For example, where are you yourself specifically getting that you will have to pay for another format.  Has Epic in fact anywhere stated this, or is this an assumption being made, maybe based on Qixel Megascans.  Because unless they said "you will pay for another format other than the Unreal format", then this is then itself an assumption built yet on top of another assumption that the Unreal creator is free.   It's a technical point, not a point of worry.

    i'm not exactly making any statment or anything like that i'm just giving a exemple of how they can do, because i've already saw others doing on this way some sort of "promotions too, where you get one format for free but have to pay for others formats, it's just to show which is possible and normal to have a way for "said A users can have it for free while B or C or D must pay, we don't know how they gonna really make it but they made clear which you have to pay nothing to create and use any character as long you are using inside unreal.

     

    Also someone told about "animation" i would not be too sure about unreal also don't want to make they own animation tool inside unreal, actually they already have it and most of they "cinematic trailers are made full inside unreal, they already have a native animation tool it's just not well developed or strong enougj as maya or MoB, but they are improving, in the same way they are building inside unreal a tool for modeling 3d without you need stufs like maya or blenderl

     

    I means it's clear which epic is trying to create the "ultimate all stuffs tool" where you can do all the steps from modeling, to texture, to rig, to maybe do a retopology(if they not want to kill the retopology process), animate/pose to or render or create animations, movies or games all inside unreal without you have use a single program outside unreal, the best would be stuffs like the metahuman which still a unreal stuff in the end.

     

    they really are aiming big, i would not be surprise if in a future they also not add a full sound editor to create sounds effects and music from the ground in the same way they already have a tool to create VFX stuffs.

    Post edited by Ellessarr on
  • metasidemetaside Posts: 177
    edited February 2021

    Ellessarr said:

    they really are aiming big, i would not be surprise if in a future they also not add a full sound editor to creatoe sounds from the ground in the same way they already have a tool to create VFX stuffs.

    Very interesting thought. As someone using a lot of audio software such as Bitwig, Live, Reason, RX and so on, I think it would be a very big step to do something similarly easy-to-use in the Unreal environment. DAWs are complex and the implementation of new features in well-known DAWs such as Ableton Live often takes a lot of time. It's def possible though - I haven't been diving into it, but I think you can already do a lot of synth stuff in UE, it's just not implemented as user-friendly as in modern DAWs. I think a comparatively simple solution would be to take a single powerful idea such as the graphical representation of frequencies as seen in Photosounder or RX and do something similar for the quick creation of SFX. I would not expect UE to be on the same user-friendy level as DAWs for the production of full tracks anytime soon, but I would be happily surprised if they managed to do it somehow...

    Post edited by metaside on
  • RobinsonRobinson Posts: 751

    Mazh said:

    Daz3D has already made some smart moves, they're positioned in the Unreal Market, they have avatar stuff likeTafi, if they're able to make their clothing work with something like Metahuman it would be a big deal. (I don't know if there's a "standard human rig" for UE)

    I've been saying this for a while actually.  Daz's content store is better than anything else out there right now.  Leveraging it into engines like Unreal, etc. is the way to go.  That implies improving integration, including the things like live link into Unreal.

  • More details...

  • SnowSultanSnowSultan Posts: 3,596

    Are there any PC webcams that can be used for this sort of motion capture? I really find it hard to believe that we have to use a $800+ Apple phone to do this when theoretically, a very good webcam should be able to harness the power of a PC and do it for less.

  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited February 2021

    You can use a used iPhone X for $300 or so with no contracts, etc. Just use it for the app and the camera. I don't know of any webcams that are depth cams which the Apple stuff requires.

    https://www.extremetech.com/mobile/255771-apple-iphone-x-truedepth-camera-works

    You can see the infrared beam shooting out of the phone on some of Solomon's videos.

    Post edited by Kevin Sanderson on
  • Faux2DFaux2D Posts: 452

    SnowSultan said:

    Are there any PC webcams that can be used for this sort of motion capture? I really find it hard to believe that we have to use a $800+ Apple phone to do this when theoretically, a very good webcam should be able to harness the power of a PC and do it for less.

    Keep in mind that they did not use an iPhone to get the mocap data for their presentation video.

    Any webcam will do for facial motion capture. Though Apple's depth camera is a nifty gadget, it's not at all necessary for capturing mocap data. The issue comes with the softwares used to capture said data which cost thousands of dollars and require a lot of experience to get good results. There are of course free alternatives but the mocap data you get from these wouldn't pass any quality test.

  • What did they use? Solomon used an I phone and Live Link for his YouTube tute.

  • Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

    oh there is a whoie bunch of them I have plugins for in my Unreal library surprise

    all bloody pricey frown

    a quick look reveals

    Dynamixyz livThe Dynamixyz Live Link Plugin provides realtime facial animation for characters using the Unreal Live Link protocol combined with Dynamixyz's Real-Time Facial Motion Capture software

    Rokoko Studio live With Rokoko Studio Live you can sync the motion capture data from your Smartsuit Pro, Smartgloves and Face capture (or use them separately) and stream it directly to your custom character in Unreal Engine.

    Live Client Using best-in-class, markerless, facial motion capture software, Live Client for Unreal alongside Faceware Live Server, animates and tracks facial movement from any video source to CG characters, in real-time directly inside Unreal Engine.

     

     

  • They are expensive! Thank you, Wendy

  • Faux2DFaux2D Posts: 452

    Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

    Most likely Dynamixyz as it's the best commercially available software out there for facial mocap. It also has an 8k$ pricetag.

    I made a tutorial on how to get mocap data from Dynamixyz into Daz:

    You get all the necessary files after you purchased Genesis Face Controls.

  • Now I need 8K$ for their software as I purchased Genesis Face Controls when it hit the store. Thanks Faux2D

  • Faux2DFaux2D Posts: 452

    Kevin Sanderson said:

    Now I need 8K$ for their software as I purchased Genesis Face Controls when it hit the store. Thanks Faux2D

    Glad I could help. You also need Maya which is only worth 5k$.

    Joking aside, there's a 30 day trial for Dynamixyz's Performer in case you want to try it out (and for Maya as well). I also attached a Documentation pdf which goes into more depth about the process of getting mocap data from Performer. There's a learning curve so there's also a time investment involved. Based on the user the results can vary a whole lot: 

    pdf
    pdf
    Dynamixyz to Daz DOCUMENTATION.pdf
    94K
  • drzapdrzap Posts: 795

    Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

     

    For the MetaHuman promo videos, I am 99% certain that Epic used Cubic Motion's Persona motion capture system:  https://beforesandafters.com/2019/04/16/tech-preview-cubic-motions-persona/ ;   This is the same system used for their Siren tech demo and I believe they even used the same performer as with Siren.  Keep in mind, this system is orders of magnitudes greater in sophistication than the iPhone ArKit (which should never be used in a pro production) and comes with an epic price.

  • drzapdrzap Posts: 795
    edited February 2021

    WendyLuvsCatz said:

    Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

    oh there is a whoie bunch of them I have plugins for in my Unreal library surprise

    all bloody pricey frown

    a quick look reveals

    Dynamixyz livThe Dynamixyz Live Link Plugin provides realtime facial animation for characters using the Unreal Live Link protocol combined with Dynamixyz's Real-Time Facial Motion Capture software

    Rokoko Studio live With Rokoko Studio Live you can sync the motion capture data from your Smartsuit Pro, Smartgloves and Face capture (or use them separately) and stream it directly to your custom character in Unreal Engine.

    Live Client Using best-in-class, markerless, facial motion capture software, Live Client for Unreal alongside Faceware Live Server, animates and tracks facial movement from any video source to CG characters, in real-time directly inside Unreal Engine.

     

    Those solutions are actually cheap when compared to what was used in the MetaHuman demo.  You would probably have a heart attack if I told you that price.

    Post edited by drzap on
  • drzapdrzap Posts: 795

    Faux2D said:

    Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

    Most likely Dynamixyz as it's the best commercially available software out there for facial mocap. It also has an 8k$ pricetag.

    I made a tutorial on how to get mocap data from Dynamixyz into Daz:

    You get all the necessary files after you purchased Genesis Face Controls.

    Cubic Motion's Persona was used.  $25k for hardware, $10k setup per character, $10k /month for software license.

  • Faux2DFaux2D Posts: 452

    drzap said:

    Faux2D said:

    Kevin Sanderson said:

    What did they use? Solomon used an I phone and Live Link for his YouTube tute.

    Most likely Dynamixyz as it's the best commercially available software out there for facial mocap. It also has an 8k$ pricetag.

    I made a tutorial on how to get mocap data from Dynamixyz into Daz:

    You get all the necessary files after you purchased Genesis Face Controls.

    Cubic Motion's Persona was used.  $25k for hardware, $10k setup per character, $10k /month for software license.

     Senua's the one that really impressed me, it's on par with Thanos' face from Endgame. Dynamixyz can achieve great results but it starts falling behind when dealing with dialogue, I assume that is why they kept cutting away whenever Hulk spoke.

  • I'm a Dynamixyz user and the technical sales rep for them for the US/Latam, so a couple of things I would like to chime in on.

    Yes, the normal price for studios is 8k per year, but there is a version for Indie content producers, that has basically everything except for batch processing (to do large amounts of tracking/retargeting at once) for 2.2k per year, yes, still expensive if you are doing this just for fun.

    I'm very curious on what you are referring to Faux2d in regards to it falling behind dealing with dialogue, as this is something neither I nor my customers have experienced, could you elaborate?

  • Thanks Dr. Zap, Bryan Steagall, and Faux2D for the pricing and additional info! All too rich for me. But at least some hope for hobbyists as tech continues to improve. My daydream as a little kid making my own characters really "come to life" may happen eventually as CG.

  • Faux2DFaux2D Posts: 452

    Bryan Steagall said:

    I'm a Dynamixyz user and the technical sales rep for them for the US/Latam, so a couple of things I would like to chime in on.

    Yes, the normal price for studios is 8k per year, but there is a version for Indie content producers, that has basically everything except for batch processing (to do large amounts of tracking/retargeting at once) for 2.2k per year, yes, still expensive if you are doing this just for fun.

    I'm very curious on what you are referring to Faux2d in regards to it falling behind dealing with dialogue, as this is something neither I nor my customers have experienced, could you elaborate?

    I'm talking about lip-synching specifically as in the quality of it. I took the movie Avengers Endgame as a benchmark and compared Thanos (animated using Masquerade 2.0) and Hulk (animated using Dynamixyz). Long story short, if you mute the audio in a scene you can almost lip-read what Thanos is saying but not so much with the Hulk.

    In my own experience with Performer Single-View, when dealing with dialogue, I had to disable the poses for the lower face containing FACS and just leave the phonimes poses. I know there are several factors that can cause this overlap between FACS and phonimes poses like the quality of the video, the expressivity of the face itself, not using Multi-View, and the rig itself. This is why I looked into other CGI characters to see where the problem lies and discovered there's a whole lip-synching rabbit hole.

    I am splitting hairs here a little. As far as indie creators go, Dynamixyz is the best option out there that I know of.

  • drzapdrzap Posts: 795
    edited February 2021

    I have had very mixed results with Dynamixyz (single view).  One, the tracker didn't always do the best job so I was forced to manually correct it and since humans are inconsistent creatures, my results varied with each tracking session.  In the end, the retargeting is only as successful as the rig quality.  I ended up getting a Snappers rig.  IMO, the best thing that will come out of MetaHumans is that developers like Faceware and Dynamixyz will able to democratize high-end facial mocap by concentrating their efforts on a single, known, rig logic and topology.  Instead of training AI for diverse and disparate rigs, they can focus a product on the Metahuman rig.  Anyone who can deal with the limitations of this arrangement should be rewarded with a very responsive and accurate retargeting of their performance in realtime.  Both Dynamxyz and Faceware have announced they are working on such a product for MH.  I am really anticipating this.   A high-end face rig and realistic face painting mean nothing without an equally proficient way of animating it.  So far I have been lukewarm with Dynamixyz and Faceware.  So much so, that I have been saving my pennies for Cubic Motion's Persona.  I hope that the next generation of tools changes my mind.

    Post edited by drzap on
  • @Faux2d

    Ahh!, now I understand.. and yes, I have experienced that, but I'm not sure if it is necessarily a fault of the software itself or more of an issue of the team doing the tracking/retargeting and animation polish (and the actor) not having that in mind.  Some years ago, I worked with Rochester Institute of Technology's School for the Deaf, who purchased Performer and I got a whole education in 5 days of some (and I stress, some) of what is involved when animating with this in mind, which most of us take for granted. Now I think I need to test this!   

    I build  my own animation rigs in motionbuilder (like what you've done for Daz) and lately have been experimenting with the facs based blendshapes and I'm finding that I agree with you in that they are not enough at times, so I'm going back and I'm going to add some of the base blendshapes to see if I get better results.

    @drzap

    Were you using an HMC or a static/web camera? and were you calibrating for the specific lens? I've found when consulting with customers that calibration and inconsistencies in the way they annotate the shapes for tracking are what cause most of their issues. Yes, we humans are very inconsistent creatures! Which is why I stress two things to customers.. 1st, mark up the face of the actors when you are learning, so you always hit the same landmarks and always have the same team  do the annotation/tracking, so that your results are as close to the same all the time.

    I wholeheartedly agree that the end result depends on the quality of your rig. I don't consider myself a modeler/rigger, so for my own stuff, I depend on Daz characters, which I then adapt to my rig in Motionbuilder, which is ok, but it is a joy when you work with a good rig (snappers are some of the best around, but can't afford them personally)

    I'm also looking forward to their solution (they haven't shared with me yet exactly what they are doing)

Sign In or Register to comment.