The Best and Worst of the Unreal Connector

Hello Unreal Connector Users!

You are truly a great community, and we love making features and tools that empower you to make some of the coolest imagery and experiences out there. We are working on the next set of features and improvements, and nothing means more to us than your input.

If you could reply with an answer to any or all of these questions, it would help us get you what you want.

  • What would be the killer feature or improvement that you would love to see?
  • What has been your biggest frustration with the Unreal Connector?
  • What is your favorite feature or ability the Unreal Connector gives you?
  • What workflows does the Unreal Connector enable you to have?

Thank you for helping us create the best experience for you and supercharge your workflows.

Hi there. Thanks a lot for all the work you guys are doing!

I’m probably not the best candidate to give feedback since I’m just starting out, but every little helps.

I have struggled very much with the UE5 connector, the biggest frustration is the documentation / how to use it. What does a workflow with it look like? People need detailed instructions of actual workflows, step by step, start to end.

Let me make you an example: I have an existing scene in Maya, that I would like to bring to Unreal. I exported an FBX and imported that with Omniverse Create, so that I end up having a USD stage. Now I can open that stage in Maya + Unreal and create a live session. So far everything works. However when I want to save in Unreal, the confusion starts, when I save, I have to save a “Level” though, not the USD. And sometimes when I reopen that Level, the whole USD scene is somehow invisible, it’s in the outliner, but I can not see it in the viewport of Unreal, whatever I try, the Level I saved is corrupted.

Let’s assume we have an existing USD file, all I want is bring that into Unreal, what are the steps I need to take? Let’s assume there are no lights in the USD and we want to use the Unreal Light Mixer to create all the “basic” lighting. I open the USD file, add the lights, then save the “Level”.

Another thing are broken Materials, I have tried for two days, the materials are severely broken, even the “parent” material is full of errors. Materials synced from Substance Painter to Unreal Engine are just bright white - #5 by 3dpxl

I’m more than happy to go through these things with someone so I can explain everything in detail. Once I understand everything / everything works as intended, I’m more than happy to create educational videos.

Hello @3dpxl. Thank you for your feedback! This is something we definitely want to work on. The frustrations you mentioned are a combination of lack of workflow documentation, missing features and what sounds like a couple of bugs. I would very much like to go over all the frustrations you are seeing as well as what you’d like to see or your ideal workflows. I will send you a direct message so we can continue the conversation.

1 Like

Biggest killer feature:
Blender - Unreal Engine - A way to choose my own custom material instance from my own master material or to automagically swap out shaders on an incoming mesh to custom Unreal ones of my choosing. I would like to be able to send a USD from blender of a mesh with 1 or more materials applied with each material consisting of a simple Principled BSDF + textures for color, metal and so on for a PBR workflow. But when it gets into Unreal engine I would love to be able to configure the master material that will be chosen to create a material instance and have those textures automagically applied as parameters to it and have those settings stored so that if I made changes elsewhere in the USD and re-imported I would not need to set up the unreal shaders again.
It is difficult to customize and automate the materials workflow at the moment non destructively and for things like texture packing multiple materials into RGB for example I do not currently see a nice fast way to automate that with USD. Maybe there is a way but I do not see it yet.
The problem I have is I still have to go in and modify the materials so that they are more efficient or to customize by replacing blender shaders with unreal engine ones like substrate shaders. It would be amazing if I could have a way to swap out blender shaders with unreal ones of my choosing by configuring this once on the first import and then have Omniverse remember this so that if I changed another part of the model and re-imported I would not need to do all that work again. This kind of non destructive workflow could be pretty amazing for level building if I did not have to convert the USD to a static mesh as that breaks the interoperability between blender and unreal and makes the workflow destructive. Which is just like FBX or GLTF (Except Omniverse is a little bit faster and a lot more organized).

Biggest frustration:
Lack of documentation and tutorials showing workflow and all settings in depth. And workflow from “install to import” could do with someone with “User Journey” in their job title to take a look at it. I have come across several weird blocking issues with settings that are really easy fixes when you know how but knowing how is not included in the user interface or documentation at the moment though there are some great video tutorials on the omniverse youtube which helped a lot.

What workflows does omniverse enable me to have:
I can now create pre-fabs in Blender and then very quickly import sections of a large scene into the Unreal engine level piece by piece. This is fantastic for prototyping and testing how PBR materials will look in Unreal in terms of look and feel and color management or for quickly building scenes in Blender and sending the entire thing over to Unreal non destructively with a few clicks of a mouse is absolutely fantastic.

You should take a look at the new 204.0 release! Lots of improvements including the ability to reparent to custom materials.

1 Like

Hi, I’m using the Audio2Face app to animate a face in Unreal Engine 5.3 successfully over livelink interface. I would also like to add Audio2Gesture character animation to animate the rest of the body over livelink in Unreal Engine. It seems like Unreal Engine side seems to have what it might need on its side of the livelink interface, but I do see a way to send it over livelink from the Omniverse livelink node they way I did for Audio2Face information.

I don’t see any videos demonstrating the livelink Gesture2Face capability, but a bunch doing it by recording the animations from Audio2Gesture and then using the file in Unreal Engine.

Is there a way to do this? And if not, could you add it and make a video showing how to hook it up.

Thanks!

I’d like to see a connector for Linux. Omniverse and Unreal are both working pretty well on Linux (Ubuntu).