A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Announcements' started by LeonhardP, Mar 23, 2020.
Thanks for the question. I'm afraid we have no current plans to support glTF directly.
Any news for addressables supporting DOTS ?
Hi @Ivan_br ,
1. Is there a plan to be able to export audio and video in different formats that what is currently available? If so, any ETA?
We have recently added an encoding API in the Recorder which allows to plug custom encoding systems.Here is an example of custom encoder: https://github.com/keijiro/FFmpegRecorder
2. Any plan for exporting video using different user-defined compression rates?
We are also working on integrating ProRes encoding for Windows and Mac Editor. We have a first version working and will start Apple's certification process. We are looking for a release mid year.
3. If I manually manipulate (in the editor) game objects and their properties (animation triggers, object transform, etc.) during a video recording, is there a way to also record/export these manual steps and re-import it back into Unity so it would be possible to further tweak these in-editor manipulations and export another video based on new tweaks to these in-editor manipulations?
The Recorder has an Animation Clip Recorder which allows to record GameObjects components has animation clips. You can both add a Video Recorder and an Animation Clip Recorder to record both at the same time.
4. Can you expand more on the chromakeying properties that will be coming? Will the chromakey algorithm be open to code customization? Any ETA on this?
We plan to first provide a simple chroma keying sample to showcase the compositor framework (currently based on https://github.com/keijiro/ProcAmp). This will should land in the coming weeks on 2020.1, HDRP 9 Preview.
We are indeed working on some new tools for procedural content creation and placement. These tools are at a very early phase, so I can’t give you any particulars yet, as timelines and even the nature of the tools might change between now and the time we release.
Everything sounds very exciting, I just combine comments in this thread and have a question: I created a prefab placer which also supports e. g. spawning houses along a spline. So you can make an alley, even a village basically by dragging spline nodes. I don't see why the new Unity spline won't be extensible for that, so this will definitely be awesome. The one thing that's not working yet is to align the terrain height to the bottom of the house, i. e. parts of the house are currently in the air depending on the terrain, one has to manually adjust the terrain. Basically it would be awesome if it were possible to paint over a terrain, so that it's height is aligned to the bottom of the lowest gameobject. Any gameobject, or rather it's vertices. The gameobject might as well be a mesh road. Is terrain levelling and adjustment like this in consideration?
No plan currently to have built-in support for that in the video player.
I have created a thread to propose the current alternatives and discuss more deeply about this:
As I mentioned, I can't say tooooo much on this topic, but I can reasonably say the following: when we asked users what their top use cases were for procedural content creation and placement, terrain creation and its relationship to in-game objects were top-of-the-list. So your request rhymes quite well with the use cases that we are pursuing (you can see some suggestions of this in the in-development section under Environment in the presentation). That said, I want to repeat my prior caveat. Much of this work is early and we don't know exactly when it will be available for you, which is why we didn't include it in the presentation itself.
Is there any ETA for floating point determinism across architectures and platforms? Any description of the current state, challenges and … would be great
Please, make this procedural placement available for Terrain tools 3+ so that can be integrated within started projects in URP/HDRP 19.3 without DOTS.
Hybrid V2 uses GPU persistent batches and only updates changed entities. Anything not being modified in a frame (even dynamic entities) wont' be touched. This allows the system to treat dynamics and statics the same and leads to a massive performance gain compared to hybrid v1.
Early on in the presentation you mentioned in the section on Reliability & Performance that the team builds internal projects to prove out technology. You have the FPS Sample Game and Mega City in the Released column, the DOTS Sample in Prerelease, and a new open world shooter project In Development.
Let's view these projects in terms of intrinsic Reliability and Performance. First, Editor version support:
FPS Sample Game: Unity 2018.3.8f1
Last Update: 2019/03/11
Mega City: Unity 2019.1.0 Beta 7
Last Update: Unknown; assume "no update since release"
DOTS Sample: Unity 2019.3.6f1
Last Update: Today
The point that I'm trying to make here is that the history of these projects is pretty spotty in terms of how they seem to be handled internally. Unity Editor version compatibility doesn't seem to be high on the list of priorities and pointing to Mega City as an instructive example of some of the features at this point isn't terribly helpful. Many of the projects were built with technology in "Preview" status and were used to dogfood the tech. While that is a good and helpful approach for your internal teams, a much more comprehensive one - one that would help people outside of Unity - would be an approach wherein you ensure that the projects work with future versions of the Unity Editor and continue to update them for compatibility.
Is there any plan on continuing to support these projects as example implementations as Unity evolves? Or are these going to continue to be "build, release, and forget" projects?
(This approach is forehead-smackingly obvious for any Asset Store publisher, where this is the de facto approach to releasing such a project...)
Thanks for the note and feedback. Long notes are fine! It is definitely good to let us know what you are thinking and seeing. Sorry that the roadmap didn't meet your expectations. The specific points of feedback you have are good, and at the risk of sounding like I am deflecting, we are working on a solution for a lot of them.
I can tell you that internally there is a big focus on core product quality. We are also taking a more holistic view of what quality means. For instance, we have expanded the notion of high user pain bugs (which we have blogged about before) to include workflow/QoL issues - which are strictly not a bug by definition, but from a user perspective totally are. That is an example of one of the ways we are making things better.
Something else to add a bit of context around is that Unity is transforming. If you take your point about Physics engines for instance. Unity ships with a verified Physics implementation (based on PhysX), that has been in the product for many years. Many projects ship successfully on this. As we evolve the engine forward we are looking at new implementations - the two in preview Unity Physics and Havok Physics are made to leverage the future DOTS based architecture of Unity. That area is still a construction site, thus providing users with a clear picture while so many systems are in preview is a challenge. It is a challenge we have accepted though and we are going to tighten up how we expose preview packages to our users. To ensure that there is safety in their usage, and the expectation is clear between user and Unity.
Point taken on some of the tools, we are a believer in the right tool for the right job. We also hear from a lot of users that want to perform some activities in Unity without switching context. In many cases though (like the examples) there are amazing tools out there for some tasks.
Also, we are investing in production validation for our work. It was referenced in the roadmap talk. This is an internal team that builds vertical slices representing the sort of quality, scale, and content that our customers will be building. We'll have different teams working on different areas - one focused on a high-end networked shooter, one on an async mobile network game, etc. The goal is that these teams work in partnership with devs internally and battle test (in the moment) the features being developed. You are going to start to see the results of this with our 2020, 2021 tech streams.
Overall, we will continue to listen carefully to the feedback and iterate on our plans. Again, thanks for providing your perspective.
@ans_unity will the experimental drops include the OOP style scripting or is it still strictly DOTS? Btw Thanks for working on a visual scripting tool can't wait to test it out some more!
Stabilisation and bug fixing remain the priority but we have many other improvements in plan. The most immediate and likely ones are shape drawing to ease the modeling of more complex shapes, improved cut tool to remove unwanted geometries, UV toolbar for better texture workflows and CSG (constructive solid geometry) again to help with complex shapes. Keep in mind that these are in very early development and things may change. What is definitive, is our commitment to improve Probuilder and make it a goto tool for creation of 3D assets.
Have a great day!
I rephrase Where can we put in ideas that Unity can consider or completely ignore depending on the architecture you guys develop. It's just that something great is coming from you guys towards our ways and if there is a valuable easy to add idea / feature that goes in the same direction along the path, it might have been worth that it was mentioned I don't hesitate to make a forum topic, but a more general place for ideas without discussion could be beneficial.
Hi! I know this kinda deviates from the topic but, Is there any hope to see some official support of Chisel? The work being done there is amazing and the product looks like it's on its way to be a world-class CSG tool. Given that Sander works for Unity, I really hope something can be done to make it an official unity tool or, at leas,t endorse and support the project in some way. Thanks!
Which physics engine is the preferred one for extremely fast moving gameobjects? I'd like to create a raytracing example project with a fast physics engine combined, so my choice to create is a pinball machine with all the lights and glitter. However, there are various options in regards to physics, as you mention. The current PhysX implementation makes the ball still go through the flippers when it's rolling fast. Is Havok better? Or is Unity Physics better for that purpose?
While I'll refrain from exact release dates, there are a few core pieces we're working on. You can expect an upcoming patch release with several bug fixes (thank you to those who have actively reported issues). And we're working towards a public release of the current feature set. That will involve some additional refactoring to ensure architecture and configuration for both AR & VR functionality is aligned. To stay updated on the latest, feel free to join us here.
I just want to add something for physics related. I struggle with the rigidbodies/controllers on moving platform with relatively high speeds. Camera and movement is jittered.
There is also a problem when rigidbody is rotating around one and/or more axis (Link).
Could you make these issues frictionless to implement for users of Unity?
When do you expect to add common functions to the Unity.Physics package such as AddForce and AddTorque? How high on the list of priorities is that compared to some of the other things mentioned in the presentation. Thanks!
It's great that you're making more internal vertical slices, but what about the standard assets? The current ones apparently don't work with 2019.3? I don't really need them, but I think they're really valuable to newbies, students etc. I know Unity was making new character controllers, but that kind of fizzled out, no updates for months.
I'm curious about mixing 3D elements with the 2D renderer. It might be a bit of an oxymoron, but is there any plans to support lighting of 3D objects with the 2D lights that come with the package? Last I checked it only supported Unlit materials.
Thanks for the question. The Character Controller is a revamp of the existing one we have as part of the Unity Physics samples. We've been doing some iterations on it based on feedback we got from our production team who worked on the FPS Sample project. The goal is to provide a couple samples that can be used alongside Unity Physics/Havok Physics for character locomotion but they wouldn't actually be integrated components within Unity, rather samples you can inject in your project and extend as needed.
Need to look into the Physics update manually and get back to you on that one.
The new environment system will be similar to the new SRPs–C#-based with adaptability intended and moving away from being a black box system. Though we’re working on our MVP and destructive environments is outside that cut line, so it won't be functionality we provide immediately out of the box. For specifics on what we will be providing will be coming later, and we’ll talk about extensibility in more detail.
Support for this already exists in Shader Graph. We could make the UX side of it easier, and that’s something we’ll talk about doing. If you want to render an additional pass you need to either modify the render pipeline or use a ScriptableRendererFeature. Universal Render Pipeline examples here: https://github.com/Unity-Technologies/UniversalRenderingExamples and this post has a helpful example on using a multipass shader in Universal: https://forum.unity.com/threads/unity-gem-shader-and-lwrp.758603/#post-5092100
For HDRP docs here: https://docs.unity3d.com/Packages/c...email@example.com/manual/Custom-Pass.html and HDRP examples: https://github.com/alelievr/HDRP-Custom-Passes.
Hi, simple question, will HDRP work well for a Nintendo Switch build? If I want to simultaneously support a PC build and a low-spec build targeting Nintendo Switch, can I have my single project configured for both HDRP (for the PC build), and URP (for the Nintendo Switch)? Or would I be forced to split my project into two copies, one for HDRP and the other for URP?
Can we expect improvements in the PhysX integration, and more exposed features? If so, who should I get in touch with about it?
There are much-needed improvements on these areas:
Expose impulses per contact
WheelCollider: writable sprungMass, suspension sweeps.
Note that Unity Physics is currently a no-go because ECS/DOTS (and Unity Physics itself) being in Preview/non-LTS and requiring to re-design and/or re-write mostly everything from scratch.
1. How is future DOTS animation (node graph based) authoring workflow compare to Monobehavior animation (state-machine/playable/animation-rigging/kinematica/physics integration) ? will it be easy to upgrade? (right now DOTS Sample's animation workflow is horrifyingly bad IMO)
2. How is InputSystem/AI planner/ml-agent related to DOTS ?
3. Will DOTS.audio be a competitor/replacement to FMOD/WWISE?
4. Will Enviroment System be a competitor/replacement to Houdini Engine/Gaea/World Creator workflow?
5. Could you share more about DOTS openworld shooter sample?
Thanks for the question. We’ve chatted with Sander about Chisel and it certainly represents an interesting direction. While we can’t make any promises on this topic, good ideas are always valued at Unity, so we’ll be exploring how well this aligns with our overall plans for artist tooling.
One more question, if I may- what's the future of TextMesh Pro in the DOTS world?
Hybrid renderer V2 is still in development and this is the first experimental release. We want feedback, though it is likely you will run into issues and bugs that will be addressed over time as well as things being changed and refactored. It is not recommended to use hybrid V2 on a project that will depend on the feature set in the short term. If you want to provide feedback on the feature (with no expectation of extensive short term support) then you'll be able to try it out with 2020.1/SRP 9.0.0-preview.5 more information on what will be there here: https://docs.unity3d.com/Packages/c...firstname.lastname@example.org/manual/index.html#hybrid-renderer-v2
There are some plans for getting a broader feedback site, that I don't know much about. However, right at this moment I maintain a public roadmap for Universal RP & Shader Graph: https://portal.productboard.com/8uf...s/3-universal-render-pipeline-previously-lwrp you can vote on what's posted there, and tell us why you need it- and there's a submit an idea button in the top right. I look at everything that goes there and follow up where it makes sense - I will also pass along ideas to other teams if things outside of my realm appear (my realm being graphics, environment is part of it though we are working with several internal teams on the complete feature set).
These features are important and they will be improved in future versions of Unity as part of our new environment feature set. The team is focused on development and you can always leave us feedback in the world building forum. There are no plans to add new features to the current terrain system that was released in 2019.3.
We are working on a new non-destructive layer-based environment system for terrain and other environment elements which will include grass, we’ll follow up with more detail once we have plans to share. If you are creating grass, right now you can make it in Shader Graph - though this does have the limitation of not being in our terrain system:
What is the time frame for a preview package of DOTS Animation to appear in the package manager? I understand that it is currently experimental and has to be manually added to the manifest. Will it be in preview in one of the 2020 releases?
Will the new non-destructive terrain system break current terrains?
There's no quota on questions you may ask so keep them coming!
TextMesh Pro is the default text rendering technology in the new UI Toolkit, which should be compatible with DOTS by the end of the year.
Will adding / removing Labels come to the package manager ?
More info here:
Ok good, so shaders won't have to be rewritten when switching, only tuned.
What about textures? Will HDRP textures turn into URP textures? Note that I never used URP, my only interest is in deploying the game on lower end hardware but starting with the higher end assets as is customary on productions.
And speaking of production pipeline, the wizard that handles HDRP -> URP should also have a LOD building and poly reduction function. On second thought all we really need is a API hook so we can add functions to the wizard and handle LODing ourselves. I'm saying that because when you guys take on too much you end up under delivering where it matters, so let the asset store provideth.
Our new environment system (which offers non-destructive workflows - making it easier for people to collaborate), will be nice and separate from the existing one. Nothing will change for those using the current terrain system.
For best performance on the Nintendo Switch we recommend the Universal Render Pipeline. Right now it’s likely easiest to go with two separate projects - as you will need to adjust lighting values, post processing, and shaders. We are working on making it so you can have your Shader Graph shaders take best advantage of both. We are also looking into how we can improve the cross-pipeline experience.
Our main initial use case is for asset store developers - so that shader graphs will always compile when built for different feature sets from each pipeline (regardless of whether or not all of those features are in a given project). If you have say a shader that takes advantage of HDRP’s iridescence (or another HDRP only feature) and have a different fall back for URP. And thank you for the feedback.
FB Instant games requires HTML5/WebGL, running in a Webview on mobile.
Current Unity WebGL build target is not supported on mobile devices - see https://docs.unity3d.com/Manual/webgl-browsercompatibility.html for details.
Project Tiny WebGL build target is designed to work on mobile browsers & webview. However FB Instant (and most Ad networks for playable ads) requires that your web build is all wrapped in a single html file/archive, which is not supported yet by Project Tiny, but it's on the roadmap. I cannot tell when it lands but we know it's a must have and we are working on it. Keep an eye on the forum: https://forum.unity.com/forums/project-tiny.151/
Hey, I would like to know what's the rationale behind adding the new 2D renderer to URP instead of creating a dedicated 2D Rendering Pipeline?
To me it seems that SRP brings the perfect opportunity to make focused pipelines that excel in one thing, instead of creating a jack-of-all-trades. In fact I'm a little worried that URP is trying to become too many things at once again. It's true that the original Lightweight/HD pipelines would've left a gap in the middle of the spectrum, but if URP starts getting more and more stuff, doesn't it defeat the purpose of SRP?
Hi @Lars-Steenhoff , yes have this Asset Store integration improvement logged in our backlog. We have no specific release ETA yet unfortunately but we are actively tracking this for a future release. If you have any other feedback/requests for the Package Manager UI please keep them coming!
Yes I have one more :
The load next button
the problem for me is that it forgets the loaded assets when I do a search and then when I empty the search field I have to start again with all the load next actions. I would not mind pressing load more 14 times if it happened only once ever, but since that is not the case the pressing load more is multiplied by each time I interact with the package manager.
I would much prefer the have the whole list of 1400 assets loaded at one time and cached in the next time its loaded.
On the topic of reliability, the roadmap only mentioned plans for Unity 2020 moving forward. My question is: Why has so much work been put into bug fixes in 2020.1 that aren't backported to 2019.3? If Unity's new focus is on reliability, shouldn't that apply to all current versions, not just to 2020+?
In this roadmap address, there's a big focus on making Unity more reliable. There's a big piece of reliability that has nothing to do with the 2020 cycle: Existing LTS releases of Unity. One of the major complaints about 2019.3 has been its overall reliability. It took quite a long time to become stable enough to be released, pushing its actual release date into 2020. You've been fixing issues in patch releases since then, but there remain a lot of stability issues with the current 2019.3 release, and 2019.3/4 need a lot of attention moving forward.
Now, my concern here is that if you look at the beta notes for 2020.1, it shows over 600 "Fixes" in 2020.1: https://unity3d.com/unity/beta/2020.1.0b3 Looking at the issues fixed, nearly all of them are fixed exclusively ion 2020.1, some in 2020.2, and only very rarely are any of those issues fixed in older versions of Unity. These are not bugs that were discovered only during the beta. Most of these are bugs that have been around for years, and have only just now been fixed in 2020.1.
So, again, why have hundreds of bugs been fixed only in 2020.1, and not in older versions? I can understand that some bugs depend on new engine improvements, but I can't imagine that this is the case for 600+ bugs.
My project uses both HDRP and Realtime GI. Given the lack of Realtime GI support moving forward, I don't have the option to migrate my project to Unity 2020.1. So I'll be stuck on 2019.3/4 for the foreseeable future, most likely a couple of years. So I'm very interested in 2019.3/4 receiving a lot of attention as far as bug fixes go. Please don't forget about your LTS customers as you move towards making 2020 more reliable.
Hey, I have one more question (really should've think of all of them at the same time )
In the roadmap for the package manager I noticed there's nothing about bringing the UPM format to the asset store. Is that still the intention?
The plan for 2D and DOTS is to start with foundational 2D features. For example, we are looking at sprite rendering, batching and sorting as well as some collision detection features as part of an early in-development package targeting DOTS Runtime/Project Tiny. You can join the discussion on that here.
Following on from that we will explore and develop support for our higher level tools for world-building and animation all while being guided by needs expressed in user projects. This is ongoing work that is planned over the next couple of years.
So while it's addressed in the roadmap that preview packages will spend a bit more time in the oven, has it been discussed at Unity the current heavily monolithic way new features are introduced? One example is the new Input System, which I've been using for my project since last May or so...which was first discussed in 2014. I come from a development background, so I do understand it can be extremely challenging to iterate on existing tech in some situations, but an awful lot of the issues with the old input system could have been solved pretty fast...like how long would it really have taken to allow new bindings at runtime? Or allow querying of existing mappings? For most of Unity's life, the best input solution has been on the asset store. A huge amount of the core features of Unity seem to be lined up for deprecation, but new solutions like the URP have some important gaps. Unity has never had a solid networking package (I used HLAPI pretty extensively, it was rough). It seems a lot of the upcoming features are huge—is there consideration for the time it will take to iterate on these packages after they get into the hands of users (to get them battle tested)?.