A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Calling all New Unity users! Join the Halloween Mods Showcase Challenge until October 31.
Discussion in 'Unity 5 Pre-order Beta' started by SaraCecilia, Feb 23, 2015.
That's what I figured! No problem.
So last night I finally decided to try lightmapping to see what all the fuss is about, especially after I read what final gather actually does... it's almost akin to raytracing. After an hour or so of tinkering here's what I got. I don't know if it's in bad form or not but the furniture is from the "Next-gen Furniture Pack" I bought on the asset store a few months ago. Added in some bloom. LOVE THE SHADOWS!!!
So I think I read somewhere that Unity Console was supposed to also be updating to 5.0 at around about the same time as the standard version. Is that still true or did I misunderstand something?
Baked shadows for semi transparent objects (speed trees) are still broken:
Issues with Loading scenes!
(Thought it might have been Photon at first) but it can't be Photon, it's a Unity Bug.
I put the bug report number in it as well.
My Shadows are turning RED
Lighting messes up and makes terrain bright GREEN, DARK RED and a slight Orange color with Red Shadows.
My light bake was going well, was finishing up 7/11 light transport when I left (at default resolutions, it was working far better than RC2), 24 hrs later pc HD was locked up so bad unity wouldnt respond again (I think I have to drop my virtual memory, HD thrashing makes pc not respond, even hitting caps lock takes 30 mins to turn it on), after hours and hours windows came back and i was able to kill unity. Now I have so many of the old Unity5s installed and forgot to make a new shortcut on my desktop for rc3. So Ive reloaded what I think is the latest unity and it says fatal error, managed ot skip past that somehow, Im thinking I may be using an older Unity, can someone confirm that the RC3 about says: Thu 19th Feb 5.0.0f3 ? Just so Im not fighting on with an older one!..
Shadowing for Realtime GI for point and spot lights will not be ready for 5.0 and it is too late to add more features, but we will add it soon in a 5.x release.
Ok so I think I am on latest actually. The phase that wreaks havoc on my pc is "12/15 bake indirect", uses up all ram available then flushes it, and keeps doing this. I guess it is going to use any amount you give it so if I go back to using only 16 gig physical ram, then maybe my machine wont lock up (other ram is all virtual and all gets eaten causing hd churning to the point where pc doesnt respond). The quest continues..
Lightmapping Terrain worked in RC3 I used a non-directional lightmap (the simplest) and 1 texel per unit with very low resolution. I was able to bake a large terrain and objects in about 45 minutes.
I'm going to try upping the quality so i can see how high it can go with reasonable bake times. A large part of the trouble is that the new lighting workflow is so different from Beast that you have to really understand the settings (i still dont really understand what Global GI is - i just leave realtime GI unchecked
The lighting and PBS in Unity 5 is really beautiful.
Without virtual memory, and only using my 16 gigs, jobprocess.exe crashes once I get to 12/15 bake indirect. I get these errors:
Failed executing external process for 'Bake Indirect' job. Exit code: '-1073741571'.
'Bake Indirect' job failed with error code: 4294967295 ('Unknown error.').
Failed executing external process for 'Bake Indirect' job. Exit code: '-1073740940'.
Failed executing external process for 'Bake Indirect' job. Exit code: '-1073741819'.
I believe it is something to do with running out of memory. Any thoughts? I basically cant endure using the virtual memory because I have other things to code in the background and it wont let me, so Im back to waiting for RC4 and hoping I think.
Usually large memory consumption are related to:
* huge single meshes covering a huge area that are not split into seperate meshes
* bad UVs
* extremely high resolution compared to that the meshes are not split into seperate pieces.
Try that, otherwise file a bug please and we'd be happy to look at it if you include a project with the scene setup where it runs out of memory for you.
We just updated from b21 to RC3, and this is how some of our scenes look now:
It varies between blue, red and green when you press Play. We do appreciate the change in look, but does anyone know what the hell is going on here?
I get this as well, reported with bug #674959.
Awesome, hopefully this can be fixed soon, because this makes developing in those scenes insanely annoying (it also doesn't happen in every scene, only in a couple, which is super weird).
We also have the problem where a Custom Editor Window seems to force focus on itself, making the game unplayable (because it doesn't have focus). The only way to fix this is closing the window, but we need that for debugging & development, this was a bug that was introduced in between b21 and RC3. Has anyone figured out why that is happening?
Thanks, I guess I will submit this package again (via ftp as it didnt upload last time). It could well be some sketch up model I have used somewhere. One question, is Unity5 somehow slower to run games? My game runs at least 40 fps slower in U5 than U4. Wondering what this is, even turning off the light stuff doesnt make it faster, or going to the lowest quality, just wondering if this is a hit on performance due to the new features? Even when it says its running at say 50 fps, it is so jerky it feels more like 10...
I'm sorry, those problems mean the whole concept has been poorly programmed. Memory should be managed far better than that. Better than just crashing or breaking stuff, at the very least.
Yeah 3dsmax, autocad, zbrush, photoshop and so on shouldn't crash when out of memory. Unfortunately, they do, because suddenly the memory isn't there any more due to the os moonwalking and doing this little spin.
Turns out the 64 bit versions are much more stable - at least until close to ram limits
That's why, Unity 5.0 is so important. Alas, we have still only Beta now. I hope we will see the stable version in the future, some day.
I don't care how other software is chronically managed - failure rate shouldn't have a benchmark!
It's very bad to presume that people won't use too many polygons!!!! And not exactly future proof.
It should at least make a guesstimate on how big the memory usage will be before it grinds a machine to a halt with page swapping. Please optimise it people!!!
I know you think you know best, but this is actually normal for any game development at this capacity. It's not limited to Unity for instance. Also, Unity would have to consume more resources by sandboxing the entire application which causes other issues too.
Why not just get more ram and Unity 5? It's not going to get fixed in 4.6 or 4.7 if it comes to that. It's a limitation of 32 bit.
I know what you're saying, but if it's only a problem with high poly & large area objects, then I don't understand why a high poly scene can't be split up internally?
Future proofing isn't really a thing for software, especially software that is in a constant state of development/enhancement. That's for something more permanent such as your computer which you won't likely upgrade every time something newer comes out.
That is something that has always been a problem, and should be handled by the developers of the game, it is part of optimizing. Large objects are going to make more lag in your game.
Sorry, I thought it was in a Release Candidate for a product that people can buy and use for 2 years or so.
I apologize for being negative, it's just very frustrating at the moment.
Are you referring to mesh vertex/poly count? that's limited to 65535 and probably won't change for a long time. That's a gpu / engine decision so it runs faster. Are your meshes higher poly than that?
Ooh, gold and white. Nice choice of colors! (I'll take my coat now..)
Famous last words ...
Unreal and Cryengine can already deal with 32 bit indexes
I reported the bug the other day, and as always nobody ever responded. It's a bug with the Light Menu, change the Ambient light from Skybox to Color or Gradient and the color shift change will stop.
I got a response to my bug yesterday saying this is fixed in the next build, which I assume is RC4.
oh sweet! Can't wait!
I don't know why I never get any updates on my Bug Reports.
Every Bug Report I've filed is just left Open with no answer lol.
Because Unity can't understand what kind of splitting is optimal at all. On the other hand, you do. You understand what possibilities exist in you application, that can't be understood by a computer.
Not on some hardware (although old stuff & crap mobiles), and it's slower and consumes more ram. All because a developer is too lazy to chop stuff up? I'll take the performance thanks, with 5 mins extra work.
Obviously, at some point there won't be any performance difference & Unity will upgrade this. But considering we can still have millions of polys in our games, I don't actually see the point.
Ideally Unity will just handle things under the hood (splitting etc) if it comes to that.
More ram, yes. Obviously 32 bit uses double size of 16 bit. But Slower? Modern graphics cards idles most of the time when dealing with the geometry. You can definitely save drawcalls with it when you use it right. Ram is cheap. Drawcalls not.
And it has nothing to do with a programmer being lazy. But with specific use cases. Use cases where it saves those told drawcalls, and/or makes some tasks easier, better or possible at all. Obviously you did not have such use cases yet. But i had. And i would have a need for a 32 bit mesh component.
There are quite a few use cases where 32 bit indices would be useful.
- Try to deal with millions of polys at once, like adding an edgeloop across a megapoly terrain mesh. Happy chunking.
- With dozens of characters above the 65k limit you could save some drawcalls.
- With 16 bit you can have mesh chunks go funky because of mip mapping.
- With 16 bit mesh chunks can have gaps because of floating point errors.
- Shading problems because the mesh ends but should go on smooth into the next chunk.
- Easier import and export of megapoly meshes.
I would be happy when i could handle it above the hood, and decide to use a 16 bit mesh component or a 32 bit mesh component where it makes sense. As told in another thread, Unity should provide freedom, not limits. It's the artists and developers job to set the limits then.
Tomorrow, a next Beta to test ? I'm finishing my big project, so I hope I see only happiness on this forum (including mine).
No more beta, autoupdate channel states "Release-5.0" In time for Tuesday.
Tiles. It's faster. Less data gets transferred. It's cheaper to put, fetch and move around 16 bits. I'm tired of trying to explain to you. Plus you'll never make use of any meshes > 65535 anyway. AAA never uses meshes anywhere near this dense. Do I need to go on?
For arch vis or science you'd probably want 32. Do I want 32, knowing it's slower? Absolutely not. Will I ever have need of a single mesh being bigger than 16 bits? no.
And in fairness, even XNA supported 32bit index buffers and vertex buffers, though not for everything, so not for Zune or silverlight for example, but pc and 360 supported it. It was a matter of swapping profiles. That being said, I think the decision around sticking with 16bit is to maintain support for mobile and web on non cutting edge devices. In terms of performance on desktop and console I don't really see it as a big difference, but that depends on the game. Games with lots of high poly characters and meshes could benefit from 32bit by reducing draw calls. Simple and 2d games would suffer as it would be extra overhead, so even if it were supported it should probably be a selectable profile like xna did with reach and hidef.
I think you missed Tiles point: reasonable choices are better than stuck on limit for eternity. I completely understand that you need pure performance and thats why there is idea for creating two mesh component version different in amount of bits.
Last but not least: dont be egoistic - even if YOU as professional game dev dont need this feature this doesnt mean that noone need this with accurate reasons for their projects.
Ah and really last thing: never say 'never' even if nowaday this is very rare case. Generall trendy is slighty increase poly amount for better details over years and already up to XXk for main chars/vehicles. I wouldnt be suprised if in relatively long future main chars will have slighty more polys than cca 66k
Nope, each character is a draw call regardless - skinned meshes cannot batch. And a single character > 65535 is uh, probably a tad more than gpus are willing to handle if gpu skinning is involved. In general, you'd be saving what, 10 draw calls? 20? a modern mobile gpu does >300 without a problem. It's not a problem.
Sure, I get that. Never is a strong word. But, generally for PS4, which is at least 5-10 years work, I won't be needing it.
We're talking vertex count though, not poly count. So you're only talking 21k tris which certainly isn't unreasonable for a high end visual game.
Anybody noticed that in RC3 (and possibly earlier betas) that the alpha channel specular map is ignored when using the legacy "Specular" shader? It's of no consequence to me, since I'd be using the standard shader for most things. Just curious. A purchased model I pulled in had that shader set by default and it looked hilariously bad in Unity 5 until I set it up for PBR.
I leaved unity on precompute GI calculation for the whole night and this morning was still at 0/11 calculate geometry | 1 jobs.
My computer was incredibly slow. I canceled the calculation and in the console 24 errors appeared about unity not capable to write some GI files on the GI cache folder. After this my pc was still slow and with 7.68/8GB ram used. Then he freezed. Any ideas? I have a 2KM square terrain from world machine imported as Raw 16. No texture painted over it.
You get it wrong. I try to explain it to you, not vice versa
It's not that simple. Drawing the scene is definitely not done with calculating the vertices positions. That part is done in an eyeblink. And you might not even notice a difference. The part of the graphics card that deals with the geometry is idling most of the time anyways, while the rest of the frame gets calculated at the cpu. But a draw call is a draw call. And that's definitely slower.
Well, what is not a problem for you might be a problem for somebody else. 20 saved drawcalls are 20 saved draw calls. And one draw call can already make the difference between running smooth and stuttering, dependand of your project. So i'm always interested in methods that saves draw calls.
One 16 bit character mesh, one draw call, right. But Characters above 65k vertices are already reality as Dustin Horne has pointed out. Then we have minimum two drawcalls for the character already. And a possible error source for weighting and shading.
With a 32 bit mesh component i couldn't only save draw calls, but also the time and effort that i spend for the workarounds to overcome the 16 bit limits. And saving manpower is also something i am very interested in. That's a win win so to say. And not everything above 65k vertices needs to be a character. I had the biggest trouble with level geometry. And abandoned a non game project because working with chunks made too much trouble.
The point is, the Unity user should be able to decide which way to go. When you think 16 bit only is the golden road for you then stick with it. But as pointed out earlier, there are situations where a 32 bit mesh index makes simply more sense.
I don't want Unity to take away the 16 bit mesh component and replace it with a 32 bit mesh component, it's not a either or thing. I want them to add something. A additional 32 bit mesh component that can be used when there is a need for.
Just in case, there exists a vote for this idea in the Unity feedback. Maybe some of you want to support the idea of 32 bit index buffers too: http://feedback.unity3d.com/suggest...ffers-and-larger-that-64k-vertex-buffer-sizes
Isn't this thread supposed to be about the release candidate?!
Sorry for hijacking
Still no RC4?...everyone think tommorow something going to happen?
I'm thinking a release announcement, and a final 5.0 download.
In RC3 I noticed that when i use directional lightmapping, realtime shadows wont show up on the terrain. They return when i use non-directional lightmap. ( Im getting the feeling that a non-directional lightmap is closest to what beast used to do.)
Is this a current bug (ie something to wait for a fix and not stress about it), or maybe its a feature? (maybe directional lightmaps and realtime shadows aren't supposed to work together for some reason)
Realtime shadows look amazing (better than lightmapping up close, except for no soft shadows) but lightmapping is good for distant objects and terrain.
The non directional lightmap must be the same as the directional one, minus the second 'direction' image becuase when i rebaked a directional lightmap as non-directional, it did it in like literally 5-10 seconds. (i guess it just removed the second lightmap)
I think a release public beta is coming. Why else would they schedule a live YouTube event tomorrow?
It doesn't matter to me when it's finally released, just so long as when it's released it's a great product with almost no bugs except for stuff that's truly unexpected.
But I honestly think at least to the point of RC3, that Light Mapping needs to be quicker. I almost just don't even use light Baking most of the time because it takes to long to do anything. Don't feel like waiting 20 minutes because I put a box in the level just to put another box 20 minutes later and wait 20 more minutes.