Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Resolved What's the most performance-efficient way to detect distance between a lot of objects?

Discussion in '2D' started by wheelerjd, Mar 6, 2023.

  1. wheelerjd

    wheelerjd

    Joined:
    Aug 16, 2022
    Posts:
    3
    Hello, I've got a dynamic segmented rope made of hinge joints in my 2D game, and when substantial force is applied to both ends, the rope starts breaking in the middle, and tries to correct itself, resulting in a plethora of physics problems for every object involved. I've tried adding a break force to the segments of the rope, but that causes more problems elsewhere, so now I'd just like to call a function to destroy the rope when the distance between segments is larger than it should be. Seeing as I'd have to check the distance between segments every frame, I'd like to avoid getting the Vector2.Distance of up to 30 segments, as that would be very expensive on the performance. Is there a better way of going about this? Thanks.
     
  2. karliss_coldwild

    karliss_coldwild

    Joined:
    Oct 1, 2020
    Posts:
    530
    If you don't actually need the distance but instead just want to compare it against some other length, you can use sqrMagnitude and compare it against square of length you want to check.

    30 objects isn't a lot. Calculating distance once for each of 30 segments is nothing. Either you are not describing the full picture, or you are heavily underestimating capabilities of modern CPU. The fact that they are being drawn and controlled by physics system probably takes order of magnitude more computing resources than calculating one extra length .
     
    Homicide likes this.
  3. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    36,962
    This sounds like you're just giving the physics system a thing it cannot solve, and thus you're seeing instability / breakage.

    Assuming it is based on IEEE single precision floats, the physics system only has about 4 or 5 digits of available dynamic range.

    In other words if you deform something and it takes 1 unit of force to fix it in stasis, the most it can really apply is about 10000 units of force.

    If you deform it to where it takes 1000000 units of force (or an infinite number of units of force) to fix it, the physics system simply cannot solve it.

    Would it?

    I think you'd be surprised.

    Doing 30 factorial distances, that would definitely leave a mark. But 30? Not even break a sweat.

    DO NOT OPTIMIZE "JUST BECAUSE..." If you don't have a problem, DO NOT OPTIMIZE!

    If you DO have a problem, there is only ONE way to find out. Always start by using the profiler:

    Window -> Analysis -> Profiler

    Failure to use the profiler first means you're just guessing, making a mess of your code for no good reason.

    Not only that but performance on platform A will likely be completely different than platform B. Test on the platform(s) that you care about, and test to the extent that it is worth your effort, and no more.

    https://forum.unity.com/threads/is-...ng-square-roots-in-2021.1111063/#post-7148770

    Remember that optimized code is ALWAYS harder to work with and more brittle, making subsequent feature development difficult or impossible, or incurring massive technical debt on future development.

    Notes on optimizing UnityEngine.UI setups:

    https://forum.unity.com/threads/how...form-data-into-an-array.1134520/#post-7289413

    At a minimum you want to clearly understand what performance issues you are having:

    - running too slowly?
    - loading too slowly?
    - using too much runtime memory?
    - final bundle too large?
    - too much network traffic?
    - something else?

    If you are unable to engage the profiler, then your next solution is gross guessing changes, such as "reimport all textures as 32x32 tiny textures" or "replace some complex 3D objects with cubes/capsules" to try and figure out what is bogging you down.

    Each experiment you do may give you intel about what is causing the performance issue that you identified. More importantly let you eliminate candidates for optimization. For instance if you swap out your biggest textures with 32x32 stamps and you STILL have a problem, you may be able to eliminate textures as an issue and move onto something else.

    This sort of speculative optimization assumes you're properly using source control so it takes one click to revert to the way your project was before if there is no improvement, while carefully making notes about what you have tried and more importantly what results it has had.
     
  4. wheelerjd

    wheelerjd

    Joined:
    Aug 16, 2022
    Posts:
    3
    Some good advice about optimization here, thanks. I'll be sure to keep it in mind in the future. I'll be sure to check out those resources, too.
     
  5. wheelerjd

    wheelerjd

    Joined:
    Aug 16, 2022
    Posts:
    3
    I had no idea you could get the raw square magnitude of a vector, that seems helpful. Thanks for your input, I'll start paying more attention to how expensive different processes are in the future.