Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Spinning ball Motion Blur

Discussion in 'Shaders' started by Raptosauru5, Apr 8, 2020.

  1. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Hi,
    I am looking for a way to blur spinning sphere that can be spinning in any direction.

    I am making very advanced version of "Roll-a-ball" game where you control a ball which has texture on it. My ball is rolling down the ramps and using roller-coasters etc. so it often gains some speed and in these cases the default visual effect of just spinning the mesh doesn't look very good, because the mesh is not rotating smoothly but rather it is "teleporting" it's rotation. In reality, we would see the texture blurred rather than perfectly sharp on every frame.

    I looked up many solutions how to do this, but everybody is using it on a car wheel. My situation is more complicated because I need the texture to blur in different directions, depending on which direction is the ball rolling.

    The visual effect I am looking for:
    gettyimages-1054343632-2048x2048.jpg

    Here I found a video of the pretty much expected behavior:


    I am trying to make the ball's rolling motion as satisfying to watch as possible.
    (side note: I do not want to blur entire scene nor the background, only the ball)

    Any suggestion, tool or trick that would solve this problem would be highly appreciated!

    Thanks in advance
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Single object motion blur is done in one of a handful of methods.

    Certainly the most common today is some kind of full screen post process, but that's not what you're going for. So we'll ignore this specific setup.

    The next most common method would be to blur the texture, either in real time in the shader, or as a pre-process so there are some number of pre-blurred textures the shader blends between. The texture based method works great for things like wheels when there's only one axis of rotation that you need to care about. The shader based method... it also benefits from this, but we'll come back to it.

    The last method that was very common for early console games, was to render the object multiple times with slightly different positions and partially transparent. This was especially common for PS2 & original Xbox games, but is still used today for many fighting or 2D games, especially if there some kind of stylized fast slide move, etc. This is a good general approach as it mimics how real motion blur works, but it has it’s limitations.

    So let’s go back to the shader based methods. Post processing and the on-object shader blurring. The toon motion blur video above is probably shader based. How it works is by sampling the texture multiple times in a direction and averaging the results. For post processing based effects you have a screen space velocity buffer and a copy of the screen as a texture that can be sampled in a straight line. For shader based wheel blurs for the sides of the wheel you just need to sample a position in a circle, and on the treads you just need to sample in the direction of the treads, which is usually a straight line. The key here is knowing where to sample the texture to do the blur. For something like a sphere with an arbitrary UV, that’s basically impossible.

    So the solution is you need to find some way to map the ball in a way that you can calculate the texture position for any point in the sphere from any point on the sphere. Two options I can think of. One would be to calculate the UVs in the shader using an equirectangular projection, but that’s a little expensive to calculate multiple times. The cheaper, but more work option, is to use a cube map for your ball’s texture. This has the advantage of texture UVs being just the object space position, and you can calculate the other texture positions with a lerp, or by passing a quaternion or rotation vector to the shader to rotate the vector with.
     
    Raptosauru5 likes this.
  3. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Thank you for writing all the possible methods! This is very helpful.

    The idea with the cube map is really smart and I think I should go that direction.

    I am guessing the cube map works pretty much like "skybox" which the ball is reflecting, so it should move in any direction if I would just take it and rotate it, is that right?

    Now I just do not understand how exactly to make it blurry if it was moved in every frame using Quaternion for example. Could you please elaborate on that?

    Thanks again!
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    You’d need to sample the texture multiple times still. The idea would be to rotate the “uvw” (local vertex position passed to the fragment for sampling the cubemap) multiple times. Or you can get an approximation by using the quaternion/matrix to do a larger single rotation once or twice and lerp between the results.

    psuedo code would be something like this:
    Code (csharp):
    1. // normalize not needed for sample, but needed for later math
    2. float3 uvw = normalize(i.uvw);
    3. fixed4 col = texCUBE(_CubeTex, uvw);
    4.  
    5. // save off copy we’ll iterate on
    6. float3 rotUVW = uvw;
    7.  
    8. // define blur quality
    9. int numSteps = 4;
    10. for (int i=0; i<numSteps; i++)
    11. {
    12.   // rotate once per step
    13.   rotUVW = mul((float3x3)_VelocityRotationMatrix, rotUVW);
    14.  
    15.   // reflect instead of counter rotate which is slightly faster
    16.   float3 reflectUVW = reflect(rotUVW, uvw);
    17.  
    18.   // sample in both rotations
    19.   col += texCUBE(_CubeTex, rotUVW);
    20.   col += texCUBE(_CubeTex, reflectUVW);
    21. }
    22.  
    23. // normalize the sum of the sample count
    24. col /= (float)(1 + numSteps * 2);
    For that example you’d be passing in a local space rotation matrix that’s based on the current rotation velocity divided by the number of steps (or less if you want to exaggerate the blur).
     
    Raptosauru5 likes this.
  5. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Thank you very much for the script example. This made me realize I am completely missing any knowledge of shader scripting in HLSL and so I had to figure out first how this even works. Now it makes a lot of sense and I was about to implement your solution, but it seems like custom shader writing is not supported in Universal/Lightweight Render Pipeline or maybe I am missing something...? Is there any good reason to use Render Pipeline except having ability to use Shader Graph? (I want to launch on both PC and Mobile)
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    You can't use lit shaders written for the built in pipelines with the URP because they completely changed how lighting is calculated, but there's nothing stopping you from writing custom shaders for it. The hard part is there's nothing like Surface Shaders for the URP/HDRP where most of the lighting code is abstracted away and you can focus on the bit of shader code you care about, apart from Shader Graph. That said it's not super hard to take the code generated by Shader Graph and modify that directly. The snippet of code above can also be 100% reproduced in Shader Graph, either as nodes (which would suck to do since you'd have to actually have unique nodes for each iteration of the loop), or use a Custom Function node to just shove some of the HLSL code into.
     
  7. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Since I am quite new to shaders, let me try to say it in my words to make sure I got it right:
    you are saying that I can keep using any of the U/HD/LW Render Pipeline and instead of creating a Shader and editing it's code I should rather create some PBR Graph that does what I want and create custom node in it where I can code with all the HLSL freedom. Is that correct?

    Is it also possible to switch between Render Pipelines freely if I later decide to jump from for example URP to LWRP to support mobile?
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Roughly, yes. "All" the freedom might be a stretch though. ;) There are some limitations to using the Custom Function node, along with limitations that Shader Graph itself has that can't be avoided without modifying the generated shader directly, but for most stuff people want to do yes.



    Part of the supposed benefit of Shader Graph is that you can write shaders once and reuse them between different SRPs.

    Unfortunately this isn't actually true since Shader Graph keeps changing so much they're often incompatible between versions (new versions of the SRPs support older graphs, but not necessarily the other way around). And for switching between the URP and HDRP you almost never want to be using the PBR Master node for the HDRP because it's missing a ton of features the HDRP supports, so really you want to use at least the Lit Master node for that... and they don't support the Lit Master node in the URP.

    Also, the LWRP & the URP are the same pipeline. They just renamed it between LWRP 4.0 and URP 5.0. And you can't go back to an older version of the SRPs once you upgrade to a newer version and expect anything to work. (In my experience you can't even upgrade an old version and expect things to work, but that's a different conversation.)
     
  9. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Alright, so this is pretty complicated I see. Then which RP is a good practice to use in the beginning when you are not sure which one to use yet? Or should you decide once and then never change it?
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    If you want to aim only for high end PCs and consoles, HDRP. If you want to support anything else, URP. Assume if you want to use “both” that they’re essentially separate projects you work on in tandem or eventually port from one to the other with completely unique content.
     
  11. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Ok. My game is going to be for PC and mobile but it is low-poly and not very graphically demanding, so I guess I should build everything on URP.
    Thanks for the valuable info, I would spend extremely long time figuring all this out on my own!

    I am going to try adding the Custom Function node as you suggested and I will post update about the result. :)
     
  12. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    I am completely lost here...I am trying to just define my Inputs as you suggested:
    in Shader Graph in Custom Function node:
    SG.jpg

    using Source script "SpinBlur01.hlsl" which looks like this so far:

    Code (CSharp):
    1. #ifndef MYHLSLINCLUDE_INCLUDED
    2. #define MYHLSLINCLUDE_INCLUDED
    3.  
    4. void SpinBlur01_float (float3 uvw, samplerCUBE cubeTex, out float3 Out)
    5. {
    6.     //uniform samplerCUBE cubeTex;
    7.     uvw = normalize(uvw);
    8.     float4 col = texCUBE(cubeTex, uvw);
    9.  
    10.     Out = float3(1, 1, 1);
    11. }
    12.  
    13. #endif
    But I can't figure how to make the "texCUBE(_CubeTex, uvw);" part working. To be honest, I am not sure what does it do exactly. I just found out that the "_CubeTex" argument is of type samplerCUBE. However I can't figure how to add it using Shader Graph nor how to perhaps convert it from 2D texture...

    I think I maybe misunderstood you or perhaps this is being done differently in this new version of Shader Graph? (according to Package Manager I am using Shader Graph 7.3.1. Unity version: 2019.3.11f1)
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Yep, was waiting for you to ask about this. I was going to mention there’s some ... translation that needs to happen.

    Most Unity shaders in the built in pipeline, at least example shaders you’ll find online, are written using Direct3D9 style HLSL. Later in the Unity 5.0 and the last 3 years of Unity they’ve started adding various bits of Direct3D11 style HLSL and then macros that automatically use whatever the best version of HLSL should be used.

    Then they rewrote all of that from scratch for the SRPs and documented approximately 0% of it.

    But the basics are instead of the
    sampler2D tex
    and
    tex2D(tex, uv)
    of Direct3D9 it’s using
    Texture2D tex
    ,
    SamplerState sampler_tex
    , and
    tex.Sample(sampler_tex, uv)
    . But those are swapped between by the macros in the API HLSL files.
    https://github.com/Unity-Technologi...r-pipelines.core/ShaderLibrary/API/D3D11.hlsl

    So in your case, instead of
    texCUBE(cubeTex, uvw)
    use
    SAMPLE_TEXTURECUBE(cubeTex, samplerState, uvw)
    . Now where the sampler state comes from, that’s a bit of a bother. Unity still doesn’t give you a node to get the sampler state from a texture directly, even though Unity specifies one for every texture property. So there are two options. One is to use a Sampler State node that you pipe in, which means it ignores the sampler state on the texture itself (which isn’t a huge deal for cube maps), or you hack around it by writing in the name of the sampler state for the texture manually based on the property name. Like this:
    https://forum.unity.com/threads/sam...es-anisotropic-filtering.839374/#post-5551990

    The name of the sampler state should be the same for a cubemap or any texture type, just prefix
    sampler
    on the texture property reference name.

    Also, I’m pretty lazy and don’t bother with using a separate file. Instead I just put the code right there in the text box.
     
  14. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Also, to touch on this, a
    tex2D
    or
    texCUBE
    call is passing in a texture (
    sampler2D
    or
    samplerCUBE
    ) and a position (or direction for cube maps) to read from on that texture, and the function returns the color value of the texture at that location.
     
  15. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    I love how you just "should know". I spent about 5 hours trying to find info about this alone, but if you wouldn't help me, I would just give up and I would never find out how this works. I like how they are trying to make it easier, but there needs to be proper documentation about how exactly to proceed, because this is really complicated stuff!

    So, little success, I have some progress here. Thanks a lot! My custom function is now outputting single color using the cubeMap (I am not sure if I shouldn't have used Texture2D tho). It looks like this now:
    SG02.jpg

    The code string looks like this (I also just put the code string there because I want to make it work, but having separate file seems more comfortable in this situation):
    Code (CSharp):
    1. uvw = normalize(uvw);
    2. float4 col = SAMPLE_TEXTURECUBE(cubeTex, sampler_MyCubemap, uvw);
    3.  
    4. // save off copy we’ll iterate on
    5. float3 rotUVW = uvw;
    6.  
    7. for (int i=0; i<numSteps; i++)
    8. {
    9.    // rotate once per step
    10.    rotUVW = mul((float3x3)velRotMatrix, rotUVW);
    11.  
    12.    // reflect instead of counter rotate -slightly faster
    13.    float3 reflectUVW = reflect(rotUVW, uvw);
    14.  
    15.    // sample in both rotations
    16.    col += SAMPLE_TEXTURECUBE(cubeTex, sampler_MyCubemap, rotUVW);
    17.    col += SAMPLE_TEXTURECUBE(cubeTex, sampler_MyCubemap, reflectUVW);
    18. }
    19. // normalize the sum of the sample count
    20. col /= (float)(1 + numSteps * 2);
    21.  
    22. Out = col;
    23. //Out = float4(1,1,1,1);
    It seems like it is just outputting a single pixel color from the inserted cubemap, because if I move uvw, it shows colors seen on the texture.

    side note: it seems like I can't use helper functions: https://docs.unity3d.com/Manual/SL-BuiltinFunctions.html
    like "fixed4". It seems like Unity can't recognize what fixed4 is, even if I put #include "UnityCG.cginc" in the code...
     
  16. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    You’ll want / need a decent rotation matrix, otherwise you’re going to have a hard time testing anything. Calculate a rotation matrix with a 15 degree rotation in a dummy c# script and copy the values for now.
    Code (csharp):
    1. Matrix4x4 rotMat = Matrix4x4.Rotate(Quaternion.AngleAxis(15f, Vector3.up));
    For actually setting the value on the shader, you’ll either need to define a uniform (a shader value defined outside of function, usually mirrored from the shader’s properties, but not required to exist in both to set from script)
    float4x4
    in your external file, or have 3 Vector3 properties you use to construct the matrix with. Unity doesn’t have a Matrix3x3 in c#, and you can’t use a
    float3x3
    as a uniform in Unity shaders, and you can’t have a matrix property.


    That’s not a “built in function”, it’s a precision type.
    fixed
    ,
    half
    , and
    float
    were the numerical precision types Unity defined to stand in for OpenGLES’s
    lowp
    ,
    medp
    , and
    highp
    precision modifiers. On desktop they all just mapped to
    float
    , and almost nothing actually supports
    lowp
    anymore. for the SRPs they swapped to defining
    half
    ,
    float
    , and
    real
    , with that last one mapping to either
    half
    or
    float
    depending which was the lowest the target platform allows for... which is weird because that’s already what
    half
    does.

    TLDR; just use float for everything.
     
  17. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Alright, I copied the values from the dummy script to the Matrix3x3 in shader graph, but it still doesn't feel right. It is always just a single color, whatever I input. Is it what we are after?

    I also can't find way how to convert from Matrix4x4 to 3x3, because I really suck with matrices and I don't understand them that well. I just know they contain transform values like transform, scale, rotation...maybe I can just take out the rotation values of rotation using mat(a,b) and then construct 3x3 matrix out of them?

    I guess when working with transform, Unity will always work with Matrix4x4, but we are using Matrix3x3, because "uvw" has 3 values (x,y,z) and we wouldn't be able to multiply them with Matrix4x4?
     
    Last edited: Apr 27, 2020
  18. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Nope, but it is what I would expect if you’re passing in a constant vector into the UVW. That should be the object space position, which you can get from the Position node set to Object Space.

    Yep. Just take the first 3 components from the first 3 rows... or columns, I can never remember. Pass those to the shader as 3 Vector3s and construct a Matrix from it in the node graph. The matrix node graph also lets you choose if it’s constructing a matrix using rows or columns, so how you get the values from c# isn’t super important since it can be “fixed” in the graph. Really, you can pass in 3 Vector4 values and use GetRow or GetColumn to simplify things too.

    A position has 3 values (x,y,z) too. The difference between a 4x4 and 3x3 is the 3x3 matrix is the rotation and scale part of a 4x4 matrix. Basically we’re skipping applying the translation since we only care about the rotation.
     
  19. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Nice!
    Position node set to Object Space fixed it. Now it properly shows the whole texture.

    I did a lot of experimenting and it seems like Matrix Construction can take both Vector3 and Vector4 and in c# Material.SetVector() can also handle Vector3 even though it should only be compatible with Vector4. Interesting...

    Now I am having pretty good results here, it seems like it is almost working! Now when I rotate the ball, it starts "ghost-repeating" the texture, which looks pretty good! But it is not exactly "smudged" so I guess I still have to do the lerp part. I am getting the result you were talking about:

    But why are we using the reflect() function here?
    Code (CSharp):
    1. // reflect instead of counter rotate which is slightly faster
    2.   float3 reflectUVW = reflect(rotUVW, uvw);
    It just mirrors the texture to the other side of the sphere across the center. Result is that I can see the imprint of the back side on the front and upside down. Why are we doing this? (Removing the reflectUVW makes better result but product texture is darker.)
     
  20. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    Unity’s Vector3 class can implicitly cast to a Vector4, so anything that takes a Vector4 you can pass a Vector3 to and it’ll “just work” by converting that to a Vector4 with a w of 0.0.

    How many iterations are you using, and are you scaling down the rotation matrix angle to account for that? You’ll probably want something like at least 4-5 iterations, which means 9 or 11 samples since each iteration samples the texture twice.

    It’s a fast way of doubling the number of sample locations. When you blur something along a direction, you want to do multiple samples + & - the staring sample position. As the comment mentions, it should be faster than rotating the vector twice as many times.
     
  21. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    So, I realized my reflect() there isn't going to do what I want. Not without some extra work at least. Reflecting by the UVW will cause some warping, so you do need to rotate & counter rotate the uvw.
    Code (csharp):
    1.    // rotate once per step
    2.    rotUVW = mul((float3x3)velRotMatrix, rotUVW);
    3.    counterRotUVW = mul(counterRotUVW, (float3x3)velRotMatrix);
    The good news is I was also wrong that the reflect is faster! Ends up being exactly the same cost. Actually slightly faster because we don't need the normalize. Yay!?
    :oops:

    upload_2020-4-28_23-54-50.png

    So here's the Shader Graph I made to test.
    upload_2020-4-28_23-58-12.png
    Note, I'm using a Sampler State node because at least in the version of URP I tested this on the explicitly named sampler state hack seems to cause an error on the shader for one of the generated passes.

    Here's the Custom Function node's code:
    Code (csharp):
    1. Out = SAMPLE_TEXTURECUBE(
    2.     cubeTex,
    3.     samplerState,
    4.     uvw);
    5.  
    6. float3 rotUVW0 = uvw;
    7. float3 rotUVW1 = uvw;
    8. for (int i=0; i<(int)numSteps; i++)
    9. {
    10.   rotUVW0 = mul(rotMatrix, rotUVW0);
    11.   rotUVW1 = mul(rotUVW1, rotMatrix);
    12.  
    13.   Out += SAMPLE_TEXTURECUBE(
    14.     cubeTex,
    15.     samplerState,
    16.     rotUVW0);
    17.   Out += SAMPLE_TEXTURECUBE(
    18.     cubeTex,
    19.     samplerState,
    20.     rotUVW1);
    21. }
    22. Out /= (float)(1 + numSteps * 2);
    You might also note I'm using setting the numSteps as a material property. Well, here's the script I'm using to test it.
    Code (csharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. [ExecuteInEditMode]
    6. public class rotationDummy : MonoBehaviour
    7. {
    8.     // material to modify values on
    9.     public Material material;
    10.  
    11.     // how fast to rotate the object
    12.     public float degreesPerSecond = 0f;
    13.  
    14.     // what world axis to rotate on
    15.     public Vector3 axis = Vector3.up;
    16.  
    17.     // use a fixed framerate for the blur rather than the current delta
    18.     public float blurFramerate = 30f;
    19.  
    20.     // scale the blur for different effects
    21.     // for example real world film blur is only half the delta between frames
    22.     public float blurScale = 0.5f;
    23.  
    24.     // optimization settings
    25.     // target angle between samples
    26.     public float anglePerStep = 1f;
    27.  
    28.     // max angle for full sweep
    29.     [Range(15f, 360f)] public float maxAngleDelta = 360f;
    30.  
    31.     // max number of samples to do
    32.     public int maxSteps = 30;
    33.  
    34.     // hack to show what kind of values we're setting on the material for easy reference
    35.     [Header("Current Render Settings")]
    36.     public float numSteps; // actual number of steps
    37.     public float perStepAngle; // actual angle between sample
    38.     public float angleDelta; // total sweep angle
    39.  
    40.     private Matrix4x4 rotatioMatrix;
    41.  
    42.     void Update()
    43.     {
    44.         // make sure we have a valid axis to rotate around
    45.         if (axis == Vector3.zero)
    46.             axis = Vector3.up;
    47.         else
    48.             axis = axis.normalized;
    49.  
    50.         // rotate current transform
    51.         // lazy assuming this is on the object the material is on
    52.         transform.rotation = Quaternion.AngleAxis(degreesPerSecond * Time.deltaTime, axis) * transform.rotation;
    53.  
    54.         // calculate the total sweep angle
    55.         angleDelta = degreesPerSecond / blurFramerate * blurScale;
    56.  
    57.         // get clamped half angle of the sweep, since we're doubling this in the shader by rotating the uvw in both directions
    58.         float halfAngleDelta = Mathf.Min(angleDelta, maxAngleDelta) / 2f;
    59.  
    60.         // make sure half angle is over some arbitrary min and calculate the number of steps to use
    61.         if (halfAngleDelta > 0.1f)
    62.             numSteps = Mathf.Clamp(Mathf.Round(Mathf.Abs(halfAngleDelta / anglePerStep)), 1f, (float)maxSteps);
    63.         else
    64.             numSteps = 0f;
    65.  
    66.         // if more than zero steps, calculate the appropriate matrix
    67.         if (numSteps > 0f)
    68.         {
    69.             // get the per step angle
    70.             perStepAngle = Mathf.Abs(halfAngleDelta / numSteps);
    71.  
    72.             // get the object space rotation matrix
    73.             rotatioMatrix = Matrix4x4.Rotate(Quaternion.AngleAxis(perStepAngle, transform.InverseTransformDirection(axis)));
    74.         }
    75.  
    76.         // set values on material
    77.         if (material != null)
    78.         {
    79.             material.SetMatrix("_VelRotMatrix", rotatioMatrix);
    80.             material.SetFloat("_NumSteps", numSteps);
    81.         }
    82.     }
    83. }
    With the default anglePerStep set to 1, when zoomed in on the ball while paused the blur is a little chunky.
    upload_2020-4-29_0-17-51.png
    But really this is spinning so fast that in motion you'll never see it. You could probably have it using way larger steps and never notice.
     
    Olmi, Raptosauru5, cLick1338 and 2 others like this.
  22. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Man, you are legend! You saved me a lot of time. Thank you so much!

    It works great and the effect looks really sick! Also, thanks for extra comments in the code, I learned so much while solving this problem and it forced me to look up plenty new areas.

    This effect is obviously pretty hard to achieve and there is 0 info on the web about how to even start.
    You have got some impressive skills, aren't you perhaps making online tutorials or something? If that is the case, I would be really interested in checking them out. :)

    Thanks again

    EDIT: I think I will implement it in in my game in more places than originally intended.
    There is something really satisfying about this effect...I can't stop playing with it. :D
     
  23. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    One last thing. In my example code I'm manually spinning the ball in
    Update
    . If you're using a a rigid body you probably want to do the update in
    FixedUpdate
    and get the rigid body's
    angularVelocity
    to drive the effect. That is a Vector3 that is the axis of rotation with the radians per second encoded in the vector's magnitude.

    Also, it might not look totally perfect once you add in regular motion on top of it.
     
  24. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,238
    I post a lot here on the forums, sometimes on twitter. I post an article on Medium once a year or so. Otherwise no. Plenty of tutorials out there already for basics, some decent ones for random common intermediate stuff.
     
    AcidArrow likes this.
  25. Raptosauru5

    Raptosauru5

    Joined:
    Feb 8, 2019
    Posts:
    24
    Thanks for the note about Update function.

    Yeah, for difference between Update and FixedUpdate I would actually recommend anyone to watch some tutorials, because it is good overall to know when Fixed Update runs and when Update runs (there are lots of misunderstandings about how they work). It saves you from plenty of framerate issues in the future.

    I am actually thinking about freezing the rotation and just making the texture to move...if I will figure out how to properly compute the rotation (maybe it will be too difficult to achieve satisfactory result, so lets see). But otherwise, using angularVelocity would be a good idea.

    By the way, you earned a new follower on Medium :)