Search Unity

Question Adjust ARKit camera exposure to Texture

Discussion in 'AR' started by Sheeks, Apr 25, 2022.

  1. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    Hello everyone,
    I have read quite a bit about the fact that it is not possible to deactivate the camera exposure of ARKit.
    Now I have the problem that I take a screenshot and put it on a mesh:
    you can clearly see the edges of the previously taken picture cause of exposure (?) changes of the camera.
    I have already done some tests by applying the difference of the "Ambient Brightness" and the colour temperature in my shader to the image, but without any useful success.
    Does anyone have an idea how I can achieve a seamless approximation (by adjusting the brightness/temperature or other parameters of the stillimage) to the live background image? So that there is no difference between the stillimage and the background?

    I would appreciate any help! And thank you for any tips!
     
  2. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    No one? Or is my question not clear enough?
     
  3. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
  4. mikeyrafier98

    mikeyrafier98

    Joined:
    May 19, 2022
    Posts:
    37
    I don't have any good idea currently, but can you share the goals that you want to?

    Even thought we couldn't experiment with the camera exposure, cause it unchangeable from ARKit,
    probably we can see the alternative solution that meet your goals.
     
  5. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    As described above (see the image), I would like to adjust a static photo that is located as a texture on the mesh in the middle in a way that it fits seamlessly with the livestream of the CameraBackground.
    The background is constantly adjusted by the CameraExposure. This shows the difference in brightness and colour temperature between the static texture and the live background.
    I have already played a bit manually with the brightness and the colour temperature of the directional light, which also leads to an acceptable result, but not automatically, because I can't find the connection how I have to edit the texture depending on the CameraExposure. Has this become clear?
     
  6. mikeyrafier98

    mikeyrafier98

    Joined:
    May 19, 2022
    Posts:
    37
    Hmm based on your case, you already tried to set the "directional light" parameter, but your need this automated while the system automatically configure the exposure in real-time.

    I think you already read this one, but this means currently we can't do much with experiment with ARKit and it's connection with device's camera exposure by using Unity
    https://github.com/Unity-Technologies/arfoundation-samples/issues/324

    Probably if you know in "directional light", what parameters and how to change those by script automatically in real-time, (for example, using Slider UI), then the next thing how this can be adjust with almost every use case of the environment.
    Like, trial and error, and adjust again.
    I know this sounds not a perfect or even good solution, but hopefully others can help us more.

    Good luck!
     
  7. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    Yep i did.
    Thanks for your reply. But trial and error is a messy thing. Cause i need a kind of sustain solution for that, which automaticcaly adjust the parameters. Maybe anyone else got a hint on that problem?
     
  8. marcin-walus

    marcin-walus

    Joined:
    Feb 9, 2016
    Posts:
    10
    Hi.
    I'm working on texturing of mesh provided by AR Foundation Meshing (ARKit) and I'm facing the same problem. During scanning I'm taking photos and later on in "post process" I'm trying to project those photos onto mesh and buid texture.

    Screenshot 2022-12-07 at 09.25.31.png

    An issue with this approach is that there seems to be no way to lock camera exposure and photos taken from different angle can be more "bright" or "dark" and later on, since some fragments of the photos are applied to different parts of the mesh it can get "inconsistent" texture.

    Screenshot 2022-12-07 at 09.22.01.png

    So what I'm planning to do is to manually make those photos "equal" in terms of luminocity before building texture.
     
  9. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    1,062
    @marcin-walus ARKit 6 introduced new API's that allow you to manually control camera hardware properties such as exposure and white balance. We are currently working on integrating these API's, and you can expect this feature to land in AR Foundation 5.1 preview in the first half of next year.
     
    Sheeks and marcin-walus like this.
  10. marcin-walus

    marcin-walus

    Joined:
    Feb 9, 2016
    Posts:
    10
    Hello @andyb-unity
    Thanks for info about new features. I will pay attention to new release.
     
    andyb-unity likes this.
  11. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    andyb-unity likes this.
  12. Sheeks

    Sheeks

    Joined:
    Jun 9, 2017
    Posts:
    12
    @andyb-unity thanks once more for the info about the arfoundation update. I tried the iso lock on the Camera. It works great, but there is still a color temperature correction, is it possible to lock this as well?
     
  13. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    1,062
    mwalus and Sheeks like this.