Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Ultrasound VR Medical Simulation

Discussion in 'General Discussion' started by DanielSCG, Sep 5, 2023.

  1. DanielSCG

    DanielSCG

    Joined:
    Jun 6, 2023
    Posts:
    3
    I am working on developing a medical-grade ultrasound simulation within Unity. The simulation is intended for use on conventional 3D models rather than voxel data. Moreover, the platform targeted for deployment is the upcoming Oculus Quest 3 as the Headset tethered to a PC.

    My initial plan is to incorporate InfinyTech3D Feature - Imaging Ultrasound - EN - InfinyTech3D simulation solutions, although I am uncertain about the computational intensity this could entail. The current project scope requires the simulation to function on a 3D model of a prostate, a structure with numerous concave surfaces. Additionally, the simulation must support multiple ultrasound devices operating simultaneously on the same 3D model, each projecting its output onto a separate screen.

    Given the potential computational demands, I am considering offloading the ultrasound simulation to a separate application. This auxiliary application would receive input from the main virtual reality application, execute the ultrasound simulation, and then feed the resulting data back into the main Unity application. This could be executed on a separate PC or on the same machine using a dedicated graphics card.

    If opting for a single-machine setup, my aim would be to allocate one graphics card for the virtual reality application and another for the ultrasound simulation. Both applications would run concurrently and exchange data in real-time. Is this even possible with modern cards, RTX4090 etc?

    Is this a feasible approach, or would it be more prudent to focus on optimizing the simulation to run solely on a single graphics card?
     
  2. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    The more important question is, if you have experienced developer for such project. Because it isn't anything small or quick.

    Do you have team, or working alone?

    Regarding performance, you should run performance tests of mocup scene or relevant on the target hardware. Each project is so different, that such question may be difficult to answer.

    And even if someone say it is possible, lack of expertise may prove difficult to achieve such results.

    Real time apps require lots of smoke and mirrors.
     
  3. DanielSCG

    DanielSCG

    Joined:
    Jun 6, 2023
    Posts:
    3
    It will mainly just be myself working on the application. I have built many VR apps in Unity in the past and have 5+ years experience. However I have zero experience with running multiple apps separately on the same PC allocated to different GPUs. The concept isn't even something I would have arrived at by myself, it was suggested to me by someone else.

    I am still waiting to get my hands on the InfinityTech3D software so I can see for myself if it would work within the performance constrains of VR.

    I really just wanted to see if anyone had any thoughts on separating out applications in this way as it's not something I have ever heard of before.

    Another thing to note is the InifityTech3D ultrasound simulation is OpenGL only. So even if it performance wasn't an issue it may be necessary to separate out the applications as they would require different graphics APIs.
     
  4. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    571
    This sounds to me like a multi-player game, but its a bit hard to tell what your goal is. Frameworks for multi-player games exist, if the goal is to have people work at the same time. If each player is supposed to have their own ultra-sound device, I think that's going to push a single machine too far and each person should have their own GPU.

    If it has to be a single computer, maybe add a team member to create a DOTS implementation would make it performant enough.
     
  5. tsibiski

    tsibiski

    Joined:
    Jul 11, 2016
    Posts:
    569
    Why not just develop the simulation as one app, and if the demands are indeed high, then separate it out? What about the simulation would render a headset unable to compute it, out of curiosity?
     
  6. DanielSCG

    DanielSCG

    Joined:
    Jun 6, 2023
    Posts:
    3
    Hi. To clarify this is not a multiplayer app it will be a single user app for VR biopsy training. The best Ultrasound simulation technology we have found is InfinityTech3Ds. The caveat is that it only runs on OpenGL so we are considering breaking the app into 2 separate parts. The VR Application where the user will move around the VR scene and move a scanner. Then the Ultrasound Simulation where there will be a density correct 3D model of the body part being scanned and the ultrasound output.

    I plan to stream the data of the position and rotation of the ultrasound sensor out of the VR application into a socket or shared memory. Then have a streamed texture for the Ultrasound output that will be streamed to from the Ultrasound simulation and streamed into the VR application to allow the user to see the ultrasound display.
     
  7. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    I don't know the app, but to me sounds like diving into really deep rabbit hole of unknown.
    Basically connecting VR with some third party application, where FPS is critical for user experience, even if it is just for a simulation.

    My suggestion would be, to find a way, to write all in Unity, if wanting it to VR and performance critical stuff.
     
  8. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,321
    It is possible to have comfortable experience in VR at low FPS.

    Basically in this case VR should be handled the way it is done in virtual desktop. Experience is sent to each eye on a giant sphere, like in a movie theater, and device tracks orientation only, without trying to reproject anything. The important thing that this "sphere" should be handled at full framerate, but whatever is shown on it can run at as low as 10 fps without much issue.

    Making it work in unity is another story, of course, as this stuff is usually handled at driver level.
     
    MadeFromPolygons likes this.
  9. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,574
    Which confirms what I just mentioned. Projection is a separate story. Just like virtual gallery, inst it?

    Unless you meant something different.

    Unless the camera doesn't roate at all.

    Te question is, if ultrasound images are suppose to be generated in real time, or just generated onece per user request.