Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Unity 3D within windows Application Enviroment.

Discussion in 'Windows' started by Tapion817, Mar 25, 2014.

  1. Tapion817

    Tapion817

    Joined:
    Apr 6, 2009
    Posts:
    3
    Good evening everyone,

    I am a new developer and I am currently working on a Windows application. I have to integrate a window within the application that will show a 3D model and is controlled by the Windows application. I am a huge fan of the Unity engine and have worked on it as a hobby of mine. I am just wondering if it is possible to have an instance of a compiled Unity executable within my Windows application environment. So when I run the program, the Unity project will be loaded within the software. Here is a diagram to show what I mean. I don't need to know how to integrate it, just need to know if such integration is possible. Thank you.

    $Windows Application Example.jpg
     
  2. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
  3. Tan-Tan

    Tan-Tan

    Joined:
    Aug 8, 2013
    Posts:
    2
    Thanks. I know in theory its possible, but I was hoping it would work with Unity simply because I've worked with it before and I'm quite fond of the engine.
     
  4. Tan-Tan

    Tan-Tan

    Joined:
    Aug 8, 2013
    Posts:
    2
    When you say its possible, do you mean there is a possible way currently like a workaround? I don't need a deep explanation, just a general idea would be useful. Thanks.
     
  5. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    5,309
    Could you embed webplayer html instead (using webbrowser component, or what is it called these days : ),
    then maybe even can interact with it using javascript?
     
  6. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
    Sorry for not being clear...Sadly there's no workaround, there has to be some tweaking in internal engine code.
     
  7. Tapion817

    Tapion817

    Joined:
    Apr 6, 2009
    Posts:
    3
    Lame. Guess I'll have to use another engine then. :/
     
  8. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    5,309
  9. Tapion817

    Tapion817

    Joined:
    Apr 6, 2009
    Posts:
    3
    That seems pretty viable. I'll test some stuff out then, thanks.
     
  10. wakespirit

    wakespirit

    Joined:
    Mar 20, 2014
    Posts:
    5
    Dear all,

    We are new to unity and for a customer project we need to control a unity object with touch from a WPF c# touch application.

    The object in unity is a small test 3D cube, given by our customer, that we should be able to rotate from a WPF container

    How can we control that cube from our application ?

    In WPF we have the possibility to have acces to the UNityWeb container as activex control, do you think if we run unity in that web browser we will be able to directly manipulate it with touch ?

    Thnaks for your prompt help which will helpus to move forward as we are stuck here for days now

    If have trial version of unity for testing this and create a small object with your help in case something need to be setup in unity object for manipulation

    Help really appreciate
    regards
    serge
     
  11. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    788
    It is possible to embed, or rather, reparent Unity executable window to another window, like it is possible with any other Windows application.
    Just obtain the window handle and pass it to the
    Code (csharp):
    1. SetParent
    WinAPI call
    ( it is a good idea in that case to launch the unity executable with -popupwindow parameter )

    Note that I've done this with the whole parent window area, so I'm not completely sure whether it's possible to occupy only portion of the window ( by passing, say, HWND of a groupbox / panel to the SetParent ) as is depicted in the topic.

    We've done this and although somewhat cumbersome the Unity exe then runs completely embedded in the parent surface/window.
    ( although now that I am thinking about it we should have probably rather launch the exe and implement e.g. fullscreen/resolution logic in the exe itself..... )

    The Unity exe has then focus so any changes have to be communicated, if required, to the win/form/wfc process and the only way of doing so is via sockets - everything else is safely buried deep inside this custom mono
     
    indevious likes this.
  12. Aurimas-Cernius

    Aurimas-Cernius

    Unity Technologies

    Joined:
    Jul 31, 2013
    Posts:
    2,212
    No, at least not with released versions.
    We've tried that, but I think it needed some modifications inside Unity. Don't know, what state that thing is.
     
  13. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    788
    erm, what exactly does "need" some modifications ?
    The Unity executable window has HWND like any other Windows application => can be reparented / assigned to other window
    ( we run the Unity executable windowed, maybe fullscreen could pose a problem )
    If it wouldn't possible, I wouldn't see our .NET/WinForm app running, calling external Unity program and assigning it into prepared winform - which certainly is not the case :)

    btw I'm talking about 'classic' WinAPI, not Metro and such, which I don't know, if that's what you meant
    -- or maybe I'm describing slightly different use case
     
    Last edited: Aug 11, 2014
  14. Aurimas-Cernius

    Aurimas-Cernius

    Unity Technologies

    Joined:
    Jul 31, 2013
    Posts:
    2,212
    I meant that for now Unity assumes it's running in it's own window. Reparenting it might work, but stuff like input, joysticks etc. might not function properly. You'll have to test it pretty thoroughly.
     
  15. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    788
    The scene in our setup is driven by external network data, but the keyboard input definitely works ( we use it to adjust parameters at runtime ) and IIRC mouse too.
    But you are maybe right that this setup might not support all possible controller/s configurations.

    Btw for all things concerned - the window in which Unity runs is still the same - only its parent is changed.
    It's not completely 'clean', however : the parent window can still resize / minimize / maximize the content ( the Unity exe ) - although we don't do that in runtime - but it has not focus, for example.
    The startup is a bit quirky, too, but it was sufficiently reliable so far.
    ( it takes some time for Windows to adjust all sizes/contexts etc. and also timing has to be right with proper process/es Wait/Refresh call/s )


    -- I'm not writing hypothetically - we use this 'in production' so to speak as part of a bigger desktop application which runs @ several clients now
    but the approach is not ideal - as I wrote earlier - we should probably have had done all resolution/fullscreen stuff in the client unity exe all along and not be dependent on the winform
    nevertheless - it works like this for now
     
    Last edited: Aug 12, 2014
  16. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    5,309
    hey!

    Patch 4.5.5p1:
    "Windows Standalone: You can now embed windows standalone player into another application, simply pass -parentHWND and windows standalone application's window will be created with specified parent. See Command line arguments documentation for more information."
    http://unity3d.com/unity/qa/patch-releases
     
  17. r618

    r618

    Joined:
    Jan 19, 2009
    Posts:
    788
    so much for the "modifications inside Unity." I guess
    haven't tested it yet, but good job -)
     
  18. Grygus

    Grygus

    Joined:
    Aug 7, 2013
    Posts:
    18
    Hi, how to make container app pass touch input in to Unity Embedded? Keyboard Input works great.
     
  19. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
    It would be pointless, because in 4.6, Unity Windows Standalone isn't capable of processing touch input. That will only be available in 5.0
     
  20. Grygus

    Grygus

    Joined:
    Aug 7, 2013
    Posts:
    18
    We are using TouchScript from AssetStore to handle input from Windows for us. So for now there is no option to forward Touch to this plugin or Unity App or some workaround for this?
     
  21. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
    In that case, it might work, but you need to know what kind of window messages they're processing, and forward those message to Unity application so TouchScript can catch them.

    What about your container application, is it capable of accepting touch input?
     
    Last edited: Feb 2, 2015
  22. WildMaN

    WildMaN

    Joined:
    Jan 24, 2013
    Posts:
    19
    Hey, I'm interested in embedding too. We're creating a larger Win8.1 touchscreen-oriented application where games are just one of many features. And utilizing embedded Unity sounds like the most efficient way.
     
  23. Grygus

    Grygus

    Joined:
    Aug 7, 2013
    Posts:
    18
    Thanks Tomas I will check this, it seems that Touchscript is opensource. But I wonder where can I get information what messages are processed directly by Unity? In other words what messages can I forward to it. Can I somehow call Unity method without using any other plugins? My Container app is a simple sample from MSDN in WPF with touch handler.
     
  24. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
    There's no such information, but in this case, because you care about touch messages, you should check Touchscript plugin for that message list.

    Regarding call Unity method, you need to implement interprocess communication or use sockets. Because you basically have two processes here - Container and Unity application.
     
  25. jordiboni

    jordiboni

    Joined:
    Aug 14, 2013
    Posts:
    26
    How can I manage keyboard inputs in a Unity embeded version? If I attach this script to a GameObject in your EmbeddedWindow project it doesn't work. I am using standalone build.

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. using UnityEngine.UI;
    5.  
    6. public class ShowKeyPressed : MonoBehaviour {
    7.  
    8.     private Text text;
    9.  
    10.     // Use this for initialization
    11.     void Start () {
    12.         text = this.GetComponent<Text>();
    13.     }
    14.  
    15.     // Update is called once per frame
    16.     void Update () {
    17.         if(Input.GetAxis("Vertical") != 0)
    18.             text.text = "Vertical";
    19.  
    20.         if(Input.GetAxis("Horizontal") != 0)
    21.             text.text = "Horizontal";
    22.     }
    23. }
     
  26. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
  27. jordiboni

    jordiboni

    Joined:
    Aug 14, 2013
    Posts:
    26
    What is the diference between source code in your post and source code downloaded from Command line documentation page? It doesn't works with Unity 4.6.
     
  28. Tomas1856

    Tomas1856

    Unity Technologies

    Joined:
    Sep 21, 2012
    Posts:
    1,875
    Keyboard wasn't working because WM_ACTIVATE event was not being sent to Unity application, it sent like this "SendMessage(unityHWND, WM_ACTIVATE, WA_ACTIVE, 0);"
     
  29. Giometric

    Giometric

    Joined:
    Dec 20, 2011
    Posts:
    163
    I know this is an old topic, but I wanted to add a note about a specific issue I had and how I resolved it. I'm using WPF, and have the Unity game hosted inside a Panel contained in a WindowsFormsHost, which itself is within another Window (so that it only takes up a certain portion). I'm not sure whether it works better in pure Windows Forms, but here it seems like Unity doesn't respond to the message WM_MOUSEACTIVATE (0x21). For my use, this creates an edge case where if you click off the window hosting the Unity game (to, say, click on the main app window which is usually maximized), then back to it, the Unity game will not regain focus. Mouse input still works (regardless of focus state it seems), but keyboard inputs are not captured, so you can end up tabbing or using the arrow keys and other controls on the window will get focused instead.

    To fix this, I set up a handler for WindowsFormsHost.MessageHook, which gives you any window messages that go unhandled. The handler just checks to see if the message was WM_MOUSEACTIVATE, and if it is, sends the Unity window a WM_ACTIVATE message (exactly like Tomas1856 posted above). As far as I can tell, everything works correctly after that.
     
    Last edited: Oct 15, 2015
  30. jtsheedy

    jtsheedy

    Joined:
    Dec 7, 2017
    Posts:
    1
    Does anyone have a simple example of a unity window inside a winform?
    I am struggling to find one.
     
  31. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    6,338
  32. indevious

    indevious

    Joined:
    Nov 29, 2017
    Posts:
    16
    As an alternative to the window reparenting technique, just inject a DLL into the Unity process and subclass the window.
     
    velayudham likes this.