Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

VST synths/effects in Unity?

Discussion in 'Editor & General Support' started by PlainMartin, Apr 3, 2008.

  1. PlainMartin

    PlainMartin

    Joined:
    Apr 3, 2008
    Posts:
    6
    Is it possible to use software synthesizers and effects based on Steinberg's Virtual Studio Technology in Unity, i.e. a Unity-based game/presentation?

    (When you want to make a VST instrument or effect available outside of your typical Audio Workstation such as Cubase, what you need is a so-called "microhost" - a software that loads the synth/effect so the user can control it, e.g. with a MIDI or alphanumerical keyboard.)

    So - could a Unity-based game/presentation (by means of scripting) act as a VST microhost to a VST, allowing the end user to "play" (trigger) sounds by interacting with in-game objects?

    (I am not a developer, so I don't know how the actual integration would be achieved. Typically, VSTs are platform-specific, i.e. a VSTi for Windows is a DLL which is then loaded into the host.)

    Has someone done this before?
     
  2. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    It should be possible to do that using plugins. It's nothing that Unity supports as it is. And I'm not sure it does make that much sense for most cases, too, because VST plugins have a tendency towards consuming a lot of processing power (at least some of them), which would then compete with the processing power required to run the game. When the game needs too much processing power, your VST plugins start making ugly noises, when the VST plugins consume too much processing power, the game might become choppy...

    It sure would be nice to have some "game optimized sound-effects" inside of Unity, though...

    As I've mentioned in my other posting I just sent: In theory, with plugins, I guess pretty much anything is possible. But it sure requires some serious programming skills (unless there's some really simple to use MIDI API available that you could easily integrate, but even then you'll need enough skills to do the integration or someone who does it for you); and it might turn out that it only runs "smoothly" on super-high-end machines ;-)

    Sunny regards,
    Jashan
     
  3. PlainMartin

    PlainMartin

    Joined:
    Apr 3, 2008
    Posts:
    6
    Yup, I am aware of that. That's why I am/was hoping for an "integrated" approach/framework where I won't lose even more CPU cycles due to the 3D engine and the VST talking to each other ...

    So this falls into the "really sophisticated"/"hasn't been done before" category? Great. :(

    OTOH; let's think positive ... Some bored Whiz Kid who'd want to give this a try? :) I'd be willing to pay for a working demo. I just don't have a big budget for this.
     
  4. dirkk0

    dirkk0

    Joined:
    Nov 29, 2007
    Posts:
    16
    The easiest way to achieve this would probably be to integrate MIDI and send Midi to VSThost for example (excellent app btw), because you wouldn't have the hassle with the integration the VST libs.
     
  5. PlainMartin

    PlainMartin

    Joined:
    Apr 3, 2008
    Posts:
    6
    I see. Thanks.

    Now if I use VSTHost - or another microhost that I could incorporate into a final product -, how do I get Unity 3D to (reliably) generate MIDI notes and CCs? I just had a look at the documentation and couldn't find anything.

    Or to put this differently:

    Let's assume I have a multitimbral VSTi running in VSTHost. Could a Unity app running on the same machine stream a typical MIDI song file, with a few 3D objects on screen used as additional note triggers or controllers for filters etc.?

    (I am aware that such a setup will probably choke very quickly if there are too many 3D objects, hundreds of MIDI notes + controllers etc. But if you kept everything in a reasonable frame: Could you do an interactive version of a typical scene demo? Tight audio playback and a navigable 3D space, where changing the POV and clicking on objects will control audio/VSTi playback?)
     
  6. dirkk0

    dirkk0

    Joined:
    Nov 29, 2007
    Posts:
    16
    the 'cost' of the midi stream is insignificant, that much is sure.

    One can also assume that the stress of the 3D objects mainly hits the GPU, while the stress of the VST host lies mainly on the CPU, more or less (unless you have some fancy DSP equipment). Also memory consumption will be an issue (always is, as in the more the better). So the real hardware setup will be important.

    At what numbers of 3D objects and VST sources this system slows is something that can be found out, I guess.
     
  7. PlainMartin

    PlainMartin

    Joined:
    Apr 3, 2008
    Posts:
    6
    Yes - this is where things get interesting. :)

    It seems you pointed me in the right direction.

    So what I need to brew up a working solution is:

    - A capable Unity coder, who will code ...

    - A Unity app that streams looped and event-based MIDI data (notes + controllers) to ...

    - A VST microhost hosting ...

    - A "leightweight", multitimbral VSTi.

    So there are technical questions (how many objects/notes can be used/generated safely on a typical system; wrapping it all up into a neat application) and legal issues (finding a host and a VSTi that can be used/licensed for such a product) - but at least I have an idea where to start.

    Well. If anyone reading this is interested in participating, please PM me. But I will also do some more research and post this to the Collaboration forum.

    Thanks again!
     
  8. keithsoulasa

    keithsoulasa

    Joined:
    Feb 15, 2012
    Posts:
    2,126
    Has anyone figured this out yet, I have a game idea that would work perfectly with this !
     
  9. steego

    steego

    Joined:
    Jul 15, 2010
    Posts:
    969
    It should be possible with the new 3.5 audio features, but you would have to write a pure C plugin for it (which means pro only and standalone only). The GUI part would probably be hard to implement, if you need it. You would also need a license to distribute the VST plugins with your project.

    I was thinking about experimenting with this myself, but I concluded it wasn't really worth the effort. You might be better off just implementing soft-synths and effects directly in C#.
     
    erichuget likes this.
  10. reefwirrax

    reefwirrax

    Joined:
    Sep 20, 2013
    Posts:
    137
    Are there file size and number limits in unity sounds? many synths work with soundbanks, you can copy old 1980's soundbanks from the net and put them in a folder with like 120 sounds on 127 notes each. that's enough to test your proj it should just take a day. its very easy if it works at all. some place online there should be 1-2 gb soundbanks of various kinds.
     
  11. MHD-Salka

    MHD-Salka

    Joined:
    Mar 19, 2013
    Posts:
    72
    The only limit is how many sounds can be played at the same time (no matter how big the sample library is)

    currently unity only support a maximum of 32 sound polyphony, which is sad.
     
  12. twobob

    twobob

    Joined:
    Jun 28, 2014
    Posts:
    2,058
    Hmm.
    Seems like you want to generate some sort of synth sound.
    Based on generated midi events.
    Ugh - this. https://github.com/n-yoda/unity-midi

    The other bit where you say "VST" what you might mean is "synth" at a guess.
    It's in that.
    As a bonus plays pre-made midi files to boot.
    And like the others said. Yes Unity could do that. And I even turned up here considering it myself.
    But thinking about it a native GUI wrapping any sort of synth does that job.
    With a bit of verb and a spot of the on board plugins I think that covers the
    "Making a cool widget that poeple can like wiggle and do audio shizzle with" pretty readily.

    So, this. Add a gui. Hope it helps.
    And no major codebases to port - just a couple of small ones that someone already did the ports on to comprehend. Weeeeeee

    (Just checked, build okay on 5.5.2f1as of today)
     
    YBtheS likes this.
  13. KurtAtWork

    KurtAtWork

    Joined:
    Aug 30, 2017
    Posts:
    1
    I'm interested in creating an audio platform in Unity, akin to Ableton Live, but within the medium of VR.
    I recently purchased an Oculus Rift, and have been pretty stoked about the idea of creating something like this...

    i'm envisioning being able to create midi clips and audio clips spatially, and loop them, link midi parameters to other custom boards in the virtual environment, and record clips of midi automation as well...

    I'd also like to have it be easy for people to be able to add their own VST's to a folder, and have them be able to be used in the application and also be mapped to other controls in the program as well, through midi. I'd rather leave it open to the user as to what VST's they want to use rather than having some that come with the package.

    I'm interested in collaborating, if anyone would like to help.

    I'm new to unity, but am familiar with a lot of other languages, have experience working with 3D models, textures, materials, 2D sprites and other assets, and have already made some simple stuff for VR pretty quickly after i got unity.

    I'm learning some code and i'm going to learn the ropes on some simpler projects and scale up to doing this...
    I think to start off i'm going to work on creating different nodes in 3D space, that have various attributes with various values... and be able to change the values of those attributes...

    I figure if i can get that going, it should be easy enough to associate an audio source to that node to record a clip, and then modify that clip with other nodes for audio effects... and arrange the nodes in a way that is hierarchical and makes sense.

    the question i have, is if there can only be 32 sounds played at once in polyphony... does that limitation also apply to software based audio sources? couldn't i have a synth that is playing more notes than that if my code is running an algorithm that can account for all of that and then render it for output? i'd imagine one might be limited in the amount of audio channels that can be played at once, but perhaps not as limited in midi channels?

    if someone could bring light to these specific limitations and whether they can be overcome, it would be much appreciated.
     
  14. steego

    steego

    Joined:
    Jul 15, 2010
    Posts:
    969
    In short, what you will need is the VST SDK, to load VST plugins and instruments. You can then use the audio data you get from this with Unitys AudioClip.SetData.

    All the VST sounds will then only account for one sound, but you will have to do all the audio mixing yourself.
     
  15. Cherubim79

    Cherubim79

    Joined:
    May 3, 2014
    Posts:
    56
    I was curious about this one. I looked into a few open source DAWs on Win 10 (LMMS and Trackton T6) for simple music creation without having to go the FL Studio / Ableton route, the newest LMMS crashes on my system and T6 was a pain trying to use a virtual midi keyboard to try and create synth notes.

    So I thought hey maybe I might just make a small step sequencer in Unity with VST hosts and mix it, making an interface I could use across my platforms (Win/OSX/Linux (looks like I'd have to run Wine to get any VST support in OSX/Linux)).

    Looks like the VST SDK is all cmake based, guessing they don't have a managed .NET wrapper for their libs, based on the fact that I just built it in VS2017 Community and it doesn't have any managed .NET wrapper libs included it looks like, unless I'm not looking in the right place, also couldn't get their standalone VST Host to find any plugins (I can find them in any DAW that actually works). Looked into VST.NET but it looks like the author ran into a technical issue with VST3 and gave up. Not sure how much farther one could go on this one without writing a .NET wrapper from scratch provided you didn't run into the same technical issues VST.NET did. All those synth instrument presets that exist out there would be a big win, a VR DAW would be pretty rad if it could be done.
     
    Last edited: Nov 13, 2017
  16. mtytel

    mtytel

    Joined:
    Feb 28, 2014
    Posts:
    84
    Hey VST synth developer here.
    Not sure if you're all aware but Unity came out with an SDK similar to the VST SDK for audio plugins.

    I recently ported my synthesizer to this Unity Native Audio SDK and released it on the store packaged with a sequencer and sampler.
    https://www.assetstore.unity3d.com/en/#!/content/86984

    The standalone/VST/AU plugin is free/PWYW and you can try it out here first: http://tytel.org/helm

    I'm hoping more audio plugins appear on the store as it's a bit lonely at the moment.
     
  17. aili_arakida

    aili_arakida

    Joined:
    Feb 10, 2019
    Posts:
    2
    Hey KurtAtWork my team is making something like this – if you're interested, please be in touch!

     
    Marald likes this.
  18. Marald

    Marald

    Joined:
    Jan 16, 2015
    Posts:
    42
    Hi Aili, how do we contact you?
     
  19. GerardUnity

    GerardUnity

    Joined:
    Sep 24, 2018
    Posts:
    3
    Hey everybody,, since 2015 have you progress in your work about Unity and VST? I would manage an open source VST instrument (like Fluidsynth or better) in real time beacause I need hours of music and I would want to play .mid files (directed by C# code). But I don't find a good doc on VST SDK to make a .dll bridge between Unity and a VST.
    How method can I use?
    Thanks in advance