Search Unity

Best practice for developing packages

Discussion in 'Package Manager' started by _karl, Feb 14, 2019.

  1. _karl

    _karl

    Joined:
    Feb 6, 2017
    Posts:
    13
    Hi.
    In an upcoming project we want to have a "plugin system" and each plugin comes with the package manager and is hosted on a private npm.
    Some plugins may be big, but others are very small.

    To develop plugin code and edit prefabs, I created an Unity project with just the plugins inside the Assets folder. I also created a little script that's add a menu item to the context menu with whom I can publish the package to my local npm server automatically.

    But I came to a point where this setup didn't work out very well, that was when I added dependencies to my package.json.. Because Unity automatically detects the package.json and processes its content.
    When I add a dependency, it automatically creates a "node_modules" subfolder and "npm installs" all dependencies to this folder. That means that I had duplicate code in my project and that breaks the build.

    So now to my question:
    Is there a best practice for developing a packages for the package manager?
    I think I need a Unity project to test the code and create prefabs etc.. But creating a Unity project for each package seems to me like an overkill. Can I maybe stop Unity to process my package.json inside the Assets folder, that would solve my problem either(at least for the moment).

    Edit: I use Unity 2019.1.0.b2

    Karl
     
    Last edited: Feb 14, 2019
  2. Adrian

    Adrian

    Joined:
    Apr 5, 2008
    Posts:
    1,066
    Unity uses the npm protocol but not the npm client. Therefore, it will never create any "node_modules" directories. Something else in your toolchain must be executing "npm install".

    I have a project like you describe, containing various packages in subdirectories of the "Assets" folder, each with its own "package.json", and then I publish them to my private npm repository. I haven't encountered any issues with this setup.
     
    maximeb_unity likes this.
  3. _karl

    _karl

    Joined:
    Feb 6, 2017
    Posts:
    13
    When I delete the "node_modules" afterwards, everything works fine and on "normal recompile" the "node_modules" is not created.
    But when I double click the package.json and change the version(or anything else) in Visual Studio and go back to Unity, then after compilation the "node_modules" is there and a package-lock.json is also created.
    When Unity is not started and I compile the project only in VS, nothing suspicious happens. So it is definitely something Unity is triggering with its compilation.

    The compiler errors are of course ONLY happening when I define some "dependencies" in the "package.json" that are already in the same project.

    I have this behavior on Win10 with Unity 2019.1.0b2 and 2018.3.5f1.
     
  4. maximeb_unity

    maximeb_unity

    Unity Technologies

    Joined:
    Mar 20, 2018
    Posts:
    556
    @_karl, @Adrian is spot on. We do not use the npm client internally. Maybe your publishing process is at fault, i.e. when you use `npm pack` or `npm publish`? Clearly, something in your setup invokes `npm install` directly or indirectly, and that's actually incompatible with the Unity Package Manager.

    It's also possible that Visual Studio is invoking npm install for you. If you have an asmdef at the root of your package, next to your package.json, it will result in a .csproj in the generated Visual Studio solution, and VS may think it's an npm package and try to be nice for you. The node_modules may automatically be ignored by VS as well, but will be seen by Unity since it has no idea what npm packages are, and does not have any specific logic to ignore them. Take a look at this: https://stackoverflow.com/a/43494775/1304104

    Packages don't need to be under
    Assets
    to be editable - in fact, as long as they're under
    Assets
    , they are not considered packages at all. You should instead move them to the
    Packages/<your-package-name>
    directory, where the Package Manager will see them; as a bonus, it will also pick up their own dependencies and add them to the project, if they're missing. You still need to publish manually to your private registry though. You also need to make sure that there are no node_modules directories remaining because they will be picked up by Unity, with the effects that you observed already; if you move the asmdef to a subdirectory inside your package, then VS will no longer see the package.json file and you'll be certain that it's won't be the cause of these spurious npm install calls.
     
    cxode and _karl like this.
  5. _karl

    _karl

    Joined:
    Feb 6, 2017
    Posts:
    13
    @maximeb_unity, thanks for you answer. That was it!! VS did something crazy when i let Unity compile the project, when Unity was not opened and I compiled it via VS only, this problem did not exist.
    The Link to the stackoverflow solved my problem. Simply disable the "Package Restore" feature. :)

    I also moved all my packages to the Plugins folder. :) I have tried it previously but back then it did not work, maybe my mistake. Now it works like a charm.
    The only thing I needed was an asmdef in each plugin, otherwise I did not see the code in VS.


    One other thing I discovered was that when I call the following code and I have selected one of my plugins, I am not actually getting the actual folder name of my plugin, but the plugin name specified in the package.json. Is this a bug, or intended behaviour?
    Code (CSharp):
    1. foreach (var obj in Selection.GetFiltered(typeof(UnityEngine.Object), SelectionMode.Assets))
    2. {
    3.      UnityEngine.Debug.Log(AssetDatabase.GetAssetPath(obj));
    4. }
     
  6. maximeb_unity

    maximeb_unity

    Unity Technologies

    Joined:
    Mar 20, 2018
    Posts:
    556
    As far as I know, you should be getting a path that you can use for queries with System.IO APIs, or other AssetDatabase APIs involving relative paths. That's because the package manager creates a virtual map for package paths in order to abstract the actual location, which can be in a cache (e.g. when adding a dependency to a registry-based or Git-based package), a Unity installation directory (when using a built-in package) or an arbitrary path on your filesystem (when using local packages). These paths are consistent, so that wherever your package is, the name used in its "virtual" path will always be based on the package name. It also means you cannot have two packages with the same name in different directories.

    You meant the Packages folder, right?
     
  7. _karl

    _karl

    Joined:
    Feb 6, 2017
    Posts:
    13
    Thanks for pointing that out. The Unity System.IO uses the path as it would be there, so no problem here but I handed over the path to my external batch script and that logically did not find the path.
    I have simply renamed all the folders to the right package name, now I have no problems anymore. :)

    Sorry, of course I meant the Packages folder.. :confused:
     
    maximeb_unity likes this.
  8. maximeb_unity

    maximeb_unity

    Unity Technologies

    Joined:
    Mar 20, 2018
    Posts:
    556
    @_karl,

    Right, Unity knows about these virtual paths, but external tools don't. The easiest solution before passing the paths to those external tools is merely to translate it to an absolute path using
    System.IO.Path.GetFullPath(string path)
    .
     
    _karl likes this.