I have an unusual use case. In Windows 10, running a Unity standalone application we're building, we are able to run multiple instances of the application simultaneously on the same PC (on the order of a dozen). The application renders stuff and records video (using AVPro Movie Capture), and that currently works. We want to generate these videos as fast as possible. So, we would like to take advantage of multiple video cards on the same machine. By default, would Windows/Unity just use one of the installed and enabled video cards, or would it automatically take advantage of other card(s)? (I understand that SLI would make multiple video cards act as one, more powerful, video card...would that actually help when running 12 instances of the application?) I've read the docs for the "-adapter N" command line argument, but using that does not seem to make a difference in my tests with two video cards in one machine. (My hope would be, that in launching the 12 instances, I could start 6 of them on one "adapter" and the other 6 on the other "adapter".) Does that make sense? And is "N" zero-based, so that "0" means the first adapter found, and "1" means the second, etc? Using Unity 5.0.1f1 currently. Any help would be greatly appreciated.