The "-adapter" command line argument that is still in the docs does not work any more. It used to work in older builds of my same project. It seemed to work when I was building with DirectX 9 and stopped working in DirectX 11 and DirectX 12. Since DirectX 9 support has been removed since Unity 2017.3, that isn't even an option anymore. Is this a bug? Is there some new way to select which GPU a Unity game will use on a desktop with multiple graphics cards and monitors? It seems crazy to me that such a major game engine would not provide some way to select which graphics card to use.
https://docs.unity3d.com/Manual/CommandLineArguments.html
↧