Hello,
I'm testing the new virtualization session from Win 2012r2 with remote FX as I’ve read that r2 enable graphic acceleration in RDSH mode :)
My configuration is a physical Windows 2012 R2 with a ATI S7000 graphical card with lasted driver. I've enable user experience, RDS Session virtualization services, and configured the GPO in order to use RemoteFX (hardware compression) functionality.
Remote FX is operational, and i'm testingwww.babylonjs.com Under IE11 and I get the following result:
-If I'm logged ON the server (locally), GPU is used (Frame rate 30fps, cpu 25%), so graphical card is normally used
-If I open a RDS session on server. vGPU is used (Frame rate 6fps, cpu 100%), so graphical card is NOT used. Remote FX use vGPU to decode/re-encod 3D.
-If I fopen a local session first, launch IE and let it running. On a another computer, I connect with RDS to this already opened session: My IE 11 get RDSH GPU acceleration still working. With 30fps and cpu 29%. Another thing, if I
launch another IE in the same session, next IE is launched in vGPU mode…
It sound like, when a program is launched Windows 2012r2 select for GPU or vGPU... but how force it ?
I'm near a perfect result, but I don't find what wrong.
If someone can help.