Page 1 of 1

D3D Alpha Blending error running UDK in Windows 7 guest

Posted: 1. Jun 2012, 07:35
by munchluxe63
I'm running VirtualBox on a linux system (OpenSuse 12.1) and I'm running a VM with Windows 7 Home Premium as the guest. I've already installed guest additions.
I'm currently using an Intel Optimus graphics setup, which requires me to use Bumblebee to switch graphics cards, which I've properly set up.

When trying to load up Unreal Development Kit November 2011, I get the error:
"Your video card does not support alpha blending with floating point render targets (D3DFMT_A16B16G16R16F), which is required to run this game. Exiting..."

Neither the 64 bit or 32 bit versions will load up.

The program works fine with the Windows 7 installation (not a VM) on my hard drive.

Re: D3D Alpha Blending error running UDK in Windows 7 guest

Posted: 4. Jun 2012, 01:36
by squall leonhart
that format requires D3D10

VirtualBox D3D is limited to D3D9 via opengl 3.x

Re: D3D Alpha Blending error running UDK in Windows 7 guest

Posted: 4. Jun 2012, 06:38
by munchluxe63
That's too bad.