R – GPU programming on Xbox 360

gpumatrixxbox360

I'm looking for some insight into XNA on Xbox 360, mainly if its possible to run vector-based float mathematics on its GPU?

If there's a way, can you point me into the right direction?

Best Answer

I don't claim to be an expert on this, but hopefully this can point you in a helpful direction.

Is it possible? Yes. You probably already know that the GPU is good at such calculations (hence the question) and you can indeed control the GPU using XNA. Whether or not it will suit your needs is a different matter.

To make use of the GPU, you'll presumably want to write shaders using HLSL. There's a decent introduction to HLSL in the context of XNA at Reimers which you might want to run through. Notably, that tutorial focuses on making the GPU do graphics-related crunching, but what you write in the shaders is up to you. If your vector-based float math is for the purpose of rendering (and thus can stay in the GPU domain), you're in luck and can stop here.

Likely rendering onscreen is not what you're after. Now, you have a fair amount of flexibility in HLSL as far as doing your math goes. Getting the results out to the CPU, however, is not how the system was designed. This is getting fuzzy for me, but Shawn Hargreaves (an XNA dev) states on more than one occasion that getting output from the GPU (other than rendered onscreen) is non-trivial and has performance implications. Retrieving data involves a call to GetData which will cause a pipeline stall.

So it can be done. The XNA framework will let you write shaders for the 360 (which supports Shader Model 3.0 plus a few extensions) and it is possible to get those results out, though it may not be efficient enough for your needs.

Related Topic