Discussion:
Using Pixel Shaders to process non video related data
(too old to reply)
nak
2009-05-27 12:27:27 UTC
Permalink
Hi there,

I've been reading several articles on companies using the programmable pixel
shader architechture to process audio data and am quite intrigued by the
apparent speed improvements.

With this said I would like to have a play about with performing a custom
algorithm on byte arrays.

My understanding (bare in mind I have never programed any pixel shaders to
date).

1. Program pixel shader with custom algorithm
2. Convert byte array into video data (offscreen buffer)
3. Perform pixel shader on video data
4. Convert video data back into a byte arrray

I was wondering if anyone had any tips/articles or helpful advice for having
a pop at this?

Cheers in advance.

Nick.
legalize+ (Richard [Microsoft Direct3D MVP])
2009-05-28 00:01:08 UTC
Permalink
[Please do not mail me a copy of your followup]
Post by nak
My understanding (bare in mind I have never programed any pixel shaders to
date).
1. Program pixel shader with custom algorithm
2. Convert byte array into video data (offscreen buffer)
3. Perform pixel shader on video data
4. Convert video data back into a byte arrray
That's the basics of it.
Post by nak
I was wondering if anyone had any tips/articles or helpful advice for having
a pop at this?
Try googling for "GPGPU" (general purpose GPU). There's quite a bit
of literature out there describing special purpose APIs and techniques
for exploiting the GPU in areas other than graphics.

If you've got access to a Vista machine, then you can play with the
compute shaders in D3D11 technology preview.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>

Legalize Adulthood! <http://blogs.xmission.com/legalize/>
nak
2009-05-28 08:12:14 UTC
Permalink
Hi Richard,

That's awesome thanks, you're a star!

Nick.
Post by legalize+ (Richard [Microsoft Direct3D MVP])
[Please do not mail me a copy of your followup]
Post by nak
My understanding (bare in mind I have never programed any pixel shaders to
date).
1. Program pixel shader with custom algorithm
2. Convert byte array into video data (offscreen buffer)
3. Perform pixel shader on video data
4. Convert video data back into a byte arrray
That's the basics of it.
Post by nak
I was wondering if anyone had any tips/articles or helpful advice for having
a pop at this?
Try googling for "GPGPU" (general purpose GPU). There's quite a bit
of literature out there describing special purpose APIs and techniques
for exploiting the GPU in areas other than graphics.
If you've got access to a Vista machine, then you can play with the
compute shaders in D3D11 technology preview.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>
Legalize Adulthood! <http://blogs.xmission.com/legalize/>
nak
2009-05-28 14:03:03 UTC
Permalink
Hi again Richard,

Although I haven't found exactly what I'm after as yet, maybe you could
tell me if my thinking below is correct before I try pursuing it...

1. Create an array of vectors
2. In the (X) coordinate of each vector, store a value, leave the
other values as 0
3. Fire off to a vertex shader loaded into the GPU
4. Check resulting vertex data

Let's say the vertex shader code simple adds "1" to each X coordinate of
each vector.

Is the above possible without requiring outputting to an offscreen
buffer? I'm just thinking this technique would be easier than converting to
and fro video data. Although I'm guessing this would be done more if it was
that easy.

Still trying to locate an extremely simple shader application that just
performs arithmetic function and outputs the result as a list of values, as
opposed to particle physics. Google seems great if you like OpenGL, but I'd
rather do it in DirectX and .NET. What I'm trying to do is make an
extremely simple .NET library for creating fast arithmetic routines.

Thanks for your time in advance!

Nick.
Post by legalize+ (Richard [Microsoft Direct3D MVP])
[Please do not mail me a copy of your followup]
Post by nak
My understanding (bare in mind I have never programed any pixel shaders to
date).
1. Program pixel shader with custom algorithm
2. Convert byte array into video data (offscreen buffer)
3. Perform pixel shader on video data
4. Convert video data back into a byte arrray
That's the basics of it.
Post by nak
I was wondering if anyone had any tips/articles or helpful advice for having
a pop at this?
Try googling for "GPGPU" (general purpose GPU). There's quite a bit
of literature out there describing special purpose APIs and techniques
for exploiting the GPU in areas other than graphics.
If you've got access to a Vista machine, then you can play with the
compute shaders in D3D11 technology preview.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>
Legalize Adulthood! <http://blogs.xmission.com/legalize/>
legalize+ (Richard [Microsoft Direct3D MVP])
2009-05-28 19:26:40 UTC
Permalink
[Please do not mail me a copy of your followup]
Post by nak
Hi again Richard,
Although I haven't found exactly what I'm after as yet, maybe you could
tell me if my thinking below is correct before I try pursuing it...
1. Create an array of vectors
2. In the (X) coordinate of each vector, store a value, leave the
other values as 0
3. Fire off to a vertex shader loaded into the GPU
4. Check resulting vertex data
With a compute shader in D3D11 you could do this directly and it would
map better to your problem.

With a vertex shader in D3D10 you can stream out the results of the
vertex shader and you can map your problem into a vertex shader.

With D3D9, you can't stream out the results of the vertex shader
directly, which is why people typically use a pixel shader for these
things because a pixel shader can write to a render target which you
can access from the CPU.

Otherwise yes, you have the right idea.
Post by nak
Is the above possible without requiring outputting to an offscreen
buffer?
The data has to go somewhere and that means a buffer of some sort.
With a compute shader, the buffer can be somewhat arbitrary, with a
vertex shader, the output looks like a stream of vertices and with a
pixel shader the output looks like a render target (regular 2D grid of
samples).
Post by nak
Still trying to locate an extremely simple shader application that just
performs arithmetic function and outputs the result as a list of values, as
opposed to particle physics. Google seems great if you like OpenGL, but I'd
rather do it in DirectX and .NET. What I'm trying to do is make an
extremely simple .NET library for creating fast arithmetic routines.
Using .NET adds an additional layer of complication to this because
there is no officially supported layer for .NET around the Direct3D
API. You probably want to look at SlimDX for a .NET layer for D3D.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>

Legalize Adulthood! <http://blogs.xmission.com/legalize/>
nak
2009-05-28 19:45:32 UTC
Permalink
Hi there,
Post by legalize+ (Richard [Microsoft Direct3D MVP])
The data has to go somewhere and that means a buffer of some sort.
With a compute shader, the buffer can be somewhat arbitrary, with a
vertex shader, the output looks like a stream of vertices and with a
pixel shader the output looks like a render target (regular 2D grid of
samples).
That makes great sense to me, I was kind of shying away from the idea of
using DX11 as I'm sure it can be done with the current technologies, well at
least I get that idea from articles I have read over the past few years
regarding this matter.

But this is cool, a stream of vertices would be perfect! :D Hopefully I
can get a small sample working like this.
Post by legalize+ (Richard [Microsoft Direct3D MVP])
Using .NET adds an additional layer of complication to this because
there is no officially supported layer for .NET around the Direct3D
API. You probably want to look at SlimDX for a .NET layer for D3D.
Now that's confused me, I thought managed Direct3D had been around for a
while now?

http://msdn.microsoft.com/en-us/magazine/cc164112.aspx

I've even done some managed Direct3D stuff in the past. The only part
of DirectX I didn't think was managed was DirectShow?

Nick.
legalize+ (Richard [Microsoft Direct3D MVP])
2009-05-29 00:32:02 UTC
Permalink
[Please do not mail me a copy of your followup]
Post by nak
But this is cool, a stream of vertices would be perfect! :D Hopefully I
can get a small sample working like this.
That will only be feasible with D3D10, which is Vista only, and you'll
need a D3D10 compatible card.
Post by nak
Now that's confused me, I thought managed Direct3D had been around for a
while now?
Managed DirectX is deprecated/dead. Use SlimDX.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>

Legalize Adulthood! <http://blogs.xmission.com/legalize/>
nak
2009-05-29 10:29:07 UTC
Permalink
Post by legalize+ (Richard [Microsoft Direct3D MVP])
Managed DirectX is deprecated/dead. Use SlimDX.
Wow that's sad :( imo the entirety of DirectX should have a .NET
interface too. Still, that SlimDX sounds cool :)
nak
2009-05-29 10:31:30 UTC
Permalink
Post by legalize+ (Richard [Microsoft Direct3D MVP])
That will only be feasible with D3D10, which is Vista only, and you'll
need a D3D10 compatible card.
Oops, I forgot to mention, I should be okay as I'm using Windows 7 which has
DX11 running on it, it's just my dev system which has got XP running on it
unfortunately, but I might just install VS onto Windows 7... if I feel lucky
enough!

Loading...