Architekt
2008-02-20 02:04:17 UTC
I've got some vertex shader code from ATI's rendermonkey program that
I'm trying to tweak for an experiment. If I compile it on the command
line via:
fxc /T vs_2_0 /Od /Cc /Zi /E main shader.vsh
The output is as expected. I make one change by inserting a dummy
operation and get radically different code. The reason it seems is
that the compiler is optimizing out a bunch of stuff, since, this
stupid operation would effectively override the previous operations
and they'd be useless. But I'm specifying DISABLE OPTIMIZATIONS on the
command line.....I would expect that that'd not optimize anything.
It's a long explanation why I'm trying to do this operation but
basically it's for debugging via PIX. Anyways, what's up with /Od not
seeming to have any effect? I tried it with /O0, same thing.
Furthermore, if I compile the code in my C++ files via
D3DXCompileShaderFromFile, and use the flags D3DXSHADER_DEBUG |
D3DXSHADER_SKIPOPTIMIZATION, the same thing happens: it still
optimizes the code.
How the heck can I get my shader to compile verbatim, unoptimized???
Thanks for the help, this is driving me insane.
PS I'm using DX SDK August 2007 but I tried it with the latest, Nov.
2007 and the same thing happens.
I'm trying to tweak for an experiment. If I compile it on the command
line via:
fxc /T vs_2_0 /Od /Cc /Zi /E main shader.vsh
The output is as expected. I make one change by inserting a dummy
operation and get radically different code. The reason it seems is
that the compiler is optimizing out a bunch of stuff, since, this
stupid operation would effectively override the previous operations
and they'd be useless. But I'm specifying DISABLE OPTIMIZATIONS on the
command line.....I would expect that that'd not optimize anything.
It's a long explanation why I'm trying to do this operation but
basically it's for debugging via PIX. Anyways, what's up with /Od not
seeming to have any effect? I tried it with /O0, same thing.
Furthermore, if I compile the code in my C++ files via
D3DXCompileShaderFromFile, and use the flags D3DXSHADER_DEBUG |
D3DXSHADER_SKIPOPTIMIZATION, the same thing happens: it still
optimizes the code.
How the heck can I get my shader to compile verbatim, unoptimized???
Thanks for the help, this is driving me insane.
PS I'm using DX SDK August 2007 but I tried it with the latest, Nov.
2007 and the same thing happens.