Hlsl

rifter1818

Junior Contributor
Joined
Sep 11, 2003
Messages
255
Location
A cold dark place
Hi there does anyone know some good tutorials on HLSL ive had trouble with it, also some tutorials on implementing it as that too isnt working for me. Thanks for all your help.
 
if you understand Italian my web site has it with full source code
robydx.altervista.org section directX9 - direct3D
hovever is simple, an example is enaugh to understand and for complex shader code is very usefull (for simple code is a waste of time).
if you has difficult tell me
 
Thanks

RobyDx said:
if you understand Italian my web site has it with full source code
robydx.altervista.org section directX9 - direct3D
hovever is simple, an example is enaugh to understand and for complex shader code is very usefull (for simple code is a waste of time).
if you has difficult tell me
Do you know of any service that can translate a whole site for me (i have enough trouble with english let allone italian). Thanks very much though, i think i can figure it out, the book i have is loading them as effects which is what was giving me so much trouble your way might be better for me. Although my Graphics card only supportsr VS_1_1 and PS_1_3... but oh well...
 
Last edited:
altervista has a translation service. Probably one way is to install a translation program. Hovever you card 'll be enaght. If you need a debugger try my "shaderDK" in project SDK. It's a tool that I've realised that permit to write, test, debug and visualize shaders in every version and also in REF (so you can test 2.0 shaders also).
bye
 
Thanks for your help

RobyDx said:
altervista has a translation service. Probably one way is to install a translation program. Hovever you card 'll be enaght. If you need a debugger try my "shaderDK" in project SDK. It's a tool that I've realised that permit to write, test, debug and visualize shaders in every version and also in REF (so you can test 2.0 shaders also).
bye
hopefully one day ill be able to solve a problem for you, but that day seems quite distant right now...
 
Can you Use CustomVertex Structures?

My code is giving me an error when rendering the primitives. (worked before i added the vertex shader): Important Things, VertexBuffer im rendering
4 Primitives Triangle list, CustomVertex.PositionNormal, VertexDeclaration :
Visual Basic:
dim PN() As VertexElement = {New VertexElement(0, 0, DeclarationType.Float3, DeclarationMethod.Default, DeclarationUsage.Position, 0), _
                                     New VertexElement(0, 12, DeclarationType.Float3, DeclarationMethod.Default, DeclarationUsage.Normal, 0), VertexElement.VertexDeclarationEnd}
        PositionNormal.VDec = New VertexDeclaration(D3d.Device, PN)
the Vertex Shader is loaded
Visual Basic:
GS = ShaderLoader.CompileShaderFromFile(Application.StartupPath & "\...\VS-GlowPN.VSH", "VSGlowPN", Nothing, Nothing, "vs_1_1", ShaderFlags.None, Errors, Nothing)
            Diagnostics.Debug.WriteLine(Errors)
            PositionNormal.AddVertexShader("Glow", New VertexShader(D3d.Device, GS))
And the Vertex Shader Its self
Code:
float GlowPower:register(c0);
float4x3 WorldView:register(c1);
float4x4 Projection:register(c49);
float4 GlowAmbient:register(c113);
float4 GlowColor:register(c117);
struct OUTPUT
{
    float4 Position : POSITION;
    float4 Diffuse  : COLOR0;
};
OUTPUT VSGlowPN
    (
    float4 Position : POSITION, 
    float4 Normal   : NORMAL
    )
{
    OUTPUT Out;
    float3 N = normalize(mul(Normal, (float3x3)WorldView));
    float3 P = mul(Position, WorldView) + GlowPower * N;
    float3 A = float3(0, 0, 1);
    float Power;
    Power  = dot(N, A);
    Power *= Power;
    Power -= 1;
    Power *= Power;
    Out.Position = mul(float4(P, 1), Projection);
    Out.Diffuse  = GlowColor * Power + GlowAmbient;   
    return Out;    
}
Any ideas whats wrong here?
The Error message is
Code:
RenderVertexBuffer: Error in the application.
-2146232832 (Unknown)
   at Microsoft.DirectX.Direct3D.Device.DrawPrimitives(PrimitiveType primitiveType, Int32 startVertex, Int32 primitiveCount)
   at W_Logic_RPG.DX9.Direct3d.RenderVertexBuffer(VertexBuffer& VertexBuffer, PrimitiveType PrimitiveType, Int32 StartVertex, Int16 PrimCount, Matrix Transform) in C:\Documents and Settings\Harry\Desktop\Programming\W-Logic RPG\clsDX901.vb:line 498
 
did you set the vertex declaration correctly?
device.vertexdeclaration=myVertexDeclaration?
did you pass your array without index to drawPrimitive
vertexArray(0) wrong
vertexArray correct
did you pass also vertexFormat?
you must pass all 3 Format, declaration and vertexShader to work properly
 
...

I dont know if the vertex declaration is right or not (note the first code sample of my previous post), All of the Format,Shader and Declaration are set to the Device, And the vertex Buffer is being rendered the same as allways (worked before the Shader), as i said. Just the shader Code and VertexDeclrations that im questioning.
 
if the hlsl code is wrong the error is given at shader creation. Probably you have use a vertex format that include not only position and normal. For example you can have texture coordinate
 
*Bangs Head*

I cant believe this, but im just being an idiot (Again) im really sorry RobyDx for all the time youve been thinking about this (and i know you have noting the number of posts and checking your profile showing when your viewing this form, Anyways what happened was that i was using the wrong VertexFormat as i was programming at home and at school and forgot to update the creation code at home (where thanks to your website RobyDx i coded my VertexShader). Anyways VertexShader isnt showing up but its not erroring so its just that the Vertex Shader isnt quite right but i will fix that eventually.. Thanks for your help and Thumbs up on the website.

Oh ya, I need to know a bit more about what info the normals should contain, right now ive got them all as +/- 0.7071, but im not sure thats right, lets say i have a triangle (0,1,0) (-1,0,0) (0,0,-1) what should the normals for each verticy be? i have (-0.7071,0.7071,-0.7071) for all 3.
 
Back
Top