Georgen said:
okee..
So knowing some GDI wont help much, is that because DirectX is totally different graphic programming architecture? (...)
I learn GDI+ 1 month ago and I found it to be EXTREMELY easy to use. (started to learn it at 25-04-2005, 1 weak later I had the graphical engine of my program ready) Yet, It truly helped me to understand the concepts of displaying graphics on screen and how graphical frames work. This acknowledge made my life much easier when I moved to DirectX (still moving in btw).
I don't know what's your goal nor what are you planning to code but I can tell you this, GDI+ can get truly slow!
For you to get an ideia...
Am building an RPG maker program which allow users to create their own RPG tiled games. This project allows the user to add images from 32x32 up to 256x256 pixels with 3 graphical layers. The base tile size is 32x32 pixels. With a 1024x768 screen the editor will show about 20x20 tiles that is 400 images. And, since it can have 3 layers, the game's engine must be able to compute 1200 tiles, each one having an 32x32 or 256x256 images (of course that 256x256 images will cover 64 tiles at the time but you can mix them with the other graphical layers with transparency to create non-repetitive grafical environments).
Using GDI+ the engine would noticeably start to slow down and getting low FPSs when using about 15 256x256 transparent/non-transparent images, specialy when combining the 2 layers (I had only 2 graphical layers when my program had GDI+ precessing the graphics). Yet, when having more then 20 256x256 images I would get something like 3 frames per second when scrolling the map... (OUCH)
Using DirectX, well, I tryed few images (about 200) of 256x256 images, and, for my breahtaking relief I would scroll the map even faster then GDI+ with no images at all!
Cheers!