Jump to content
Xtreme .Net Talk

Roof Top Pew We

Members
  • Posts

    3
  • Joined

  • Last visited

Roof Top Pew We's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. When I load textures, I use the TextureLoader.FromFile method. The last argument I pass is a reference to an ImageInfo class. The ImageInfo gets filled, and you can look at its .Width and .Height values. Also, if you already have the texture loaded, you can do the following: SurfaceDescription sd = texture.GetLevelDescription(0); And look at ds.Width and ds.Height. --Vic-- http://www.flatredball.com
  2. The best thing to do is run a search on your computer. Search for *DirectX.dll and it should bring up the location of the .dlls. Then you can manually add them by going to Browse. --Vic--
  3. I am trying to get my video to render to texture. The SDK example seems to only draw the scene when the video "creates" a new texture. However, I'm going to be using this texture in scenes where other objects are using regular textures, and everything will be running at higher framerates. When I tried to do that, my entire scene started flashing. I reduced the code to see if I could find the piece of code causing the problem, and got down to two lines: Video video = Video.FromFile("test.avi"); video.RenderToTexture(D3DDevice); If I comment out the 2nd line, everything works just fine. Adding it in seems to cause a problem. Like I said, the entire scene starts to flash. My guess would be that the video (being on a different thread) calls some event during my render loop and causes some kind of problem. . perhaps exiting out of the loop. Has anyone had this problem before, and if so, how has it been solved? --Vic--
×
×
  • Create New...