rustyd Posted August 18, 2005 Posted August 18, 2005 Hi, I've noticed that I call object.Dispose and if I mouse over it, it tells me what it is. If I set it = Nothing, it now shows nothing. Which is the better for memory/performance? I ask because I have a huge query from 3 tables, 2 of which have over 1.8 million records. The query reduces to 183,000 records. I open the sqldatareader with the query and it opens within less than 10 seconds. It is now running through the logic in the read loop. Two issues: 1) Occasionally, I get a timeout after the loop has read 75%-80% of the records. I've increased the command timeout to 60 and that problem went away. I set the command timeout back to 30 and removed some unneeded columns from the query and it reduced the frequency of this happening. Does this make sense? Before optimizing the query, it timed out every time. After optimization, it seems to only happen after the procedure has been run several times. Which brings me to my second issue. In the loop, it picks out records that match certain criteria and inserts them into a table 2) I have 512MB of RAM. My page file is managed by Windows XP Pro (1.5 GB). I start the procedure that loads the above query and Task Manager performance tab. I see the Physical Memory total at 523,000. As it begins looping through the read of the query, I see the Physical Memory available plummet, and the PF Usage steadily rise. After the procedure is complete, the Physical Memory slowly comes back, but the PF Usage remains high and my machine is dog slow after running this procedure. Even after closing the application it remains within a few MB of it's peak PF Usage. It is like this when run from the IDE, the EXE in both Debug and Release builds. Any ideas are appreciated. Quote rustyd
Wile Posted August 19, 2005 Posted August 19, 2005 Using dispose ensures that critical resources are cleaned up as soon as you call Dispose. Setting something to nothing will only have effect the next time the garbage collector kicks into action, usually a bit later than you want ;), and even then it might not destroy the object. I dont know the exact details (depending on the object) but it can even be that setting it to nothing doesnt even clear up resources after the garbage collector has destructed the objects. The best thing is to do both: first disposing, then setting it to nothing, this way you are sure you dont hold any resources for longer than necessary. Quote Nothing is as illusive as 'the last bug'.
rustyd Posted August 22, 2005 Author Posted August 22, 2005 Page File not cleared after .Dispose and = nothing Thanks for the response Wile. There are two places where I see it not releasing the page file usage and both times are when it runs a multi-table inner join where two of the tables have over 1.8 million records. The where clause reduces the quantity of records returned to 180,000 records. I am running this on an MSDE setup. Haven't run it over the network yet. After I dispose and set the sqldatareader = nothing, I see the memory slowly freeing up. What I don't see, is the page file usage remains huge and my entire computer runs very sluggish. Quote rustyd
Administrators PlausiblyDamp Posted August 22, 2005 Administrators Posted August 22, 2005 (edited) If you check in task manager (or use sysinternal's process explorer instead) is it your app or MSDE that is taking up all of the memory? As Wile says calling .Dispose() frees up the critical resources, you shouldn't normally need to set them to nothing as well as the GC will be capable of detecting when a variable is no longer used (are you doing a release build or a debug build?). Too be honest I have a feeling that the problem isn't really a .net one and it's more likely that your are running this on a PC with too litle RAM for what you are attempting - that would explain the rapid drop in free memory and large page file growth. Also MSDE is designed as a SQL - Lite kind of database and is deliberately constrained in terms of performance - if you are using tables that large you really should be using a full SQL product. If you just run the same query direct from Query Analyzer (or even osql.exe) does the same problem occur? Is there any way you could optimize the query? Edited August 22, 2005 by PlausiblyDamp Quote Posting Guidelines FAQ Post Formatting Intellectuals solve problems; geniuses prevent them. -- Albert Einstein
rustyd Posted August 22, 2005 Author Posted August 22, 2005 If you check in task manager (or use sysinternal's process explorer instead) is it your app or MSDE that is taking up all of the memory? I'm running in debug mode through the development environment, and it's memory changes slightly while running this job and frees up memory when the job is complete. SqlServr.exe starts at 47000 Kb and after the process is complete, remains at 157000Kb. When I close the program, it comes back down to about 110000Kb. As Wile says calling .Dispose() frees up the critical resources' date=' you shouldn't normally need to set them to nothing as well as the GC will be capable of detecting when a variable is no longer used (are you doing a release build or a debug build?).[/quote'] Should I remove the line setting them to nothing. This is what I currently have: dr.Dispose() dr = Nothing Too be honest I have a feeling that the problem isn't really a .net one and it's more likely that your are running this on a PC with too litle RAM for what you are attempting - that would explain the rapid drop in free memory and large page file growth. I'm running Windows XP Pro SP2 on a PIII 2.8 Ghz w/512MB RAM. Also MSDE is designed as a SQL - Lite kind of database and is deliberately constrained in terms of performance - if you are using tables that large you really should be using a full SQL product. If you just run the same query direct from Query Analyzer (or even osql.exe) does the same problem occur? Is there any way you could optimize the query? I have another machine that I will run this process against with SQL Server 2000 w/ SP3 installed. So I will try that. In Query Analyzer, the query starts loading records almost immediately, but takes 1 min 46 secs to return all 187,703 records. During that time the sqlservr.exe fluctuated between 35000Kb and 70000Kb of memory and when the query was complete went way down to around 18000Kb of memory. The page file went up to 734 MB and stayed there after the query is complete. The query itself, I optimized by eliminating columns I didn't need returned, that helped a lot. I can improve the where clause to limit records returned which will help also. Thanks for your help. Quote rustyd
Administrators PlausiblyDamp Posted August 22, 2005 Administrators Posted August 22, 2005 Disposing of the DataReader is a good thing, setting it to nothing will not have any impact so you may as well remove it. It does look more like a SQL issue than a .Net one from the figures you are quoting, I would suggest you run the SQL profiler against the query and see what figures it gives for IO, CPU etc just to get a better feel for what it is doing, also from query analyzer you may want to investigate the graphical execution plan option to see which parts in particular are causing problems. If you can reduce the rows involved by a 'better' where clause then that is probably the first thing to try. Quote Posting Guidelines FAQ Post Formatting Intellectuals solve problems; geniuses prevent them. -- Albert Einstein
rustyd Posted August 22, 2005 Author Posted August 22, 2005 Disposing of the DataReader is a good thing, setting it to nothing will not have any impact so you may as well remove it. It does look more like a SQL issue than a .Net one from the figures you are quoting, I would suggest you run the SQL profiler against the query and see what figures it gives for IO, CPU etc just to get a better feel for what it is doing, also from query analyzer you may want to investigate the graphical execution plan option to see which parts in particular are causing problems. If you can reduce the rows involved by a 'better' where clause then that is probably the first thing to try. Restricting the InventoryQuantity records reduced the records returned from 187,000 to 704. THe sql profiler showed the Inventory Quanitity record at 88% before the where clause addition and down to 50% after the change. The I/O cost went from 36.2 to .0129. The CPU cost went from 2.08 to .002341. Quote rustyd
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.