Memory

Algar

Freshman
Joined
Jun 17, 2003
Messages
28
Location
Toronto
I have a small VB.Net application, couple small forms.

Main form has a ListView, Listbox, statusbar, main menu and a 100k jpg.
The load event populates the listbox and listview controls with 2-3 items each from an access db.

The following is the memory at variouse stages:

After Load: 21 megs
Move form around screen: 22 megs
Minize form:600k
Maximize form: 4 megs
Move form around screen: 4.1 megs
Minimize form:600k
Maximize form: 4 megs
etc...

The Virtual Memory remains a constant 10.5 megs at all times.

I dispose the connection object and close the DB reader... why does it hog so much at first and reduce so much when minimimzed ?

I read a previous post from a long time ago about someone having the same problem as myself, though it never really got solved so I thought I'd try now.
 
Last edited:
There are several recent threads on this memory subject, the jist of them is tha while it looks to be using less memory when minimised, it is not, it has written the data to disk instead of holding it in memory. when you maxamise it again, it will quickly get back to the origional amount.

The whole memory management is very scarey to say the least....
 
Ok, well then that explains the vast difference, but still leaves the question of how could 1 small form take up 22 megs right after being loaded ?
 
I presume a large % of this comes from loading the CLR or else it is a business arrangment between Microsoft and the memory manufacturers.

Have you tried calling the GC direct to free any unused memory(Gc.Collect) it might help. As well as using Dispose, add setting the variable to 'Nothing'
 
Sorry, whats CLR ?

I am closing, disposing, setting all object to Nothing, forcing the collect at the end of the load, and its still the exact same.
 
yes this is the basic theory


the more ram in the system in which its not being consumed

the more ram the application itself reserves

weird but it's a fact


you would notice if you ran that 22 meg app on a 64 megs of ram PC, it would probably use only 200k primarily because there isnt enough ram

there is a way to get around this

which is to use the


SetProcessWorkingSize function


which can show the "Real" memoru usage


my app used around 34 megs


but after calling this function on the form activate and deactivate events



it refreshes it to the real memory size which is 1 .2 megs



basically the task manager itself IMO is just decieving.
 
Hi Winston,
do you have an example of the call to SetProcessWorkingSize and is there an extra component to be added to allow it to work. I see it does not work with Win 98.

Thanks
 
Back
Top