wyrd Posted February 2, 2003 Posted February 2, 2003 When dealing with DataSets, would it be wise to keep it small as possible (ie; maybe only 100 or so records at a time)? I mean, if you have a database with 1 million records I'm pretty sure it'd significantly impact the speed of your program if you had all of those in the DataSet (memory). So.. what would be the breaking point (if any) of when things start becoming sluggish? Quote Gamer extraordinaire. Programmer wannabe.
*Experts* Nerseus Posted February 3, 2003 *Experts* Posted February 3, 2003 I use a general rule of thumb of 250 or 500 records max (depending on the number of columns). Anything more than that and you'd be better off filtering or paging. You want to limit the records not only because it takes a lot of client resources, but a lot of server power, network traffic, etc. to send more records to the client. Some controls, notably the old VB6 Flexgrid, had trouble displaying too many records as well. Hopefully you won't have the need to ever bring back more than 500 records at once. A good app will provide some kind of filter, either through a search or some paging mechanism to limit the amount of data going across the line. If they *really* need more records, you may have to deal with downloading entire tables to a client and do the paging there. It's not as fun to code so I'd avoid it unless you have to. Was this a "what if" question, or something that has actually come up? -nerseus Quote "I want to stand as close to the edge as I can without going over. Out on the edge you see all the kinds of things you can't see from the center." - Kurt Vonnegut
wyrd Posted February 3, 2003 Author Posted February 3, 2003 Was this a "what if" question, or something that has actually come up? More or less a random thing that I was thinking about when reading up on ADO.NET. Quote Gamer extraordinaire. Programmer wannabe.
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.