my_lou Posted January 10, 2005 Posted January 10, 2005 ok, here is the question i have. i am working on a desktop Windows application in VB.NET, backend is PostgreSQL 7.4.5. i've used strongly typed DataSets quite a bit in the application and they seem to be working fine so far. so i've been discussing the use of strongly typed DataSets with my sys admin and he claims that this approach would not be scalable. When i asked him why, here is what he said: "Because you're caching large chunks of the database locally. The overhead is massive, and with a multi user application, you're going to get flayed alive by write synchronization". My application will be used by approximately 20 users. My DataSets average about 15 tables each. I don't think any of the tables will exceed 100,000 rows of data. I am aware that DataSets pull data from the database and store it on the local machine, but that's the first i ever heard about the overhead being massive. Does he have a point? Quote
Moderators Robby Posted January 10, 2005 Moderators Posted January 10, 2005 The overhead with Strongly Typed Datasets is massive but it shouldn't make any difference that you have 20 or 200 users, each client will cache their own dataset. Quote Visit...Bassic Software
my_lou Posted January 10, 2005 Author Posted January 10, 2005 what do you mean when you say 'massive'? i am not talking about result sets containg millions of rows. My biggest strongly typed DataSet will have no more 200,000 rows (and that's a very rough max estimate, it most likely will be something like 100,000). Even if i had a DataSet with 500,000 rows though i still don't think the overhead will be significant enough to affect the performance of the application. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.