Jump to content
Xtreme .Net Talk

Recommended Posts

Posted

ok, here is the question i have. i am working on a desktop Windows application in VB.NET, backend is PostgreSQL 7.4.5. i've used strongly typed DataSets quite a bit in the application and they seem to be working fine so far.

 

so i've been discussing the use of strongly typed DataSets with my sys admin and he claims that this approach would not be scalable. When i asked him why, here is what he said: "Because you're caching large chunks of the database locally. The overhead is massive, and with a multi user application, you're going to get flayed alive by write synchronization".

 

My application will be used by approximately 20 users. My DataSets average about 15 tables each. I don't think any of the tables will exceed 100,000 rows of data. I am aware that DataSets pull data from the database and store it on the local machine, but that's the first i ever heard about the overhead being massive.

 

Does he have a point?

Posted

what do you mean when you say 'massive'?

 

i am not talking about result sets containg millions of rows. My biggest strongly typed DataSet will have no more 200,000 rows (and that's a very rough max estimate, it most likely will be something like 100,000). Even if i had a DataSet with 500,000 rows though i still don't think the overhead will be significant enough to affect the performance of the application.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...