alanchinese Posted December 6, 2005 Posted December 6, 2005 i have a talbe that contains 60M to 100M rows of data column name.....data type.....length ...key..................char...........42 ...desc................varchar......200 ...ref...................int...............4 all three columns are indexed and key is the primary key. i am doing the following operation: if (!DataIsInTable(key)) InsertIntoTable(key); it's fairly slow when the number of rows are 100M. i wonder if there are some advanced method to improve its performance? what happen if i don't check if the data is in the table? would the failing process faster than selecting? thankx.... Quote
kejpa Posted December 7, 2005 Posted December 7, 2005 What database? If you're using mySQL there's a REPLACE command that inserts if not present and updates (actually deletes and updates) if present. Generally I sould say trying to insert and failing is faster than checking and then inserting. If you insert and the key is present then the insert fails, no harm done. HTH /Kejpa Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.