Convert 8 byte number from Text (CSV) to Database.... How to?

Agent707

Newcomer
Joined
Jul 9, 2004
Messages
19
Location
Knoxville, TN
K, I'm not sure which forum this question goes in, so I am putting it here. Probably a database forum thing though...

Here's my problem.

I have a customer file [CSV] I am importing to SQL. I am using JET to read it into a dataset. I am using a schema.ini file to define columns, because I have one column with up to 1000 characters in it... so I have to define it as a Memo field so it won't truncate at 255 characters.

ANYWAY, I have one other column that has VERY large numbers in it. i.e. 4671070409010930000

This is, in effect an 8 byte integer.

My problem is, I can't seem to get it into my dataset without it *****ing the value. I can't define it as any kind of numberic type in my schema ini, as there is no decimal type. If I define it as long, I get an eroneous value. If I define it as double, I get overflows.
If I define it as text (Which I thought the logical answer), I get '4.67E+18'...????? which SQL can't convert to anything numeric. Even if it could, I loose a lot of the value due to precision.

Is my problem understood?

Any idea's? I need to get it into my dataset as TEXT without it changing it to '4.67E+18'. SQL can convert '4671070409010930000' to a decimal(19,0). It can't convert '4.67E+18' to Anything.

I don't think it's the end of the world if I can't get this column imported. They aren't asking for it to be displayed on the report (reason for import).

Here's my code that reads it into a ds.
Code:
    Private Function GetDataSetFromFile() As DataSet
        Dim oleDBCon As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0; Data" & _
                " Source=" & mWorkingImportDirectory & _
                ";Extended Properties=""Text;HDR=Yes;FMT=Delimited""")

        Dim oleDbDA As New OleDbDataAdapter()
        Dim ds As New DataSet
        Dim oleDbCmd As New OleDbCommand("SELECT * FROM " & mImportFile, oleDBCon)

        oleDBCon.Open()
        oleDbDA.SelectCommand = oleDbCmd
        oleDbDA.Fill(ds, "CSV")
        oleDBCon.Close()
        Return ds
    End Function
schema file.
Code:
[ImportFile.csv]
Format=CSVDelimited
CharacterSet=ANSI
ColNameHeader=True
Col1=OrderDate DateTime
Col2=SupplierName Text Width 100
Col3=ItemDescription Memo
Col4=ItemQty Short
Col5=ItemPrice Currency
Col6=Subtotal Currency
Col7=Freight Currency
Col8=Tax Currency
Col9=Total Currency
Col10=AccountCode Long
Col11=ProjectNumber Text Width 19   <----- This is the problem child. changing this number has no affect.
Col12=CostCenter Text Width 10
Col13=BusinessPurpose Text Width 30
Col14=ReferenceNumber Text Width 20
Col15=InvoiceNumber Long
Col16=ShipDate DateTime
Col17=ReceivedDate DateTime
 
If you are reading from a csv file could you not load the file into a dataset directly and then use the SQlBulkCopy class to load the data into Sql and ignore jet completely?
 
I don't follow you - "load the file into a dataset directly". What do you mean by that? Reading the file manually line by line using a streamreader or something?
I mean, that's what I'm doing using jet... loading it directly.

I'm only using Jet to load up my dataset from the file (part a). At that point (part b), I'm going through and validating row by row and adding it to a new datatable, which I use a sqldataadapter to do a batch insert (stored procedure). Reporting invalid rows so they can be fixed in the file. Part b works as intended.

However, I did change my method of inserting the data. Instead of using the sqldataadapter to batch insert, I used the SQLBulkInsert method. Works much faster, and no store procedure required.

This does nothing though for my problem though (part a). Please elaborate on what you meant by the quoted statement. And thanks for the SQLBulkInsert tip.
 
The idea was to create and populate the dataset manually via a streamreader rather than going through the Jet engine as the Jet engine seems to be the source of the problem (and IIRC limits you to a 32bit platform which may be another issue).

A better solution maybe to use something like http://www.codeproject.com/KB/database/CsvReader.aspx and open a DataReader directly over the text file and use SqlBulkCopy with this DataReader and bypass Jet and the need for a DataTable entirely.
 
I have to confess the source of this problem came from Excel, not Jet.

I never opened the .CSV file in notepad (shame on me) to verify the values were actually the long numbers. I just assumed, but not so. Apparently Excel is pretty dumb when it comes to saving files "as csv". It outputs the scientific notations (what is displayed) instead of the actual value of the cell. How wonderful. Open Excel, type in a really long number, hit enter (you'll see the cell value change) and save the file as csv. You'll see what I'm talking about.
So I had to define the column as numeric (no digits) THEN export it to the CSV. I'll just have to make sure our vendor knows about this before they start sending the CSV files our way.

All this headache for nothing.

I do agree a custom CSV reader would be better than using JET, but this application is something that should have only taken about 4 hours total to implement... and I'm way over that margin right now, so I'm moving on to reporting.

Thanks for sharing your thoughts.
 
Back
Top