Trying to export data from a .csv file to a DB | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Trying to export data from a .csv file to a DB

Hello, I#%92 m trying to use an Insert script to export data from a .csv file to a database table, but I receive errors in Query Analyzer saying that the specified column names of .csv file aren#%92 t valid, but the column names are correct.. This is the script:
Use market
GO
INSERT INTO [SystemPerformance] (
[DateTimeCounter]
,[Total_ProcessorTime]
,[Total_processorQueueLength]
,[AvailableMbytes]
,[Pages/second]
,[Avg. Disk Queue Length]
,[Avg. Disk sec/Transfer]
,[Disk Time]
,[Network Bytes Received/sec]
,[Network Bytes Sent/sec]
,[Network Bytes Total/sec]
,[Server Bytes Received/sec]
,[Server Bytes Transmitted/sec]
,[Server Bytes Total/sec]
)
SELECT
[(PDH-CSV 4#0) (Pacific Standard Time)(480)]
,cast([Processor(_Total)% Processor Time] as float)
,cast([SystemProcessor Queue Length] as float)
,cast([MemoryAvailable Mbytes] as float)
,cast([MemoryPages/sec#] as float)
,cast([PhysicalDisk(_Total)Avg# Disk Queue Length] as float)
,cast([PhysicalDisk(_Total)Avg# Disk sec/Transfer] as float)
,cast([Physical Disk(_Total)% Disk Time] as float)
,cast([Network Interface(Broadcom NetXtreme Gigabit Ethernet Driver)Bytes Received/sec #] as float)
,cast([Network Interface(Broadcom NetXtreme Gigabit Ethernet Driver)Bytes Sent/sec#] as float)
,cast([Network Interface(Broadcom NetXtreme Gigabit Ethernet Driver)Bytes Total/sec#] as float)
,cast([ServerBytes Received/sec#] as float)
,cast([ServerBytes Transmitted/sec#] as float)
,cast([ServerBytes Total/sec#] as float)
FROM OpenDataSource( ‘Microsoft.Jet.OLEDB.4.0’,
— csv file on a local drive
‘Data Source=C:perfLogs;Extended properties=Text’)…System_000002#csv Which can be the problem? Thank you
Hello Cesar, What is the reason you’re not using DTS?
Because every day I have to export a different .csv file name to a DB table, and having to follow all DTS wizard every time is very long. Instead, using a script like this I only have to change the file name and execute it. Also, data in .csv file is stored automatically by the server, and the order of columns is not the same than the order of DB table columns. I saw the idea in this articlehttp://www.sql-server-performance.com/ss_performance_monitoring.asp , and the article says: “I use the OpenDataSource function in the import script because in my opinion it is a more flexible solution than the BCP utility, a DTS package, or a BULK INSERT statement.” Do you are not agree with that opinion? In that article there is the insert script that it doesn#%92 t work to me. Any suggestions looking to that script and mine?

I see one interesting thing, when I execute my insert statement returns errors for all columns of .csv file except for the first column. So, seems that the script recognizes only first column (whichever is), this is my .csv file: "(PDH-CSV 4.0) (Pacific Standard Time)(480)","Processor(_Total)% Processor Time","SystemProcessor Queue Length",…..
So, it only can read "(PDH-CSV 4.0) (Pacific Standard Time)(480)", but not the others.
If I change the order and I put another column in the first position, for example thus: "Processor(_Total)% Processor Time","(PDH-CSV 4.0) (Pacific Standard Time)(480)","SystemProcessor Queue Length","MemoryAvailable Mbytes","MemoryPages/sec."….. Then, the column which can read is "Processor(_Total)% Processor Time", but not the others.
Why my insert statement only can read the first column of .csv file? And it doesn#%92 t recognize the others saying that the column names are not valid?
]]>