I do quite a lot of importing from Excel and have found the easiest way is to import the file to a table and run all the updates/ inserts from this as it is materially quicker once you have the data in a local table.
You could create the table on the fly but I prefer to have the table structure all setup and use TransferText to import, where you can use an import spec.
Loosely to set this up:
- Create your table with appropriate field names and data types
- Manually import the data from your text file and save the import spec
- Use VBA to import future text files and trigger the update/ insert queries
Code could look something like this:
' Folder where files to be imported are saved
strFilePath = "C:\myFolder\"
' Look for a .txt file in that folder
strFileName = Dir(strFilePath & "*.txt")
' Loop through all .txt files until each is imported
Do Until strFileName = ""
strFile = strFilePath & strFileName
' Import the file to a temporary table where "myImportSpec" is what you saved
' the import spec as when you did the manual import and "tbl_IMPORT_Temp"
' is the table you created to run your queries from
'NOTE: This is what i use for .csv files and haven't tested on .txt but
' I think it should work
DoCmd.TransferText acImportDelim, "myImportSpec", "tbl_IMPORT_Temp", strFile
DoCmd.OpenQuery "qryUpdateQuery", acViewNormal
DoCmd.OpenQuery "qryAppendQuery", acViewNormal
' Copy the .txt file to a new folder
FileCopy strFile, strFilePath & "Successful\" & strFileName
' Delete the original file so it isn't imported again
Kill strFile
NextFile:
' Clear your temp import table
DoCmd.RunSQL "DELETE * FROM tbl_IMPORT_Temp"
' Get the next file to be imported
strFileName = Dir(strFilePath & "*.txt")
Loop
Hope this helps. Simon