0

I need to run a query in SQL Server, where I have a particular number of values stored individually on separate lines in a text file, and I need to run a query in SQL server to check if a value in a column of the table matches any one of the value stored in the txt file.

How should I go about doing this ?

I am aware of how to formulate various types of queries in SQL Server, just not sure how to run a query that is dependent on a file for its query parameters.

EDIT :

Issue 1 : I am not doing this via a program since the query that I need to run traverses over 7 million datapoints which results in the program timing out before it can complete, hence the only alternative I have left is to run the query in SQL Server itself without worrying about the timeout.

Issue 2 : I do not have admin rights to the database that I am accessing which is why there is no way I could create a table, dump the file into it, then perform a query by joining those tables.

Thanks.

11
  • If a program is going to be invoking this query, why not have the program take care of parsing the text file? That would simplify things. Commented Feb 4, 2016 at 19:41
  • @David : Check my edits to the question, I've explained the issues that I'm facing. Commented Feb 4, 2016 at 19:46
  • A query that hits 7 million datapoints???? Commented Feb 4, 2016 at 19:49
  • @SeanLange : Yes, a query that has to go through close to 7 million data-points to compare with the values in a file and then displays the result. Commented Feb 4, 2016 at 19:51
  • 1
    Stark..table variables, temp tables, etc. run in the context of the thread (i.e. from SSMS, Access pass-through query, ODBC client. etc.). You do not need a permanent table. You'll need permission to run any sort of query, so BULK INSERT into one of these types should be fine. Commented Feb 4, 2016 at 20:01

3 Answers 3

3

One option would be to use BULK INSERT and a temp table. Once in the temp table, you can parse the values. This is likely not the exact answer you need, but based on your experience, I'm sure you could tweak as needed.

Thanks...

SET NOCOUNT ON;

USE Your_DB;

GO

CREATE TABLE dbo.t (
    i int, 
    n varchar(10),
    d decimal(18,4),
    dt datetime
    );
GO


BULK INSERT dbo.t
    FROM 'D:\import\data.txt'
    WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n');
Sign up to request clarification or add additional context in comments.

3 Comments

@DanielG : For this BULK INSERT to work, does the file I am uploading need to be on the same server as the database ? Or can I upload a file from my local machine as well ?
The file needs to be "reachable" by SQL Server itself. That could be a UNC (\\MACHINE\DRIVE\File.txt) or path to a locally accessible drive. I'm not sure about your scenario, but you can FTP or FileCopy the file first. Maybe also consider using TABLE TYPE, and pass in a Structured Data Type as a parameter to the stored procedure. That would involve parsing the file locally first into a DataTable. If the process happens infrequently, maybe a combination of pushing the file to an FTP site, and a SQL job to pull down locally using a SQL job. The size of the file/data may affect the decision.
1

There are lots of approaches.

Mine would be to import the file to a table, do the comparison with a regular SQL query, and then delete the file-data table if you don't need it anymore.

Comments

1
  1. Bulk import the data from text file into a temporary table.
  2. Execute the query to do the comparison between your actual physical table & temporary table.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.