1

I have around 3000 CSV files, each one have 200 entries. So, in total, 600,000 entries.

I want to import all these files into SQLite DB (I am using SQLite Manager in Firefox).

What is the best approach to do this? Later on I will need to read SQLite tables into R code, so I should take it into account when importing the files into DB. For instance, is it a good idea to import all files into the same table? Or, is it better to import each file into a separate table (in this case how to automate this process)?

1 Answer 1

4

Probably easiest to put everything into a single table, but that depends on your needs down the road.

All in different tables using filenames (without extension) as table names:

library(RSQLite)
con <- dbConnect(SQLite(), "mydb.sqlite")

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  dbWriteTable(con, name = strsplit(filename,"\\.")[[1]][1], value = tmp)
}

All in one table (using RSQLite):

library(RSQLite)
con <- dbConnect(SQLite(), "mydb.sqlite")

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  dbWriteTable(con, name = "mytable", value = tmp, append = TRUE)
}

All in one table (using sqldf):

library(sqldf)

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  sqldf("INSERT INTO mytable SELECT * FROM tmp;", dbname = "mydb.sqlite", drv = "SQLite")
}

Side-note: a possibly more convenient tool than the Firefox plugin could be http://sqlitebrowser.org/ -- it's a matter of preference, but it's worth a shot! Multiplatform as well.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.