As question, I found that I can use .import in sqlite shell, but seems it is not working in R environment, any suggestions?
2 Answers
You can use read.csv.sql in the sqldf package. It is only one line of code to do the read. Assuming you want to create a new database, testingdb, and then read a file into it try this:
# create a test file
write.table(iris, "iris.csv", sep = ",", quote = FALSE, row.names = FALSE)
# create an empty database.
# can skip this step if database already exists.
sqldf("attach testingdb as new")
# or: cat(file = "testingdb")
# read into table called iris in the testingdb sqlite database
library(sqldf)
read.csv.sql("iris.csv", sql = "create table main.iris as select * from file",
dbname = "testingdb")
# look at first three lines
sqldf("select * from main.iris limit 3", dbname = "testingdb")
The above uses sqldf which uses RSQLite. You can also use RSQLite directly. See ?dbWriteTable in RSQLite. Note that there can be problems with line endings if you do it directly with dbWriteTable that sqldf will automatically handle (usually).
If your intention was to read the file into R immediately after reading it into the database and you don't really need the database after that then see:
http://code.google.com/p/sqldf/#Example_13._read.csv.sql_and_read.csv2.sql
5 Comments
attach statement - not by the front end sqldf) so sqldf will not delete it. Note that sqldf is intended for ad hoc manipulation of data frames but if you deal wit persistent databases then you might want to use RSQLite, RH2 or other database interface package directly.library(sqldf) loads the sqldf package which is needed to use the sqldf command.I tend to do that with the sqldf package: Quickly reading very large tables as dataframes in R
Keep in mind that in the above example I read the csv into a temp sqlite db. You'll obviously need to change that bit.