1

I am using Oracle ApEx and I was wondering on what is the best way to upload a single column csv file into an Oracle table, to be used within ApEx?

1

4 Answers 4

2

I had the same issues as you. You probably already found the solution but I follow the method in this link http://forums.oracle.com/forums/thread.jspa?threadID=545565 and it works for me. They used external tables. The maximum number of columns is 50.

Other option is http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/ However, I found that if your CSV contains some special character, it does not run properly.

Hope this helps somehow.

Sign up to request clarification or add additional context in comments.

Comments

0

To load a CSV file using SQLLOADER you will need to do the following:

(a) The CSV file (b) SQLLOADER Control file (also known as CTL file) (c) SQLLOADER command to use the CTL file in (b) to load the CSV file in (a).

CSV file, you already have that. For this example, the file name is temp.csv contains two columns

A,100
B,200
C,300
D,400

Now, you need to create a control to describe the file and which table to load it to, and how to load it. For this example, the control file name called temp.ctl and the table you want to load into is TEMP_TABLE that contains two columns, COLUMN_1 is VARCHAR2(2), COLUMN_2 is NUMBER. The temp.ctl looks like the below

LOAD DATA
APPEND
INTO TEMP_TABLE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
  COLUMN_1 CHAR,
  COLUMN_2 Integer EXTERNAL
)

3 Comments

Thanks Mohamed for your help but do not want to use SQL*Loader - pls see my comment above.
@ Mohamed i would definately need your help in SQL*Loader to load csv file into oracle db
0

Big files:

  • sqlloader
  • external tables

Small files:

  • own parsing

1 Comment

Thanks but not looking at using SQL*Loader as this will be done by an end user so would like to allow them to choose the file from Oracle ApEx, to upload from their PC and then parse and then load into a table. What is the best way to parse a single column csv into oracle table? Thanks.
0

You can do it with awk.

Here is example on how to populate table from .csv data.

SQL> create table MY_TABLE(a varchar2(100), b varchar2(100));

[oracle@myora ~]$ tail Some_Input_CSV_file
Some Data A 1,Some Data B 1
Some Data A 2,Some Data B 2
Some Data A 3,Some Data B 3
Some Data A 4,Some Data B 4
Some Data A 5,Some Data B 5
Some Data A 6,Some Data B 6
Some Data A 7,Some Data B 7
Some Data A 8,Some Data B 8
Some Data A 9,Some Data B 9
Some Data A 10,Some Data B 10
[oracle@myora ~]$

[oracle@myora ~]$ cat Some_Input_CSV_file | awk -F, ‘ { printf(“insert into MY_TABLE values(trim(\x27%s\x27), trim(\x27%s\x27));\n”, $1, $2); } ‘ > RunMe.sql

[oracle@myora ~]$ tail RunMe.sql
insert into MY_TABLE values(trim(‘Some Data A 1′), trim(‘Some Data B 1′));
insert into MY_TABLE values(trim(‘Some Data A 2′), trim(‘Some Data B 2′));
insert into MY_TABLE values(trim(‘Some Data A 3′), trim(‘Some Data B 3′));
insert into MY_TABLE values(trim(‘Some Data A 4′), trim(‘Some Data B 4′));
insert into MY_TABLE values(trim(‘Some Data A 5′), trim(‘Some Data B 5′));
insert into MY_TABLE values(trim(‘Some Data A 6′), trim(‘Some Data B 6′));
insert into MY_TABLE values(trim(‘Some Data A 7′), trim(‘Some Data B 7′));
insert into MY_TABLE values(trim(‘Some Data A 8′), trim(‘Some Data B 8′));
insert into MY_TABLE values(trim(‘Some Data A 9′), trim(‘Some Data B 9′));
insert into MY_TABLE values(trim(‘Some Data A 10′), trim(‘Some Data B 10′));
[oracle@myora ~]$

[oracle@myora ~]$ sqlplus myuser/mypwd@myinst

SQL> @RunMe.sql

…

1 row created.

1 row created.

1 row created.

1 row created.

1 row created.

SQL> commit;

Commit complete.

SQL> exit

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.