Exporting data from Oracle databases is a common task that every DBA and developer faces. Whether you’re migrating data, creating backups, or generating reports, having the right export strategy can save you time and headaches. In this post, I’ll walk you through four powerful methods for exporting data from Oracle, from simple CSV outputs to handling massive datasets with millions of rows.
Setting Up Our Test Data
First, let’s create a sample table to work with throughout this guide:
create table foo(id int, somedata varchar2(100), indate timestamp default systimestamp);
insert into foo
(id, somedata)
select level as id,
dbms_random.string('a', trunc(dbms_random.value(5, 20))) as somedata
from dual
connect by level <= 10000;
commit;
This creates a table with 10,000 rows, perfect for demonstrating our export techniques.
Method 1: SQLcl CSV Export with Spool
The simplest method uses SQLcl’s built-in CSV formatting with the traditional spool command. This approach is perfect for quick exports and ad-hoc reporting.
set sqlformat csv
spool /tmp/myoutput.csv
select * from foo where id < 10;
spool off
set sqlformat ansiconsole
When to use this:
- Quick, one-off exports
- Small to medium datasets
- When you need immediate results without additional setup
- Perfect for sending query results to non-technical users
Pros: Simple, fast, no additional privileges required
Cons: Not ideal for very large datasets, limited formatting options
Method 2: SQLcl Unload Command
SQLcl’s unload command provides a streamlined way to export entire tables with a single command:
unload table foo dir /tmp/sqlclunload
When to use this:
- Exporting entire tables quickly
- When you need consistent, reproducible exports
- Batch operations where you’re exporting multiple tables
Pros: Single command simplicity, efficient for whole-table exports
Cons: Less flexibility for filtered or transformed data
Method 3: DBMS_CLOUD for Precise Control
DBMS_CLOUD offers powerful export capabilities with fine-grained control over the output format. Here’s how to set it up:
-- Setup: Create directory for output
!mkdir -p /tmp/csvout
create directory csvout as '/tmp/csvout';
grant read,write on directory csvout to public;
-- Simple CSV export
begin
dbms_cloud.export_data(
file_uri_list => 'csvout:foo.csv',
format => json_object('type' value 'csv'),
query => 'select * from foo where id < 10'
);
end;
/
When to use this:
- When you need custom queries with filtering and transformations
- Exporting to cloud storage (when configured)
- When you need specific format control
Pros: Flexible query options, format control, cloud integration support
Cons: Requires directory setup and grants, more complex syntax
Note for on-premises databases: If you’re using DBMS_CLOUD outside of Autonomous Database, you’ll need to configure it appropriately for your environment.
Method 4: Handling Large Tables with Chunked Exports
When dealing with tables containing millions of rows, exporting everything at once can cause memory issues or create unwieldy files. The solution? Export in chunks using FETCH and OFFSET:
set serveroutput on;
declare
v_chunk_size constant pls_integer := 1000000; -- 1M rows per file
v_offset number := 0;
v_batch number := 1;
v_total_rows number;
begin
-- Calculate total rows to determine loop iterations
select count(*) into v_total_rows from foo;
while v_offset < v_total_rows loop
dbms_output.put_line('Exporting chunk ' || v_batch ||
' starting at offset ' || v_offset);
dbms_cloud.export_data(
credential_name => null,
file_uri_list => 'csvout:foo_' || to_char(v_batch, 'fm000') || '.csv',
format => json_object('type' value 'csv'),
query => 'select * from foo offset ' || v_offset ||
' rows fetch next ' || v_chunk_size || ' rows only'
);
v_offset := v_offset + v_chunk_size;
v_batch := v_batch + 1;
end loop;
dbms_output.put_line('Export completed successfully.');
end;
/
This approach creates multiple files (foo_001.csv, foo_002.csv, etc.), each containing up to 1 million rows. You can adjust the chunk size based on your needs.
Bonus Method 4.1: Adding Compression
For even better efficiency, especially when dealing with text heavy data or transferring files over networks, add gzip compression:
declare
v_chunk_size constant pls_integer := 1000000;
v_offset number := 0;
v_batch number := 1;
v_total_rows number;
begin
select count(*) into v_total_rows from foo;
while v_offset < v_total_rows loop
dbms_output.put_line('Exporting chunk ' || v_batch ||
' starting at offset ' || v_offset);
dbms_cloud.export_data(
credential_name => null,
file_uri_list => 'csvout:foo_' || to_char(v_batch, 'fm000') || '.csv',
format => json_object('type' value 'csv',
'compression' value 'gzip'),
query => 'select * from foo offset ' || v_offset ||
' rows fetch next ' || v_chunk_size || ' rows only'
);
v_offset := v_offset + v_chunk_size;
v_batch := v_batch + 1;
end loop;
dbms_output.put_line('Export completed successfully.');
end;
/
When to use chunked exports:
- Tables with millions of rows
- When file size limitations exist
- To enable parallel processing of exported data
- To minimize memory usage during export
Pros: Handles massive datasets, creates manageable file sizes, reduces memory pressure
Cons: More complex code, creates multiple files to manage
Choosing the Right Method
Here’s a quick decision tree to help you choose:
- Need a quick export of a small query result? → Use SQLcl CSV with spool (Method 1)
- Exporting entire tables regularly? → Use SQLcl unload (Method 2)
- Need filtered data with specific formatting? → Use DBMS_CLOUD (Method 3)
- Dealing with millions of rows? → Use chunked DBMS_CLOUD exports (Method 4)
- Need compression for network transfer? → Use DBMS_CLOUD with gzip (Method 4.1)
Conclusion
Oracle provides multiple pathways for data export, each suited to different scenarios. Start with simple methods like SQLcl’s CSV export for everyday tasks, but don’t hesitate to leverage DBMS_CLOUD’s power when you need more control or when dealing with large-scale exports. The chunked export approach is particularly valuable for production environments where you’re regularly exporting massive datasets.
Remember to always test your export strategy with a subset of data first, and consider factors like file permissions, disk space, and network bandwidth when planning large exports.
Happy exporting!
Leave a comment