MySQL & Database Design

Import & Export Data

13 min Lesson 34 of 40

Import & Export Data

Importing and exporting data is essential for data migration, integration with external systems, bulk operations, and transferring data between environments. In this lesson, you'll learn efficient methods for importing CSV files, exporting query results, handling large datasets, and data transformation techniques.

Why Import/Export Matters

Understanding common use cases helps you choose the right tool:

Common Import/Export Scenarios: ✓ Migrating data from another database system ✓ Bulk loading data from CSV/Excel files ✓ Exporting reports for analysis in Excel/Google Sheets ✓ Transferring data between development and production ✓ Integrating with external APIs or systems ✓ Creating data archives ✓ Loading test data for development ✓ Sharing datasets with partners or clients ✓ Data backup in portable format

LOAD DATA INFILE - Fast Bulk Import

LOAD DATA INFILE is MySQL's fastest method for importing large datasets from text files:

Basic LOAD DATA INFILE Syntax: LOAD DATA INFILE '/path/to/file.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS; -- Skip header row Example: Import users from CSV -- CSV file: users.csv -- id,name,email,age -- 1,John Doe,john@example.com,30 -- 2,Jane Smith,jane@example.com,25 LOAD DATA INFILE '/var/lib/mysql-files/users.csv' INTO TABLE users FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS (id, name, email, age);
Security Note: By default, MySQL only allows loading files from the secure_file_priv directory (usually /var/lib/mysql-files/). Check with: SHOW VARIABLES LIKE 'secure_file_priv';

Advanced LOAD DATA Options

Handle complex file formats and data transformations:

Specify Column Mapping: LOAD DATA INFILE '/var/lib/mysql-files/data.csv' INTO TABLE products FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS (product_name, @price_str, @qty_str, category) SET price = CAST(@price_str AS DECIMAL(10,2)), quantity = CAST(@qty_str AS INT), created_at = NOW(); Handle Different Delimiters: -- Tab-separated values LOAD DATA INFILE '/var/lib/mysql-files/data.tsv' INTO TABLE data FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'; -- Pipe-delimited file LOAD DATA INFILE '/var/lib/mysql-files/data.txt' INTO TABLE data FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n'; Handle Windows Line Endings: LOAD DATA INFILE '/var/lib/mysql-files/windows_file.csv' INTO TABLE data FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' -- Windows CRLF IGNORE 1 ROWS;

LOAD DATA LOCAL INFILE - Import from Client

Load files from the client machine instead of the server:

Enable LOCAL INFILE (if disabled): -- In my.cnf: [mysqld] local_infile = 1 -- Or enable per session: SET GLOBAL local_infile = 1; Load from client computer: LOAD DATA LOCAL INFILE '/home/user/Downloads/data.csv' INTO TABLE products FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
Security Warning: LOAD DATA LOCAL INFILE can be a security risk as it allows reading any file the MySQL user can access. Enable only when necessary and disable after use.

SELECT INTO OUTFILE - Export Data

Export query results to a file on the server:

Basic Export: SELECT * FROM users INTO OUTFILE '/var/lib/mysql-files/users_export.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'; Export with Column Headers: -- Export header row SELECT 'id', 'name', 'email', 'created_at' UNION ALL SELECT id, name, email, created_at FROM users INTO OUTFILE '/var/lib/mysql-files/users_with_headers.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'; Export Specific Columns: SELECT user_id, username, email, DATE_FORMAT(created_at, '%Y-%m-%d') AS signup_date FROM users WHERE status = 'active' INTO OUTFILE '/var/lib/mysql-files/active_users.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
Tip: INTO OUTFILE will fail if the file already exists. Delete the old file first or use a timestamp in the filename to create unique exports.

Exporting with mysqldump for Data Migration

Use mysqldump for portable, database-independent exports:

Export Specific Tables: mysqldump -u root -p database_name table1 table2 > tables_export.sql Export Data Only (No CREATE TABLE): mysqldump -u root -p --no-create-info database_name > data_only.sql Export with WHERE Condition: mysqldump -u root -p database_name users \ --where="created_at >= '2024-01-01'" > recent_users.sql Export as CSV (not SQL): mysqldump -u root -p database_name users \ --tab=/var/lib/mysql-files/ \ --fields-terminated-by=',' \ --fields-enclosed-by='"' # Creates users.sql (structure) and users.txt (data) Export to Remote Server: mysqldump -u root -p database_name | \ ssh remote_user@remote_server "mysql -u root -p remote_database"

Importing Large CSV Files Efficiently

Optimize performance when importing millions of rows:

Optimize Before Import: -- Disable keys to speed up bulk insert ALTER TABLE large_table DISABLE KEYS; -- Import data LOAD DATA INFILE '/var/lib/mysql-files/large_data.csv' INTO TABLE large_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS; -- Re-enable keys and rebuild indexes ALTER TABLE large_table ENABLE KEYS; Split Large Files: # Split 10 million row file into 1 million row chunks split -l 1000000 large_file.csv chunk_ # Import each chunk for file in chunk_*; do mysql -u root -p -e "LOAD DATA INFILE '/var/lib/mysql-files/$file' \ INTO TABLE data FIELDS TERMINATED BY ',' \ LINES TERMINATED BY '\n';" done Use Transactions for InnoDB: START TRANSACTION; LOAD DATA INFILE '/var/lib/mysql-files/data.csv' INTO TABLE innodb_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'; COMMIT;

Data Transformation During Import

Clean and transform data as you import:

Convert Data Types: LOAD DATA INFILE '/var/lib/mysql-files/sales.csv' INTO TABLE sales FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS (@sale_date, @amount, product_name, @quantity) SET sale_date = STR_TO_DATE(@sale_date, '%m/%d/%Y'), amount = CAST(REPLACE(@amount, '$', '') AS DECIMAL(10,2)), quantity = CAST(@quantity AS UNSIGNED), created_at = NOW(); Handle NULL Values: LOAD DATA INFILE '/var/lib/mysql-files/data.csv' INTO TABLE products FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (name, @description, @price) SET description = NULLIF(@description, ''), price = IF(@price = '', NULL, @price); Generate Calculated Fields: LOAD DATA INFILE '/var/lib/mysql-files/employees.csv' INTO TABLE employees FIELDS TERMINATED BY ',' (first_name, last_name, @email_prefix, salary) SET email = CONCAT(@email_prefix, '@company.com'), full_name = CONCAT(first_name, ' ', last_name), monthly_salary = salary / 12;

Exporting Data via Command Line

Export data directly from the shell without writing SQL files:

Export to CSV with mysql client: mysql -u root -p -e "SELECT * FROM users" database_name \ --batch \ --skip-column-names \ | sed 's/\t/,/g' > users.csv Export with Headers: mysql -u root -p database_name -e "SELECT * FROM users" \ | sed 's/\t/,/g' > users_with_headers.csv Export Formatted Report: mysql -u root -p -e " SELECT CONCAT(first_name, ' ', last_name) AS 'Full Name', email AS 'Email Address', DATE_FORMAT(created_at, '%Y-%m-%d') AS 'Signup Date' FROM users WHERE status = 'active' ORDER BY created_at DESC " database_name > active_users_report.txt Export JSON Format: mysql -u root -p database_name -e " SELECT JSON_OBJECT( 'id', id, 'name', name, 'email', email ) AS json_data FROM users " --batch --skip-column-names > users.json

Handling Import Errors

Deal with data quality issues during import:

Skip Invalid Rows: LOAD DATA INFILE '/var/lib/mysql-files/dirty_data.csv' INTO TABLE products FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS (name, @price, quantity) SET price = IF(@price REGEXP '^[0-9.]+$', @price, NULL); Log Import Warnings: LOAD DATA INFILE '/var/lib/mysql-files/data.csv' INTO TABLE products FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'; -- Check for warnings SHOW WARNINGS; -- Count affected rows SELECT ROW_COUNT(); Use REPLACE to Handle Duplicates: LOAD DATA INFILE '/var/lib/mysql-files/data.csv' REPLACE INTO TABLE products -- Update if primary key exists FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'; Use IGNORE to Skip Duplicates: LOAD DATA INFILE '/var/lib/mysql-files/data.csv' IGNORE INTO TABLE products -- Skip rows with duplicate keys FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';

Cross-Database Data Transfer

Transfer data between different MySQL databases or servers:

Copy Table Within Same Server: -- Copy structure and data CREATE TABLE new_db.users LIKE old_db.users; INSERT INTO new_db.users SELECT * FROM old_db.users; -- Or in one step CREATE TABLE new_db.users AS SELECT * FROM old_db.users; Transfer Between Servers: # Export from source server mysqldump -h source_server -u user -p source_db table_name | \ mysql -h dest_server -u user -p dest_db Selective Data Transfer: -- Export only recent orders mysqldump -h source_server -u root -p shop_db orders \ --where="order_date >= '2024-01-01'" | \ mysql -h dest_server -u root -p archive_db Incremental Data Sync: INSERT INTO target_db.users SELECT * FROM source_db.users WHERE updated_at > ( SELECT COALESCE(MAX(updated_at), '1970-01-01') FROM target_db.users );

Working with Excel Files

Import and export data for Excel users:

Export Excel-Compatible CSV: SELECT * FROM products INTO OUTFILE '/var/lib/mysql-files/products_excel.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n'; -- Windows line ending Import from Excel CSV: -- Save Excel file as CSV first LOAD DATA INFILE '/var/lib/mysql-files/excel_export.csv' INTO TABLE products CHARACTER SET utf8mb4 FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 ROWS; Handle Excel Encoding Issues: LOAD DATA INFILE '/var/lib/mysql-files/excel_utf8.csv' INTO TABLE products CHARACTER SET utf8mb4 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 ROWS;
Excel Tip: Always save Excel files as "CSV UTF-8" to preserve special characters. Standard CSV may lose accents and non-ASCII characters.

Automated Import/Export Script

Create a reusable script for regular import/export tasks:

automated_export.sh: #!/bin/bash DB_NAME="shop_db" DB_USER="export_user" DB_PASS="ExportPass2024" EXPORT_DIR="/var/exports/daily" DATE=$(date +%Y%m%d) # Create export directory mkdir -p $EXPORT_DIR # Export users mysql -u $DB_USER -p$DB_PASS $DB_NAME -e " SELECT * FROM users INTO OUTFILE '$EXPORT_DIR/users_$DATE.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' " # Export orders with join mysql -u $DB_USER -p$DB_PASS $DB_NAME -e " SELECT o.order_id, u.email, o.total_amount, o.order_date FROM orders o JOIN users u ON o.user_id = u.id WHERE o.order_date = CURDATE() INTO OUTFILE '$EXPORT_DIR/daily_orders_$DATE.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' " # Compress exports gzip $EXPORT_DIR/*_$DATE.csv echo "Export completed: $(date)"

Performance Comparison

Understanding performance differences helps choose the right tool:

Speed Comparison (1 million rows): LOAD DATA INFILE: ~5 seconds ★★★★★ Fastest, direct file to table INSERT ... SELECT: ~30 seconds ★★★★☆ Good for same-server transfers mysqldump + mysql: ~45 seconds ★★★☆☆ Portable, good for migration Individual INSERT: ~5 minutes ★☆☆☆☆ Slowest, use only for small datasets Recommendations: • < 1,000 rows: Use INSERT statements • 1,000 - 100,000 rows: Use LOAD DATA or mysqldump • > 100,000 rows: Always use LOAD DATA INFILE

Practical Exercise:

Task: Import product data, transform it, and export a report:

  1. Create products table
  2. Import CSV with price conversion (remove $ sign)
  3. Calculate discounted prices
  4. Export products over $100 to new CSV

Solution:

-- 1. Create table CREATE TABLE products ( id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(100), original_price DECIMAL(10,2), discount_percent INT, final_price DECIMAL(10,2), imported_at DATETIME ); -- 2. Import with transformation -- CSV format: name,price,discount -- Example: "Laptop","$1200",10 LOAD DATA INFILE '/var/lib/mysql-files/products.csv' INTO TABLE products FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS (name, @price_str, discount_percent) SET original_price = CAST(REPLACE(@price_str, '$', '') AS DECIMAL(10,2)), final_price = CAST(REPLACE(@price_str, '$', '') AS DECIMAL(10,2)) * (1 - discount_percent / 100), imported_at = NOW(); -- 3. Verify import SELECT name, original_price, discount_percent, final_price FROM products LIMIT 10; -- 4. Export premium products SELECT name, original_price, discount_percent, final_price FROM products WHERE final_price > 100 ORDER BY final_price DESC INTO OUTFILE '/var/lib/mysql-files/premium_products.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';

Summary

In this lesson, you learned:

  • LOAD DATA INFILE for fast bulk imports from CSV files
  • SELECT INTO OUTFILE for exporting query results
  • Advanced options for handling different file formats
  • Data transformation during import (type conversion, calculated fields)
  • Error handling with REPLACE and IGNORE
  • Performance optimization for large datasets
  • Cross-database and cross-server data transfer
  • Working with Excel-compatible formats
  • Automated import/export scripts
  • Performance comparison of different methods
Module Complete! You've completed Module 6: Security & User Management. You now know how to manage users and privileges, implement database security best practices, perform backups and recovery, and efficiently import/export data. Next, you'll explore real-world database design patterns in Module 7!