📤 MySQL 导出数据

数据备份与导出 - 多种格式和方法详解

数据导出概述

MySQL 提供了多种数据导出方法,可以将数据库、表或查询结果导出为不同格式的文件。数据导出是数据备份、迁移、分析和共享的重要手段。

🔧 mysqldump

官方命令行工具,支持完整的数据库结构和数据导出

📊 SELECT INTO OUTFILE

SQL 语句直接导出查询结果到文件

🖥️ MySQL Workbench

图形界面工具,支持多种导出格式

📝 第三方工具

phpMyAdmin、Navicat 等工具提供的导出功能

📄 SQL

标准 SQL 脚本格式

📊 CSV

逗号分隔值格式

📋 TSV

制表符分隔格式

📑 JSON

JavaScript 对象表示法

📈 XML

可扩展标记语言

📊 Excel

Microsoft Excel 格式

准备测试数据

-- 创建测试数据库和表 CREATE DATABASE export_demo; USE export_demo; -- 创建用户表 CREATE TABLE users ( id INT AUTO_INCREMENT PRIMARY KEY, username VARCHAR(50) NOT NULL UNIQUE, email VARCHAR(100) NOT NULL, age INT, city VARCHAR(50), salary DECIMAL(10,2), created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP ); -- 创建订单表 CREATE TABLE orders ( id INT AUTO_INCREMENT PRIMARY KEY, user_id INT NOT NULL, order_number VARCHAR(50) NOT NULL UNIQUE, total_amount DECIMAL(10,2) NOT NULL, status ENUM('pending', 'paid', 'shipped', 'delivered', 'cancelled') DEFAULT 'pending', order_date DATE NOT NULL, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE ); -- 创建产品表 CREATE TABLE products ( id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(100) NOT NULL, description TEXT, price DECIMAL(10,2) NOT NULL, category VARCHAR(50), stock_quantity INT DEFAULT 0, is_active BOOLEAN DEFAULT TRUE, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); -- 插入测试数据 INSERT INTO users (username, email, age, city, salary) VALUES ('alice', 'alice@example.com', 28, 'Beijing', 8500.00), ('bob', 'bob@example.com', 32, 'Shanghai', 12000.00), ('charlie', 'charlie@example.com', 25, 'Guangzhou', 7500.00), ('diana', 'diana@example.com', 29, 'Shenzhen', 9500.00), ('eve', 'eve@example.com', 35, 'Hangzhou', 11000.00), ('frank', 'frank@example.com', 27, 'Chengdu', 8000.00), ('grace', 'grace@example.com', 31, 'Wuhan', 9000.00), ('henry', 'henry@example.com', 26, 'Xian', 7800.00), ('iris', 'iris@example.com', 33, 'Nanjing', 10500.00), ('jack', 'jack@example.com', 30, 'Tianjin', 9200.00); INSERT INTO products (name, description, price, category, stock_quantity) VALUES ('iPhone 15', 'Latest Apple smartphone', 7999.00, 'Electronics', 50), ('MacBook Pro', 'Professional laptop', 15999.00, 'Electronics', 30), ('Nike Air Max', 'Running shoes', 899.00, 'Sports', 100), ('Adidas T-shirt', 'Cotton sports t-shirt', 199.00, 'Clothing', 200), ('Samsung TV', '55-inch 4K Smart TV', 4999.00, 'Electronics', 25), ('Coffee Maker', 'Automatic coffee machine', 1299.00, 'Home', 40), ('Yoga Mat', 'Premium yoga mat', 299.00, 'Sports', 80), ('Wireless Headphones', 'Bluetooth headphones', 599.00, 'Electronics', 60), ('Office Chair', 'Ergonomic office chair', 1899.00, 'Furniture', 35), ('Water Bottle', 'Stainless steel bottle', 89.00, 'Sports', 150); INSERT INTO orders (user_id, order_number, total_amount, status, order_date) VALUES (1, 'ORD-2024-001', 8898.00, 'delivered', '2024-01-15'), (2, 'ORD-2024-002', 15999.00, 'delivered', '2024-01-20'), (3, 'ORD-2024-003', 1198.00, 'shipped', '2024-02-01'), (4, 'ORD-2024-004', 4999.00, 'paid', '2024-02-05'), (5, 'ORD-2024-005', 1898.00, 'delivered', '2024-02-10'), (1, 'ORD-2024-006', 688.00, 'pending', '2024-02-15'), (6, 'ORD-2024-007', 299.00, 'delivered', '2024-02-20'), (7, 'ORD-2024-008', 599.00, 'shipped', '2024-02-25'), (8, 'ORD-2024-009', 1988.00, 'paid', '2024-03-01'), (9, 'ORD-2024-010', 89.00, 'delivered', '2024-03-05'); SELECT 'Test data created successfully' AS message; SELECT COUNT(*) AS user_count FROM users; SELECT COUNT(*) AS product_count FROM products; SELECT COUNT(*) AS order_count FROM orders;

使用 mysqldump 导出

基本导出语法

命令行基本语法

# 基本语法 mysqldump [options] database_name [table_name] > output_file.sql # 导出整个数据库 mysqldump -u username -p export_demo > export_demo_backup.sql # 导出特定表 mysqldump -u username -p export_demo users > users_backup.sql # 导出多个表 mysqldump -u username -p export_demo users orders > users_orders_backup.sql # 导出所有数据库 mysqldump -u username -p --all-databases > all_databases_backup.sql # 导出多个数据库 mysqldump -u username -p --databases db1 db2 db3 > multiple_databases_backup.sql

常用导出选项

选项 说明 示例
--single-transaction 在单个事务中导出(InnoDB) --single-transaction
--lock-tables 锁定表(MyISAM) --lock-tables
--no-data 只导出表结构,不导出数据 --no-data
--no-create-info 只导出数据,不导出表结构 --no-create-info
--where 添加 WHERE 条件 --where="age > 25"
--complete-insert 生成完整的 INSERT 语句 --complete-insert
--extended-insert 使用多行 INSERT 语句 --extended-insert
--compress 压缩客户端和服务器间的数据 --compress
--hex-blob 使用十六进制格式导出二进制数据 --hex-blob
--routines 导出存储过程和函数 --routines
--triggers 导出触发器 --triggers
--events 导出事件 --events

实际导出示例

# 1. 完整数据库导出(推荐用于生产环境) mysqldump -u root -p \ --single-transaction \ --routines \ --triggers \ --events \ --hex-blob \ --complete-insert \ export_demo > export_demo_full_backup.sql # 2. 只导出表结构 mysqldump -u root -p \ --no-data \ --routines \ --triggers \ --events \ export_demo > export_demo_structure.sql # 3. 只导出数据 mysqldump -u root -p \ --no-create-info \ --complete-insert \ export_demo > export_demo_data.sql # 4. 条件导出 mysqldump -u root -p \ --single-transaction \ --where="age >= 30" \ export_demo users > users_age_30_plus.sql # 5. 导出到压缩文件 mysqldump -u root -p \ --single-transaction \ --routines \ --triggers \ export_demo | gzip > export_demo_backup.sql.gz # 6. 远程导出 mysqldump -h remote_host -P 3306 -u username -p \ --single-transaction \ export_demo > remote_export_demo.sql # 7. 导出特定字段 mysqldump -u root -p \ --no-create-info \ --complete-insert \ --where="1=1" \ export_demo users \ --ignore-table=export_demo.orders > users_only.sql # 8. 批量导出多个数据库 for db in db1 db2 db3; do mysqldump -u root -p --single-transaction $db > ${db}_backup_$(date +%Y%m%d).sql done
💡 mysqldump 最佳实践:
  • InnoDB 表:使用 --single-transaction 确保一致性
  • MyISAM 表:使用 --lock-tables 锁定表
  • 大数据库:考虑使用 --compress 压缩传输
  • 生产环境:在低峰期进行导出操作
  • 安全性:避免在命令行中直接输入密码

使用 SELECT INTO OUTFILE 导出

基本语法和示例

-- 基本语法 SELECT column1, column2, ... INTO OUTFILE 'file_path' FIELDS TERMINATED BY 'delimiter' LINES TERMINATED BY 'line_ending' FROM table_name WHERE condition; -- 1. 导出为 CSV 格式 SELECT id, username, email, age, city, salary INTO OUTFILE '/tmp/users_export.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM users; -- 2. 导出为 TSV 格式(制表符分隔) SELECT id, username, email, age, city, salary INTO OUTFILE '/tmp/users_export.tsv' FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' FROM users; -- 3. 带条件的导出 SELECT username, email, age, salary INTO OUTFILE '/tmp/high_salary_users.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM users WHERE salary > 9000; -- 4. 导出连接查询结果 SELECT u.username, u.email, o.order_number, o.total_amount, o.status, o.order_date INTO OUTFILE '/tmp/user_orders.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM users u JOIN orders o ON u.id = o.user_id ORDER BY o.order_date DESC; -- 5. 导出聚合数据 SELECT city, COUNT(*) as user_count, AVG(age) as avg_age, AVG(salary) as avg_salary INTO OUTFILE '/tmp/city_statistics.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM users GROUP BY city ORDER BY user_count DESC; -- 6. 导出带标题的 CSV -- 首先导出标题 SELECT 'ID', 'Username', 'Email', 'Age', 'City', 'Salary' INTO OUTFILE '/tmp/users_with_header.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'; -- 然后追加数据(注意:MySQL 不支持 APPEND,需要手动合并) SELECT id, username, email, age, city, salary INTO OUTFILE '/tmp/users_data_only.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM users; -- 7. 自定义分隔符导出 SELECT id, username, email, age INTO OUTFILE '/tmp/users_pipe_delimited.txt' FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' FROM users; -- 8. 导出 JSON 格式数据(MySQL 5.7+) SELECT JSON_OBJECT( 'id', id, 'username', username, 'email', email, 'age', age, 'city', city, 'salary', salary ) as user_json INTO OUTFILE '/tmp/users.json' LINES TERMINATED BY '\n' FROM users;
⚠️ SELECT INTO OUTFILE 注意事项:
  • 文件权限:MySQL 服务器需要对目标目录有写权限
  • 文件路径:必须是服务器端的绝对路径
  • 文件覆盖:如果文件已存在,操作会失败
  • secure_file_priv:检查 MySQL 的文件导出限制设置
  • 字符编码:注意字符集设置,避免乱码
-- 检查文件导出设置 SHOW VARIABLES LIKE 'secure_file_priv'; -- 如果 secure_file_priv 有值,只能导出到指定目录 -- 如果为 NULL,则禁止文件导出 -- 如果为空字符串,则可以导出到任何位置 -- 查看导出文件的内容(在服务器端) -- cat /tmp/users_export.csv -- 检查导出的记录数 SELECT COUNT(*) as exported_records FROM users;

使用 MySQL Workbench 导出

图形界面导出步骤

  1. 连接数据库:在 MySQL Workbench 中连接到目标数据库
  2. 选择导出方式
    • Server → Data Export(导出整个数据库)
    • 右键表 → Table Data Export Wizard(导出单个表)
  3. 选择导出对象:选择要导出的数据库、表和选项
  4. 配置导出选项
    • Export to Dump Project Folder(导出为项目文件夹)
    • Export to Self-Contained File(导出为单个文件)
  5. 高级选项
    • Include Create Schema(包含创建数据库语句)
    • Export Routines(导出存储过程)
    • Export Events(导出事件)
    • Export Triggers(导出触发器)
  6. 执行导出:点击 "Start Export" 开始导出

📄 SQL Dump

标准 SQL 脚本格式

📊 CSV

逗号分隔值格式

📑 JSON

JSON 格式导出

📈 XML

XML 格式导出

编程语言导出示例

Python 导出示例

#!/usr/bin/env python3 # -*- coding: utf-8 -*- import mysql.connector import csv import json import pandas as pd from datetime import datetime # 数据库连接配置 config = { 'host': 'localhost', 'user': 'root', 'password': 'your_password', 'database': 'export_demo', 'charset': 'utf8mb4' } def export_to_csv(query, filename): """导出查询结果到 CSV 文件""" try: conn = mysql.connector.connect(**config) cursor = conn.cursor() cursor.execute(query) results = cursor.fetchall() column_names = [desc[0] for desc in cursor.description] with open(filename, 'w', newline='', encoding='utf-8') as csvfile: writer = csv.writer(csvfile) writer.writerow(column_names) # 写入标题 writer.writerows(results) # 写入数据 print(f"Data exported to {filename} successfully") except mysql.connector.Error as err: print(f"Error: {err}") finally: if conn.is_connected(): cursor.close() conn.close() def export_to_json(query, filename): """导出查询结果到 JSON 文件""" try: conn = mysql.connector.connect(**config) cursor = conn.cursor(dictionary=True) cursor.execute(query) results = cursor.fetchall() # 处理日期时间对象 for row in results: for key, value in row.items(): if isinstance(value, datetime): row[key] = value.isoformat() with open(filename, 'w', encoding='utf-8') as jsonfile: json.dump(results, jsonfile, ensure_ascii=False, indent=2) print(f"Data exported to {filename} successfully") except mysql.connector.Error as err: print(f"Error: {err}") finally: if conn.is_connected(): cursor.close() conn.close() def export_to_excel(query, filename): """导出查询结果到 Excel 文件""" try: conn = mysql.connector.connect(**config) # 使用 pandas 读取数据 df = pd.read_sql(query, conn) # 导出到 Excel df.to_excel(filename, index=False, engine='openpyxl') print(f"Data exported to {filename} successfully") except Exception as err: print(f"Error: {err}") finally: if conn.is_connected(): conn.close() def export_multiple_tables(): """导出多个表到不同的工作表""" try: conn = mysql.connector.connect(**config) with pd.ExcelWriter('export_demo_all_tables.xlsx', engine='openpyxl') as writer: # 导出用户表 users_df = pd.read_sql('SELECT * FROM users', conn) users_df.to_excel(writer, sheet_name='Users', index=False) # 导出订单表 orders_df = pd.read_sql('SELECT * FROM orders', conn) orders_df.to_excel(writer, sheet_name='Orders', index=False) # 导出产品表 products_df = pd.read_sql('SELECT * FROM products', conn) products_df.to_excel(writer, sheet_name='Products', index=False) # 导出统计数据 stats_query = """ SELECT city, COUNT(*) as user_count, AVG(age) as avg_age, AVG(salary) as avg_salary FROM users GROUP BY city ORDER BY user_count DESC """ stats_df = pd.read_sql(stats_query, conn) stats_df.to_excel(writer, sheet_name='Statistics', index=False) print("All tables exported to export_demo_all_tables.xlsx successfully") except Exception as err: print(f"Error: {err}") finally: if conn.is_connected(): conn.close() # 使用示例 if __name__ == "__main__": # 导出用户数据到 CSV export_to_csv( "SELECT id, username, email, age, city, salary FROM users", "users_export.csv" ) # 导出订单数据到 JSON export_to_json( "SELECT * FROM orders WHERE status = 'delivered'", "delivered_orders.json" ) # 导出用户统计到 Excel export_to_excel( """ SELECT city, COUNT(*) as user_count, AVG(age) as avg_age, AVG(salary) as avg_salary FROM users GROUP BY city ORDER BY user_count DESC """, "user_statistics.xlsx" ) # 导出所有表到多个工作表 export_multiple_tables()

PHP 导出示例

setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION); } catch(PDOException $e) { die("Connection failed: " . $e->getMessage()); } /** * 导出数据到 CSV 文件 */ function exportToCSV($pdo, $query, $filename) { $stmt = $pdo->prepare($query); $stmt->execute(); $file = fopen($filename, 'w'); // 写入 BOM 以支持中文 fwrite($file, "\xEF\xBB\xBF"); // 获取列名 $columnCount = $stmt->columnCount(); $headers = []; for ($i = 0; $i < $columnCount; $i++) { $meta = $stmt->getColumnMeta($i); $headers[] = $meta['name']; } // 写入标题行 fputcsv($file, $headers); // 写入数据行 while ($row = $stmt->fetch(PDO::FETCH_NUM)) { fputcsv($file, $row); } fclose($file); echo "Data exported to $filename successfully\n"; } /** * 导出数据到 JSON 文件 */ function exportToJSON($pdo, $query, $filename) { $stmt = $pdo->prepare($query); $stmt->execute(); $data = $stmt->fetchAll(PDO::FETCH_ASSOC); $json = json_encode($data, JSON_PRETTY_PRINT | JSON_UNESCAPED_UNICODE); file_put_contents($filename, $json); echo "Data exported to $filename successfully\n"; } /** * 导出数据到 XML 文件 */ function exportToXML($pdo, $query, $filename, $rootElement = 'data', $rowElement = 'row') { $stmt = $pdo->prepare($query); $stmt->execute(); $xml = new XMLWriter(); $xml->openURI($filename); $xml->startDocument('1.0', 'UTF-8'); $xml->setIndent(true); $xml->startElement($rootElement); while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) { $xml->startElement($rowElement); foreach ($row as $key => $value) { $xml->writeElement($key, $value); } $xml->endElement(); } $xml->endElement(); $xml->endDocument(); $xml->flush(); echo "Data exported to $filename successfully\n"; } /** * 生成 HTML 表格 */ function exportToHTML($pdo, $query, $filename, $title = 'Data Export') { $stmt = $pdo->prepare($query); $stmt->execute(); $html = "\n\n\n"; $html .= "\n"; $html .= "$title\n"; $html .= "\n"; $html .= " \n\n"; $html .= "

$title

\n"; $html .= "\n"; // 获取列名 $columnCount = $stmt->columnCount(); $html .= "\n"; for ($i = 0; $i < $columnCount; $i++) { $meta = $stmt->getColumnMeta($i); $html .= "\n"; } $html .= "\n\n"; // 添加数据行 while ($row = $stmt->fetch(PDO::FETCH_NUM)) { $html .= "\n"; foreach ($row as $cell) { $html .= "\n"; } $html .= "\n"; } $html .= "\n
" . htmlspecialchars($meta['name']) . "
" . htmlspecialchars($cell) . "
\n \n"; file_put_contents($filename, $html); echo "Data exported to $filename successfully\n"; } // 使用示例 // 导出用户数据到 CSV exportToCSV($pdo, "SELECT * FROM users", "users_export.csv"); // 导出订单数据到 JSON exportToJSON($pdo, "SELECT * FROM orders", "orders_export.json"); // 导出产品数据到 XML exportToXML($pdo, "SELECT * FROM products", "products_export.xml", "products", "product"); // 导出用户统计到 HTML $statsQuery = " SELECT city, COUNT(*) as user_count, AVG(age) as avg_age, AVG(salary) as avg_salary FROM users GROUP BY city ORDER BY user_count DESC "; exportToHTML($pdo, $statsQuery, "user_statistics.html", "User Statistics by City"); echo "All exports completed!\n"; ?>

性能优化和最佳实践

🚀 大数据导出

使用分批导出,避免内存溢出和长时间锁表

⏰ 时间选择

在业务低峰期进行导出操作,减少对生产环境的影响

🔒 数据一致性

使用事务或锁确保导出数据的一致性

📦 压缩存储

对大文件进行压缩,节省存储空间和传输时间

🔐 安全考虑

敏感数据脱敏,控制导出权限和文件访问权限

📊 监控日志

记录导出操作日志,便于审计和问题排查

大数据量导出优化

-- 分批导出大表数据 -- 方法1:使用 LIMIT 和 OFFSET SET @batch_size = 10000; SET @offset = 0; -- 循环导出(需要在脚本中实现) SELECT * FROM large_table LIMIT @batch_size OFFSET @offset; -- 方法2:使用主键范围 SET @start_id = 1; SET @end_id = 10000; SET @batch_size = 10000; SELECT * FROM large_table WHERE id BETWEEN @start_id AND @end_id; -- 方法3:使用游标(在存储过程中) DELIMITER // CREATE PROCEDURE ExportLargeTable() BEGIN DECLARE done INT DEFAULT FALSE; DECLARE batch_count INT DEFAULT 0; DECLARE total_records INT; -- 获取总记录数 SELECT COUNT(*) INTO total_records FROM large_table; -- 分批处理 WHILE batch_count * 10000 < total_records DO -- 这里可以调用导出逻辑 SELECT CONCAT('Processing batch ', batch_count + 1) as status; SET batch_count = batch_count + 1; END WHILE; SELECT CONCAT('Export completed. Total batches: ', batch_count) as result; END // DELIMITER ; -- 并行导出(多个连接同时导出不同的数据段) -- 连接1:导出 ID 1-100000 SELECT * FROM large_table WHERE id BETWEEN 1 AND 100000; -- 连接2:导出 ID 100001-200000 SELECT * FROM large_table WHERE id BETWEEN 100001 AND 200000; -- 连接3:导出 ID 200001-300000 SELECT * FROM large_table WHERE id BETWEEN 200001 AND 300000;

导出脚本自动化

#!/bin/bash # MySQL 数据导出自动化脚本 # 配置参数 DB_HOST="localhost" DB_USER="backup_user" DB_PASS="backup_password" DB_NAME="export_demo" BACKUP_DIR="/backup/mysql" DATE=$(date +%Y%m%d_%H%M%S) LOG_FILE="$BACKUP_DIR/export_$DATE.log" # 创建备份目录 mkdir -p $BACKUP_DIR # 日志函数 log_message() { echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" | tee -a $LOG_FILE } # 导出函数 export_database() { local db_name=$1 local output_file="$BACKUP_DIR/${db_name}_$DATE.sql" log_message "Starting export of database: $db_name" mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASS \ --single-transaction \ --routines \ --triggers \ --events \ --hex-blob \ --complete-insert \ $db_name > $output_file if [ $? -eq 0 ]; then log_message "Database $db_name exported successfully to $output_file" # 压缩备份文件 gzip $output_file log_message "Backup file compressed: ${output_file}.gz" # 计算文件大小 local file_size=$(du -h "${output_file}.gz" | cut -f1) log_message "Compressed file size: $file_size" else log_message "ERROR: Failed to export database $db_name" return 1 fi } # 导出特定表 export_table() { local db_name=$1 local table_name=$2 local output_file="$BACKUP_DIR/${db_name}_${table_name}_$DATE.sql" log_message "Starting export of table: $db_name.$table_name" mysqldump -h $DB_HOST -u $DB_USER -p$DB_PASS \ --single-transaction \ --complete-insert \ $db_name $table_name > $output_file if [ $? -eq 0 ]; then log_message "Table $table_name exported successfully" gzip $output_file else log_message "ERROR: Failed to export table $table_name" return 1 fi } # 清理旧备份 cleanup_old_backups() { local retention_days=7 log_message "Cleaning up backups older than $retention_days days" find $BACKUP_DIR -name "*.sql.gz" -mtime +$retention_days -delete log_message "Old backups cleaned up" } # 发送通知 send_notification() { local status=$1 local message=$2 # 发送邮件通知 echo "$message" | mail -s "MySQL Export $status" admin@example.com # 发送到监控系统 # curl -X POST "http://monitoring-system/alert" -d "status=$status&message=$message" } # 主执行流程 main() { log_message "Starting MySQL export process" # 导出整个数据库 if export_database $DB_NAME; then log_message "Database export completed successfully" send_notification "SUCCESS" "Database $DB_NAME exported successfully" else log_message "Database export failed" send_notification "FAILED" "Database $DB_NAME export failed" exit 1 fi # 导出特定表(可选) # export_table $DB_NAME "users" # export_table $DB_NAME "orders" # 清理旧备份 cleanup_old_backups log_message "Export process completed" } # 执行主函数 main # 设置定时任务 # crontab -e # 添加以下行以每天凌晨2点执行备份: # 0 2 * * * /path/to/mysql_export.sh
🎯 导出最佳实践总结:
  1. 选择合适的导出方法:根据数据量和需求选择最适合的导出工具
  2. 确保数据一致性:使用事务或锁机制保证导出数据的完整性
  3. 优化导出性能:大数据量时使用分批导出,避免长时间锁表
  4. 压缩和存储:对导出文件进行压缩,节省存储空间
  5. 安全和权限:控制导出权限,对敏感数据进行脱敏处理
  6. 自动化和监控:建立自动化导出流程和监控机制
  7. 测试和验证:定期测试导出和恢复流程的有效性
-- 清理示例数据 DROP DATABASE IF EXISTS export_demo; SELECT 'MySQL data export tutorial completed. Example database cleaned up.' AS message;