Menu
Amazon Relational Database Service
User Guide (API Version 2014-10-31)

Importing and Exporting SQL Server Data

Use the following procedures to import data into and export data from an Amazon RDS SQL DB instance.

Importing Data into SQL Server on Amazon RDS

If you have an existing Microsoft SQL Server deployment that you want to move to Amazon RDS, the complexity of your task depends on the size of your database and the types of database objects that you are transferring. For example, a database that contains data sets on the order of gigabytes, along with stored procedures and triggers, is going to be more complicated than a simple database with only a few megabytes of test data and no triggers or stored procedures.

RDS for SQL Server service does not currently support RESTORE DATABASE ... FROM FILE, because the database and log file backups must be local to the SQL Server instance. Similarly, FILESTREAM is also not supported at this time.

The BULK INSERT and OPENROWSET(BULK...) statements from the server are not supported import procedures due to their dependency on the ADMINISTER BULK OPERATIONS permission which is not granted for SQL Server DB instances. Please use the process outlined below to import data to a SQL Server DB instance.

The process that we recommend to import data into a SQL Server DB instance is as follows:

  1. Create a DB Instance.

  2. Before you load data into the destination DB Instance, you should do some preparation, such as disabling foreign key constraints and database triggers. You should also disable automated backups.

  3. Query the source SQL Server instance for any logins that you want to import to the destination DB Instance.

  4. In your existing SQL Server deployment, generate scripts that obtain data from the source SQL Server instance, and then apply the scripts to the destination DB Instance. If you have existing scripts, you can apply those scripts to the destination DB Instance. If you are importing a large dataset, your script can define only the database schema; otherwise, it can also include the data and all other database objects.

  5. After your data is imported, reverse any preparations that you made earlier, re-enable foreign key constraints and database triggers, switch the recovery model to its original state, and then re-enable automated backups.

Note

Amazon RDS for SQL Server does not currently support importing data into the msdb database, though we do support SQL Server Agent jobs. Some SQL Server features that use the msdb database, such as Database Mail and Replication, are not currently supported in Amazon RDS.

Preparing to Import Data into Your SQL Server DB Instance

Before you import data into your SQL Server DB Instance, we recommend the following best practices:

  • Stop applications from accessing the destination DB Instance.

  • Create a snapshot of the target database.

  • Disable automated backups on the target database.

  • Disable foreign key constraints, if applicable.

  • Drop indexes, if applicable.

  • Disable database triggers, if applicable.

Stop Applications from Accessing the Target DB Instance

If you prevent access to your DB Instance while you are importing data, data transfer will be faster. Additionally, you won't need to worry about conflicts while data is being loaded if other applications cannot write to the DB Instance at the same time. If something goes wrong and you have to roll back to a prior database snapshot, the only changes that you will lose will be the imported data, which you can import again after you resolve the issue.

For information about controlling access to your DB Instance, see Working with DB Security Groups.

Create a Database Snapshot

If the target database is already populated with data, we recommend that you take a snapshot of the database before you import the data. If something goes wrong with the data import or you want to discard the changes, you can restore the database to its previous state by using the snapshot. For information about database snapshots, see Creating a DB Snapshot.

Note

When you take a database snapshot, I/O operations to the database are suspended for about 10 seconds while the backup is in progress.

Disable Automated Backups

Disabling automated backups on the target DB Instance will improve performance while you are importing your data because Amazon RDS doesn't log transactions when automatic backups are disabled. There are, however, some things to consider. Because automated backups are required to perform a point-in-time recovery, you won't be able to restore the database to a specific point in time while you are importing data. Additionally, any automated backups that were created on the DB Instance are erased. You can still use previous snapshots to recover the database, and any snapshots that you have taken will remain available. For information about automated backups, see Working With Automated Backups.

Disable Foreign Key Constraints

If you need to disable foreign key constraints, you can do so with the following script.

--Disable foreign keys on all tables
DECLARE @table_name SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE table_cursor CURSOR FOR SELECT name FROM sys.tables;

OPEN table_cursor;
FETCH NEXT FROM table_cursor INTO @table_name;

WHILE @@FETCH_STATUS = 0 BEGIN
  SELECT @cmd = 'ALTER TABLE '+QUOTENAME(@table_name)+' NOCHECK CONSTRAINT ALL';
  EXEC (@cmd);
  FETCH NEXT FROM table_cursor INTO @table_name;
END

CLOSE table_cursor;
DEALLOCATE table_cursor;

GO  

Disable Database Triggers

If you need to disable database triggers, you can do so with the following script.

--Disable triggers on all tables
DECLARE @enable BIT = 0;
DECLARE @trigger SYSNAME;
DECLARE @table SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE trigger_cursor CURSOR FOR SELECT trigger_object.name trigger_name,
 table_object.name table_name
FROM sysobjects trigger_object
JOIN sysobjects table_object ON trigger_object.parent_obj = table_object.id
WHERE trigger_object.type = 'TR';

OPEN trigger_cursor;
FETCH NEXT FROM trigger_cursor INTO @trigger, @table;

WHILE @@FETCH_STATUS = 0 BEGIN
  IF @enable = 1
     SET @cmd = 'ENABLE ';
  ELSE
     SET @cmd = 'DISABLE ';

  SET @cmd = @cmd + ' TRIGGER dbo.'+QUOTENAME(@trigger)+' ON dbo.'+QUOTENAME(@table)+' ';
  EXEC (@cmd);
  FETCH NEXT FROM trigger_cursor INTO @trigger, @table;
END

CLOSE trigger_cursor;
DEALLOCATE trigger_cursor;

GO  

Import Logins to Your SQL Server DB Instance

SQL Server stores logins and passwords in the master database. Because Amazon RDS does not grant access to the master database, you cannot directly import logins and passwords into your destination DB Instance. Instead, you must query the master database on the source SQL Server instance to generate a DDL file that includes all logins and passwords that you want to add to the destination DB Instance, as well as role memberships and permissions that you want to transfer.

For information about querying the master database, go to How to Transfer the Logins and the Passwords Between Instances of SQL Server 2005 and SQL Server 2008 on the Microsoft Knowledge Base.

The output of the script is another script that you can run on the destination DB Instance. Where the script in the Knowledge Base article has the following:

p.type IN 

Use the following instead:

p.type = 'S' 

Import the Data

Microsoft SQL Server Management Studio is a graphical SQL Server client that is included in all Microsoft SQL Server editions except the Express Edition. SQL Server Management Studio Express is available from Microsoft as a free download.

Note

SQL Server Management Studio is available only as a Windows-based application.

SQL Server Management Studio includes the following tools, which are useful in importing data to a SQL Server DB Instance:

  • Generate and Publish Scripts Wizard

  • Import and Export Wizard

  • Bulk copy feature

Generate and Publish Scripts Wizard

The Generate and Publish Scripts Wizard creates a script that contains the schema of a database, the data itself, or both. If you generate a script for a database in your local SQL Server deployment, you can then run the script to transfer the information that it contains to an Amazon RDS DB Instance.

Note

For databases of 1 GB or larger, it is more efficient to script only the database schema and then use the Import and Export Wizard or the bulk copy feature of SQL Server to transfer the data.

For detailed information about the Generate and Publish Scripts Wizard, see the Microsoft SQL Server documentation.

In the wizard, pay particular attention to the advanced options on the Set Scripting Options page to ensure that everything you want your script to include is selected. For example, by default, database triggers are not included in the script.

When the script is generated and saved, you can use SQL Server Management Studio to connect to your DB Instance and then run the script.

Import and Export Wizard

The Import and Export Wizard creates a special Integration Services package, which you can use to copy data from your local SQL Server database to the destination DB Instance. The wizard can filter which tables and even which tuples within a table are copied to the destination DB Instance.

Note

The Import and Export Wizard works well for large datasets, but it may not be the fastest way to remotely export data from your local deployment. For an even faster way, you may want to consider the SQL Server bulk copy feature.

For detailed information about the Import and Export Wizard, go to the Microsoft SQL Server documentation

In the wizard, on the Choose a Destination page, do the following:

  • In the Server Name box, enter the name of the endpoint for your DB Instance.

  • For the server authentication mode, click Use SQL Server Authentication.

  • Under User name and Password, enter the credentials for the master user that you created for the DB Instance.

Bulk Copy

The SQL Server bulk copy feature is an efficient means of copying data from a source database to your DB Instance. Bulk copy writes the data that you specify to a data file, such as an ASCII file. You can then run bulk copy again to write the contents of the file to the destination DB Instance.

This section uses the bcp utility, which is included with all editions of SQL Server. For detailed information about bulk import and export operations, go to the Microsoft SQL Server documentation.

Note

Before you use bulk copy, you must first import your database schema to the destination DB Instance. The Generate and Publish Scripts Wizard, described earlier in this topic, is an excellent tool for this purpose.

The following command connects to the local SQL Server instance to generate a tab-delimited file of a specified table in the C:\ root directory of your existing SQL Server deployment. The table is specified by its fully qualified name, and the text file has the same name as the table that is being copied.

bcp dbname.schema_name.table_name out C:\table_name.txt -n -S localhost -U username -P password -b 10000 

Where:

  • -n specifies that the bulk copy will use the native data types of the data to be copied.

  • -S specifies the SQL Server instance that the bcp utility will connect to.

  • -U specifies the user name of the account that will log in to the SQL Server instance.

  • -P specifies the password for the user specified by -U.

  • -b specifies the number of rows per batch of imported data.

Note

There may be other parameters, such as the -E parameter that pertains to identity values, that are important to your import situation; please review the full description of the command line syntax for the bcp utility, at the Microsoft SQL Server documentation.

For example, suppose a database named store that uses the default schema, dbo, contains a table named customers. The user account admin, with the password insecure, will copy 10,000 rows of the customers table to a file named customers.txt.

bcp store.dbo.customers out C:\customers.txt -n -S localhost -U admin -P insecure -b 10000 

After you generate the data file, if you have created the database and schema on the target DB Instance, you can upload the data to your DB Instance by using a similar command. In this case, you will use the in argument to specify an input file instead of out to specify an output file. Instead of using localhost to specify the local SQL Server instance, you will specify the endpoint of your DB Instance. If you use a port other than 1433, you will specify that, too. The user name and password will be those of the master user and password for your DB Instance. The syntax is as follows:

bcp dbname.schema_name.table_name in C:\table_name.txt -n -S endpoint,port -U master_user_name -P master_user_password -b 10000 

To continue the previous example, suppose the master user name is admin, and the password is insecure. The endpoint for the DB Instance is rds.ckz2kqd4qsn1.us-east-1.rds.amazonaws.com, and you will use port 4080. The command would be as follows:

bcp store.dbo.customers in C:\customers.txt -n -S rds.ckz2kqd4qsn1.us-east-1.rds.amazonaws.com,4080 -U admin -P insecure -b 10000 

Cleaning Up

If you followed the best practices outlined earlier in this topic for preparing to import data to your DB Instance, you will need to perform the following tasks now:

  • Grant applications access to the target DB Instance.

  • Enable automated backups on the target DB Instance.

  • Enable foreign key constraints.

  • Enable database triggers.

Grant Applications Access to the Target DB Instance

When your data import is complete, you can grant access to the DB Instance to those applications that you blocked during the import. For information about controlling access to your DB Instance, see Working with DB Security Groups.

Enable Automated Backups on the Target DB Instance

For information about automated backups, see Working With Automated Backups.

Enable Foreign Key Constraints

If you disabled foreign key constraints earlier, you can now enable them with the following script:

--Enable foreign keys on all tables
DECLARE @table_name SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE table_cursor CURSOR FOR SELECT name FROM sys.tables;

OPEN table_cursor;
FETCH NEXT FROM table_cursor INTO @table_name;

WHILE @@FETCH_STATUS = 0 BEGIN
  SELECT @cmd = 'ALTER TABLE '+QUOTENAME(@table_name)+' CHECK CONSTRAINT ALL';
  EXEC (@cmd);
  FETCH NEXT FROM table_cursor INTO @table_name;
END

CLOSE table_cursor;
DEALLOCATE table_cursor;  

Enable Database Triggers

If you disabled database triggers earlier, you can now enable them with the following script:

--Enable triggers on all tables
DECLARE @enable BIT = 1;
DECLARE @trigger SYSNAME;
DECLARE @table SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE trigger_cursor CURSOR FOR SELECT trigger_object.name trigger_name,
 table_object.name table_name
FROM sysobjects trigger_object
JOIN sysobjects table_object ON trigger_object.parent_obj = table_object.id
WHERE trigger_object.type = 'TR';

OPEN trigger_cursor;
FETCH NEXT FROM trigger_cursor INTO @trigger, @table;

WHILE @@FETCH_STATUS = 0 BEGIN
  IF @enable = 1
     SET @cmd = 'ENABLE ';
  ELSE
     SET @cmd = 'DISABLE ';

  SET @cmd = @cmd + ' TRIGGER dbo.'+QUOTENAME(@trigger)+' ON dbo.'+QUOTENAME(@table)+' ';
  EXEC (@cmd);
  FETCH NEXT FROM trigger_cursor INTO @trigger, @table;
END

CLOSE trigger_cursor;
DEALLOCATE trigger_cursor;  

Exporting Data from SQL Server on Amazon RDS

You can export data from an Amazon RDS SQL DB instance using either of a couple of options, following. Which option you choose depends on the target database and what you want to export.

  • SQL Server Import and Export Wizard – You can use this wizard to copy one or more tables, views, or queries from your Amazon RDS SQL DB instance to another data store. The wizard can both export the data from your SQL Server DB instance and import it into the target data store. This choice is best if you want to transfer small- to medium-size tables or query result sets to another SQL Server DB instance, or if the target data store is not SQL Server.

  • SQL Server Generate and Publish Scripts Wizard and bcp Utility – You can use this wizard to create scripts for an entire database or just selected objects. You can run these scripts on a target SQL Server DB instance to recreate the scripted objects. You can then use the bcp utility to bulk export the data for the selected objects to the target DB instance. This choice is best if you want to move a whole database (including objects other than tables) or large quantities of data between two SQL Server DB instances.

The tools named preceding are available as part of Microsoft SQL Server Management Studio, a graphical SQL Server client that is included in all Microsoft SQL Server editions except the Express Edition. SQL Server Management Studio Express is also available from Microsoft as a free download.

Note

SQL Server Management Studio is available only as a Windows-based application.

SQL Server Import and Export Wizard

To use the SQL Server Import and Export Wizard to export data, follow these steps:

  1. In SQL Server Management Studio, connect to your Amazon RDS SQL DB instance. For details on how to do this, see Connecting to a DB Instance Running the SQL Server Database Engine.

  2. In Object Explorer, expand Databases, right-click the source database, select Tasks, and then click Export Data to open the wizard.

  3. On the Choose a Data Source page, do the following:

    1. Click SQL Server Native Client 11.0 in the Data source box.

    2. Verify that the Server name box shows the endpoint of your Amazon RDS SQL DB instance.

    3. Click Use SQL Server Authentication. Type the master user name and password of your Amazon RDS SQL DB instance in the User name and Password boxes.

    4. Verify that the Database box shows the database from which you want to export data.

    5. Click Next.

  4. On the Choose a Destination page, do the following:

    1. Click SQL Server Native Client 11.0 in the Destination box.

      Note

      Other target data sources are available, include .NET Framework data providers, OLE DB providers, SQL Server Native Client providers, ADO.NET providers, Microsoft Office Excel, Microsoft Office Access, and the Flat File source. If you choose to target one of these data sources, skip the remainder of step 4 and go to Choose a Destination in the SQL Server documentation for details on the connection information to provide.

    2. Type the server name of the target SQL Server DB instance in the Server name box.

    3. Click the appropriate authentication type. Type a user name and password if necessary.

    4. Click the database name of the target database in the Database box, or click New to create a new database to contain the exported data.

      If you click New, go to Create Database in the SQL Server documentation for details on the database information to provide.

    5. Click Next.

  5. On the Table Copy or Query page, click Copy data from one or more tables or views or Write a query to specify the data to transfer. Click Next.

  6. If you clicked Write a query to specify the data to transfer, you see the Provide a Source Query page. Type or paste in a SQL query, and then click Parse to verify it. Once the query validates, click Next.

  7. On the Select Source Tables and Views page, do the following:

    1. Select the tables and views that you want to export, or verify that the query you provided is selected.

    2. Click Edit Mappings and specify database and column mapping information. For further details, go to Column Mappings in the SQL Server documentation.

    3. (Optional) To see a preview of data to be exported, select the table, view, or query, and then click Preview.

    4. Click Next.

  8. On the Run Package page, verify that Run immediately is selected. Click Next.

  9. On the Complete the Wizard page, verify that the data export details are as you expect. Click Finish.

  10. On the The execution was successful page, click Close.

For more information, go to SQL Server Import and Export Wizard in the SQL Server documentation.

SQL Server Generate and Publish Scripts Wizard and bcp Utility

To use the SQL Server Generate and Publish Scripts Wizard and the bcp utility to export data, follow these steps:

  1. In SQL Server Management Studio, connect to your Amazon RDS SQL DB instance. For details on how to do this, see Connecting to a DB Instance Running the SQL Server Database Engine.

  2. In Object Explorer, expand the Databases node and select the database you want to script.

  3. Follow the instructions in Generate and Publish Scripts Wizard in the SQL Server documentation to create a script file.

  4. In SQL Server Management Studio, connect to your target SQL Server DB instance.

  5. With the target SQL Server DB instance selected in Object Explorer, click Open on the File menu, click File, and then open the script file.

  6. If you have scripted the entire database, review the CREATE DATABASE statement in the script to make sure the database is being created in the location and with the parameters that you want. For further details, go to CREATE DATABASE in the SQL Server documentation.

  7. If you are creating database users in the script, check to see if server logins exist on the target DB instance for those users. If not, create logins for those users; the scripted commands to create the database users will fail otherwise. For more information, go to Create a Login in the SQL Server documentation.

  8. Click !Execute on the SQL Editor menu to execute the script file and create the database objects. When the script finishes, verify that all database objects exist as expected.

  9. Use the bcp utility to export data from the Amazon RDS SQL DB instance into files. Open a command prompt and type the command

    bcp database_name.schema_name.table_name out data_file -n -S aws_rds_sql_endpoint -U username -P password

    where:

    • table_name is the name of one of the tables that you’ve re-created in the target database and now want to populate with data.

    • data_file is the full path and name of the data file to be created.

    • -n specifies that the bulk copy will use the native data types of the data to be copied.

    • -S specifies the SQL Server DB instance to export from.

    • -U specifies the user name to use when connecting to the SQL Server DB instance.

    • -P specifies the password for the user specified by -U.

    The following shows an example command:

    bcp world.dbo.city out C:\Users\JohnDoe\city.dat -n -S sql-jdoe.1234abcd.us-west-2.rds.amazonaws.com,1433 -U JohnDoe -P ClearTextPassword

    For a full description of the bcp command line syntax, go to bcp Utility in the Microsoft SQL Server documentation.

    Repeat this step until you have data files for all of the tables you want to export.

  10. Prepare your target DB instance for bulk import of data by following the instructions at Basic Guidelines for Bulk Importing Data in the SQL Server documentation.

  11. Decide on a bulk import method to use after considering performance and other concerns discussed in About Bulk Import and Bulk Export Operations in the SQL Server documentation.

  12. Bulk import the data from the data files you created using the bcp utility, following the instructions at either Import and Export Bulk Data by Using the bcp Utility or Import Bulk Data by Using BULK INSERT or OPENROWSET(BULK...) in the SQL Server documentation, depending on what you decided in step 11.