Migrate DNS records in bulk to an Amazon Route 53 private hosted zone - AWS Prescriptive Guidance

Migrate DNS records in bulk to an Amazon Route 53 private hosted zone

Created by Ram Kandaswamy (AWS)

Environment: Production

Technologies: Networking; Cloud-native; DevOps; Infrastructure

AWS services: AWS Cloud9; Amazon Route 53; Amazon S3

Summary

Network engineers and cloud administrators need an efficient and simple way to add Domain Name System (DNS) records to private hosted zones in Amazon Route 53. Using a manual approach to copy entries from a Microsoft Excel worksheet to appropriate locations in the Route 53 console is tedious and error prone. This pattern describes an automated approach that reduces the time and effort required to add multiple records. It also provides a repeatable set of steps for multiple hosted zone creation.

This pattern uses the AWS Cloud9 integrated development environment (IDE) for development and testing, and Amazon Simple Storage Service (Amazon S3) to store records. To work with data efficiently, the pattern uses the JSON format because of its simplicity and its ability to support a Python dictionary (dict data type).

Note: If you can generate a zone file from your system, consider using the Route 53 import feature instead.

Prerequisites and limitations

Prerequisites 

  • An Excel worksheet that contains private hosted zone records

  • Familiarity with different types of DNS records such as A record, Name Authority Pointer (NAPTR) record, and SRV record (see Supported DNS record types)

  • Familiarity with the Python language and its libraries

Limitations

  • The pattern doesn’t provide extensive coverage for all use case scenarios. For example, the change_resource_record_sets call doesn’t use all the available properties of the API.

  • In the Excel worksheet, the value in each row is assumed to be unique. Multiple values for each fully qualified domain name (FQDN) are expected to appear in the same row. If that is not true, you should modify the code provided in this pattern to perform the necessary concatenation.

  • The pattern uses the AWS SDK for Python (Boto3) to call the Route 53 service directly. You can enhance the code to use an AWS CloudFormation wrapper for the create_stack and update_stack commands, and use the JSON values to populate template resources.

Architecture

Technology stack

  • Route 53 private hosted zones for routing traffic

  • AWS Cloud9 IDE for development and testing

  • Amazon S3 for storing the output JSON file

Workflow and architecture for migrating DNS records in bulk to an Amazon Route 53 private hosted zone

The workflow consists of these steps, as illustrated in the previous diagram and discussed in the Epics section:

  1. Upload an Excel worksheet that has the record set information to an S3 bucket.

  2. Create and run a Python script that converts the Excel data to JSON format.

  3. Read the records from the S3 bucket and clean the data.

  4. Create record sets in your private hosted zone.

Tools

  • Route 53 –  Amazon Route 53 is a highly available and scalable DNS web service that handles domain registration, DNS routing, and health checking.

  • AWS Cloud9 – AWS Cloud9 is an IDE that offers a rich code editing experience with support for several programming languages and runtime debuggers, and a built-in terminal. It contains a collection of tools that you use to code, build, run, test, and debug software, and helps you release software to the cloud.

  • Amazon S3 – Amazon Simple Storage Service (Amazon S3) is an object storage service. You can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web.

Epics

TaskDescriptionSkills required
Create an Excel file for your records.

Use the records you exported from your current system to create an Excel worksheet that has the required columns for a record, such as fully qualified domain name (FQDN), record type, Time to Live (TTL), and value. For NAPTR and SRV records, the value is a combination of multiple properties, so use Excel's concat method to combine these properties.

FqdnName

RecordType

Value

TTL

something.example.org

A

1.1.1.1

900

Data engineer, Excel skills
Verify the working environment.

In the AWS Cloud9 IDE, create a Python file to convert the Excel input worksheet to JSON format. (Instead of AWS Cloud9, you can also use an Amazon SageMaker notebook to work with Python code.)

Verify that the Python version you’re using is version 3.7 or later.

 python3 --version

Install the pandas package.

 pip3 install pandas --user
General AWS
Convert the Excel worksheet data to JSON.

Create a Python file that contains the following code to convert from Excel to JSON.

import pandas as pd data=pd.read_excel('./Book1.xls') data.to_json(path_or_buf='my.json',orient='records')

where Book1 is the name of the Excel worksheet and my.json is the name of the output JSON file.

Data engineer, Python skills
Upload the JSON file to an S3 bucket.

Upload the my.json file to an S3 bucket. For more information, see Creating a bucket in the Amazon S3 documentation.

App developer
TaskDescriptionSkills required
Create a private hosted zone.

Use the create_hosted_zone API and the following Python sample code to create a private hosted zone. Replace the parameters hostedZoneName, vpcRegion, and vpcId with your own values.

import boto3 import random hostedZoneName ="xxx" vpcRegion = "us-east-1" vpcId="vpc-xxxx" route53_client = boto3.client('route53') response = route53_client.create_hosted_zone(         Name= hostedZoneName,         VPC={             'VPCRegion: vpcRegion,             'VPCId': vpcId         },         CallerReference=str(random.random()*100000),         HostedZoneConfig={             'Comment': "private hosted zone created by automation",             'PrivateZone': True         }     )  print(response)

You can also use an infrastructure as code (IaC) tool such as AWS CloudFormation to replace these steps with a template that creates a stack with the appropriate resources and properties.

Cloud architect, Network administrator, Python skills
Retrieve details as a dictionary from Amazon S3.

Use the following code to read from the S3 bucket and to get the JSON values as a Python dictionary. 

fileobj = s3_client.get_object(         Bucket=bucket_name,         Key='my.json'         )     filedata = fileobj['Body'].read()     contents = filedata.decode('utf-8')     json_content=json.loads(contents)     print(json_content)

where json_content contains the Python dictionary.

App developer, Python skills
Clean data values for spaces and Unicode characters.

As a safety measure to ensure the correctness of data, use the following code to perform a strip operation on the values in json_content. This code removes the space characters at the front and end of each string. It also uses the replace method to remove hard (non-breaking) spaces (the \xa0 characters).

for item in json_content:     fqn_name = unicodedata.normalize("NFKD",item["FqdnName"].replace("u'", "'").replace('\xa0', '').strip())     rec_type = item["RecordType"].replace('\xa0', '').strip()     res_rec = {                  'Value': item["Value"].replace('\xa0', '').strip()                 }
App developer, Python skills
Insert records.

Use the following code as part of the previous for loop.

change_response = route53_client.change_resource_record_sets( HostedZoneId="xxxxxxxx",             ChangeBatch={                 'Comment': 'Created by automation',                 'Changes': [                     {                         'Action': 'UPSERT',                         'ResourceRecordSet': {                             'Name': fqn_name,                             'Type': rec_type,                             'TTL': item["TTL"],                             'ResourceRecords': res_rec                         }                     }                 ]             }     )

Where xxxxxxx is the hosted zone ID from the first step of this epic.

App developer, Python skills

Related resources

References

Tutorials and videos