End of support notice: On September 10, 2025, AWS
will discontinue support for AWS RoboMaker. After September 10, 2025, you will
no longer be able to access the AWS RoboMaker console or AWS RoboMaker resources.
For more information on transitioning to AWS Batch to help run containerized
simulations, visit this blog
post
Logging a simulation
To capture output files and other artifacts from your simulation job, you can configure custom uploads. You can configure custom uploads for your robot application and your simulation application. When you configure a custom upload, files you specify are uploaded from the simulation job to the Amazon S3 simulation output location you provide. This can be useful when you want to review or analyze application output generated during a simulation run or reuse artifacts.
Before you can configure custom uploads, you must provide an Amazon S3 output destination for your simulation job. AWS RoboMaker uploads matching files to a folder using a name you specify. Matching files can be uploaded when all of the simulation job tools shut down or uploaded as they are produced and then removed.
Default upload configurations are automatically added to your custom upload configurations
unless you turn them off. The default upload configuration uploads ROS and Gazebo default
logging output. This maintains compatibility with past simulation job output configurations.
which uploaded ROS and Gazebo default logging output. You can turn off the default upload
configuration when you configure a simulation job in the console. You can also turn it off
by setting useDefaultUploadConfigurations
to false
in the CreateSimulationJob API.
Your simulation applications are extraded onto a single 128gb partition and you have write access to the partition.
Adding a custom upload configuration
To create a custom upload configuration, you need to specify a name prefix that specifies where the files are uploaded in Amazon S3, a Unix glob path specifying the files to upload, and an upload behavior specifying when the files are uploaded.
Name
A name is a prefix that specifies how files are uploaded in Amazon S3. It is appended to the simulation output location to determine the final path.
For example, if your simulation output location is s3://amzn-s3-demo-bucket
and
your upload configuration name is robot-test
, your files are uploaded to
s3://amzn-s3-demo-bucket/<simid>/<runid>/robot-test
.
Path
The path specifies which files are uploaded. Standard Unix glob matching rules are accepted subject to the following:
-
The path must begin with
/home/robomaker/
or/var/log
. -
The path must not contain a reverse path expression (
/..
). -
Symbolic links are not followed.
-
You can use
**
as a super asterisk in your path. For example, specifying/var/log/**.log
causes all.log
files in the/var/log
directory tree to be collected.You can also use the standard asterisk as a standard wildcard. For example,
/var/log/system.log*
matches files such assystem.log_1111
,system.log_2222
, and so on in/var/log
.
Upload behavior
You can select one of the following upload behaviors:
-
Upload on terminate (
UPLOAD_ON_TERMINATE
) uploads all files matching the path once the simulation job enters the terminating state. AWS RoboMaker attempts to upload logs for a maximum of 60 minutes.AWS RoboMaker does not begin uploading files until all of your tools running in the simulation have stopped.
-
Upload rolling with auto remove (
UPLOAD_ROLLING_AUTO_REMOVE
) uploads all files matching the path as they are generated. Paths are checked every 5 seconds. When the files are uploaded, the source files are deleted. Once a file is deleted, if a new file is generated with the same name, it replaces the previously uploaded file. AWS RoboMaker performs a final check for files once all of your applications running in the simulation have stopped.Upload rolling with auto remove is useful for uploading rolling logs. Write or stream output to an "active" file which is not covered by the path glob. Once you're done writing to the active file, roll the file into a location covered by the path glob to be uploaded and removed.
This setting can help you conserve space in your simulation job. It can also help you access files before your simulation job terminates.
The simulation job partition size is 128gb. If your simulation job ends for any reason, AWS RoboMaker tries to upload all files specified in your custom upload configuration.
Environment variables created by AWS RoboMaker
AWS RoboMaker defines the following simulation job environment variables.
-
AWS_ROBOMAKER_SIMULATION_JOB_ID
-
AWS_ROBOMAKER_SIMULATION_JOB_ARN
-
AWS_ROBOMAKER_SIMULATION_RUN_ID
You can get these variables from your application or from the command line. For
example, to get the current simulation job Amazon Resource Name (ARN) in Python, use
os.environ.get("AWS_ROBOMAKER_SIMULATION_JOB_ARN")
.
If you specified an Amazon Simple Storage Service output bucket for the simulation job, you can use the
environment variables to find the output path. AWS RoboMaker writes output to
s3://
.
Use this to manage objects in Amazon S3 from code or the command line.bucket-name
/AWS_ROBOMAKER_SIMULATION_JOB_ID/AWS_ROBOMAKER_SIMULATION_RUN_ID
AWS RoboMaker also handles specific environment variables set up in
CreateSimulationJobRequest
to allow robot and simulation application
containers to communicate with each other. For more information, see ROS container FAQs.