Automatic downloads
The Deadline CLI provides a command to download the output of all tasks in a queue that completed since the last time the same command ran. You can configure this as a cron job or scheduled task to run repeatedly. This creates automatic downloading of output on a continuous basis.
Before setting up automatic downloads, follow the steps in Storage profiles for job attachments to configure all paths of asset data for upload and download. If a job uses an output path that is not in its storage profile, then the automatic download skips downloading that output and prints warning messages to summarize the files it did not download. Similarly, if a job is submitted without a storage profile, the automatic download skips that job and prints a warning message. By default, Deadline Cloud submitters display warning messages for paths that are outside of storage profiles to help ensure correct configuration.
Configuring AWS credentials
If you want to run the output synchronization command manually, or to understand how it works before configuring it as a cron job, you can use the credentials from logging in to the Deadline Cloud monitor desktop application.
On-premises AWS credentials
Your on-premises workers use credentials to access Deadline Cloud job attachments output. For the most secure access, we recommend using IAM Roles Anywhere to authenticate your workers. For more information, see IAM Roles Anywhere.
For testing, you can use IAM user access keys for AWS credentials. We recommend that you set an expiration for the IAM user by including a restrictive inline policy.
Important
Heed the following warnings:
-
Do NOT use your account's root credentials to access AWS resources. These credentials provide unrestricted account access and are difficult to revoke.
-
Do NOT put literal access keys or credential information in your application files. If you do, you create a risk of accidentally exposing your credentials if, for example, you upload the project to a public repository.
-
Do NOT include files that contain credentials in your project area.
-
Secure your access keys. Do not provide your access keys to unauthorized parties, even to help find your account identifiers. By doing this, you might give someone permanent access to your account.
-
Be aware that any credentials stored in the shared AWS credentials file are stored in plain text.
For more details, see Best practices for managing AWS access keys in the AWS General Reference.
Create an IAM user
Open the IAM console at https://console.aws.amazon.com/iam/
. -
In the navigation pane, select Users and then select Create user.
-
Name the user
deadline-output-downloader
. Clear the checkbox for Provide user access to the AWS Management Console, then choose Next. -
Choose Attach policies directly.
-
Choose Create policy to create a custom policy with minimum required permissions.
-
In the JSON editor, specify the following permissions:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "DeadlineCloudOutputDownload", "Effect": "Allow", "Action": [ "deadline:AssumeQueueRoleForUser", "deadline:ListQueueEnvironments", "deadline:ListSessions", "deadline:ListSessionActions", "deadline:SearchJobs", "deadline:GetJob", "deadline:GetQueue", "deadline:GetStorageProfileForQueue" ], "Resource": "*" } ] }
-
Name the policy
DeadlineCloudOutputDownloadPolicy
and choose Create policy. -
Return to the user creation page, refresh the policy list, and select the DeadlineCloudOutputDownloadPolicy you just created, then choose Next.
-
Review the user details and then choose Create user.
Restrict user access to a limited time window
Any IAM user access keys that you create are long-term credentials. To ensure that these credentials expire in case they are mishandled, you can make these credentials time-bound by creating an inline policy that specifies a date after which the keys will no longer be valid.
-
Open the IAM user that you just created. In the Permissions tab, choose Add permissions and then choose Create inline policy.
-
In the JSON editor, specify the following permissions. To use this policy, replace the
aws:CurrentTime
timestamp value in the example policy with your own time and date.{ "Version": "2012-10-17", "Statement": [ { "Effect": "Deny", "Action": "*", "Resource": "*", "Condition": { "DateGreaterThan": { "aws:CurrentTime": "
2024-01-01T00:00:00Z
" } } } ] }
Create an access key
-
From the user details page, select the Security credentials tab. In the Access keys section, choose Create access key.
-
Indicate that you want to use the key for Other, then choose Next, then choose Create access key.
-
On the Retrieve access keys page, choose Show to reveal the value of your user's secret access key. You can copy the credentials or download a .csv file.
Store the user access keys
-
Store the user access keys in the AWS credentials file on your system:
-
On Linux, the file is located at
~/.aws/credentials
-
On Windows, the file is located at
%USERPROFILE\.aws\credentials
Replace the following keys:
[deadline-downloader] aws_access_key_id=
ACCESS_KEY_ID
aws_secret_access_key=SECRET_ACCESS_KEY
region=YOUR_AWS_REGION
-
-
To use these credentials at all times, set the env variables
AWS_PROFILE
todeadline-downloader
.
Important
When you no longer need this IAM user, we recommend that you remove it to align with the AWS security best practice. We recommend that you require your human users to use temporary credentials through AWS IAM Identity Center when accessing AWS.
Prerequisites
Complete the following steps before creating a cron job or scheduled task for automatic download.
-
If you haven't already, install Python
. -
Install the Deadline CLI by running:
python -m pip install deadline
-
Confirm the version of the Deadline CLI is 0.52.1 or newer with the following command.
$ deadline --version deadline, version 0.52.1
Test the output download command
To verify the command works in your environment
-
Get the path to Deadline
-
Run the sync-output command to bootstrap.
/path/to/deadline queue sync-output \ --farm-id YOUR_FARM_ID \ --queue-id YOUR_QUEUE_ID \ --storage-profile-id YOUR_PROFILE_ID \ --checkpoint-dir /path/to/checkpoint/directory \
-
You only need to do this step if your downloading machine is the same as submitting machine. Replace
--storage-profile-id YOUR_PROFILE_ID \
above with--ignore-storage-profiles
. -
Submit a test job.
-
Download the .zip file from GitHub.
-
Open the deadline-cloud-samples GitHub repository
. -
Choose Code and then, from the dropdown menu, select Download ZIP.
-
Unzip the downloaded archive to a local directory.
-
-
Run
cd /path/to/unzipped/deadline-cloud-samples-mainline/job_bundles/job_attachments_devguide_output
-
Run
deadline bundle submit .
-
If you don’t have the default deadline config setup, you might need to supply the following in the command line.
--farm-id
YOUR-FARM-ID
--queue-idYOUR-QUEUE-ID
-
-
Wait for the job to complete before going to the next step.
-
-
Run the sync-output command again.
/path/to/deadline queue sync-output \ --farm-id YOUR_FARM_ID \ --queue-id YOUR_QUEUE_ID \ --storage-profile-id YOUR_PROFILE_ID \ --checkpoint-dir /path/to/checkpoint/directory
-
Verify the following:
-
Your test job's outputs appear in the destination directory.
-
A checkpoint file is created in your specified checkpoint directory.
-
Set up scheduled downloads
Select the tab for your operating system to learn how to configure automatic downloads for every 5 minutes.
Verify the setup
To verify the automatic downloads setup was successful, complete the following steps.
-
Submit a new test job.
-
Wait for one scheduler interval to complete, which in this case is 5 minutes.
-
Verify that new outputs are downloaded automatically.
If the outputs do not download, check the Troubleshooting section for the process logs.
Troubleshooting automatic downloads
If you encounter issues with the automatic downloads, check the following:
Storage Profile Issues
-
An error like
[Errno 2] No such file or directory
or[Errno 13] Permission denied
in the log file could be related to missing or misconfigured storage profiles. -
See Storage profiles for information about how to set up your storage profiles when the downloading machine is different from the submitting machine.
-
For same-machine downloads, try the
--ignore-storage-profiles
flag.
Directory Permissions
-
Ensure the scheduler service user has:
-
Read/write access to the checkpoint directory
-
Write access to the output destination directory
-
-
For Linux and macOS, use
ls -la
to check permissions. -
For Windows, review Security settings in the Properties folder.