Salesforce - Amazon AppFlow

Salesforce

The following are the requirements and connection instructions for using Salesforce with Amazon AppFlow.

Note

You can use Salesforce as a source or destination.

Requirements

  • Your Salesforce account must be enabled for API access. API access is enabled by default for the Enterprise, Unlimited, Developer, and Performance editions.

  • Your Salesforce account must allow you to install connected apps. If this functionality is disabled, contact your Salesforce administrator. After you create a Salesforce connection in Amazon AppFlow, verify that the connected app named Amazon AppFlow Embedded Login App is installed in your Salesforce account.

  • The refresh token policy for the Amazon AppFlow Embedded Login App must be set to Refresh token is valid until revoked. Otherwise, your flows will fail when your refresh token expires. For more information on how to check and edit the refresh token policy, see Manage OAuth Access Policies for a Connected App in the Salesforce documentation.

  • You must enable change data capture in Salesforce to use event-driven flow triggers. For more information on how to enable this, see Select Objects for Change Notifications in the User Interface in the Salesforce documentation.

  • If your Salesforce app enforces IP address restrictions, you must grant access to the addresses used by Amazon AppFlow. For more information, see AWS IP address ranges in the Amazon Web Services General Reference.

  • To create private connections using AWS PrivateLink, you must enable both Manager Metadata and Manage External Connections user permissions in your Salesforce account. Private connections are currently available in the us-east-1, us-west-2, ap-northeast-1, ap-south-1, ap-southeast-2, ca-central-1, and eu-central-1 AWS Regions.

Connection instructions

To connect to Salesforce while creating a flow
  1. Sign in to the AWS Management Console and open the Amazon AppFlow console at https://console.aws.amazon.com/appflow/.

  2. Choose Create flow.

  3. For Flow details, enter a name and description for the flow.

  4. (Optional) To use a customer managed CMK instead of the default AWS managed CMK, choose Data encryption, Customize encryption settings and then choose an existing CMK or create a new one.

  5. (Optional) To add a tag, choose Tags, Add tag and then enter the key name and value.

  6. Choose Next.

  7. Choose Salesforce from the Source name or Destination name dropdown list.

  8. Choose Create new connection to open the Connect to Salesforce window.

    1. Under Salesforce environment, choose Production to log into your developer account.

    2. Under PrivateLink, choose Enabled if you want to connect to your Salesforce account privately through an AWS PrivateLink connection. Otherwise, leave this open set to Disabled.

    3. Under Data encryption, enter your AWS KMS key.

    4. Under Connection name, specify a name for your connection.

    5. Choose Connect.

  9. You will be redirected to the Salesforce login page. When prompted, grant Amazon AppFlow permissions to access your Salesforce account.

  10. After you log in, you will see the objects that you enabled in your Salesforce account in the Choose Salesforce object dropdown list.

Now that you are connected to your Salesforce account, you can continue with the flow creation steps as described in Creating flows in Amazon AppFlow.

Tip

If you aren’t connected successfully, ensure that you have followed the instructions in the Requirements section above.

AWS PrivateLink connections

If you enabled the option to connect to Salesforce through AWS PrivateLink, wait for Amazon AppFlow to set up the private connection before you finish creating your flow. To set up the connection, Amazon AppFlow provisions an interface VPC endpoint and attempts to connect to your VPC endpoint service. This can take several minutes. During that time, the Amazon AppFlow console shows the status message "Private connection is being provisioned." When the connection process completes, the message changes to "Private connection created successfully." Until the process completes, your flow can't transfer your Salesforce objects.

For more information about AWS PrivateLink, see the AWS PrivateLink Guide.

Use a global connected app with Amazon AppFlow

  • You can use your own global connected app for Salesforce with Amazon AppFlow APIs. For instructions on how to create a connected app in Salesforce, see Create a global connected app in Salesforce.

  • To use your own global connected app, you need to pass on the clientId, clientSecret, and Secrets Manager secret ARN to Amazon AppFlow.

    • The following example shows a sample Secrets Manager secret with application credentials for Salesforce:

      { "clientCredsARN": "arn:aws:secretsmanager:region:SecretID:secret:Secret_Key", "Name": "Salesforce", "VersionId": "db83aeb0-e995-480a-81f3-8805b0bf2b79", "SecretString": "{\"clientId\":\"sampleClientId\",\"clientSecret\":\"sampleClientSecret\"}" }
    • This example shows how you can call the ConnectorProfile API with an access token, refresh token, and credentials ARN:

      { "connectorProfileName": "testSalesforceProfileNew", "kmsArn": null, "connectorType": "Salesforce", "connectionMode": "Public", "connectorProfileConfig": { "connectorProfileProperties": { "salesforce": { "instanceUrl": "InstanceURL", "isSandboxEnvironment": false } } } }, "connectorProfileCredentials": { "salesforce": { "clientCredsARN": "arn:aws:secretsmanager:region:SecretID:secret:Secret_Key",** "accessToken": "testAccessToken", "refreshToken": "testRefreshToken", "oauthRequest": { "authCode": null, "redirectUri": null } } }
  • You must attach a resource policy to the Secrets Manager secret and the KMS key which is used encrypt the secret. This resource policy allows Amazon AppFlow to read the secret and use it.

    • The following is the policy to be attached for the KMS key. Replace the placeholder with your own information.

      { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "appflow.amazonaws.com" }, "Action": [ "kms:Encrypt", "kms:GenerateDataKey", "kms:Decrypt" ], "Resource": "<KMS key ARN>" } ] }

      Additionally, supports adding confused deputy protection to this KMS key policy. To learn about the confused deputy problem and mitigations, refer to our Amazon S3 documentation. The following example shows how you can use the aws:SourceArn and aws:SourceAccount global condition context keys in your AWS KMS key to prevent the confused deputy problem. Replace Account ID with your AWS account ID and Resource ARNs with a list of ARNs for any connector profiles created with the client credentials secret. Additionally you may use wildcards in the aws:SourceAccount key (*). For example, you can replace Resource ARNs with arn:aws:appflow:region:accountId:* to give access to all Amazon AppFlow created resources created on your behalf.

      { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "appflow.amazonaws.com" }, "Action": [ "kms:Encrypt", "kms:GenerateDataKey", "kms:Decrypt" ], "Resource": "<KMS key ARN>", "Condition": { "StringEquals": { "aws:SourceAccount":"<Account ID>" }, "ArnLike": { "aws:SourceArn":"<Resource ARNs>" } } } ] }
    • The following is the policy to be attached for the secret. Replace the placeholder with your own information.

      { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "appflow.amazonaws.com" }, "Action": "secretsmanager:GetSecretValue", "Resource": "<Secret ARN>" } ] }

Create a global connected app in Salesforce

Follow these instructions to create a connected app in Salesforce if you haven't done so already.

To create a global connected app in Salesforce
  1. Log in to Salesforce with an account that has administrator rights, and go to Setup.

  2. In the navigation pane under Platform Tools, expand Apps and choose App Manager.

  3. Choose New Connected App in the upper-right corner, and enter the following information for your connected app:

    • The name of your connected app, such as "Amazon AppFlow Embedded Login App".

    • The API name for your connected app. This is auto-generated and can be edited, if needed.

    • The contact email address for Salesforce to use if they need to contact you about your connected app.

    • The logo image URL and icon, if you have one. This is optional.

    • A brief description to specify what the connected app is for, such as "Application which handles interaction between Salesforce and Amazon AppFlow console".

  4. Select the Enable OAuth Settings check box.

  5. In the Callback URL text field, enter one ore more redirect URLs for Amazon AppFlow. Enter these URLs on separate lines.

    Redirect URLs have the following format:

    https://region.console.aws.amazon.com/appflow/oauth

    In this URL, region is the code for the AWS Region where you use Amazon AppFlow to transfer data from Salesforce. For example, the code for the US East (N. Virginia) Region is us-east-1. For that Region, the URL is the following:

    https://us-east-1.console.aws.amazon.com/appflow/oauth

    For the AWS Regions that Amazon AppFlow supports, and their codes, see Amazon AppFlow endpoints and quotas in the AWS General Reference.

  6. Select the Require Secret for Web Server Flow check box.

  7. In the Available OAuth Scopes list, select the following items and then choose add to move them to the Selected OAuth Scopes list. You can customize this list as needed.

    • Manage user data via APIs (api)

    • Access custom permissions (custom_permissions)

    • Access the identity URL service (id, profile, email, address, phone)

    • Access unique user identifiers (openid)

    • Perform requests at any time (refresh_token, offline_access)

  8. Choose Save.

To retrieve the client ID and client secret for use in your OAuth flow, you can view your connected app in Salesforce by choosing Apps and then App Manager, and then selecting the connected app that you created.

For more information on connected apps in Salesforce, see Connected Apps in the Salesforce documentation.

Additional flow settings for Salesforce

When you configure a flow that transfers data to or from Salesforce, the Amazon AppFlow console shows some unique settings that aren't available for flows that don't use Salesforce.

Salesforce API preference

When you use Salesforce as the source or destination, you can configure the Salesforce API preference setting. Use this setting to specify what Salesforce API Amazon AppFlow uses when your flow transfers data to or from Salesforce. Your choice optimizes your flow for small to medium-sized data transfers, large data transfers, or both.

The Amazon AppFlow console provides this setting on the Configure flow page under Source details or Destination details. To view it, expand the Additional settings section.


          The options for Salesforce API preference on the Configure flow page.

You can choose one of the following options:

  • Automatic (default) — For each flow run, Amazon AppFlow selects the API to use based on the number of records that the run transfers. The threshold of records that determines the API varies based on whether Salesforce is the source or the destination, as shown in the following table:

    Is Salesforce the source or destination?

    Number of records transferred

    API used to transfer records

    Source

    Fewer than 1,000,000

    Salesforce REST API

    1,000,000 or more

    Salesforce Bulk API 2.0

    Destination

    Fewer than 1,000

    Salesforce REST API

    1,000 or more

    Salesforce Bulk API 2.0

    Notes
    • If you choose this option, be aware that each of the potential Salesforce APIs structures data differently. For recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and how null values are represented.

    • Flow runs that use Bulk API 2.0 can't transfer Salesforce compound fields.

    If you choose this option, you optimize flow performance for all data transfer sizes, but the tradeoff is inconsistent formatting in the output.

  • Standard — Amazon AppFlow uses only Salesforce REST API. This option optimizes your flow for small- to medium-sized data transfers. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0.

    Note

    If you choose this option and your flow attempts to transfer a vary large set of data, it might fail with a timeout error.

  • Bulk — Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. If you choose this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.

    Note

    If you choose this option, your flow can't transfer Salesforce compound fields because Bulk API 2.0 doesn't support them.

Salesforce destination record preference

When you use Salesforce as a destination, the Amazon AppFlow console shows additional settings on the Map data fields page under Destination record preference.


        The options for Destination record preference on the Map data fields page.

You can choose one of the following options:

Insert new records

This is the default data transfer option. When you choose this setting, Amazon AppFlow inserts your source data into the chosen Salesforce object as a new record.

Update existing records

When you choose this setting, Amazon AppFlow uses your source data to update existing records in Salesforce. For every source record, Amazon AppFlow looks for a matching record in Salesforce based on your criteria. You can specify matching criteria on the Map data fields page. To do so, select a field in the source application and map it to a Salesforce record ID field with the dropdown list.

When a matching record is found, Amazon AppFlow updates the record in Salesforce. If no matching record is found, Amazon AppFlow ignores the record or fails the flow per your chosen error handling option. You can specify your error handling preferences on the Configure flow page.

Please note that you must use the upsert operation in order to update existing records using an external id field. The standard update operation does not support use of an external id field.

Upsert records

When you choose this setting, Amazon AppFlow performs an upsert operation in Salesforce. For every source record, Amazon AppFlow looks for a matching record in Salesforce based on your criteria. You can specify matching criteria on the Map data fields page. To do so, select a field in the source application and map it to a Salesforce external field using the dropdown list.

When Amazon AppFlow finds a matching record , Amazon AppFlow updates the record in Salesforce. If Amazon AppFlow finds no matching record, Amazon AppFlow inserts the data as a new record. Any errors in performing the operation are handled paccording to your chosen error handling option. You can specify your error handling preferences on the Configure flow page.

Delete existing records

When you choose this setting, Amazon AppFlow deletes Salesforce records that you specify. To specify the records, create a file that contains the IDs that Salesforce assigned to them. Provide that file as the source data for your flow.

For example, the following CSV file lists the IDs of two Salesforce records to delete.

salesforce_id A1B2C3D4E5F6G7H8I9 J1K2L3M4N5O6P7Q9R0

In this example, the IDs appear under the one source field for the file, salesforce_id.

In your flow definition, you must specify the source field that contains the IDs of the objects to delete. You do this when you map data fields. At that point, you map the source field to the corresponding destination field in Salesforce. For example, if you assigned the Salesforce object Opportunity to your flow, then the destination field name is Opportunity ID.

You can provide a source data file that has other fields besides the one with the IDs, but Amazon AppFlow ignores them.

Each flow can delete only one type of object, which is the Salesforce object that you choose when you configure the destination details.

After your flow runs, you can view the records that it deleted in your Salesforce recycle bin. You can recover your files from the recycle bin if needed. However, you must do so before its retention period elapses or before the files are manually purged.

If any errors occur when you run the flow, Amazon AppFlow handles them according to the error handling option that you choose when you configure the flow.

Notes

  • If you are transferring more than 1 million Salesforce records, you cannot choose any Salesforce compound field. Amazon AppFlow uses Salesforce bulk APIs for the transfer, which does not allow the transfer of compound fields.

  • Amazon AppFlow only supports the automatic import of newly created Salesforce fields into Amazon S3 without requiring the user to update their flow configurations.

  • When you use Salesforce as a source, you can import 15 GB of data as part of a single flow run. To transfer over 15 GB of data, you can split your workload into multiple flows by applying the appropriate filters to each flow. Salesforce records are typically 2 KB in size, but can be up to 4 KB. Therefore, 15 GB would be approximately 7.5 million Salesforce records.

  • When you use Salesforce as a source, you can run schedule-triggered flows at a maximum frequency of one flow run per minute.

  • Amazon AppFlow added support for Salesforce API version 55.0 on August 30th, 2022. Flows associated with all Salesforce connections created after this date will use Salesforce API version 55.0. Flows created before this date but after January 19th, 2021, will use Salesforce API version 50.0, while any flows created before January 19th, 2021, will use Salesforce API version 47.0.

  • Amazon AppFlow supports Change Data Capture Events and Platform events from Salesforce.

Related resources