AWS Tools for Windows PowerShell
Command Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Synopsis

Calls the Amazon SageMaker Service UpdateInferenceComponent API operation.

Syntax

Update-SMInferenceComponent
-InferenceComponentName <String>
-AutoRollbackConfiguration_Alarm <Alarm[]>
-Container_ArtifactUrl <String>
-Specification_BaseInferenceComponentName <String>
-StartupParameters_ContainerStartupHealthCheckTimeoutInSecond <Int32>
-RuntimeConfig_CopyCount <Int32>
-DataCacheConfig_EnableCaching <Boolean>
-Container_Environment <Hashtable>
-Container_Image <String>
-RollingUpdatePolicy_MaximumExecutionTimeoutInSecond <Int32>
-ComputeResourceRequirements_MaxMemoryRequiredInMb <Int32>
-ComputeResourceRequirements_MinMemoryRequiredInMb <Int32>
-StartupParameters_ModelDataDownloadTimeoutInSecond <Int32>
-Specification_ModelName <String>
-ComputeResourceRequirements_NumberOfAcceleratorDevicesRequired <Single>
-ComputeResourceRequirements_NumberOfCpuCoresRequired <Single>
-MaximumBatchSize_Type <InferenceComponentCapacitySizeType>
-RollbackMaximumBatchSize_Type <InferenceComponentCapacitySizeType>
-MaximumBatchSize_Value <Int32>
-RollbackMaximumBatchSize_Value <Int32>
-RollingUpdatePolicy_WaitIntervalInSecond <Int32>
-Select <String>
-PassThru <SwitchParameter>
-Force <SwitchParameter>
-ClientConfig <AmazonSageMakerConfig>

Description

Updates an inference component.

Parameters

-AutoRollbackConfiguration_Alarm <Alarm[]>
List of CloudWatch alarms in your account that are configured to monitor metrics on an endpoint. If any alarms are tripped during a deployment, SageMaker rolls back the deployment.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_AutoRollbackConfiguration_Alarms
-ClientConfig <AmazonSageMakerConfig>
Amazon.PowerShell.Cmdlets.SM.AmazonSageMakerClientCmdlet.ClientConfig
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ComputeResourceRequirements_MaxMemoryRequiredInMb <Int32>
The maximum MB of memory to allocate to run a model that you assign to an inference component.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_ComputeResourceRequirements_MaxMemoryRequiredInMb
-ComputeResourceRequirements_MinMemoryRequiredInMb <Int32>
The minimum MB of memory to allocate to run a model that you assign to an inference component.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_ComputeResourceRequirements_MinMemoryRequiredInMb
-ComputeResourceRequirements_NumberOfAcceleratorDevicesRequired <Single>
The number of accelerators to allocate to run a model that you assign to an inference component. Accelerators include GPUs and Amazon Web Services Inferentia.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_ComputeResourceRequirements_NumberOfAcceleratorDevicesRequired
-ComputeResourceRequirements_NumberOfCpuCoresRequired <Single>
The number of CPU cores to allocate to run a model that you assign to an inference component.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_ComputeResourceRequirements_NumberOfCpuCoresRequired
-Container_ArtifactUrl <String>
The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_Container_ArtifactUrl
-Container_Environment <Hashtable>
The environment variables to set in the Docker container. Each key and value in the Environment string-to-string map can have length of up to 1024. We support up to 16 entries in the map.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_Container_Environment
-Container_Image <String>
The Amazon Elastic Container Registry (Amazon ECR) path where the Docker image for the model is stored.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_Container_Image
-DataCacheConfig_EnableCaching <Boolean>
Sets whether the endpoint that hosts the inference component caches the model artifacts and container image.With caching enabled, the endpoint caches this data in each instance that it provisions for the inference component. That way, the inference component deploys faster during the auto scaling process. If caching isn't enabled, the inference component takes longer to deploy because of the time it spends downloading the data.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_DataCacheConfig_EnableCaching
This parameter overrides confirmation prompts to force the cmdlet to continue its operation. This parameter should always be used with caution.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-InferenceComponentName <String>
The name of the inference component.
Required?True
Position?1
Accept pipeline input?True (ByValue, ByPropertyName)
-MaximumBatchSize_Type <InferenceComponentCapacitySizeType>
Specifies the endpoint capacity type.
COPY_COUNT
The endpoint activates based on the number of inference component copies.
CAPACITY_PERCENT
The endpoint activates based on the specified percentage of capacity.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_MaximumBatchSize_Type
-MaximumBatchSize_Value <Int32>
Defines the capacity size, either as a number of inference component copies or a capacity percentage.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_MaximumBatchSize_Value
-PassThru <SwitchParameter>
Changes the cmdlet behavior to return the value passed to the InferenceComponentName parameter. The -PassThru parameter is deprecated, use -Select '^InferenceComponentName' instead. This parameter will be removed in a future version.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-RollbackMaximumBatchSize_Type <InferenceComponentCapacitySizeType>
Specifies the endpoint capacity type.
COPY_COUNT
The endpoint activates based on the number of inference component copies.
CAPACITY_PERCENT
The endpoint activates based on the specified percentage of capacity.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_RollbackMaximumBatchSize_Type
-RollbackMaximumBatchSize_Value <Int32>
Defines the capacity size, either as a number of inference component copies or a capacity percentage.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_RollbackMaximumBatchSize_Value
-RollingUpdatePolicy_MaximumExecutionTimeoutInSecond <Int32>
The time limit for the total deployment. Exceeding this limit causes a timeout.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_MaximumExecutionTimeoutInSeconds
-RollingUpdatePolicy_WaitIntervalInSecond <Int32>
The length of the baking period, during which SageMaker AI monitors alarms for each batch on the new fleet.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesDeploymentConfig_RollingUpdatePolicy_WaitIntervalInSeconds
-RuntimeConfig_CopyCount <Int32>
The number of runtime copies of the model container to deploy with the inference component. Each copy can serve inference requests.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Select <String>
Use the -Select parameter to control the cmdlet output. The default value is 'InferenceComponentArn'. Specifying -Select '*' will result in the cmdlet returning the whole service response (Amazon.SageMaker.Model.UpdateInferenceComponentResponse). Specifying the name of a property of type Amazon.SageMaker.Model.UpdateInferenceComponentResponse will result in that property being returned. Specifying -Select '^ParameterName' will result in the cmdlet returning the selected cmdlet parameter value.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Specification_BaseInferenceComponentName <String>
The name of an existing inference component that is to contain the inference component that you're creating with your request.Specify this parameter only if your request is meant to create an adapter inference component. An adapter inference component contains the path to an adapter model. The purpose of the adapter model is to tailor the inference output of a base foundation model, which is hosted by the base inference component. The adapter inference component uses the compute resources that you assigned to the base inference component.When you create an adapter inference component, use the Container parameter to specify the location of the adapter artifacts. In the parameter value, use the ArtifactUrl parameter of the InferenceComponentContainerSpecification data type.Before you can create an adapter inference component, you must have an existing inference component that contains the foundation model that you want to adapt.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Specification_ModelName <String>
The name of an existing SageMaker AI model object in your account that you want to deploy with the inference component.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-StartupParameters_ContainerStartupHealthCheckTimeoutInSecond <Int32>
The timeout value, in seconds, for your inference container to pass health check by Amazon S3 Hosting. For more information about health check, see How Your Container Should Respond to Health Check (Ping) Requests.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_StartupParameters_ContainerStartupHealthCheckTimeoutInSeconds
-StartupParameters_ModelDataDownloadTimeoutInSecond <Int32>
The timeout value, in seconds, to download and extract the model that you want to host from Amazon S3 to the individual inference instance associated with this inference component.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSpecification_StartupParameters_ModelDataDownloadTimeoutInSeconds

Common Credential and Region Parameters

-AccessKey <String>
The AWS access key for the user account. This can be a temporary access key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAK
-Credential <AWSCredentials>
An AWSCredentials object instance containing access and secret key information, and optionally a token for session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-EndpointUrl <String>
The endpoint to make the call against.Note: This parameter is primarily for internal AWS use and is not required/should not be specified for normal usage. The cmdlets normally determine which endpoint to call based on the region specified to the -Region parameter or set as default in the shell (via Set-DefaultAWSRegion). Only specify this parameter if you must direct the call to a specific custom endpoint.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-NetworkCredential <PSCredential>
Used with SAML-based authentication when ProfileName references a SAML role profile. Contains the network credentials to be supplied during authentication with the configured identity provider's endpoint. This parameter is not required if the user's default network identity can or should be used during authentication.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-ProfileLocation <String>
Used to specify the name and location of the ini-format credential file (shared with the AWS CLI and other AWS SDKs)If this optional parameter is omitted this cmdlet will search the encrypted credential file used by the AWS SDK for .NET and AWS Toolkit for Visual Studio first. If the profile is not found then the cmdlet will search in the ini-format credential file at the default location: (user's home directory)\.aws\credentials.If this parameter is specified then this cmdlet will only search the ini-format credential file at the location given.As the current folder can vary in a shell or during script execution it is advised that you use specify a fully qualified path instead of a relative path.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAWSProfilesLocation, ProfilesLocation
-ProfileName <String>
The user-defined name of an AWS credentials or SAML-based role profile containing credential information. The profile is expected to be found in the secure credential file shared with the AWS SDK for .NET and AWS Toolkit for Visual Studio. You can also specify the name of a profile stored in the .ini-format credential file used with the AWS CLI and other AWS SDKs.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesStoredCredentials, AWSProfileName
-Region <Object>
The system name of an AWS region or an AWSRegion instance. This governs the endpoint that will be used when calling service operations. Note that the AWS resources referenced in a call are usually region-specific.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRegionToCall
-SecretKey <String>
The AWS secret key for the user account. This can be a temporary secret key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSK, SecretAccessKey
-SessionToken <String>
The session token if the access and secret keys are temporary session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesST

Outputs

This cmdlet returns a System.String object. The service call response (type Amazon.SageMaker.Model.UpdateInferenceComponentResponse) can be returned by specifying '-Select *'.

Supported Version

AWS Tools for PowerShell: 2.x.y.z