@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class InferenceComponentSpecification extends Object implements Serializable, Cloneable, StructuredPojo
Details about the resources to deploy with this inference component, including the model, container, and compute resources.
Constructor and Description |
---|
InferenceComponentSpecification() |
Modifier and Type | Method and Description |
---|---|
InferenceComponentSpecification |
clone() |
boolean |
equals(Object obj) |
InferenceComponentComputeResourceRequirements |
getComputeResourceRequirements()
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentContainerSpecification |
getContainer()
Defines a container that provides the runtime environment for a model that you deploy with an inference
component.
|
String |
getModelName()
The name of an existing SageMaker model object in your account that you want to deploy with the inference
component.
|
InferenceComponentStartupParameters |
getStartupParameters()
Settings that take effect while the model container starts up.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
void |
setContainer(InferenceComponentContainerSpecification container)
Defines a container that provides the runtime environment for a model that you deploy with an inference
component.
|
void |
setModelName(String modelName)
The name of an existing SageMaker model object in your account that you want to deploy with the inference
component.
|
void |
setStartupParameters(InferenceComponentStartupParameters startupParameters)
Settings that take effect while the model container starts up.
|
String |
toString()
Returns a string representation of this object.
|
InferenceComponentSpecification |
withComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
|
InferenceComponentSpecification |
withContainer(InferenceComponentContainerSpecification container)
Defines a container that provides the runtime environment for a model that you deploy with an inference
component.
|
InferenceComponentSpecification |
withModelName(String modelName)
The name of an existing SageMaker model object in your account that you want to deploy with the inference
component.
|
InferenceComponentSpecification |
withStartupParameters(InferenceComponentStartupParameters startupParameters)
Settings that take effect while the model container starts up.
|
public void setModelName(String modelName)
The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
modelName
- The name of an existing SageMaker model object in your account that you want to deploy with the inference
component.public String getModelName()
The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
public InferenceComponentSpecification withModelName(String modelName)
The name of an existing SageMaker model object in your account that you want to deploy with the inference component.
modelName
- The name of an existing SageMaker model object in your account that you want to deploy with the inference
component.public void setContainer(InferenceComponentContainerSpecification container)
Defines a container that provides the runtime environment for a model that you deploy with an inference component.
container
- Defines a container that provides the runtime environment for a model that you deploy with an inference
component.public InferenceComponentContainerSpecification getContainer()
Defines a container that provides the runtime environment for a model that you deploy with an inference component.
public InferenceComponentSpecification withContainer(InferenceComponentContainerSpecification container)
Defines a container that provides the runtime environment for a model that you deploy with an inference component.
container
- Defines a container that provides the runtime environment for a model that you deploy with an inference
component.public void setStartupParameters(InferenceComponentStartupParameters startupParameters)
Settings that take effect while the model container starts up.
startupParameters
- Settings that take effect while the model container starts up.public InferenceComponentStartupParameters getStartupParameters()
Settings that take effect while the model container starts up.
public InferenceComponentSpecification withStartupParameters(InferenceComponentStartupParameters startupParameters)
Settings that take effect while the model container starts up.
startupParameters
- Settings that take effect while the model container starts up.public void setComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
computeResourceRequirements
- The compute resources allocated to run the model assigned to the inference component.public InferenceComponentComputeResourceRequirements getComputeResourceRequirements()
The compute resources allocated to run the model assigned to the inference component.
public InferenceComponentSpecification withComputeResourceRequirements(InferenceComponentComputeResourceRequirements computeResourceRequirements)
The compute resources allocated to run the model assigned to the inference component.
computeResourceRequirements
- The compute resources allocated to run the model assigned to the inference component.public String toString()
toString
in class Object
Object.toString()
public InferenceComponentSpecification clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.