Resource Type definition for AWS::SageMaker::InferenceExperiment
endpoint_name
(String) The name of the endpoint used to run the inference experiment.model_variants
(Attributes List) An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant. (see below for nested schema)name
(String) The name for the inference experiment.role_arn
(String) The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.type
(String) The type of the inference experiment that you want to run.data_storage_config
(Attributes) The Amazon S3 location and configuration for storing inference request and response data. (see below for nested schema)description
(String) The description of the inference experiment.desired_state
(String) The desired state of the experiment after starting or stopping operation.kms_key
(String) The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.schedule
(Attributes) The duration for which you want the inference experiment to run. (see below for nested schema)shadow_mode_config
(Attributes) The configuration of ShadowMode inference experiment type. Use this field to specify a production variant which takes all the inference requests, and a shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant also specify the percentage of requests that Amazon SageMaker replicates. (see below for nested schema)status_reason
(String) The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.tags
(Attributes List) An array of key-value pairs to apply to this resource. (see below for nested schema)arn
(String) The Amazon Resource Name (ARN) of the inference experiment.creation_time
(String) The timestamp at which you created the inference experiment.endpoint_metadata
(Attributes) The metadata of the endpoint on which the inference experiment ran. (see below for nested schema)id
(String) Uniquely identifies the resource.last_modified_time
(String) The timestamp at which you last modified the inference experiment.status
(String) The status of the inference experiment.model_variants
Required:
infrastructure_config
(Attributes) The configuration for the infrastructure that the model will be deployed to. (see below for nested schema)model_name
(String) The name of the Amazon SageMaker Model entity.variant_name
(String) The name of the variant.model_variants.infrastructure_config
Required:
infrastructure_type
(String) The type of the inference experiment that you want to run.real_time_inference_config
(Attributes) The infrastructure configuration for deploying the model to a real-time inference endpoint. (see below for nested schema)model_variants.infrastructure_config.real_time_inference_config
Required:
instance_count
(Number) The number of instances of the type specified by InstanceType.instance_type
(String) The instance type the model is deployed to.data_storage_config
Required:
destination
(String) The Amazon S3 bucket where the inference request and response data is stored.Optional:
content_type
(Attributes) Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data. (see below for nested schema)kms_key
(String) The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.data_storage_config.content_type
Optional:
csv_content_types
(List of String) The list of all content type headers that SageMaker will treat as CSV and capture accordingly.json_content_types
(List of String) The list of all content type headers that SageMaker will treat as JSON and capture accordingly.schedule
Optional:
end_time
(String) The timestamp at which the inference experiment ended or will end.start_time
(String) The timestamp at which the inference experiment started or will start.shadow_mode_config
Required:
shadow_model_variants
(Attributes List) List of shadow variant configurations. (see below for nested schema)source_model_variant_name
(String) The name of the production variant, which takes all the inference requests.shadow_mode_config.shadow_model_variants
Required:
sampling_percentage
(Number) The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.shadow_model_variant_name
(String) The name of the shadow variant.tags
Required:
key
(String) The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.value
(String) The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.endpoint_metadata
Read-Only:
endpoint_config_name
(String) The name of the endpoint configuration.endpoint_name
(String) The name of the endpoint used to run the inference experiment.endpoint_status
(String) The status of the endpoint. For possible values of the status of an endpoint.Import is supported using the following syntax:
$ terraform import awscc_sagemaker_inference_experiment.example <resource ID>