Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

Usage

import * as mod from "https://aws-api.deno.dev/v0.3/services/lookoutequipment.ts?docs=full";

§Classes

LookoutEquipment

§Interfaces

CreateDatasetRequest
CreateDatasetResponse
CreateInferenceSchedulerRequest
CreateInferenceSchedulerResponse
CreateModelRequest
CreateModelResponse
DataIngestionJobSummary

Provides information about a specified data ingestion job, including dataset information, data ingestion configuration, and status.

DataPreProcessingConfiguration

The configuration is the TargetSamplingRate, which is the sampling rate of the data after post processing by Amazon Lookout for Equipment. For example, if you provide data that has been collected at a 1 second level and you want the system to resample the data at a 1 minute rate before training, the TargetSamplingRate is 1 minute.

DatasetSchema

Provides information about the data schema used with the given dataset.

DatasetSummary

Contains information about the specific data set, including name, ARN, and status.

DeleteDatasetRequest
DeleteInferenceSchedulerRequest
DeleteModelRequest
DescribeDataIngestionJobRequest
DescribeDataIngestionJobResponse
DescribeDatasetRequest
DescribeDatasetResponse
DescribeInferenceSchedulerRequest
DescribeInferenceSchedulerResponse
DescribeModelRequest
DescribeModelResponse
InferenceExecutionSummary

Contains information about the specific inference execution, including input and output data configuration, inference scheduling information, status, and so on.

InferenceInputConfiguration

Specifies configuration information for the input data for the inference, including S3 location of input data..

InferenceInputNameConfiguration

Specifies configuration information for the input data for the inference, including timestamp format and delimiter.

InferenceOutputConfiguration

Specifies configuration information for the output results from for the inference, including KMS key ID and output S3 location.

InferenceS3InputConfiguration

Specifies configuration information for the input data for the inference, including input data S3 location.

InferenceS3OutputConfiguration

Specifies configuration information for the output results from the inference, including output S3 location.

InferenceSchedulerSummary

Contains information about the specific inference scheduler, including data delay offset, model name and ARN, status, and so on.

IngestionInputConfiguration

Specifies configuration information for the input data for the data ingestion job, including input data S3 location.

IngestionS3InputConfiguration

Specifies S3 configuration information for the input data for the data ingestion job.

LabelsInputConfiguration

Contains the configuration information for the S3 location being used to hold label data.

LabelsS3InputConfiguration

The location information (prefix and bucket name) for the s3 location being used for label data.

ListDataIngestionJobsRequest
ListDataIngestionJobsResponse
ListDatasetsRequest
ListDatasetsResponse
ListInferenceExecutionsRequest
ListInferenceExecutionsResponse
ListInferenceSchedulersRequest
ListInferenceSchedulersResponse
ListModelsRequest
ListModelsResponse
ListTagsForResourceRequest
ListTagsForResourceResponse
ModelSummary

Provides information about the specified ML model, including dataset and model names and ARNs, as well as status.

S3Object

Contains information about an S3 bucket.

StartDataIngestionJobRequest
StartDataIngestionJobResponse
StartInferenceSchedulerRequest
StartInferenceSchedulerResponse
StopInferenceSchedulerRequest
StopInferenceSchedulerResponse
Tag

A tag is a key-value pair that can be added to a resource as metadata.

TagResourceRequest
UntagResourceRequest
UpdateInferenceSchedulerRequest

§Type Aliases

DatasetStatus
DataUploadFrequency
InferenceExecutionStatus
InferenceSchedulerStatus
IngestionJobStatus
ModelStatus
TargetSamplingRate