Create Storage Connector for Feature Store Data

Update Storage Connector: {{newStorageConnectorCtrl.storageConnectorName}}

{{newStorageConnectorCtrl.accordion1.title}} {{newStorageConnectorCtrl.accordion1.value}}

Storage Connector name shouldn't be empty and can only contain alphanumeric characters and underscores, maximum length is {{newStorageConnectorCtrl.storageConnectorNameMaxLength}} characters.

The provided storage connector name already exist in this feature store.

{{newStorageConnectorCtrl.accordion2.title}} {{newStorageConnectorCtrl.accordion2.value}}
{{newStorageConnectorCtrl.accordion3.title}} {{newStorageConnectorCtrl.accordion3.value}}

The JDBC connection string should not be empty and should be less than {{newStorageConnectorCtrl.jdbcStorageConnectorConnectionStringMaxLength}} characters"

{{"" + "JDBC argument name should not be empty"}}

JDBC connector argument names should be unique.

The S3 Bucket should not be empty and should be less than {{newStorageConnectorCtrl.s3StorageConnectorBucketMaxLength}} characters
{{m.algorithm}}

Please select encryption algorithm

The S3 server encryption key should not be empty and longer than {{newStorageConnectorCtrl.s3ServerEncryptionKeyMaxLength}} characters

The Access Key should not be empty and longer than {{newStorageConnectorCtrl.s3StorageConnectorAccesskeyMaxLength}} characters

The Secret Key should not be empty and longer than {{newStorageConnectorCtrl.s3StorageConnectorSecretkeyMaxLength}} characters

You need to select a role to use for temporary credentials.

{{$select.selected.name}}

You must select a HopsFS Dataset

Redshift connector property names should be unique.

{{newStorageConnectorCtrl.accordion4.title}} {{newStorageConnectorCtrl.accordion4.value}}