amazonka-dms-2.0: Amazon Database Migration Service SDK.
Copyright(c) 2013-2023 Brendan Hay
LicenseMozilla Public License, v. 2.0.
MaintainerBrendan Hay
Stabilityauto-generated
Portabilitynon-portable (GHC extensions)
Safe HaskellSafe-Inferred
LanguageHaskell2010

Amazonka.DMS.Types.KafkaSettings

Description

 
Synopsis

Documentation

data KafkaSettings Source #

Provides information that describes an Apache Kafka endpoint. This information includes the output format of records applied to the endpoint and details of transaction and control table data information.

See: newKafkaSettings smart constructor.

Constructors

KafkaSettings' 

Fields

  • broker :: Maybe Text

    A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

  • includeControlDetails :: Maybe Bool

    Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

  • includeNullAndEmpty :: Maybe Bool

    Include NULL and empty columns for records migrated to the endpoint. The default is false.

  • includePartitionValue :: Maybe Bool

    Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

  • includeTableAlterOperations :: Maybe Bool

    Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

  • includeTransactionDetails :: Maybe Bool

    Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

  • messageFormat :: Maybe MessageFormatValue

    The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

  • messageMaxBytes :: Maybe Int

    The maximum size in bytes for records created on the endpoint The default is 1,000,000.

  • noHexPrefix :: Maybe Bool

    Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

  • partitionIncludeSchemaTable :: Maybe Bool

    Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

  • saslPassword :: Maybe (Sensitive Text)

    The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

  • saslUsername :: Maybe Text

    The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

  • securityProtocol :: Maybe KafkaSecurityProtocol

    Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

  • sslCaCertificateArn :: Maybe Text

    The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

  • sslClientCertificateArn :: Maybe Text

    The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

  • sslClientKeyArn :: Maybe Text

    The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

  • sslClientKeyPassword :: Maybe (Sensitive Text)

    The password for the client private key used to securely connect to a Kafka target endpoint.

  • topic :: Maybe Text

    The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.

Instances

Instances details
FromJSON KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

ToJSON KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Generic KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Associated Types

type Rep KafkaSettings :: Type -> Type #

Show KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

NFData KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Methods

rnf :: KafkaSettings -> () #

Eq KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Hashable KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

type Rep KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

type Rep KafkaSettings = D1 ('MetaData "KafkaSettings" "Amazonka.DMS.Types.KafkaSettings" "amazonka-dms-2.0-LVCLJv4CY4nJuf0WXCDs3O" 'False) (C1 ('MetaCons "KafkaSettings'" 'PrefixI 'True) ((((S1 ('MetaSel ('Just "broker") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: S1 ('MetaSel ('Just "includeControlDetails") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool))) :*: (S1 ('MetaSel ('Just "includeNullAndEmpty") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "includePartitionValue") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)))) :*: ((S1 ('MetaSel ('Just "includeTableAlterOperations") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "includeTransactionDetails") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool))) :*: (S1 ('MetaSel ('Just "messageFormat") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe MessageFormatValue)) :*: (S1 ('MetaSel ('Just "messageMaxBytes") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Int)) :*: S1 ('MetaSel ('Just "noHexPrefix") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)))))) :*: (((S1 ('MetaSel ('Just "partitionIncludeSchemaTable") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "saslPassword") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe (Sensitive Text)))) :*: (S1 ('MetaSel ('Just "saslUsername") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: S1 ('MetaSel ('Just "securityProtocol") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe KafkaSecurityProtocol)))) :*: ((S1 ('MetaSel ('Just "sslCaCertificateArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: S1 ('MetaSel ('Just "sslClientCertificateArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text))) :*: (S1 ('MetaSel ('Just "sslClientKeyArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: (S1 ('MetaSel ('Just "sslClientKeyPassword") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe (Sensitive Text))) :*: S1 ('MetaSel ('Just "topic") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text))))))))

newKafkaSettings :: KafkaSettings Source #

Create a value of KafkaSettings with all optional fields omitted.

Use generic-lens or optics to modify other optional fields.

The following record fields are available, with the corresponding lenses provided for backwards compatibility:

$sel:broker:KafkaSettings', kafkaSettings_broker - A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

$sel:includeControlDetails:KafkaSettings', kafkaSettings_includeControlDetails - Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

$sel:includeNullAndEmpty:KafkaSettings', kafkaSettings_includeNullAndEmpty - Include NULL and empty columns for records migrated to the endpoint. The default is false.

$sel:includePartitionValue:KafkaSettings', kafkaSettings_includePartitionValue - Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

$sel:includeTableAlterOperations:KafkaSettings', kafkaSettings_includeTableAlterOperations - Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

$sel:includeTransactionDetails:KafkaSettings', kafkaSettings_includeTransactionDetails - Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

$sel:messageFormat:KafkaSettings', kafkaSettings_messageFormat - The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

$sel:messageMaxBytes:KafkaSettings', kafkaSettings_messageMaxBytes - The maximum size in bytes for records created on the endpoint The default is 1,000,000.

$sel:noHexPrefix:KafkaSettings', kafkaSettings_noHexPrefix - Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

$sel:partitionIncludeSchemaTable:KafkaSettings', kafkaSettings_partitionIncludeSchemaTable - Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

$sel:saslPassword:KafkaSettings', kafkaSettings_saslPassword - The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

$sel:saslUsername:KafkaSettings', kafkaSettings_saslUsername - The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

$sel:securityProtocol:KafkaSettings', kafkaSettings_securityProtocol - Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

$sel:sslCaCertificateArn:KafkaSettings', kafkaSettings_sslCaCertificateArn - The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

$sel:sslClientCertificateArn:KafkaSettings', kafkaSettings_sslClientCertificateArn - The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

$sel:sslClientKeyArn:KafkaSettings', kafkaSettings_sslClientKeyArn - The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

$sel:sslClientKeyPassword:KafkaSettings', kafkaSettings_sslClientKeyPassword - The password for the client private key used to securely connect to a Kafka target endpoint.

$sel:topic:KafkaSettings', kafkaSettings_topic - The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.

kafkaSettings_broker :: Lens' KafkaSettings (Maybe Text) Source #

A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

kafkaSettings_includeControlDetails :: Lens' KafkaSettings (Maybe Bool) Source #

Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

kafkaSettings_includeNullAndEmpty :: Lens' KafkaSettings (Maybe Bool) Source #

Include NULL and empty columns for records migrated to the endpoint. The default is false.

kafkaSettings_includePartitionValue :: Lens' KafkaSettings (Maybe Bool) Source #

Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

kafkaSettings_includeTableAlterOperations :: Lens' KafkaSettings (Maybe Bool) Source #

Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

kafkaSettings_includeTransactionDetails :: Lens' KafkaSettings (Maybe Bool) Source #

Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

kafkaSettings_messageFormat :: Lens' KafkaSettings (Maybe MessageFormatValue) Source #

The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

kafkaSettings_messageMaxBytes :: Lens' KafkaSettings (Maybe Int) Source #

The maximum size in bytes for records created on the endpoint The default is 1,000,000.

kafkaSettings_noHexPrefix :: Lens' KafkaSettings (Maybe Bool) Source #

Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

kafkaSettings_partitionIncludeSchemaTable :: Lens' KafkaSettings (Maybe Bool) Source #

Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

kafkaSettings_saslPassword :: Lens' KafkaSettings (Maybe Text) Source #

The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

kafkaSettings_saslUsername :: Lens' KafkaSettings (Maybe Text) Source #

The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

kafkaSettings_securityProtocol :: Lens' KafkaSettings (Maybe KafkaSecurityProtocol) Source #

Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

kafkaSettings_sslCaCertificateArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

kafkaSettings_sslClientCertificateArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

kafkaSettings_sslClientKeyArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

kafkaSettings_sslClientKeyPassword :: Lens' KafkaSettings (Maybe Text) Source #

The password for the client private key used to securely connect to a Kafka target endpoint.

kafkaSettings_topic :: Lens' KafkaSettings (Maybe Text) Source #

The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.