-
Notifications
You must be signed in to change notification settings - Fork 71
feat: additional configurations for Kafka Trigger and Output #306
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
aloiva
wants to merge
1
commit into
Azure:dev
Choose a base branch
from
aloiva:dev
base: dev
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -30,7 +30,8 @@ | |
| from azure.functions.decorators.http import HttpTrigger, HttpOutput, \ | ||
| HttpMethod | ||
| from azure.functions.decorators.kafka import KafkaTrigger, KafkaOutput, \ | ||
| BrokerAuthenticationMode, BrokerProtocol, OAuthBearerMethod | ||
| BrokerAuthenticationMode, BrokerProtocol, OAuthBearerMethod, \ | ||
| KafkaMessageKeyType | ||
| from azure.functions.decorators.queue import QueueTrigger, QueueOutput | ||
| from azure.functions.decorators.servicebus import ServiceBusQueueTrigger, \ | ||
| ServiceBusQueueOutput, ServiceBusTopicTrigger, \ | ||
|
|
@@ -1244,12 +1245,18 @@ def kafka_trigger(self, | |
| event_hub_connection_string: Optional[str] = None, | ||
| consumer_group: Optional[str] = None, | ||
| avro_schema: Optional[str] = None, | ||
| key_avro_schema: Optional[str] = None, | ||
| key_data_type: Optional[Union[KafkaMessageKeyType, str]] = None, | ||
| username: Optional[str] = None, | ||
| password: Optional[str] = None, | ||
| ssl_key_location: Optional[str] = None, | ||
| ssl_ca_location: Optional[str] = None, | ||
| ssl_certificate_location: Optional[str] = None, | ||
| ssl_key_password: Optional[str] = None, | ||
| ssl_certificate_pem: Optional[str] = None, | ||
| ssl_key_pem: Optional[str] = None, | ||
| ssl_ca_pem: Optional[str] = None, | ||
| ssl_certificate_and_key_pem: Optional[str] = None, | ||
| schema_registry_url: Optional[str] = None, | ||
| schema_registry_username: Optional[str] = None, | ||
| schema_registry_password: Optional[str] = None, | ||
|
|
@@ -1287,6 +1294,10 @@ def kafka_trigger(self, | |
| Azure Event Hubs). | ||
| :param consumer_group: Kafka consumer group used by the trigger. | ||
| :param avro_schema: Used only if a generic Avro record should be generated. | ||
| :param key_avro_schema: Avro schema for the message key. Used only if a | ||
| generic Avro record should be generated for the key. | ||
| :param key_data_type: Data type of the message key. Valid values: Int, Long, | ||
| String, Binary. Default is String. Ignored if key_avro_schema is set. | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Could you please also add pydocs for the new |
||
| :param username: SASL username for use with the PLAIN or SASL-SCRAM mechanisms. | ||
| Equivalent to 'sasl.username' in librdkafka. Default is empty string. | ||
| :param password: SASL password for use with the PLAIN or SASL-SCRAM mechanisms. | ||
|
|
@@ -1338,12 +1349,19 @@ def decorator(): | |
| event_hub_connection_string=event_hub_connection_string, # noqa: E501 | ||
| consumer_group=consumer_group, | ||
| avro_schema=avro_schema, | ||
| key_avro_schema=key_avro_schema, | ||
| key_data_type=parse_singular_param_to_enum( | ||
| key_data_type, KafkaMessageKeyType), | ||
| username=username, | ||
| password=password, | ||
| ssl_key_location=ssl_key_location, | ||
| ssl_ca_location=ssl_ca_location, | ||
| ssl_certificate_location=ssl_certificate_location, | ||
| ssl_key_password=ssl_key_password, | ||
| ssl_certificate_pem=ssl_certificate_pem, | ||
| ssl_key_pem=ssl_key_pem, | ||
| ssl_ca_pem=ssl_ca_pem, | ||
| ssl_certificate_and_key_pem=ssl_certificate_and_key_pem, | ||
| schema_registry_url=schema_registry_url, | ||
| schema_registry_username=schema_registry_username, | ||
| schema_registry_password=schema_registry_password, | ||
|
|
@@ -2588,12 +2606,18 @@ def kafka_output(self, | |
| topic: str, | ||
| broker_list: str, | ||
| avro_schema: Optional[str] = None, | ||
| key_avro_schema: Optional[str] = None, | ||
| key_data_type: Optional[Union[KafkaMessageKeyType, str]] = None, | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same as trigger - please set default value here to match |
||
| username: Optional[str] = None, | ||
| password: Optional[str] = None, | ||
| ssl_key_location: Optional[str] = None, | ||
| ssl_ca_location: Optional[str] = None, | ||
| ssl_certificate_location: Optional[str] = None, | ||
| ssl_key_password: Optional[str] = None, | ||
| ssl_certificate_pem: Optional[str] = None, | ||
| ssl_key_pem: Optional[str] = None, | ||
| ssl_ca_pem: Optional[str] = None, | ||
| ssl_certificate_and_key_pem: Optional[str] = None, | ||
| schema_registry_url: Optional[str] = None, | ||
| schema_registry_username: Optional[str] = None, | ||
| schema_registry_password: Optional[str] = None, | ||
|
|
@@ -2630,6 +2654,10 @@ def kafka_output(self, | |
| :param topic: The Kafka topic to which messages are published. | ||
| :param broker_list: The list of Kafka brokers to which the producer connects. | ||
| :param avro_schema: Optional. Avro schema to generate a generic record. | ||
| :param key_avro_schema: Avro schema for the message key. Used only if a | ||
| generic Avro record should be generated for the key. | ||
| :param key_data_type: Data type of the message key. Valid values: Int, Long, | ||
| String, Binary. Default is String. Ignored if key_avro_schema is set. | ||
| :param username: SASL username for use with the PLAIN and SASL-SCRAM | ||
| mechanisms. Equivalent to `'sasl.username'` in librdkafka. | ||
| :param password: SASL password for use with the PLAIN and SASL-SCRAM | ||
|
|
@@ -2642,6 +2670,14 @@ def kafka_output(self, | |
| Equivalent to `'ssl.certificate.location'` in librdkafka. | ||
| :param ssl_key_password: Password for the client's SSL key. | ||
| Equivalent to `'ssl.key.password'` in librdkafka. | ||
| :param ssl_certificate_pem: Client certificate in PEM format. | ||
| Equivalent to 'ssl.certificate.pem' in librdkafka. | ||
| :param ssl_key_pem: Client private key in PEM format. | ||
| Equivalent to 'ssl.key.pem' in librdkafka. | ||
| :param ssl_ca_pem: CA certificate for verifying the broker's certificate in PEM format. | ||
| Equivalent to 'ssl.ca.pem' in librdkafka. | ||
| :param ssl_certificate_and_key_pem: Client certificate and key in PEM format. | ||
| Additional configuration for KeyVault support (certificate with private key). | ||
| :param schema_registry_url: URL of the Avro Schema Registry. | ||
| :param schema_registry_username: Username for accessing the Schema Registry. | ||
| :param schema_registry_password: Password for accessing the Schema Registry. | ||
|
|
@@ -2695,12 +2731,19 @@ def decorator(): | |
| topic=topic, | ||
| broker_list=broker_list, | ||
| avro_schema=avro_schema, | ||
| key_avro_schema=key_avro_schema, | ||
| key_data_type=parse_singular_param_to_enum( | ||
| key_data_type, KafkaMessageKeyType), | ||
| username=username, | ||
| password=password, | ||
| ssl_key_location=ssl_key_location, | ||
| ssl_ca_location=ssl_ca_location, | ||
| ssl_certificate_location=ssl_certificate_location, | ||
| ssl_key_password=ssl_key_password, | ||
| ssl_certificate_pem=ssl_certificate_pem, | ||
| ssl_key_pem=ssl_key_pem, | ||
| ssl_ca_pem=ssl_ca_pem, | ||
| ssl_certificate_and_key_pem=ssl_certificate_and_key_pem, | ||
| schema_registry_url=schema_registry_url, | ||
| schema_registry_username=schema_registry_username, | ||
| schema_registry_password=schema_registry_password, | ||
|
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you set the default value for
key_data_typetoKafkaMessageKeyType.STRINGhere since it's given a default value inkafka.py?