1. Packages
  2. Databricks Provider
  3. API Docs
  4. FeatureEngineeringKafkaConfig
Viewing docs for Databricks v1.90.0
published on Thursday, Mar 19, 2026 by Pulumi
databricks logo
Viewing docs for Databricks v1.90.0
published on Thursday, Mar 19, 2026 by Pulumi

    Private Preview

    Create FeatureEngineeringKafkaConfig Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new FeatureEngineeringKafkaConfig(name: string, args: FeatureEngineeringKafkaConfigArgs, opts?: CustomResourceOptions);
    @overload
    def FeatureEngineeringKafkaConfig(resource_name: str,
                                      args: FeatureEngineeringKafkaConfigArgs,
                                      opts: Optional[ResourceOptions] = None)
    
    @overload
    def FeatureEngineeringKafkaConfig(resource_name: str,
                                      opts: Optional[ResourceOptions] = None,
                                      auth_config: Optional[FeatureEngineeringKafkaConfigAuthConfigArgs] = None,
                                      bootstrap_servers: Optional[str] = None,
                                      subscription_mode: Optional[FeatureEngineeringKafkaConfigSubscriptionModeArgs] = None,
                                      backfill_source: Optional[FeatureEngineeringKafkaConfigBackfillSourceArgs] = None,
                                      extra_options: Optional[Mapping[str, str]] = None,
                                      key_schema: Optional[FeatureEngineeringKafkaConfigKeySchemaArgs] = None,
                                      provider_config: Optional[FeatureEngineeringKafkaConfigProviderConfigArgs] = None,
                                      value_schema: Optional[FeatureEngineeringKafkaConfigValueSchemaArgs] = None)
    func NewFeatureEngineeringKafkaConfig(ctx *Context, name string, args FeatureEngineeringKafkaConfigArgs, opts ...ResourceOption) (*FeatureEngineeringKafkaConfig, error)
    public FeatureEngineeringKafkaConfig(string name, FeatureEngineeringKafkaConfigArgs args, CustomResourceOptions? opts = null)
    public FeatureEngineeringKafkaConfig(String name, FeatureEngineeringKafkaConfigArgs args)
    public FeatureEngineeringKafkaConfig(String name, FeatureEngineeringKafkaConfigArgs args, CustomResourceOptions options)
    
    type: databricks:FeatureEngineeringKafkaConfig
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args FeatureEngineeringKafkaConfigArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args FeatureEngineeringKafkaConfigArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args FeatureEngineeringKafkaConfigArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args FeatureEngineeringKafkaConfigArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args FeatureEngineeringKafkaConfigArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Constructor example

    The following reference example uses placeholder values for all input properties.

    var featureEngineeringKafkaConfigResource = new Databricks.FeatureEngineeringKafkaConfig("featureEngineeringKafkaConfigResource", new()
    {
        AuthConfig = new Databricks.Inputs.FeatureEngineeringKafkaConfigAuthConfigArgs
        {
            UcServiceCredentialName = "string",
        },
        BootstrapServers = "string",
        SubscriptionMode = new Databricks.Inputs.FeatureEngineeringKafkaConfigSubscriptionModeArgs
        {
            Assign = "string",
            Subscribe = "string",
            SubscribePattern = "string",
        },
        BackfillSource = new Databricks.Inputs.FeatureEngineeringKafkaConfigBackfillSourceArgs
        {
            DeltaTableSource = new Databricks.Inputs.FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSourceArgs
            {
                FullName = "string",
                DataframeSchema = "string",
                EntityColumns = new[]
                {
                    "string",
                },
                FilterCondition = "string",
                TimeseriesColumn = "string",
                TransformationSql = "string",
            },
        },
        ExtraOptions = 
        {
            { "string", "string" },
        },
        KeySchema = new Databricks.Inputs.FeatureEngineeringKafkaConfigKeySchemaArgs
        {
            JsonSchema = "string",
        },
        ProviderConfig = new Databricks.Inputs.FeatureEngineeringKafkaConfigProviderConfigArgs
        {
            WorkspaceId = "string",
        },
        ValueSchema = new Databricks.Inputs.FeatureEngineeringKafkaConfigValueSchemaArgs
        {
            JsonSchema = "string",
        },
    });
    
    example, err := databricks.NewFeatureEngineeringKafkaConfig(ctx, "featureEngineeringKafkaConfigResource", &databricks.FeatureEngineeringKafkaConfigArgs{
    	AuthConfig: &databricks.FeatureEngineeringKafkaConfigAuthConfigArgs{
    		UcServiceCredentialName: pulumi.String("string"),
    	},
    	BootstrapServers: pulumi.String("string"),
    	SubscriptionMode: &databricks.FeatureEngineeringKafkaConfigSubscriptionModeArgs{
    		Assign:           pulumi.String("string"),
    		Subscribe:        pulumi.String("string"),
    		SubscribePattern: pulumi.String("string"),
    	},
    	BackfillSource: &databricks.FeatureEngineeringKafkaConfigBackfillSourceArgs{
    		DeltaTableSource: &databricks.FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSourceArgs{
    			FullName:        pulumi.String("string"),
    			DataframeSchema: pulumi.String("string"),
    			EntityColumns: pulumi.StringArray{
    				pulumi.String("string"),
    			},
    			FilterCondition:   pulumi.String("string"),
    			TimeseriesColumn:  pulumi.String("string"),
    			TransformationSql: pulumi.String("string"),
    		},
    	},
    	ExtraOptions: pulumi.StringMap{
    		"string": pulumi.String("string"),
    	},
    	KeySchema: &databricks.FeatureEngineeringKafkaConfigKeySchemaArgs{
    		JsonSchema: pulumi.String("string"),
    	},
    	ProviderConfig: &databricks.FeatureEngineeringKafkaConfigProviderConfigArgs{
    		WorkspaceId: pulumi.String("string"),
    	},
    	ValueSchema: &databricks.FeatureEngineeringKafkaConfigValueSchemaArgs{
    		JsonSchema: pulumi.String("string"),
    	},
    })
    
    var featureEngineeringKafkaConfigResource = new FeatureEngineeringKafkaConfig("featureEngineeringKafkaConfigResource", FeatureEngineeringKafkaConfigArgs.builder()
        .authConfig(FeatureEngineeringKafkaConfigAuthConfigArgs.builder()
            .ucServiceCredentialName("string")
            .build())
        .bootstrapServers("string")
        .subscriptionMode(FeatureEngineeringKafkaConfigSubscriptionModeArgs.builder()
            .assign("string")
            .subscribe("string")
            .subscribePattern("string")
            .build())
        .backfillSource(FeatureEngineeringKafkaConfigBackfillSourceArgs.builder()
            .deltaTableSource(FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSourceArgs.builder()
                .fullName("string")
                .dataframeSchema("string")
                .entityColumns("string")
                .filterCondition("string")
                .timeseriesColumn("string")
                .transformationSql("string")
                .build())
            .build())
        .extraOptions(Map.of("string", "string"))
        .keySchema(FeatureEngineeringKafkaConfigKeySchemaArgs.builder()
            .jsonSchema("string")
            .build())
        .providerConfig(FeatureEngineeringKafkaConfigProviderConfigArgs.builder()
            .workspaceId("string")
            .build())
        .valueSchema(FeatureEngineeringKafkaConfigValueSchemaArgs.builder()
            .jsonSchema("string")
            .build())
        .build());
    
    feature_engineering_kafka_config_resource = databricks.FeatureEngineeringKafkaConfig("featureEngineeringKafkaConfigResource",
        auth_config={
            "uc_service_credential_name": "string",
        },
        bootstrap_servers="string",
        subscription_mode={
            "assign": "string",
            "subscribe": "string",
            "subscribe_pattern": "string",
        },
        backfill_source={
            "delta_table_source": {
                "full_name": "string",
                "dataframe_schema": "string",
                "entity_columns": ["string"],
                "filter_condition": "string",
                "timeseries_column": "string",
                "transformation_sql": "string",
            },
        },
        extra_options={
            "string": "string",
        },
        key_schema={
            "json_schema": "string",
        },
        provider_config={
            "workspace_id": "string",
        },
        value_schema={
            "json_schema": "string",
        })
    
    const featureEngineeringKafkaConfigResource = new databricks.FeatureEngineeringKafkaConfig("featureEngineeringKafkaConfigResource", {
        authConfig: {
            ucServiceCredentialName: "string",
        },
        bootstrapServers: "string",
        subscriptionMode: {
            assign: "string",
            subscribe: "string",
            subscribePattern: "string",
        },
        backfillSource: {
            deltaTableSource: {
                fullName: "string",
                dataframeSchema: "string",
                entityColumns: ["string"],
                filterCondition: "string",
                timeseriesColumn: "string",
                transformationSql: "string",
            },
        },
        extraOptions: {
            string: "string",
        },
        keySchema: {
            jsonSchema: "string",
        },
        providerConfig: {
            workspaceId: "string",
        },
        valueSchema: {
            jsonSchema: "string",
        },
    });
    
    type: databricks:FeatureEngineeringKafkaConfig
    properties:
        authConfig:
            ucServiceCredentialName: string
        backfillSource:
            deltaTableSource:
                dataframeSchema: string
                entityColumns:
                    - string
                filterCondition: string
                fullName: string
                timeseriesColumn: string
                transformationSql: string
        bootstrapServers: string
        extraOptions:
            string: string
        keySchema:
            jsonSchema: string
        providerConfig:
            workspaceId: string
        subscriptionMode:
            assign: string
            subscribe: string
            subscribePattern: string
        valueSchema:
            jsonSchema: string
    

    FeatureEngineeringKafkaConfig Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The FeatureEngineeringKafkaConfig resource accepts the following input properties:

    AuthConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    BootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    SubscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    BackfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    ExtraOptions Dictionary<string, string>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    ProviderConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    ValueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    AuthConfig FeatureEngineeringKafkaConfigAuthConfigArgs
    Authentication configuration for connection to topics
    BootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    SubscriptionMode FeatureEngineeringKafkaConfigSubscriptionModeArgs
    Options to configure which Kafka topics to pull data from
    BackfillSource FeatureEngineeringKafkaConfigBackfillSourceArgs
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    ExtraOptions map[string]string
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema FeatureEngineeringKafkaConfigKeySchemaArgs
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    ProviderConfig FeatureEngineeringKafkaConfigProviderConfigArgs
    Configure the provider for management through account provider.
    ValueSchema FeatureEngineeringKafkaConfigValueSchemaArgs
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    bootstrapServers String
    A comma-separated list of host/port pairs pointing to Kafka cluster
    subscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    backfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    extraOptions Map<String,String>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    providerConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    valueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    bootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    subscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    backfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    extraOptions {[key: string]: string}
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    providerConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    valueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    auth_config FeatureEngineeringKafkaConfigAuthConfigArgs
    Authentication configuration for connection to topics
    bootstrap_servers str
    A comma-separated list of host/port pairs pointing to Kafka cluster
    subscription_mode FeatureEngineeringKafkaConfigSubscriptionModeArgs
    Options to configure which Kafka topics to pull data from
    backfill_source FeatureEngineeringKafkaConfigBackfillSourceArgs
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    extra_options Mapping[str, str]
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    key_schema FeatureEngineeringKafkaConfigKeySchemaArgs
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    provider_config FeatureEngineeringKafkaConfigProviderConfigArgs
    Configure the provider for management through account provider.
    value_schema FeatureEngineeringKafkaConfigValueSchemaArgs
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig Property Map
    Authentication configuration for connection to topics
    bootstrapServers String
    A comma-separated list of host/port pairs pointing to Kafka cluster
    subscriptionMode Property Map
    Options to configure which Kafka topics to pull data from
    backfillSource Property Map
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    extraOptions Map<String>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema Property Map
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    providerConfig Property Map
    Configure the provider for management through account provider.
    valueSchema Property Map
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided

    Outputs

    All input properties are implicitly available as output properties. Additionally, the FeatureEngineeringKafkaConfig resource produces the following output properties:

    Id string
    The provider-assigned unique ID for this managed resource.
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    Id string
    The provider-assigned unique ID for this managed resource.
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    id String
    The provider-assigned unique ID for this managed resource.
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    id string
    The provider-assigned unique ID for this managed resource.
    name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    id str
    The provider-assigned unique ID for this managed resource.
    name str
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    id String
    The provider-assigned unique ID for this managed resource.
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name

    Look up Existing FeatureEngineeringKafkaConfig Resource

    Get an existing FeatureEngineeringKafkaConfig resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

    public static get(name: string, id: Input<ID>, state?: FeatureEngineeringKafkaConfigState, opts?: CustomResourceOptions): FeatureEngineeringKafkaConfig
    @staticmethod
    def get(resource_name: str,
            id: str,
            opts: Optional[ResourceOptions] = None,
            auth_config: Optional[FeatureEngineeringKafkaConfigAuthConfigArgs] = None,
            backfill_source: Optional[FeatureEngineeringKafkaConfigBackfillSourceArgs] = None,
            bootstrap_servers: Optional[str] = None,
            extra_options: Optional[Mapping[str, str]] = None,
            key_schema: Optional[FeatureEngineeringKafkaConfigKeySchemaArgs] = None,
            name: Optional[str] = None,
            provider_config: Optional[FeatureEngineeringKafkaConfigProviderConfigArgs] = None,
            subscription_mode: Optional[FeatureEngineeringKafkaConfigSubscriptionModeArgs] = None,
            value_schema: Optional[FeatureEngineeringKafkaConfigValueSchemaArgs] = None) -> FeatureEngineeringKafkaConfig
    func GetFeatureEngineeringKafkaConfig(ctx *Context, name string, id IDInput, state *FeatureEngineeringKafkaConfigState, opts ...ResourceOption) (*FeatureEngineeringKafkaConfig, error)
    public static FeatureEngineeringKafkaConfig Get(string name, Input<string> id, FeatureEngineeringKafkaConfigState? state, CustomResourceOptions? opts = null)
    public static FeatureEngineeringKafkaConfig get(String name, Output<String> id, FeatureEngineeringKafkaConfigState state, CustomResourceOptions options)
    resources:  _:    type: databricks:FeatureEngineeringKafkaConfig    get:      id: ${id}
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    resource_name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    The following state arguments are supported:
    AuthConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    BackfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    BootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    ExtraOptions Dictionary<string, string>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    ProviderConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    SubscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    ValueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    AuthConfig FeatureEngineeringKafkaConfigAuthConfigArgs
    Authentication configuration for connection to topics
    BackfillSource FeatureEngineeringKafkaConfigBackfillSourceArgs
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    BootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    ExtraOptions map[string]string
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema FeatureEngineeringKafkaConfigKeySchemaArgs
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    ProviderConfig FeatureEngineeringKafkaConfigProviderConfigArgs
    Configure the provider for management through account provider.
    SubscriptionMode FeatureEngineeringKafkaConfigSubscriptionModeArgs
    Options to configure which Kafka topics to pull data from
    ValueSchema FeatureEngineeringKafkaConfigValueSchemaArgs
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    backfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers String
    A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions Map<String,String>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    providerConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    subscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    valueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig FeatureEngineeringKafkaConfigAuthConfig
    Authentication configuration for connection to topics
    backfillSource FeatureEngineeringKafkaConfigBackfillSource
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers string
    A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions {[key: string]: string}
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema FeatureEngineeringKafkaConfigKeySchema
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    providerConfig FeatureEngineeringKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    subscriptionMode FeatureEngineeringKafkaConfigSubscriptionMode
    Options to configure which Kafka topics to pull data from
    valueSchema FeatureEngineeringKafkaConfigValueSchema
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    auth_config FeatureEngineeringKafkaConfigAuthConfigArgs
    Authentication configuration for connection to topics
    backfill_source FeatureEngineeringKafkaConfigBackfillSourceArgs
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrap_servers str
    A comma-separated list of host/port pairs pointing to Kafka cluster
    extra_options Mapping[str, str]
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    key_schema FeatureEngineeringKafkaConfigKeySchemaArgs
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name str
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    provider_config FeatureEngineeringKafkaConfigProviderConfigArgs
    Configure the provider for management through account provider.
    subscription_mode FeatureEngineeringKafkaConfigSubscriptionModeArgs
    Options to configure which Kafka topics to pull data from
    value_schema FeatureEngineeringKafkaConfigValueSchemaArgs
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    authConfig Property Map
    Authentication configuration for connection to topics
    backfillSource Property Map
    A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers String
    A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions Map<String>
    Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema Property Map
    Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    providerConfig Property Map
    Configure the provider for management through account provider.
    subscriptionMode Property Map
    Options to configure which Kafka topics to pull data from
    valueSchema Property Map
    Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided

    Supporting Types

    FeatureEngineeringKafkaConfigAuthConfig, FeatureEngineeringKafkaConfigAuthConfigArgs

    UcServiceCredentialName string
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    UcServiceCredentialName string
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName String
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName string
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    uc_service_credential_name str
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName String
    Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential

    FeatureEngineeringKafkaConfigBackfillSource, FeatureEngineeringKafkaConfigBackfillSourceArgs

    DeltaTableSource FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    DeltaTableSource FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    delta_table_source FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource Property Map
    The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource

    FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSource, FeatureEngineeringKafkaConfigBackfillSourceDeltaTableSourceArgs

    FullName string
    The full three-part (catalog, schema, table) name of the Delta table
    DataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    EntityColumns List<string>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    FilterCondition string
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    TimeseriesColumn string
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    TransformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    FullName string
    The full three-part (catalog, schema, table) name of the Delta table
    DataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    EntityColumns []string
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    FilterCondition string
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    TimeseriesColumn string
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    TransformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName String
    The full three-part (catalog, schema, table) name of the Delta table
    dataframeSchema String
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns List<String>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition String
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    timeseriesColumn String
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    transformationSql String
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName string
    The full three-part (catalog, schema, table) name of the Delta table
    dataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns string[]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition string
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    timeseriesColumn string
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    transformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    full_name str
    The full three-part (catalog, schema, table) name of the Delta table
    dataframe_schema str
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entity_columns Sequence[str]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filter_condition str
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    timeseries_column str
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    transformation_sql str
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName String
    The full three-part (catalog, schema, table) name of the Delta table
    dataframeSchema String
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns List<String>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition String
    Single WHERE clause to filter delta table before applying transformations. Will be row-wise evaluated, so should only include conditionals and projections
    timeseriesColumn String
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column of the Delta table
    transformationSql String
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe

    FeatureEngineeringKafkaConfigKeySchema, FeatureEngineeringKafkaConfigKeySchemaArgs

    JsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    JsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    json_schema str
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)

    FeatureEngineeringKafkaConfigProviderConfig, FeatureEngineeringKafkaConfigProviderConfigArgs

    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspace_id str
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.

    FeatureEngineeringKafkaConfigSubscriptionMode, FeatureEngineeringKafkaConfigSubscriptionModeArgs

    Assign string
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    Subscribe string
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    SubscribePattern string
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    Assign string
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    Subscribe string
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    SubscribePattern string
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign String
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe String
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern String
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign string
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe string
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern string
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign str
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe str
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribe_pattern str
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign String
    A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe String
    A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern String
    A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'

    FeatureEngineeringKafkaConfigValueSchema, FeatureEngineeringKafkaConfigValueSchemaArgs

    JsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    JsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema string
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    json_schema str
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)

    Package Details

    Repository
    databricks pulumi/pulumi-databricks
    License
    Apache-2.0
    Notes
    This Pulumi package is based on the databricks Terraform Provider.
    databricks logo
    Viewing docs for Databricks v1.90.0
    published on Thursday, Mar 19, 2026 by Pulumi
      Try Pulumi Cloud free. Your team will thank you.