1. Packages
  2. Databricks Provider
  3. API Docs
  4. FeatureEngineeringFeature
Viewing docs for Databricks v1.90.0
published on Thursday, Mar 19, 2026 by Pulumi
databricks logo
Viewing docs for Databricks v1.90.0
published on Thursday, Mar 19, 2026 by Pulumi

    Private Preview

    Create FeatureEngineeringFeature Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new FeatureEngineeringFeature(name: string, args: FeatureEngineeringFeatureArgs, opts?: CustomResourceOptions);
    @overload
    def FeatureEngineeringFeature(resource_name: str,
                                  args: FeatureEngineeringFeatureArgs,
                                  opts: Optional[ResourceOptions] = None)
    
    @overload
    def FeatureEngineeringFeature(resource_name: str,
                                  opts: Optional[ResourceOptions] = None,
                                  full_name: Optional[str] = None,
                                  function: Optional[FeatureEngineeringFeatureFunctionArgs] = None,
                                  source: Optional[FeatureEngineeringFeatureSourceArgs] = None,
                                  description: Optional[str] = None,
                                  entities: Optional[Sequence[FeatureEngineeringFeatureEntityArgs]] = None,
                                  filter_condition: Optional[str] = None,
                                  inputs: Optional[Sequence[str]] = None,
                                  lineage_context: Optional[FeatureEngineeringFeatureLineageContextArgs] = None,
                                  provider_config: Optional[FeatureEngineeringFeatureProviderConfigArgs] = None,
                                  time_window: Optional[FeatureEngineeringFeatureTimeWindowArgs] = None,
                                  timeseries_column: Optional[FeatureEngineeringFeatureTimeseriesColumnArgs] = None)
    func NewFeatureEngineeringFeature(ctx *Context, name string, args FeatureEngineeringFeatureArgs, opts ...ResourceOption) (*FeatureEngineeringFeature, error)
    public FeatureEngineeringFeature(string name, FeatureEngineeringFeatureArgs args, CustomResourceOptions? opts = null)
    public FeatureEngineeringFeature(String name, FeatureEngineeringFeatureArgs args)
    public FeatureEngineeringFeature(String name, FeatureEngineeringFeatureArgs args, CustomResourceOptions options)
    
    type: databricks:FeatureEngineeringFeature
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args FeatureEngineeringFeatureArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args FeatureEngineeringFeatureArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args FeatureEngineeringFeatureArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args FeatureEngineeringFeatureArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args FeatureEngineeringFeatureArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Constructor example

    The following reference example uses placeholder values for all input properties.

    var featureEngineeringFeatureResource = new Databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource", new()
    {
        FullName = "string",
        Function = new Databricks.Inputs.FeatureEngineeringFeatureFunctionArgs
        {
            AggregationFunction = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionArgs
            {
                ApproxCountDistinct = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs
                {
                    Input = "string",
                    RelativeSd = 0,
                },
                ApproxPercentile = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs
                {
                    Input = "string",
                    Percentile = 0,
                    Accuracy = 0,
                },
                Avg = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs
                {
                    Input = "string",
                },
                CountFunction = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs
                {
                    Input = "string",
                },
                First = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs
                {
                    Input = "string",
                },
                Last = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs
                {
                    Input = "string",
                },
                Max = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs
                {
                    Input = "string",
                },
                Min = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs
                {
                    Input = "string",
                },
                StddevPop = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs
                {
                    Input = "string",
                },
                StddevSamp = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs
                {
                    Input = "string",
                },
                Sum = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs
                {
                    Input = "string",
                },
                TimeWindow = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs
                {
                    Continuous = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs
                    {
                        WindowDuration = "string",
                        Offset = "string",
                    },
                    Sliding = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs
                    {
                        SlideDuration = "string",
                        WindowDuration = "string",
                    },
                    Tumbling = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs
                    {
                        WindowDuration = "string",
                    },
                },
                VarPop = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs
                {
                    Input = "string",
                },
                VarSamp = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs
                {
                    Input = "string",
                },
            },
            ExtraParameters = new[]
            {
                new Databricks.Inputs.FeatureEngineeringFeatureFunctionExtraParameterArgs
                {
                    Key = "string",
                    Value = "string",
                },
            },
            FunctionType = "string",
        },
        Source = new Databricks.Inputs.FeatureEngineeringFeatureSourceArgs
        {
            DeltaTableSource = new Databricks.Inputs.FeatureEngineeringFeatureSourceDeltaTableSourceArgs
            {
                FullName = "string",
                DataframeSchema = "string",
                EntityColumns = new[]
                {
                    "string",
                },
                FilterCondition = "string",
                TimeseriesColumn = "string",
                TransformationSql = "string",
            },
            KafkaSource = new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceArgs
            {
                Name = "string",
                EntityColumnIdentifiers = new[]
                {
                    new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs
                    {
                        VariantExprPath = "string",
                    },
                },
                FilterCondition = "string",
                TimeseriesColumnIdentifier = new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs
                {
                    VariantExprPath = "string",
                },
            },
        },
        Description = "string",
        Entities = new[]
        {
            new Databricks.Inputs.FeatureEngineeringFeatureEntityArgs
            {
                Name = "string",
            },
        },
        FilterCondition = "string",
        Inputs = new[]
        {
            "string",
        },
        LineageContext = new Databricks.Inputs.FeatureEngineeringFeatureLineageContextArgs
        {
            JobContext = new Databricks.Inputs.FeatureEngineeringFeatureLineageContextJobContextArgs
            {
                JobId = 0,
                JobRunId = 0,
            },
            NotebookId = 0,
        },
        ProviderConfig = new Databricks.Inputs.FeatureEngineeringFeatureProviderConfigArgs
        {
            WorkspaceId = "string",
        },
        TimeWindow = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowArgs
        {
            Continuous = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowContinuousArgs
            {
                WindowDuration = "string",
                Offset = "string",
            },
            Sliding = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowSlidingArgs
            {
                SlideDuration = "string",
                WindowDuration = "string",
            },
            Tumbling = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowTumblingArgs
            {
                WindowDuration = "string",
            },
        },
        TimeseriesColumn = new Databricks.Inputs.FeatureEngineeringFeatureTimeseriesColumnArgs
        {
            Name = "string",
        },
    });
    
    example, err := databricks.NewFeatureEngineeringFeature(ctx, "featureEngineeringFeatureResource", &databricks.FeatureEngineeringFeatureArgs{
    	FullName: pulumi.String("string"),
    	Function: &databricks.FeatureEngineeringFeatureFunctionArgs{
    		AggregationFunction: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionArgs{
    			ApproxCountDistinct: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs{
    				Input:      pulumi.String("string"),
    				RelativeSd: pulumi.Float64(0),
    			},
    			ApproxPercentile: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs{
    				Input:      pulumi.String("string"),
    				Percentile: pulumi.Float64(0),
    				Accuracy:   pulumi.Int(0),
    			},
    			Avg: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs{
    				Input: pulumi.String("string"),
    			},
    			CountFunction: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs{
    				Input: pulumi.String("string"),
    			},
    			First: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs{
    				Input: pulumi.String("string"),
    			},
    			Last: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs{
    				Input: pulumi.String("string"),
    			},
    			Max: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs{
    				Input: pulumi.String("string"),
    			},
    			Min: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs{
    				Input: pulumi.String("string"),
    			},
    			StddevPop: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs{
    				Input: pulumi.String("string"),
    			},
    			StddevSamp: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs{
    				Input: pulumi.String("string"),
    			},
    			Sum: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs{
    				Input: pulumi.String("string"),
    			},
    			TimeWindow: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs{
    				Continuous: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs{
    					WindowDuration: pulumi.String("string"),
    					Offset:         pulumi.String("string"),
    				},
    				Sliding: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs{
    					SlideDuration:  pulumi.String("string"),
    					WindowDuration: pulumi.String("string"),
    				},
    				Tumbling: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs{
    					WindowDuration: pulumi.String("string"),
    				},
    			},
    			VarPop: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs{
    				Input: pulumi.String("string"),
    			},
    			VarSamp: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs{
    				Input: pulumi.String("string"),
    			},
    		},
    		ExtraParameters: databricks.FeatureEngineeringFeatureFunctionExtraParameterArray{
    			&databricks.FeatureEngineeringFeatureFunctionExtraParameterArgs{
    				Key:   pulumi.String("string"),
    				Value: pulumi.String("string"),
    			},
    		},
    		FunctionType: pulumi.String("string"),
    	},
    	Source: &databricks.FeatureEngineeringFeatureSourceArgs{
    		DeltaTableSource: &databricks.FeatureEngineeringFeatureSourceDeltaTableSourceArgs{
    			FullName:        pulumi.String("string"),
    			DataframeSchema: pulumi.String("string"),
    			EntityColumns: pulumi.StringArray{
    				pulumi.String("string"),
    			},
    			FilterCondition:   pulumi.String("string"),
    			TimeseriesColumn:  pulumi.String("string"),
    			TransformationSql: pulumi.String("string"),
    		},
    		KafkaSource: &databricks.FeatureEngineeringFeatureSourceKafkaSourceArgs{
    			Name: pulumi.String("string"),
    			EntityColumnIdentifiers: databricks.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArray{
    				&databricks.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs{
    					VariantExprPath: pulumi.String("string"),
    				},
    			},
    			FilterCondition: pulumi.String("string"),
    			TimeseriesColumnIdentifier: &databricks.FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs{
    				VariantExprPath: pulumi.String("string"),
    			},
    		},
    	},
    	Description: pulumi.String("string"),
    	Entities: databricks.FeatureEngineeringFeatureEntityArray{
    		&databricks.FeatureEngineeringFeatureEntityArgs{
    			Name: pulumi.String("string"),
    		},
    	},
    	FilterCondition: pulumi.String("string"),
    	Inputs: pulumi.StringArray{
    		pulumi.String("string"),
    	},
    	LineageContext: &databricks.FeatureEngineeringFeatureLineageContextArgs{
    		JobContext: &databricks.FeatureEngineeringFeatureLineageContextJobContextArgs{
    			JobId:    pulumi.Int(0),
    			JobRunId: pulumi.Int(0),
    		},
    		NotebookId: pulumi.Int(0),
    	},
    	ProviderConfig: &databricks.FeatureEngineeringFeatureProviderConfigArgs{
    		WorkspaceId: pulumi.String("string"),
    	},
    	TimeWindow: &databricks.FeatureEngineeringFeatureTimeWindowArgs{
    		Continuous: &databricks.FeatureEngineeringFeatureTimeWindowContinuousArgs{
    			WindowDuration: pulumi.String("string"),
    			Offset:         pulumi.String("string"),
    		},
    		Sliding: &databricks.FeatureEngineeringFeatureTimeWindowSlidingArgs{
    			SlideDuration:  pulumi.String("string"),
    			WindowDuration: pulumi.String("string"),
    		},
    		Tumbling: &databricks.FeatureEngineeringFeatureTimeWindowTumblingArgs{
    			WindowDuration: pulumi.String("string"),
    		},
    	},
    	TimeseriesColumn: &databricks.FeatureEngineeringFeatureTimeseriesColumnArgs{
    		Name: pulumi.String("string"),
    	},
    })
    
    var featureEngineeringFeatureResource = new FeatureEngineeringFeature("featureEngineeringFeatureResource", FeatureEngineeringFeatureArgs.builder()
        .fullName("string")
        .function(FeatureEngineeringFeatureFunctionArgs.builder()
            .aggregationFunction(FeatureEngineeringFeatureFunctionAggregationFunctionArgs.builder()
                .approxCountDistinct(FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs.builder()
                    .input("string")
                    .relativeSd(0.0)
                    .build())
                .approxPercentile(FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs.builder()
                    .input("string")
                    .percentile(0.0)
                    .accuracy(0)
                    .build())
                .avg(FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs.builder()
                    .input("string")
                    .build())
                .countFunction(FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs.builder()
                    .input("string")
                    .build())
                .first(FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs.builder()
                    .input("string")
                    .build())
                .last(FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs.builder()
                    .input("string")
                    .build())
                .max(FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs.builder()
                    .input("string")
                    .build())
                .min(FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs.builder()
                    .input("string")
                    .build())
                .stddevPop(FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs.builder()
                    .input("string")
                    .build())
                .stddevSamp(FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs.builder()
                    .input("string")
                    .build())
                .sum(FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs.builder()
                    .input("string")
                    .build())
                .timeWindow(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs.builder()
                    .continuous(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs.builder()
                        .windowDuration("string")
                        .offset("string")
                        .build())
                    .sliding(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs.builder()
                        .slideDuration("string")
                        .windowDuration("string")
                        .build())
                    .tumbling(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs.builder()
                        .windowDuration("string")
                        .build())
                    .build())
                .varPop(FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs.builder()
                    .input("string")
                    .build())
                .varSamp(FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs.builder()
                    .input("string")
                    .build())
                .build())
            .extraParameters(FeatureEngineeringFeatureFunctionExtraParameterArgs.builder()
                .key("string")
                .value("string")
                .build())
            .functionType("string")
            .build())
        .source(FeatureEngineeringFeatureSourceArgs.builder()
            .deltaTableSource(FeatureEngineeringFeatureSourceDeltaTableSourceArgs.builder()
                .fullName("string")
                .dataframeSchema("string")
                .entityColumns("string")
                .filterCondition("string")
                .timeseriesColumn("string")
                .transformationSql("string")
                .build())
            .kafkaSource(FeatureEngineeringFeatureSourceKafkaSourceArgs.builder()
                .name("string")
                .entityColumnIdentifiers(FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs.builder()
                    .variantExprPath("string")
                    .build())
                .filterCondition("string")
                .timeseriesColumnIdentifier(FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs.builder()
                    .variantExprPath("string")
                    .build())
                .build())
            .build())
        .description("string")
        .entities(FeatureEngineeringFeatureEntityArgs.builder()
            .name("string")
            .build())
        .filterCondition("string")
        .inputs("string")
        .lineageContext(FeatureEngineeringFeatureLineageContextArgs.builder()
            .jobContext(FeatureEngineeringFeatureLineageContextJobContextArgs.builder()
                .jobId(0)
                .jobRunId(0)
                .build())
            .notebookId(0)
            .build())
        .providerConfig(FeatureEngineeringFeatureProviderConfigArgs.builder()
            .workspaceId("string")
            .build())
        .timeWindow(FeatureEngineeringFeatureTimeWindowArgs.builder()
            .continuous(FeatureEngineeringFeatureTimeWindowContinuousArgs.builder()
                .windowDuration("string")
                .offset("string")
                .build())
            .sliding(FeatureEngineeringFeatureTimeWindowSlidingArgs.builder()
                .slideDuration("string")
                .windowDuration("string")
                .build())
            .tumbling(FeatureEngineeringFeatureTimeWindowTumblingArgs.builder()
                .windowDuration("string")
                .build())
            .build())
        .timeseriesColumn(FeatureEngineeringFeatureTimeseriesColumnArgs.builder()
            .name("string")
            .build())
        .build());
    
    feature_engineering_feature_resource = databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource",
        full_name="string",
        function={
            "aggregation_function": {
                "approx_count_distinct": {
                    "input": "string",
                    "relative_sd": 0,
                },
                "approx_percentile": {
                    "input": "string",
                    "percentile": 0,
                    "accuracy": 0,
                },
                "avg": {
                    "input": "string",
                },
                "count_function": {
                    "input": "string",
                },
                "first": {
                    "input": "string",
                },
                "last": {
                    "input": "string",
                },
                "max": {
                    "input": "string",
                },
                "min": {
                    "input": "string",
                },
                "stddev_pop": {
                    "input": "string",
                },
                "stddev_samp": {
                    "input": "string",
                },
                "sum": {
                    "input": "string",
                },
                "time_window": {
                    "continuous": {
                        "window_duration": "string",
                        "offset": "string",
                    },
                    "sliding": {
                        "slide_duration": "string",
                        "window_duration": "string",
                    },
                    "tumbling": {
                        "window_duration": "string",
                    },
                },
                "var_pop": {
                    "input": "string",
                },
                "var_samp": {
                    "input": "string",
                },
            },
            "extra_parameters": [{
                "key": "string",
                "value": "string",
            }],
            "function_type": "string",
        },
        source={
            "delta_table_source": {
                "full_name": "string",
                "dataframe_schema": "string",
                "entity_columns": ["string"],
                "filter_condition": "string",
                "timeseries_column": "string",
                "transformation_sql": "string",
            },
            "kafka_source": {
                "name": "string",
                "entity_column_identifiers": [{
                    "variant_expr_path": "string",
                }],
                "filter_condition": "string",
                "timeseries_column_identifier": {
                    "variant_expr_path": "string",
                },
            },
        },
        description="string",
        entities=[{
            "name": "string",
        }],
        filter_condition="string",
        inputs=["string"],
        lineage_context={
            "job_context": {
                "job_id": 0,
                "job_run_id": 0,
            },
            "notebook_id": 0,
        },
        provider_config={
            "workspace_id": "string",
        },
        time_window={
            "continuous": {
                "window_duration": "string",
                "offset": "string",
            },
            "sliding": {
                "slide_duration": "string",
                "window_duration": "string",
            },
            "tumbling": {
                "window_duration": "string",
            },
        },
        timeseries_column={
            "name": "string",
        })
    
    const featureEngineeringFeatureResource = new databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource", {
        fullName: "string",
        "function": {
            aggregationFunction: {
                approxCountDistinct: {
                    input: "string",
                    relativeSd: 0,
                },
                approxPercentile: {
                    input: "string",
                    percentile: 0,
                    accuracy: 0,
                },
                avg: {
                    input: "string",
                },
                countFunction: {
                    input: "string",
                },
                first: {
                    input: "string",
                },
                last: {
                    input: "string",
                },
                max: {
                    input: "string",
                },
                min: {
                    input: "string",
                },
                stddevPop: {
                    input: "string",
                },
                stddevSamp: {
                    input: "string",
                },
                sum: {
                    input: "string",
                },
                timeWindow: {
                    continuous: {
                        windowDuration: "string",
                        offset: "string",
                    },
                    sliding: {
                        slideDuration: "string",
                        windowDuration: "string",
                    },
                    tumbling: {
                        windowDuration: "string",
                    },
                },
                varPop: {
                    input: "string",
                },
                varSamp: {
                    input: "string",
                },
            },
            extraParameters: [{
                key: "string",
                value: "string",
            }],
            functionType: "string",
        },
        source: {
            deltaTableSource: {
                fullName: "string",
                dataframeSchema: "string",
                entityColumns: ["string"],
                filterCondition: "string",
                timeseriesColumn: "string",
                transformationSql: "string",
            },
            kafkaSource: {
                name: "string",
                entityColumnIdentifiers: [{
                    variantExprPath: "string",
                }],
                filterCondition: "string",
                timeseriesColumnIdentifier: {
                    variantExprPath: "string",
                },
            },
        },
        description: "string",
        entities: [{
            name: "string",
        }],
        filterCondition: "string",
        inputs: ["string"],
        lineageContext: {
            jobContext: {
                jobId: 0,
                jobRunId: 0,
            },
            notebookId: 0,
        },
        providerConfig: {
            workspaceId: "string",
        },
        timeWindow: {
            continuous: {
                windowDuration: "string",
                offset: "string",
            },
            sliding: {
                slideDuration: "string",
                windowDuration: "string",
            },
            tumbling: {
                windowDuration: "string",
            },
        },
        timeseriesColumn: {
            name: "string",
        },
    });
    
    type: databricks:FeatureEngineeringFeature
    properties:
        description: string
        entities:
            - name: string
        filterCondition: string
        fullName: string
        function:
            aggregationFunction:
                approxCountDistinct:
                    input: string
                    relativeSd: 0
                approxPercentile:
                    accuracy: 0
                    input: string
                    percentile: 0
                avg:
                    input: string
                countFunction:
                    input: string
                first:
                    input: string
                last:
                    input: string
                max:
                    input: string
                min:
                    input: string
                stddevPop:
                    input: string
                stddevSamp:
                    input: string
                sum:
                    input: string
                timeWindow:
                    continuous:
                        offset: string
                        windowDuration: string
                    sliding:
                        slideDuration: string
                        windowDuration: string
                    tumbling:
                        windowDuration: string
                varPop:
                    input: string
                varSamp:
                    input: string
            extraParameters:
                - key: string
                  value: string
            functionType: string
        inputs:
            - string
        lineageContext:
            jobContext:
                jobId: 0
                jobRunId: 0
            notebookId: 0
        providerConfig:
            workspaceId: string
        source:
            deltaTableSource:
                dataframeSchema: string
                entityColumns:
                    - string
                filterCondition: string
                fullName: string
                timeseriesColumn: string
                transformationSql: string
            kafkaSource:
                entityColumnIdentifiers:
                    - variantExprPath: string
                filterCondition: string
                name: string
                timeseriesColumnIdentifier:
                    variantExprPath: string
        timeWindow:
            continuous:
                offset: string
                windowDuration: string
            sliding:
                slideDuration: string
                windowDuration: string
            tumbling:
                windowDuration: string
        timeseriesColumn:
            name: string
    

    FeatureEngineeringFeature Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The FeatureEngineeringFeature resource accepts the following input properties:

    FullName string
    The full three-part name (catalog, schema, name) of the feature
    Function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    Source FeatureEngineeringFeatureSource
    The data source of the feature
    Description string
    The description of the feature
    Entities List<FeatureEngineeringFeatureEntity>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    Inputs List<string>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    LineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    ProviderConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    TimeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    TimeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    FullName string
    The full three-part name (catalog, schema, name) of the feature
    Function FeatureEngineeringFeatureFunctionArgs
    The function by which the feature is computed
    Source FeatureEngineeringFeatureSourceArgs
    The data source of the feature
    Description string
    The description of the feature
    Entities []FeatureEngineeringFeatureEntityArgs
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    Inputs []string
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    LineageContext FeatureEngineeringFeatureLineageContextArgs
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    ProviderConfig FeatureEngineeringFeatureProviderConfigArgs
    Configure the provider for management through account provider.
    TimeWindow FeatureEngineeringFeatureTimeWindowArgs
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    TimeseriesColumn FeatureEngineeringFeatureTimeseriesColumnArgs
    Column recording time, used for point-in-time joins, backfills, and aggregations
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    source FeatureEngineeringFeatureSource
    The data source of the feature
    description String
    The description of the feature
    entities List<FeatureEngineeringFeatureEntity>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    inputs List<String>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    timeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    fullName string
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    source FeatureEngineeringFeatureSource
    The data source of the feature
    description string
    The description of the feature
    entities FeatureEngineeringFeatureEntity[]
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    inputs string[]
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    timeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    full_name str
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunctionArgs
    The function by which the feature is computed
    source FeatureEngineeringFeatureSourceArgs
    The data source of the feature
    description str
    The description of the feature
    entities Sequence[FeatureEngineeringFeatureEntityArgs]
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filter_condition str
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    inputs Sequence[str]
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineage_context FeatureEngineeringFeatureLineageContextArgs
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    provider_config FeatureEngineeringFeatureProviderConfigArgs
    Configure the provider for management through account provider.
    time_window FeatureEngineeringFeatureTimeWindowArgs
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseries_column FeatureEngineeringFeatureTimeseriesColumnArgs
    Column recording time, used for point-in-time joins, backfills, and aggregations
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    function Property Map
    The function by which the feature is computed
    source Property Map
    The data source of the feature
    description String
    The description of the feature
    entities List<Property Map>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    inputs List<String>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext Property Map
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig Property Map
    Configure the provider for management through account provider.
    timeWindow Property Map
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn Property Map
    Column recording time, used for point-in-time joins, backfills, and aggregations

    Outputs

    All input properties are implicitly available as output properties. Additionally, the FeatureEngineeringFeature resource produces the following output properties:

    Id string
    The provider-assigned unique ID for this managed resource.
    Id string
    The provider-assigned unique ID for this managed resource.
    id String
    The provider-assigned unique ID for this managed resource.
    id string
    The provider-assigned unique ID for this managed resource.
    id str
    The provider-assigned unique ID for this managed resource.
    id String
    The provider-assigned unique ID for this managed resource.

    Look up Existing FeatureEngineeringFeature Resource

    Get an existing FeatureEngineeringFeature resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

    public static get(name: string, id: Input<ID>, state?: FeatureEngineeringFeatureState, opts?: CustomResourceOptions): FeatureEngineeringFeature
    @staticmethod
    def get(resource_name: str,
            id: str,
            opts: Optional[ResourceOptions] = None,
            description: Optional[str] = None,
            entities: Optional[Sequence[FeatureEngineeringFeatureEntityArgs]] = None,
            filter_condition: Optional[str] = None,
            full_name: Optional[str] = None,
            function: Optional[FeatureEngineeringFeatureFunctionArgs] = None,
            inputs: Optional[Sequence[str]] = None,
            lineage_context: Optional[FeatureEngineeringFeatureLineageContextArgs] = None,
            provider_config: Optional[FeatureEngineeringFeatureProviderConfigArgs] = None,
            source: Optional[FeatureEngineeringFeatureSourceArgs] = None,
            time_window: Optional[FeatureEngineeringFeatureTimeWindowArgs] = None,
            timeseries_column: Optional[FeatureEngineeringFeatureTimeseriesColumnArgs] = None) -> FeatureEngineeringFeature
    func GetFeatureEngineeringFeature(ctx *Context, name string, id IDInput, state *FeatureEngineeringFeatureState, opts ...ResourceOption) (*FeatureEngineeringFeature, error)
    public static FeatureEngineeringFeature Get(string name, Input<string> id, FeatureEngineeringFeatureState? state, CustomResourceOptions? opts = null)
    public static FeatureEngineeringFeature get(String name, Output<String> id, FeatureEngineeringFeatureState state, CustomResourceOptions options)
    resources:  _:    type: databricks:FeatureEngineeringFeature    get:      id: ${id}
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    resource_name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    The following state arguments are supported:
    Description string
    The description of the feature
    Entities List<FeatureEngineeringFeatureEntity>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    FullName string
    The full three-part name (catalog, schema, name) of the feature
    Function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    Inputs List<string>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    LineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    ProviderConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    Source FeatureEngineeringFeatureSource
    The data source of the feature
    TimeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    TimeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    Description string
    The description of the feature
    Entities []FeatureEngineeringFeatureEntityArgs
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    FullName string
    The full three-part name (catalog, schema, name) of the feature
    Function FeatureEngineeringFeatureFunctionArgs
    The function by which the feature is computed
    Inputs []string
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    LineageContext FeatureEngineeringFeatureLineageContextArgs
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    ProviderConfig FeatureEngineeringFeatureProviderConfigArgs
    Configure the provider for management through account provider.
    Source FeatureEngineeringFeatureSourceArgs
    The data source of the feature
    TimeWindow FeatureEngineeringFeatureTimeWindowArgs
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    TimeseriesColumn FeatureEngineeringFeatureTimeseriesColumnArgs
    Column recording time, used for point-in-time joins, backfills, and aggregations
    description String
    The description of the feature
    entities List<FeatureEngineeringFeatureEntity>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    inputs List<String>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    source FeatureEngineeringFeatureSource
    The data source of the feature
    timeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    description string
    The description of the feature
    entities FeatureEngineeringFeatureEntity[]
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    fullName string
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunction
    The function by which the feature is computed
    inputs string[]
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext FeatureEngineeringFeatureLineageContext
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig FeatureEngineeringFeatureProviderConfig
    Configure the provider for management through account provider.
    source FeatureEngineeringFeatureSource
    The data source of the feature
    timeWindow FeatureEngineeringFeatureTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn FeatureEngineeringFeatureTimeseriesColumn
    Column recording time, used for point-in-time joins, backfills, and aggregations
    description str
    The description of the feature
    entities Sequence[FeatureEngineeringFeatureEntityArgs]
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filter_condition str
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    full_name str
    The full three-part name (catalog, schema, name) of the feature
    function FeatureEngineeringFeatureFunctionArgs
    The function by which the feature is computed
    inputs Sequence[str]
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineage_context FeatureEngineeringFeatureLineageContextArgs
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    provider_config FeatureEngineeringFeatureProviderConfigArgs
    Configure the provider for management through account provider.
    source FeatureEngineeringFeatureSourceArgs
    The data source of the feature
    time_window FeatureEngineeringFeatureTimeWindowArgs
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseries_column FeatureEngineeringFeatureTimeseriesColumnArgs
    Column recording time, used for point-in-time joins, backfills, and aggregations
    description String
    The description of the feature
    entities List<Property Map>
    The entity columns for the feature, used as aggregation keys and for query-time lookup
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    function Property Map
    The function by which the feature is computed
    inputs List<String>
    Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
    lineageContext Property Map
    Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
    providerConfig Property Map
    Configure the provider for management through account provider.
    source Property Map
    The data source of the feature
    timeWindow Property Map
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    timeseriesColumn Property Map
    Column recording time, used for point-in-time joins, backfills, and aggregations

    Supporting Types

    FeatureEngineeringFeatureEntity, FeatureEngineeringFeatureEntityArgs

    Name string
    Name string
    name String
    name string
    name str
    name String

    FeatureEngineeringFeatureFunction, FeatureEngineeringFeatureFunctionArgs

    AggregationFunction FeatureEngineeringFeatureFunctionAggregationFunction
    An aggregation function applied over a time window
    ExtraParameters List<FeatureEngineeringFeatureFunctionExtraParameter>
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    FunctionType string
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP
    AggregationFunction FeatureEngineeringFeatureFunctionAggregationFunction
    An aggregation function applied over a time window
    ExtraParameters []FeatureEngineeringFeatureFunctionExtraParameter
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    FunctionType string
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP
    aggregationFunction FeatureEngineeringFeatureFunctionAggregationFunction
    An aggregation function applied over a time window
    extraParameters List<FeatureEngineeringFeatureFunctionExtraParameter>
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    functionType String
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP
    aggregationFunction FeatureEngineeringFeatureFunctionAggregationFunction
    An aggregation function applied over a time window
    extraParameters FeatureEngineeringFeatureFunctionExtraParameter[]
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    functionType string
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP
    aggregation_function FeatureEngineeringFeatureFunctionAggregationFunction
    An aggregation function applied over a time window
    extra_parameters Sequence[FeatureEngineeringFeatureFunctionExtraParameter]
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    function_type str
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP
    aggregationFunction Property Map
    An aggregation function applied over a time window
    extraParameters List<Property Map>
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
    functionType String
    Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. The type of the function. Possible values are: APPROX_COUNT_DISTINCT, APPROX_PERCENTILE, AVG, COUNT, FIRST, LAST, MAX, MIN, STDDEV_POP, STDDEV_SAMP, SUM, VAR_POP, VAR_SAMP

    FeatureEngineeringFeatureFunctionAggregationFunction, FeatureEngineeringFeatureFunctionAggregationFunctionArgs

    ApproxCountDistinct FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct
    ApproxPercentile FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile
    Avg FeatureEngineeringFeatureFunctionAggregationFunctionAvg
    CountFunction FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction
    First FeatureEngineeringFeatureFunctionAggregationFunctionFirst
    Last FeatureEngineeringFeatureFunctionAggregationFunctionLast
    Max FeatureEngineeringFeatureFunctionAggregationFunctionMax
    Min FeatureEngineeringFeatureFunctionAggregationFunctionMin
    StddevPop FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop
    StddevSamp FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp
    Sum FeatureEngineeringFeatureFunctionAggregationFunctionSum
    TimeWindow FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    VarPop FeatureEngineeringFeatureFunctionAggregationFunctionVarPop
    VarSamp FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp
    ApproxCountDistinct FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct
    ApproxPercentile FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile
    Avg FeatureEngineeringFeatureFunctionAggregationFunctionAvg
    CountFunction FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction
    First FeatureEngineeringFeatureFunctionAggregationFunctionFirst
    Last FeatureEngineeringFeatureFunctionAggregationFunctionLast
    Max FeatureEngineeringFeatureFunctionAggregationFunctionMax
    Min FeatureEngineeringFeatureFunctionAggregationFunctionMin
    StddevPop FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop
    StddevSamp FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp
    Sum FeatureEngineeringFeatureFunctionAggregationFunctionSum
    TimeWindow FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    VarPop FeatureEngineeringFeatureFunctionAggregationFunctionVarPop
    VarSamp FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp
    approxCountDistinct FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct
    approxPercentile FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile
    avg FeatureEngineeringFeatureFunctionAggregationFunctionAvg
    countFunction FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction
    first FeatureEngineeringFeatureFunctionAggregationFunctionFirst
    last FeatureEngineeringFeatureFunctionAggregationFunctionLast
    max FeatureEngineeringFeatureFunctionAggregationFunctionMax
    min FeatureEngineeringFeatureFunctionAggregationFunctionMin
    stddevPop FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop
    stddevSamp FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp
    sum FeatureEngineeringFeatureFunctionAggregationFunctionSum
    timeWindow FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    varPop FeatureEngineeringFeatureFunctionAggregationFunctionVarPop
    varSamp FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp
    approxCountDistinct FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct
    approxPercentile FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile
    avg FeatureEngineeringFeatureFunctionAggregationFunctionAvg
    countFunction FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction
    first FeatureEngineeringFeatureFunctionAggregationFunctionFirst
    last FeatureEngineeringFeatureFunctionAggregationFunctionLast
    max FeatureEngineeringFeatureFunctionAggregationFunctionMax
    min FeatureEngineeringFeatureFunctionAggregationFunctionMin
    stddevPop FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop
    stddevSamp FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp
    sum FeatureEngineeringFeatureFunctionAggregationFunctionSum
    timeWindow FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    varPop FeatureEngineeringFeatureFunctionAggregationFunctionVarPop
    varSamp FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp
    approx_count_distinct FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct
    approx_percentile FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile
    avg FeatureEngineeringFeatureFunctionAggregationFunctionAvg
    count_function FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction
    first FeatureEngineeringFeatureFunctionAggregationFunctionFirst
    last FeatureEngineeringFeatureFunctionAggregationFunctionLast
    max FeatureEngineeringFeatureFunctionAggregationFunctionMax
    min FeatureEngineeringFeatureFunctionAggregationFunctionMin
    stddev_pop FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop
    stddev_samp FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp
    sum FeatureEngineeringFeatureFunctionAggregationFunctionSum
    time_window FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow
    Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
    var_pop FeatureEngineeringFeatureFunctionAggregationFunctionVarPop
    var_samp FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp

    FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct, FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs

    Input string
    RelativeSd double
    The maximum relative standard deviation allowed (default defined by Spark)
    Input string
    RelativeSd float64
    The maximum relative standard deviation allowed (default defined by Spark)
    input String
    relativeSd Double
    The maximum relative standard deviation allowed (default defined by Spark)
    input string
    relativeSd number
    The maximum relative standard deviation allowed (default defined by Spark)
    input str
    relative_sd float
    The maximum relative standard deviation allowed (default defined by Spark)
    input String
    relativeSd Number
    The maximum relative standard deviation allowed (default defined by Spark)

    FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile, FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs

    Input string
    Percentile double
    The percentile value to compute (between 0 and 1)
    Accuracy int
    The accuracy parameter (higher is more accurate but slower)
    Input string
    Percentile float64
    The percentile value to compute (between 0 and 1)
    Accuracy int
    The accuracy parameter (higher is more accurate but slower)
    input String
    percentile Double
    The percentile value to compute (between 0 and 1)
    accuracy Integer
    The accuracy parameter (higher is more accurate but slower)
    input string
    percentile number
    The percentile value to compute (between 0 and 1)
    accuracy number
    The accuracy parameter (higher is more accurate but slower)
    input str
    percentile float
    The percentile value to compute (between 0 and 1)
    accuracy int
    The accuracy parameter (higher is more accurate but slower)
    input String
    percentile Number
    The percentile value to compute (between 0 and 1)
    accuracy Number
    The accuracy parameter (higher is more accurate but slower)

    FeatureEngineeringFeatureFunctionAggregationFunctionAvg, FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction, FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionFirst, FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionLast, FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionMax, FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionMin, FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop, FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp, FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionSum, FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs

    FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuous, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs

    WindowDuration string
    Offset string
    The offset of the continuous window (must be non-positive)
    WindowDuration string
    Offset string
    The offset of the continuous window (must be non-positive)
    windowDuration String
    offset String
    The offset of the continuous window (must be non-positive)
    windowDuration string
    offset string
    The offset of the continuous window (must be non-positive)
    window_duration str
    offset str
    The offset of the continuous window (must be non-positive)
    windowDuration String
    offset String
    The offset of the continuous window (must be non-positive)

    FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSliding, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs

    SlideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    WindowDuration string
    SlideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    WindowDuration string
    slideDuration String
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration String
    slideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration string
    slide_duration str
    The slide duration (interval by which windows advance, must be positive and less than duration)
    window_duration str
    slideDuration String
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration String

    FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumbling, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs

    FeatureEngineeringFeatureFunctionAggregationFunctionVarPop, FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp, FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs

    Input string
    Input string
    input String
    input string
    input str
    input String

    FeatureEngineeringFeatureFunctionExtraParameter, FeatureEngineeringFeatureFunctionExtraParameterArgs

    Key string
    The name of the parameter
    Value string
    The value of the parameter
    Key string
    The name of the parameter
    Value string
    The value of the parameter
    key String
    The name of the parameter
    value String
    The value of the parameter
    key string
    The name of the parameter
    value string
    The value of the parameter
    key str
    The name of the parameter
    value str
    The value of the parameter
    key String
    The name of the parameter
    value String
    The value of the parameter

    FeatureEngineeringFeatureLineageContext, FeatureEngineeringFeatureLineageContextArgs

    JobContext FeatureEngineeringFeatureLineageContextJobContext
    Job context information including job ID and run ID
    NotebookId int
    The notebook ID where this API was invoked
    JobContext FeatureEngineeringFeatureLineageContextJobContext
    Job context information including job ID and run ID
    NotebookId int
    The notebook ID where this API was invoked
    jobContext FeatureEngineeringFeatureLineageContextJobContext
    Job context information including job ID and run ID
    notebookId Integer
    The notebook ID where this API was invoked
    jobContext FeatureEngineeringFeatureLineageContextJobContext
    Job context information including job ID and run ID
    notebookId number
    The notebook ID where this API was invoked
    job_context FeatureEngineeringFeatureLineageContextJobContext
    Job context information including job ID and run ID
    notebook_id int
    The notebook ID where this API was invoked
    jobContext Property Map
    Job context information including job ID and run ID
    notebookId Number
    The notebook ID where this API was invoked

    FeatureEngineeringFeatureLineageContextJobContext, FeatureEngineeringFeatureLineageContextJobContextArgs

    JobId int
    The job ID where this API invoked
    JobRunId int
    The job run ID where this API was invoked
    JobId int
    The job ID where this API invoked
    JobRunId int
    The job run ID where this API was invoked
    jobId Integer
    The job ID where this API invoked
    jobRunId Integer
    The job run ID where this API was invoked
    jobId number
    The job ID where this API invoked
    jobRunId number
    The job run ID where this API was invoked
    job_id int
    The job ID where this API invoked
    job_run_id int
    The job run ID where this API was invoked
    jobId Number
    The job ID where this API invoked
    jobRunId Number
    The job run ID where this API was invoked

    FeatureEngineeringFeatureProviderConfig, FeatureEngineeringFeatureProviderConfigArgs

    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspace_id str
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.

    FeatureEngineeringFeatureSource, FeatureEngineeringFeatureSourceArgs

    FeatureEngineeringFeatureSourceDeltaTableSource, FeatureEngineeringFeatureSourceDeltaTableSourceArgs

    FullName string
    The full three-part name (catalog, schema, name) of the feature
    DataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    EntityColumns List<string>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    TimeseriesColumn string
    Column recording time, used for point-in-time joins, backfills, and aggregations
    TransformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    FullName string
    The full three-part name (catalog, schema, name) of the feature
    DataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    EntityColumns []string
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    TimeseriesColumn string
    Column recording time, used for point-in-time joins, backfills, and aggregations
    TransformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    dataframeSchema String
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns List<String>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumn String
    Column recording time, used for point-in-time joins, backfills, and aggregations
    transformationSql String
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName string
    The full three-part name (catalog, schema, name) of the feature
    dataframeSchema string
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns string[]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumn string
    Column recording time, used for point-in-time joins, backfills, and aggregations
    transformationSql string
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    full_name str
    The full three-part name (catalog, schema, name) of the feature
    dataframe_schema str
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entity_columns Sequence[str]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filter_condition str
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseries_column str
    Column recording time, used for point-in-time joins, backfills, and aggregations
    transformation_sql str
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
    fullName String
    The full three-part name (catalog, schema, name) of the feature
    dataframeSchema String
    Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
    entityColumns List<String>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumn String
    Column recording time, used for point-in-time joins, backfills, and aggregations
    transformationSql String
    A single SQL SELECT expression applied after filter_condition. Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have transformation_sql ", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe

    FeatureEngineeringFeatureSourceKafkaSource, FeatureEngineeringFeatureSourceKafkaSourceArgs

    Name string
    EntityColumnIdentifiers List<FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    TimeseriesColumnIdentifier FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
    Name string
    EntityColumnIdentifiers []FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    FilterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    TimeseriesColumnIdentifier FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
    name String
    entityColumnIdentifiers List<FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumnIdentifier FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
    name string
    entityColumnIdentifiers FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier[]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    filterCondition string
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumnIdentifier FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
    name str
    entity_column_identifiers Sequence[FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier]
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    filter_condition str
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseries_column_identifier FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
    name String
    entityColumnIdentifiers List<Property Map>
    Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
    filterCondition String
    Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
    timeseriesColumnIdentifier Property Map
    Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source

    FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier, FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs

    VariantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    VariantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath String
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variant_expr_path str
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath String
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip

    FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier, FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs

    VariantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    VariantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath String
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath string
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variant_expr_path str
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
    variantExprPath String
    String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip

    FeatureEngineeringFeatureTimeWindow, FeatureEngineeringFeatureTimeWindowArgs

    FeatureEngineeringFeatureTimeWindowContinuous, FeatureEngineeringFeatureTimeWindowContinuousArgs

    WindowDuration string
    Offset string
    The offset of the continuous window (must be non-positive)
    WindowDuration string
    Offset string
    The offset of the continuous window (must be non-positive)
    windowDuration String
    offset String
    The offset of the continuous window (must be non-positive)
    windowDuration string
    offset string
    The offset of the continuous window (must be non-positive)
    window_duration str
    offset str
    The offset of the continuous window (must be non-positive)
    windowDuration String
    offset String
    The offset of the continuous window (must be non-positive)

    FeatureEngineeringFeatureTimeWindowSliding, FeatureEngineeringFeatureTimeWindowSlidingArgs

    SlideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    WindowDuration string
    SlideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    WindowDuration string
    slideDuration String
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration String
    slideDuration string
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration string
    slide_duration str
    The slide duration (interval by which windows advance, must be positive and less than duration)
    window_duration str
    slideDuration String
    The slide duration (interval by which windows advance, must be positive and less than duration)
    windowDuration String

    FeatureEngineeringFeatureTimeWindowTumbling, FeatureEngineeringFeatureTimeWindowTumblingArgs

    FeatureEngineeringFeatureTimeseriesColumn, FeatureEngineeringFeatureTimeseriesColumnArgs

    Name string
    Name string
    name String
    name string
    name str
    name String

    Package Details

    Repository
    databricks pulumi/pulumi-databricks
    License
    Apache-2.0
    Notes
    This Pulumi package is based on the databricks Terraform Provider.
    databricks logo
    Viewing docs for Databricks v1.90.0
    published on Thursday, Mar 19, 2026 by Pulumi
      Try Pulumi Cloud free. Your team will thank you.