package input
- Alphabetic
- Public
- All
Type Members
-
case class
Autogenous(dataOsAddress: String, options: Map[String, String]) extends Product with Serializable
Represents an autogenous data source with a DataOs address and options.
Represents an autogenous data source with a DataOs address and options.
- dataOsAddress
The DataOs address for the autogenous data source.
- options
Additional options for the autogenous data source.
-
case class
BigqueryDatasourceInput(project: String, dataset: String, table: String, gcsKeyJsonFilePath: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a BigQuery data source input.
Represents a BigQuery data source input.
- project
The BigQuery project ID.
- dataset
The BigQuery dataset name.
- table
The BigQuery table name.
- gcsKeyJsonFilePath
Optional file path to the Google Cloud Storage (GCS) key JSON file.
- options
Optional additional options for the BigQuery data source.
- incremental
Optional incremental settings for the data source.
-
case class
DatasetInput(name: String, dataset: String, isStream: Option[Boolean], format: Option[String], schema: Option[Schema], queryParams: Option[String], pathParams: Map[String, String] = Map.empty, options: Option[Map[String, String]], incremental: Option[Incremental]) extends InputConfig with Product with Serializable
Represents a dataset input configuration.
Represents a dataset input configuration.
- name
The name of the input.
- dataset
The dataset path or address.
- isStream
Optional flag indicating if the input is a stream.
- format
Optional format of the input data.
- options
Optional additional options for the input.
- incremental
Optional incremental settings for the input.
-
case class
DatasetInputOptions(schema: Option[Schema], queryParams: Option[String], pathParams: Map[String, String] = Map.empty, incremental: Option[Incremental]) extends Product with Serializable
Represents DatasetInputOptions used for dataset input configuration.
Represents DatasetInputOptions used for dataset input configuration.
- schema
An optional Schema representing the dataset schema.
- queryParams
An optional String representing query parameters to append to the resolved dataos URL. e.g., "a=b&c".
- pathParams
A Map containing path parameters to replace in the resolved dataos URL. The Map should be in the format of key-value pairs, where the key represents the parameter placeholder and the value represents the replacement value.
- incremental
An optional Incremental representing the incremental configuration for the dataset input.
-
class
DatasetInputOptionsBuilder extends AnyRef
DatasetInputOptionsBuilder for constructing DatasetInputOptions.
-
class
DatasourceInput extends InputConfig
Datasource input configuration.
-
case class
ElasticsearchDatasourceInput(nodes: String, index: String, username: Option[String], password: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a configuration for an Elasticsearch datasource input.
Represents a configuration for an Elasticsearch datasource input.
- nodes
Comma-separated list of Elasticsearch nodes.
- index
The Elasticsearch index to read from.
- username
Optional username for authentication.
- password
Optional password for authentication.
- options
Optional additional options for the input.
- incremental
Optional incremental settings for the input.
-
case class
EventHubDatasourceInput(endpoint: String, eventhubName: String, sasKeyName: String, sasKey: String, isBatch: Option[Boolean], incremental: Option[Incremental], options: Option[Map[String, String]]) extends DatasourceInput with Product with Serializable
EventHub datasource input configuration.
EventHub datasource input configuration.
- endpoint
The EventHub endpoint.
- eventhubName
The EventHub name.
- sasKeyName
The SAS key name.
- sasKey
The SAS key.
- isBatch
Optional flag indicating if the input is in batch mode.
- incremental
Optional incremental configuration.
- options
Optional additional options for the datasource.
-
case class
FileDatasourceInput(path: Option[String], paths: Option[Seq[String]], warehousePath: Option[String], catalogName: Option[String], tableName: Option[String], metastoreUri: Option[String], options: Option[Map[String, String]] = Some(Map.empty), format: Option[String] = Some("parquet"), isStream: Option[Boolean] = Some(false), incremental: Option[Incremental], icebergCatalogType: Option[String], schema: Option[Schema], schemaName: Option[String]) extends DatasourceInput with Product with Serializable
File datasource input configuration.
File datasource input configuration.
- path
Optional path to a single file or directory.
- paths
Optional paths to multiple files or directories.
- warehousePath
Optional path to the warehouse directory.
- catalogName
Optional name of the catalog.
- tableName
Optional name of the table.
- metastoreUri
Optional URI of the metastore.
- options
Optional additional options for the datasource.
- format
Optional format of the files (e.g., parquet, csv).
- isStream
Optional flag indicating if the input is in stream mode.
- incremental
Optional incremental configuration.
- icebergCatalogType
Optional type of the Iceberg catalog.
- schema
An optional Schema representing the data schema for the input data.
- schemaName
Optional name of the schema.
-
case class
Incremental(context: String, sql: String, keys: List[IncrementalKeys], state: Option[List[IncrementalState]]) extends Product with Serializable
Incremental configuration for data processing.
Incremental configuration for data processing.
- context
The context of the incremental processing.
- sql
The SQL query for incremental processing.
- keys
The list of incremental keys.
- state
Optional list of incremental state.
-
case class
IncrementalKeys(name: String, default: Option[String], sql: Option[String]) extends Product with Serializable
Incremental key configuration.
Incremental key configuration.
- name
The name of the incremental key.
- default
Optional default value for the incremental key.
- sql
Optional SQL query for the incremental key.
-
case class
IncrementalState(key: Option[String], value: Option[String]) extends Product with Serializable
Incremental state configuration.
Incremental state configuration.
- key
Optional key for the incremental state.
- value
Optional value for the incremental state.
-
trait
InputConfig extends AnyRef
Represents a configuration for input data.
-
case class
JDBCDatasourceInput(username: String, password: Option[String], url: String, table: String, options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a JDBC data source input configuration.
Represents a JDBC data source input configuration.
- username
The username for JDBC authentication.
- password
The password for JDBC authentication.
- url
The JDBC connection URL.
- table
The name of the table in the database.
- options
Additional options for the JDBC data source.
- incremental
The incremental configuration for the input.
-
case class
KafkaDatasourceInput(brokers: String, topic: Option[String], schemaRegistryUrl: Option[String], isBatch: Option[Boolean], incremental: Option[Incremental], format: Option[String], options: Option[Map[String, String]]) extends DatasourceInput with Product with Serializable
Represents a Kafka data source input configuration.
Represents a Kafka data source input configuration.
- brokers
The list of Kafka brokers.
- topic
The Kafka topic.
- schemaRegistryUrl
The URL of the schema registry.
- isBatch
Indicates whether the input is batch or streaming.
- incremental
The incremental configuration for the input.
- format
The data format of the input.
- options
Additional options for the Kafka data source.
-
case class
MinervaInput(name: String, options: Option[Map[String, String]], query: String) extends InputConfig with Product with Serializable
Represents a Minerva input configuration.
Represents a Minerva input configuration.
- name
The name of the input.
- options
Additional options for the Minerva input.
- query
The Minerva query.
-
case class
MongoDbDatasourceInput(nodes: List[String], subprotocol: String, database: String, table: String, username: String, password: String, queryParams: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a MongoDB datasource input configuration.
Represents a MongoDB datasource input configuration.
- nodes
The list of MongoDB nodes.
- subprotocol
The subprotocol for the MongoDB connection.
- database
The MongoDB database.
- table
The MongoDB table.
- username
The username for authentication.
- password
The password for authentication.
- options
Additional options for the MongoDB input.
- incremental
The incremental configuration.
-
case class
OpensearchDatasourceInput(nodes: String, index: String, username: Option[String], password: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a configuration for an Opensearch datasource input.
Represents a configuration for an Opensearch datasource input.
- nodes
Comma-separated list of Opensearch nodes.
- index
The Opensearch index to read from.
- username
Optional username for authentication.
- password
Optional password for authentication.
- options
Optional additional options for the input.
- incremental
Optional incremental settings for the input.
-
case class
PulsarDatasourceInput(serviceUrl: String, adminUrl: String, tenant: String, namespace: String, topic: String, topicPattern: Option[String], options: Option[Map[String, String]], isBatch: Option[Boolean], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a Pulsar datasource input configuration.
Represents a Pulsar datasource input configuration.
- serviceUrl
The Pulsar service URL.
- adminUrl
The Pulsar admin URL.
- tenant
The Pulsar tenant.
- namespace
The Pulsar namespace.
- topic
The Pulsar topic.
- topicPattern
The optional pattern for matching multiple topics. (Note: Avoid using this, as it can make it difficult to determine the input dataset.)
- options
Additional options for the Pulsar input.
- isBatch
Indicates if the Pulsar input is in batch mode.
- incremental
The incremental configuration.
-
case class
RedisDatasourceInput(host: String, port: Int, db: Int, table: String, password: Option[String], isBatch: Option[Boolean], schemaPath: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a Redis datasource input configuration.
Represents a Redis datasource input configuration.
- host
The Redis server host.
- port
The Redis server port.
- db
The Redis database index.
- table
The Redis table or key.
- password
The optional password for authenticating with Redis.
- isBatch
Indicates if the Redis input is in batch mode.
- schemaPath
The optional path to the Redis schema definition.
- options
Additional options for the Redis input.
- incremental
The incremental configuration.
-
case class
RedshiftDatasourceInput(jdbcUrl: String, username: String, password: Option[String], dbTable: String, tempDir: String, options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a Redshift datasource input configuration.
Represents a Redshift datasource input configuration.
- jdbcUrl
The JDBC URL for connecting to Redshift.
- username
The username for authenticating with Redshift.
- password
The optional password for authenticating with Redshift.
- dbTable
The Redshift database table.
- tempDir
The temporary directory for storing intermediate data.
- options
Additional options for the Redshift input.
- incremental
The incremental configuration.
-
case class
Schema(path: Option[String], type: String, content: Option[String], registryUrl: Option[String], subject: Option[String], id: Option[Int]) extends Product with Serializable
Represents a Schema used in DatasetInputOptions.
Represents a Schema used in DatasetInputOptions.
- path
An optional String representing the path.
- content
An optional String representing the content.
- registryUrl
An optional String representing the registry URL.
- subject
An optional String representing the subject.
- id
An optional Int representing the ID.
-
case class
SnowflakeDatasourceInput(user: String, password: Option[String], token: Option[String], pemPrivateKey: Option[String], url: String, database: String, table: String, schema: String, snowflakeWarehouse: Option[String], options: Option[Map[String, String]], incremental: Option[Incremental]) extends DatasourceInput with Product with Serializable
Represents a Snowflake datasource input configuration.
Represents a Snowflake datasource input configuration.
- user
The Snowflake username.
- password
The optional Snowflake password.
- token
The optional Snowflake token.
- pemPrivateKey
The optional PEM private key for Snowflake authentication.
- url
The Snowflake URL.
- database
The Snowflake database.
- table
The Snowflake table.
- schema
The Snowflake schema (collection).
- snowflakeWarehouse
The optional Snowflake warehouse.
- options
Additional options for the Snowflake input.
- incremental
The incremental configuration.
Value Members
- object DatasetInputOptions extends Serializable