Packages

package utils

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. utils
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class AddressResolver extends AnyRef

    Utility object for resolving and handling different types of data addresses.

  2. case class SchemaType(typeName: String, nullable: Boolean) extends Product with Serializable

    Schema Converter for getting schema in json format into a spark Structure

    Schema Converter for getting schema in json format into a spark Structure

    The given schema for spark has almost no validity checks, so it will make sense to combine this with the schema-validator. For loading data with schema, data is converted to the type given in the schema. If this is not possible the whole row will be null (!). A field can be null if its type is a 2-element array, one of which is "null". The converted schema doesn't check for 'enum' fields, i.e. fields which are limited to a given set. It also doesn't check for required fields or if additional properties are set to true or false. If a field is specified in the schema, than you can select it and it will be null if missing. If a field is not in the schema, it cannot be selected even if given in the dataset.

Value Members

  1. def encodeUrl(value: String): String

    Encodes a given string using the UTF-8 encoding.

    Encodes a given string using the UTF-8 encoding.

    value

    The string to encode.

    returns

    The encoded string.

  2. def getDataframeSchema(df: DataFrame): Schema

    Converts the schema of a DataFrame to an Avro schema using the avro.AvroSchemaConverter.

    Converts the schema of a DataFrame to an Avro schema using the avro.AvroSchemaConverter.

    df

    The DataFrame whose schema to convert.

    returns

    The converted Avro schema.

  3. def isDataOSAddress(address: String): Boolean

    Checks if an address is a DataOS address.

    Checks if an address is a DataOS address.

    address

    The address to check.

    returns

    true if the address is a DataOS address, false otherwise.

  4. def joinPaths(filePath: String, path: String): String

    Joins two paths together, ensuring correct formatting by removing any trailing or leading slashes as needed.

    Joins two paths together, ensuring correct formatting by removing any trailing or leading slashes as needed.

    filePath

    The base path.

    path

    The path to join.

    returns

    The joined path.

  5. def randomViewName: String

    Generates a random view name by combining an alphanumeric string with the current system time.

    Generates a random view name by combining an alphanumeric string with the current system time.

    returns

    The generated random view name.

  6. def stringOrNone(secretMap: Map[String, String], key: String): Option[String]

    Retrieves a value from a java.util.Map based on the specified key.

    Retrieves a value from a java.util.Map based on the specified key. If the value is null, it returns None; otherwise, it wraps the value in an Option.

    secretMap

    The map to retrieve the value from.

    key

    The key of the value to retrieve.

    returns

    An Option containing the retrieved value, or None if the value is not found.

  7. def stringOrThrow(secretMap: Map[String, String], key: String): String

    Retrieves a value from a java.util.Map based on the specified key.

    Retrieves a value from a java.util.Map based on the specified key. If the value is null, it throws a FlareInvalidConfigException.

    secretMap

    The map to retrieve the value from.

    key

    The key of the value to retrieve.

    returns

    The retrieved value.

    Exceptions thrown

    io.dataos.spark.exceptions.FlareInvalidConfigException if the value is not found.

  8. def tryCastingToMap(value: Option[Any]): Option[Map[String, String]]

    Attempts to cast an Option[Any] to Option[Map[String, String]].

    Attempts to cast an Option[Any] to Option[Map[String, String]]. If the cast is successful, it returns Some with the casted value; otherwise, it returns None.

    value

    The value to cast.

    returns

    An Option containing the casted value, or None if the cast is not possible.

  9. object AddressResolver
  10. object Constants
  11. object DatasetFormat extends Enumeration

    Enumeration representing different formats that a dataset can have.

  12. object DepotType extends Enumeration
  13. object EnvUtils

    EnvUtils helps you deal with all the required environment variables along with their default values.

  14. object FileUtils

    Utility object for file-related operations.

  15. object SchemaConverter

    The SchemaConverter object provides utility methods for converting JSON schemas into Spark SQL StructType.

  16. object SparkConfigUtils

    SparkConfigUtils - Contains helper method to deal with spark configuration e.g.

    SparkConfigUtils - Contains helper method to deal with spark configuration e.g. 1. Load spark configuration from heimdall 2. Load spark configurations from depot config mounted in files named with depot.(key-value-properties secrets) 3. Load spark configurations from key-value secrets

Inherited from AnyRef

Inherited from Any

Ungrouped