package utils
- Alphabetic
- By Inheritance
- utils
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
class
AddressResolver extends AnyRef
Utility object for resolving and handling different types of data addresses.
-
case class
SchemaType(typeName: String, nullable: Boolean) extends Product with Serializable
Schema Converter for getting schema in json format into a spark Structure
Schema Converter for getting schema in json format into a spark Structure
The given schema for spark has almost no validity checks, so it will make sense to combine this with the schema-validator. For loading data with schema, data is converted to the type given in the schema. If this is not possible the whole row will be null (!). A field can be null if its type is a 2-element array, one of which is "null". The converted schema doesn't check for 'enum' fields, i.e. fields which are limited to a given set. It also doesn't check for required fields or if additional properties are set to true or false. If a field is specified in the schema, than you can select it and it will be null if missing. If a field is not in the schema, it cannot be selected even if given in the dataset.
Value Members
-
def
encodeUrl(value: String): String
Encodes a given string using the UTF-8 encoding.
Encodes a given string using the UTF-8 encoding.
- value
The string to encode.
- returns
The encoded string.
-
def
getDataframeSchema(df: DataFrame): Schema
Converts the schema of a DataFrame to an Avro schema using the avro.AvroSchemaConverter.
Converts the schema of a DataFrame to an Avro schema using the avro.AvroSchemaConverter.
- df
The DataFrame whose schema to convert.
- returns
The converted Avro schema.
-
def
isDataOSAddress(address: String): Boolean
Checks if an address is a DataOS address.
Checks if an address is a DataOS address.
- address
The address to check.
- returns
true
if the address is a DataOS address,false
otherwise.
-
def
joinPaths(filePath: String, path: String): String
Joins two paths together, ensuring correct formatting by removing any trailing or leading slashes as needed.
Joins two paths together, ensuring correct formatting by removing any trailing or leading slashes as needed.
- filePath
The base path.
- path
The path to join.
- returns
The joined path.
-
def
randomViewName: String
Generates a random view name by combining an alphanumeric string with the current system time.
Generates a random view name by combining an alphanumeric string with the current system time.
- returns
The generated random view name.
-
def
stringOrNone(secretMap: Map[String, String], key: String): Option[String]
Retrieves a value from a java.util.Map based on the specified key.
Retrieves a value from a java.util.Map based on the specified key. If the value is null, it returns None; otherwise, it wraps the value in an Option.
- secretMap
The map to retrieve the value from.
- key
The key of the value to retrieve.
- returns
An Option containing the retrieved value, or None if the value is not found.
-
def
stringOrThrow(secretMap: Map[String, String], key: String): String
Retrieves a value from a java.util.Map based on the specified key.
Retrieves a value from a java.util.Map based on the specified key. If the value is null, it throws a FlareInvalidConfigException.
- secretMap
The map to retrieve the value from.
- key
The key of the value to retrieve.
- returns
The retrieved value.
- Exceptions thrown
io.dataos.spark.exceptions.FlareInvalidConfigException
if the value is not found.
-
def
tryCastingToMap(value: Option[Any]): Option[Map[String, String]]
Attempts to cast an Option[Any] to Option[Map[String, String]].
Attempts to cast an Option[Any] to Option[Map[String, String]]. If the cast is successful, it returns Some with the casted value; otherwise, it returns None.
- value
The value to cast.
- returns
An Option containing the casted value, or None if the cast is not possible.
- object AddressResolver
- object Constants
-
object
DatasetFormat extends Enumeration
Enumeration representing different formats that a dataset can have.
- object DepotType extends Enumeration
-
object
EnvUtils
EnvUtils helps you deal with all the required environment variables along with their default values.
-
object
FileUtils
Utility object for file-related operations.
-
object
SchemaConverter
The
SchemaConverter
object provides utility methods for converting JSON schemas into Spark SQL StructType. -
object
SparkConfigUtils
SparkConfigUtils - Contains helper method to deal with spark configuration e.g.
SparkConfigUtils - Contains helper method to deal with spark configuration e.g. 1. Load spark configuration from heimdall 2. Load spark configurations from depot config mounted in files named with depot.(key-value-properties secrets) 3. Load spark configurations from key-value secrets