Coral is a library for analyzing, processing, and rewriting views defined in the Hive Metastore, and sharing them across multiple execution engines. It performs SQL translations to enable views expressed in HiveQL (and potentially other languages) to be accessible in engines such as Trino (formerly PrestoSQL), Apache Spark, and Apache Pig. Coral not only translates view definitions between different SQL/non-SQL dialects, but also rewrites expressions to produce semantically equivalent ones, taking into account the semantics of the target language or engine. For example, it automatically composes new built-in expressions that are equivalent to each built-in expression in the source view definition. Additionally, it integrates with Transport UDFs to enable translating and executing user-defined functions (UDFs) across Hive, Trino, Spark, and Pig. Coral is under active development. Currently, we are looking into expanding the set of input view language APIs beyond HiveQL, and implementing query rewrite algorithms for data governance and query optimization.
- Join the discussion with the community on Slack here!
Coral consists of following modules:
- Coral-Hive: Converts definitions of Hive views with UDFs to equivalent view logical plan.
- Coral-Trino: Converts view logical plan to Trino (formerly PrestoSQL) SQL, and vice versa.
- Coral-Spark: Converts view logical plan to Spark SQL.
- Coral-Pig: Converts view logical plan to Pig-latin.
- Coral-Incremental [WIP]: Derives an incremental query from input SQL for incremental view maintenance.
- Coral-Schema: Derives Avro schema of view using view logical plan and input Avro schemas of base tables.
- Coral-Spark-Plan [WIP]: Converts Spark plan strings to equivalent logical plan.
- Coral-Visualization [WIP]: Visualizes Coral SqlNode and RelNode trees and renders them to an output file.
- Coral-Service: Service that exposes REST APIs that allow users to interact with Coral (see Coral-as-a-Service for more details).
Clone the repository:
git clone https://github.com/linkedin/coral.git
Build:
./gradlew clean build
The project is under active development and we welcome contributions of different forms. Please see the Contribution Agreement.
- Coral: A SQL translation, analysis, and rewrite engine for modern data lakehouses, LinkedIn Engineering Blog, 12/10/2020.
- Coral & Transport UDFs: Building Blocks of a Postmodern Data Warehouse, Tech-talk, Facebook HQ, 2/28/2020.
- Transport: Towards Logical Independence Using Translatable Portable UDFs, LinkedIn Engineering Blog, 11/14/2018.
- Dali Views: Functions as a Service for Big Data, LinkedIn Engineering Blog, 11/9/2017.
Coral-as-a-Service or simply, Coral Service is a service that exposes REST APIs that allow users to interact with Coral without necessarily coming from a compute engine. Currently, the service supports an API for query translation between different dialects and another for interacting with a local Hive Metastore to create example databases, tables, and views so they can be referenced in the translation API. The service can be used in two modes: remote Hive Metastore mode, and local Hive Metastore mode. The remote mode uses an existing (already deployed) Hive Metastore to resolve tables and views, while the local one creates an empty embedded Hive Metastore so users can add their own table and view definitions.
A POST API which takes JSON request body containing following parameters and returns the translated query:
fromLanguage
: Input dialect (e.g., spark, trino, hive -- see below for supported inputs)toLanguage
: Output dialect (e.g., spark, trino, hive -- see below for supported outputs)query
: SQL query to translate between two dialects
A POST API which takes a SQL statement to create a database/table/view in the local metastore (note: this endpoint is only available with Coral Service in local metastore mode).
- Clone Coral repo
git clone https://github.com/linkedin/coral.git
- From the root directory of Coral, access the coral-service module
cd coral-service
- Build
../gradlew clean build
- Run
../gradlew bootRun --args='--spring.profiles.active=localMetastore'
- Add your kerberos client keytab file to
coral-service/src/main/resources
- Appropriately replace all instances of
SET_ME
incoral-service/src/main/resources/hive.properties
- Run
../gradlew bootRun
You can also specify a custom location of hive.properties
file through --hivePropsLocation
as follows
./gradlew bootRun --args='--hivePropsLocation=/tmp/hive.properties'
Then you can interact with the service using your browser or the CLI.
After running ../gradlew bootRun --args='--spring.profiles.active=localMetastore'
(for local metastore mode)
or ../gradlew bootRun
(for remote metastore mode) from coral-service module,
the UI can be accessed from the browser. Use the URL http://localhost:8080 to run the UI on a local browser.
The UI provides 2 features:
This feature is only available with Coral Service in local metastore mode, it calls /api/catalog-ops/execute
API above.
You can enter a SQL statement to create a database/table/view in the local metastore:
This feature is available with Coral Service in both local and remote metastore modes, it calls /api/translations/translate
API above.
You can enter a SQL query and specify the source and target language to use Coral translation service:
Apart from the UI above, you can also interact with the service using the CLI.
Example workflow for local metastore mode:
- Create a database called
db1
in local metastore using the/api/catalog-ops/execute
endpoint
curl --header "Content-Type: application/json" \
--request POST \
--data "CREATE DATABASE IF NOT EXISTS db1" \
http://localhost:8080/api/catalog-ops/execute
Creation successful
- Create a table called
airport
withindb1
in local metastore using the/api/catalog-ops/execute
endpoint
curl --header "Content-Type: application/json" \
--request POST \
--data "CREATE TABLE IF NOT EXISTS db1.airport(name string, country string, area_code int, code string, datepartition string)" \
http://localhost:8080/api/catalog-ops/execute
Creation successful
- Translate a query on
db1.airport
in local metastore using the/api/translations/translate
endpoint
curl --header "Content-Type: application/json" \
--request POST \
--data '{
"fromLanguage":"hive",
"toLanguage":"trino",
"query":"SELECT * FROM db1.airport"
}' \
http://localhost:8080/api/translations/translate
The translation result is:
Original query in Hive QL:
SELECT * FROM db1.airport
Translated to Trino SQL:
SELECT "name", "country", "area_code", "code", "datepartition"
FROM "db1"."airport"
- Hive to Trino
- Hive to Spark
- Trino to Spark
Note: During Trino to Spark translations, views referenced in queries are considered to be defined in HiveQL and hence cannot be used when translating a view from Trino. Currently, only referencing base tables is supported in Trino queries. This translation path is currently a POC and may need further improvements.