Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init branch-21.08 #2543

Merged
merged 1 commit into from
May 31, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions api_validation/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
</parent>
<artifactId>rapids-4-spark-api-validation</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<profiles>
<profile>
Expand Down Expand Up @@ -78,7 +78,7 @@
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-aggregator_${scala.binary.version}</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
</dependencies>
Expand Down
4 changes: 2 additions & 2 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
</parent>
<artifactId>rapids-4-spark_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Distribution</name>
<description>Creates the distribution package of the RAPIDS plugin for Apache Spark</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion docs/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The following is the list of options that `rapids-plugin-4-spark` supports.
On startup use: `--conf [conf key]=[conf value]`. For example:

```
${SPARK_HOME}/bin/spark --jars 'rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar,cudf-21.06.0-SNAPSHOT-cuda11.jar' \
${SPARK_HOME}/bin/spark --jars 'rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar,cudf-21.08.0-SNAPSHOT-cuda11.jar' \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.rapids.sql.incompatibleOps.enabled=true
```
Expand Down
4 changes: 2 additions & 2 deletions docs/get-started/Dockerfile.cuda
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,8 @@ COPY spark-3.0.2-bin-hadoop3.2/examples /opt/spark/examples
COPY spark-3.0.2-bin-hadoop3.2/kubernetes/tests /opt/spark/tests
COPY spark-3.0.2-bin-hadoop3.2/data /opt/spark/data

COPY cudf-21.06.0-SNAPSHOT-cuda11.jar /opt/sparkRapidsPlugin
COPY rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar /opt/sparkRapidsPlugin
COPY cudf-21.08.0-SNAPSHOT-cuda11.jar /opt/sparkRapidsPlugin
COPY rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar /opt/sparkRapidsPlugin
COPY getGpusResources.sh /opt/sparkRapidsPlugin

RUN mkdir /opt/spark/python
Expand Down
8 changes: 4 additions & 4 deletions docs/get-started/getting-started-on-prem.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,17 +53,17 @@ CUDA and will not run on other versions. The jars use a maven classifier to keep
- CUDA 11.0/11.1/11.2 => classifier cuda11

For example, here is a sample version of the jars and cudf with CUDA 11.0 support:
- cudf-21.06.0-SNAPSHOT-cuda11.jar
- rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar
- cudf-21.08.0-SNAPSHOT-cuda11.jar
- rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar
jar that your version of the accelerator depends on.


For simplicity export the location to these jars. This example assumes the sample jars above have
been placed in the `/opt/sparkRapidsPlugin` directory:
```shell
export SPARK_RAPIDS_DIR=/opt/sparkRapidsPlugin
export SPARK_CUDF_JAR=${SPARK_RAPIDS_DIR}/cudf-21.06.0-SNAPSHOT-cuda11.jar
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar
export SPARK_CUDF_JAR=${SPARK_RAPIDS_DIR}/cudf-21.08.0-SNAPSHOT-cuda11.jar
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar
```

## Install the GPU Discovery Script
Expand Down
6 changes: 3 additions & 3 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ individually, so you don't risk running unit tests along with the integration te
http://www.scalatest.org/user_guide/using_the_scalatest_shell

```shell
spark-shell --jars rapids-4-spark-tests_2.12-21.06.0-SNAPSHOT-tests.jar,rapids-4-spark-udf-examples_2.12-21.06.0-SNAPSHOT,rapids-4-spark-integration-tests_2.12-21.06.0-SNAPSHOT-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
spark-shell --jars rapids-4-spark-tests_2.12-21.08.0-SNAPSHOT-tests.jar,rapids-4-spark-udf-examples_2.12-21.08.0-SNAPSHOT,rapids-4-spark-integration-tests_2.12-21.08.0-SNAPSHOT-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
```

First you import the `scalatest_shell` and tell the tests where they can find the test files you
Expand All @@ -158,7 +158,7 @@ If you just want to verify the SQL replacement is working you will need to add t
example assumes CUDA 11.0 is being used.

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar,rapids-4-spark-udf-examples_2.12-21.06.0-SNAPSHOT.jar,cudf-21.06.0-SNAPSHOT-cuda11.jar" ./runtests.py
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar,rapids-4-spark-udf-examples_2.12-21.08.0-SNAPSHOT.jar,cudf-21.08.0-SNAPSHOT-cuda11.jar" ./runtests.py
```

You don't have to enable the plugin for this to work, the test framework will do that for you.
Expand Down Expand Up @@ -232,7 +232,7 @@ To run cudf_udf tests, need following configuration changes:
As an example, here is the `spark-submit` command with the cudf_udf parameter on CUDA 11.0:

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar,rapids-4-spark-udf-examples_2.12-21.06.0-SNAPSHOT.jar,cudf-21.06.0-SNAPSHOT-cuda11.jar,rapids-4-spark-tests_2.12-21.06.0-SNAPSHOT.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-21.06.0-SNAPSHOT.jar" ./runtests.py --cudf_udf
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar,rapids-4-spark-udf-examples_2.12-21.08.0-SNAPSHOT.jar,cudf-21.08.0-SNAPSHOT-cuda11.jar,rapids-4-spark-tests_2.12-21.08.0-SNAPSHOT.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-21.08.0-SNAPSHOT.jar" ./runtests.py --cudf_udf
```

## Writing tests
Expand Down
4 changes: 2 additions & 2 deletions integration_tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
</parent>
<artifactId>rapids-4-spark-integration-tests_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<dependencies>
<dependency>
Expand Down
2 changes: 1 addition & 1 deletion jenkins/Dockerfile-blossom.integration.centos
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
# Arguments:
# CUDA_VER=11.0, 11.1 or 11.2.x
# CENTOS_VER=7 or 8
# CUDF_VER=21.06
# CUDF_VER=21.08
# URM_URL=<maven repo url>
###

Expand Down
2 changes: 1 addition & 1 deletion jenkins/Dockerfile-blossom.integration.ubuntu
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
# Arguments:
# CUDA_VER=11.0, 11.1 or 11.2.x
# UBUNTU_VER=18.04 or 20.04
# CUDF_VER=21.06
# CUDF_VER=21.08
###

ARG CUDA_VER=11.0
Expand Down
2 changes: 1 addition & 1 deletion jenkins/databricks/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ def main():
workspace = 'https://dbc-9ff9942e-a9c4.cloud.databricks.com'
token = ''
sshkey = ''
cluster_name = 'CI-GPU-databricks-21.06.0-SNAPSHOT'
cluster_name = 'CI-GPU-databricks-21.08.0-SNAPSHOT'
idletime = 240
runtime = '7.0.x-gpu-ml-scala2.12'
num_workers = 1
Expand Down
2 changes: 1 addition & 1 deletion jenkins/databricks/init_cudf_udf.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
# The initscript to set up environment for the cudf_udf tests on Databricks
# Will be automatically pushed into the dbfs:/databricks/init_scripts once it is updated.

CUDF_VER=${CUDF_VER:-21.06}
CUDF_VER=${CUDF_VER:-21.08}

# Use mamba to install cudf-udf packages to speed up conda resolve time
base=$(conda info --base)
Expand Down
2 changes: 1 addition & 1 deletion jenkins/printJarVersion.sh
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ function print_ver(){
SERVER_ID=$5

# Collect snapshot dependency info only in Jenkins build
# In dev build, print 'SNAPSHOT' tag without time stamp, e.g.: cudf-21.06.0-SNAPSHOT.jar
# In dev build, print 'SNAPSHOT' tag without time stamp, e.g.: cudf-21.08.0-SNAPSHOT.jar
if [[ "$VERSION" == *"-SNAPSHOT" && -n "$JENKINS_URL" ]]; then
PREFIX=${VERSION%-SNAPSHOT}
# List the latest SNAPSHOT jar file in the maven repo
Expand Down
6 changes: 3 additions & 3 deletions jenkins/version-def.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,10 @@ for VAR in $OVERWRITE_PARAMS;do
done
IFS=$PRE_IFS

CUDF_VER=${CUDF_VER:-"21.06.0-SNAPSHOT"}
CUDF_VER=${CUDF_VER:-"21.08.0-SNAPSHOT"}
CUDA_CLASSIFIER=${CUDA_CLASSIFIER:-"cuda11"}
PROJECT_VER=${PROJECT_VER:-"21.06.0-SNAPSHOT"}
PROJECT_TEST_VER=${PROJECT_TEST_VER:-"21.06.0-SNAPSHOT"}
PROJECT_VER=${PROJECT_VER:-"21.08.0-SNAPSHOT"}
PROJECT_TEST_VER=${PROJECT_TEST_VER:-"21.08.0-SNAPSHOT"}
SPARK_VER=${SPARK_VER:-"3.0.1"}
SCALA_BINARY_VER=${SCALA_BINARY_VER:-"2.12"}
SERVER_ID=${SERVER_ID:-"snapshots"}
Expand Down
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
<artifactId>rapids-4-spark-parent</artifactId>
<name>RAPIDS Accelerator for Apache Spark Root Project</name>
<description>The root project of the RAPIDS Accelerator for Apache Spark</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<packaging>pom</packaging>

<url>https://github.com/NVIDIA</url>
Expand Down Expand Up @@ -219,7 +219,7 @@
<spark.version>${spark301.version}</spark.version>
<spark.test.version>${spark301.version}</spark.test.version>
<cuda.version>cuda11</cuda.version>
<cudf.version>21.06.0-SNAPSHOT</cudf.version>
<cudf.version>21.08.0-SNAPSHOT</cudf.version>
<scala.binary.version>2.12</scala.binary.version>
<scala.version>2.12.8</scala.version>
<orc.version>1.5.8</orc.version>
Expand Down
4 changes: 2 additions & 2 deletions rapids-4-spark-tools/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-tools</artifactId>
<name>RAPIDS Accelerator for Apache Spark tools</name>
<description>RAPIDS Accelerator for Apache Spark tools</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<packaging>jar</packaging>

<properties>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -137,10 +137,10 @@ object QualificationInfoUtils extends Logging {
/*
* Example command:
* $SPARK_HOME/bin/spark-submit --master local[1] --driver-memory 30g \
* --jars ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.06.0-SNAPSHOT-tests.jar,\
* ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.06.0-SNAPSHOT.jar \
* --jars ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.08.0-SNAPSHOT-tests.jar,\
* ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.08.0-SNAPSHOT.jar \
* --class com.nvidia.spark.rapids.tool.profiling.QualificationInfoUtils \
* ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.06.0-SNAPSHOT-tests.jar udffunc \
* ./rapids-4-spark-tools/target/rapids-4-spark-tools-21.08.0-SNAPSHOT-tests.jar udffunc \
* /tmp/testeventlogDir 100001
*/
def main(args: Array[String]): Unit = {
Expand Down
4 changes: 2 additions & 2 deletions shims/aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-aggregator_2.12</artifactId>
<packaging>jar</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shim Aggregator</name>
<description>The RAPIDS SQL plugin for Apache Spark Shim Aggregator</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<profiles>
<profile>
Expand Down
4 changes: 2 additions & 2 deletions shims/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<packaging>pom</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shims</name>
<description>The RAPIDS SQL plugin for Apache Spark Shims</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<profiles>
<profile>
Expand Down
4 changes: 2 additions & 2 deletions shims/spark301/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark301_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.1 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0.1 Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark301db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark301-databricks_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.1 Databricks Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0.1 Databricks Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark301emr/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark301emr_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.1 EMR Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0.1 EMR Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark302/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark302_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.2 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0.2 Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark303/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark303_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0.3 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0.3 Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark311/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark311_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.1.1 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.1.1 Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark311db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark311-databricks_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.1.1 Databricks Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.1.1 Databricks Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
4 changes: 2 additions & 2 deletions shims/spark312/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-shims-spark312_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.1.2 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.1.2 Shim</description>
<version>21.06.0-SNAPSHOT</version>
<version>21.08.0-SNAPSHOT</version>

<!-- Set 'spark.version' for the shims layer -->
<!-- Create a separate file 'SPARK_VER.properties' in the jar to save cudf & spark version info -->
Expand Down
Loading