From 0f53d197e037e5c510865bb201293387e2a33311 Mon Sep 17 00:00:00 2001 From: Bulat Date: Wed, 21 Dec 2022 16:56:43 +0600 Subject: [PATCH] [Website] delete 404 links #24745 --- .../www/site/content/en/documentation/io/built-in/cdap.md | 2 +- .../content/en/documentation/io/built-in/sparkreceiver.md | 2 +- .../www/site/content/en/documentation/io/connectors.md | 8 ++++---- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/website/www/site/content/en/documentation/io/built-in/cdap.md b/website/www/site/content/en/documentation/io/built-in/cdap.md index 3ed4fd06ed0e..76ead911ebe5 100644 --- a/website/www/site/content/en/documentation/io/built-in/cdap.md +++ b/website/www/site/content/en/documentation/io/built-in/cdap.md @@ -38,7 +38,7 @@ Also, any other CDAP Batch plugin based on Hadoop's `InputFormat` or `OutputForm Requirements for CDAP Streaming plugins: - CDAP Streaming plugin should be based on `Spark Receiver` (Spark 2.4). - CDAP Streaming plugin should support work with offsets. -- Corresponding Spark Receiver should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface. +- Corresponding Spark Receiver should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface. - Records should have the numeric field that represents record offset. ## Batch reading using CdapIO diff --git a/website/www/site/content/en/documentation/io/built-in/sparkreceiver.md b/website/www/site/content/en/documentation/io/built-in/sparkreceiver.md index 31c33988f31f..c6476c8f25f1 100644 --- a/website/www/site/content/en/documentation/io/built-in/sparkreceiver.md +++ b/website/www/site/content/en/documentation/io/built-in/sparkreceiver.md @@ -26,7 +26,7 @@ SparkReceiverIO is a transform for reading data from an Apache Spark Receiver as Requirements for `Spark Receiver`: - Version of Spark should be 2.4.*. - `Spark Receiver` should support work with offsets. -- `Spark Receiver` should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface. +- `Spark Receiver` should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface. - Records should have the numeric field that represents record offset. For more details please see [SparkReceiverIO readme](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/README.md). diff --git a/website/www/site/content/en/documentation/io/connectors.md b/website/www/site/content/en/documentation/io/connectors.md index 852e019a9446..7ad4057580e4 100644 --- a/website/www/site/content/en/documentation/io/connectors.md +++ b/website/www/site/content/en/documentation/io/connectors.md @@ -798,14 +798,14 @@ This table provides a consolidated, at-a-glance overview of the available built- SparkReceiverIO (guide) ✔ - ✘ + ✘ ✔ - native + native Not available Not available - ✘ + ✘ ✔ @@ -814,7 +814,7 @@ This table provides a consolidated, at-a-glance overview of the available built- ✔ ✔ - native + native Not available Not available