Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Website] delete 404 links #24744

Merged
merged 1 commit into from
Dec 22, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Also, any other CDAP Batch plugin based on Hadoop's `InputFormat` or `OutputForm
Requirements for CDAP Streaming plugins:
- CDAP Streaming plugin should be based on `Spark Receiver` (Spark 2.4).
- CDAP Streaming plugin should support work with offsets.
- Corresponding Spark Receiver should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface.
- Corresponding Spark Receiver should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface.
- Records should have the numeric field that represents record offset.

## Batch reading using CdapIO
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ SparkReceiverIO is a transform for reading data from an Apache Spark Receiver as
Requirements for `Spark Receiver`:
- Version of Spark should be 2.4.*.
- `Spark Receiver` should support work with offsets.
- `Spark Receiver` should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface.
- `Spark Receiver` should implement [HasOffset](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/src/main/java/org/apache/beam/sdk/io/sparkreceiver/HasOffset.java) interface.
- Records should have the numeric field that represents record offset.

For more details please see [SparkReceiverIO readme](https://github.com/apache/beam/blob/master/sdks/java/io/sparkreceiver/2/README.md).
Expand Down
8 changes: 4 additions & 4 deletions website/www/site/content/en/documentation/io/connectors.md
Original file line number Diff line number Diff line change
Expand Up @@ -798,14 +798,14 @@ This table provides a consolidated, at-a-glance overview of the available built-
<tr>
<td>SparkReceiverIO (<a href="/documentation/io/built-in/sparkreceiver/">guide</a>)</td>
<td class="present">✔</td>
<td class="present">✘</td>
<td class="absent">✘</td>
<td class="present">
<a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/sparkreceiver/2/SparkReceiverIO.html">native</a>
native
</td>
<td>Not available</td>
<td>Not available</td>
<td class="present">✘</td>
<td class="absent">✘</td>
<td class="present">✔</td>
</tr>
<tr>
Expand All @@ -814,7 +814,7 @@ This table provides a consolidated, at-a-glance overview of the available built-
<td class="present">✔</td>
<td class="present">
<a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/cdap/CdapIO.html">native</a>
native
</td>
<td>Not available</td>
<td>Not available</td>
Expand Down