Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: revise URL structure (libraries) #3269

Merged
merged 1 commit into from
Oct 1, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/link-validator.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,4 @@ jobs:
run: sbt docs/makeSite

- name: Run Link Validator
run: cs launch net.runne::site-link-validator:0.2.4 -- scripts/link-validator.conf
run: cs launch net.runne::site-link-validator:0.2.5 -- scripts/link-validator.conf
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Please have a look at our [contributor advice](contributor-advice.md).
## Binary compatibility (MiMa)

Binary compatibility rules and guarantees are described in depth in the [Binary Compatibility Rules
](https://doc.akka.io/docs/akka/snapshot/common/binary-compatibility-rules.html) section of the Akka documentation.
](https://doc.akka.io/libraries/akka-core/snapshot/common/binary-compatibility-rules.html) section of the Akka documentation.

Akka projects use [MiMa](https://github.com/lightbend/mima) to validate binary compatibility of incoming pull requests. If your PR fails due to binary compatibility issues, you may see an error like this:

Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,16 @@ Alpakka

Systems don't come alone. In the modern world of microservices and cloud deployment, new components must interact with legacy systems, making integration an important key to success. Reactive Streams give us a technology-independent tool to let these heterogeneous systems communicate without overwhelming each other.

The Alpakka project implements stream-aware & reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](http://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/docs/akka/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.
The Alpakka project implements stream-aware & reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/libraries/akka-core/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](http://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/libraries/akka-core/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.

The Akka family of projects is managed by teams at [Lightbend](https://lightbend.com) with help from the community.

Documentation
-------------

- [Alpakka reference](https://doc.akka.io/docs/alpakka/current/) documentation
- [Alpakka reference](https://doc.akka.io/libraries/alpakka/current/) documentation

- [Alpakka Kafka connector reference](https://doc.akka.io/docs/akka-stream-kafka/current/) documentation
- [Alpakka Kafka connector reference](https://doc.akka.io/libraries/akka-stream-kafka/current/) documentation

To keep up with the latest Alpakka releases check out [Alpakka releases](https://github.com/akka/alpakka/releases) and [Alpakka Kafka connector releases](https://github.com/akka/alpakka-kafka/releases).

Expand Down Expand Up @@ -58,4 +58,4 @@ License
-------
Alpakka is licensed under the [Business Source License (BSL) 1.1](https://github.com/akka/alpakka/blob/main/LICENSE), please see the [Akka License FAQ](https://www.lightbend.com/akka/license-faq).

Tests and documentation are under a separate license, see the LICENSE file in each documentation and test root directory for details.
Tests and documentation are under a separate license, see the LICENSE file in each documentation and test root directory for details.
4 changes: 2 additions & 2 deletions awslambda/src/test/java/docs/javadsl/Examples.java
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,15 @@ public void initClient() {
// #init-client

// Don't encode credentials in your source code!
// see https://doc.akka.io/docs/alpakka/current/aws-shared-configuration.html
// see https://doc.akka.io/libraries/alpakka/current/aws-shared-configuration.html
StaticCredentialsProvider credentialsProvider =
StaticCredentialsProvider.create(AwsBasicCredentials.create("x", "x"));
LambdaAsyncClient awsLambdaClient =
LambdaAsyncClient.builder()
.credentialsProvider(credentialsProvider)
.httpClient(AkkaHttpClient.builder().withActorSystem(system).build())
// Possibility to configure the retry policy
// see https://doc.akka.io/docs/alpakka/current/aws-shared-configuration.html
// see https://doc.akka.io/libraries/alpakka/current/aws-shared-configuration.html
// .overrideConfiguration(...)
.build();

Expand Down
4 changes: 2 additions & 2 deletions awslambda/src/test/scala/docs/scaladsl/Examples.scala
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,14 @@ object Examples {
import software.amazon.awssdk.services.lambda.LambdaAsyncClient

// Don't encode credentials in your source code!
// see https://doc.akka.io/docs/alpakka/current/aws-shared-configuration.html
// see https://doc.akka.io/libraries/alpakka/current/aws-shared-configuration.html
val credentialsProvider = StaticCredentialsProvider.create(AwsBasicCredentials.create("x", "x"))
implicit val lambdaClient: LambdaAsyncClient = LambdaAsyncClient
.builder()
.credentialsProvider(credentialsProvider)
.httpClient(AkkaHttpClient.builder().withActorSystem(system).build())
// Possibility to configure the retry policy
// see https://doc.akka.io/docs/alpakka/current/aws-shared-configuration.html
// see https://doc.akka.io/libraries/alpakka/current/aws-shared-configuration.html
// .overrideConfiguration(...)
.build()

Expand Down
8 changes: 4 additions & 4 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -387,7 +387,7 @@ lazy val docs = project
("http://www\\.scala-lang\\.org/".r, _ => "https://www\\.scala-lang\\.org/"),
("https://javadoc\\.io/page/".r, _ => "https://javadoc\\.io/static/")
),
Paradox / siteSubdirName := s"docs/alpakka/${projectInfoVersion.value}",
Paradox / siteSubdirName := s"libraries/alpakka/${projectInfoVersion.value}",
// make use of https://github.com/scala/scala/pull/8663
Compile / doc / scalacOptions ++= Seq(
"-jdk-api-doc-base",
Expand All @@ -399,16 +399,16 @@ lazy val docs = project
"hadoop.version" -> Dependencies.HadoopVersion,
"extref.github.base_url" -> s"https://github.com/akka/alpakka/tree/${if (isSnapshot.value) "main"
else "v" + version.value}/%s",
"extref.akka.base_url" -> s"https://doc.akka.io/docs/akka/${Dependencies.AkkaBinaryVersion}/%s",
"extref.akka.base_url" -> s"https://doc.akka.io/libraries/akka-core/${Dependencies.AkkaBinaryVersion}/%s",
"scaladoc.akka.base_url" -> s"https://doc.akka.io/api/akka/${Dependencies.AkkaBinaryVersion}",
"javadoc.akka.base_url" -> s"https://doc.akka.io/japi/akka/${Dependencies.AkkaBinaryVersion}/",
"javadoc.akka.link_style" -> "direct",
"extref.akka-http.base_url" -> s"https://doc.akka.io/docs/akka-http/${Dependencies.AkkaHttpBinaryVersion}/%s",
"extref.akka-http.base_url" -> s"https://doc.akka.io/libraries/akka-http/${Dependencies.AkkaHttpBinaryVersion}/%s",
"scaladoc.akka.http.base_url" -> s"https://doc.akka.io/api/akka-http/${Dependencies.AkkaHttpBinaryVersion}/",
"javadoc.akka.http.base_url" -> s"https://doc.akka.io/japi/akka-http/${Dependencies.AkkaHttpBinaryVersion}/",
// Akka gRPC
"akka-grpc.version" -> Dependencies.AkkaGrpcBinaryVersion,
"extref.akka-grpc.base_url" -> s"https://doc.akka.io/docs/akka-grpc/${Dependencies.AkkaGrpcBinaryVersion}/%s",
"extref.akka-grpc.base_url" -> s"https://doc.akka.io/libraries/akka-grpc/${Dependencies.AkkaGrpcBinaryVersion}/%s",
// Couchbase
"couchbase.version" -> Dependencies.CouchbaseVersion,
"extref.couchbase.base_url" -> s"https://docs.couchbase.com/java-sdk/${Dependencies.CouchbaseVersionForDocs}/%s",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ import scala.jdk.DurationConverters._
import scala.jdk.FutureConverters._

/**
* [[https://doc.akka.io/docs/akka/current/discovery/index.html Akka Discovery]]
* [[https://doc.akka.io/libraries/akka-core/current/discovery/index.html Akka Discovery]]
* is enabled by setting the `service-discovery.name` in the given `CassandraSession` config.
*
* Akka Discovery overwrites the basic.contact-points` from the configuration with addresses
Expand Down
2 changes: 1 addition & 1 deletion contributor-advice.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ All Akka APIs aim to evolve in a binary compatible way within minor versions.

1. To generate a case class replacement, consider using [Kaze Class](https://github.com/ktoso/kaze-class)

See [Binary Compatibilty Rules](https://doc.akka.io/docs/akka/current/common/binary-compatibility-rules.html) in the Akka documentation.
See [Binary Compatibilty Rules](https://doc.akka.io/libraries/akka-core/current/common/binary-compatibility-rules.html) in the Akka documentation.

See [Binary Compatibility for library authors](https://docs.scala-lang.org/overviews/core/binary-compatibility-for-library-authors.html)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

/**
* Utility to delegate Couchbase node address lookup to
* [[https://doc.akka.io/docs/akka/current/discovery/index.html Akka Discovery]].
* [[https://doc.akka.io/libraries/akka-core/current/discovery/index.html Akka Discovery]].
*/
public final class DiscoverySupport {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import scala.concurrent.Future
import scala.concurrent.duration.FiniteDuration

/**
* Utility to delegate Couchbase node address lookup to [[https://doc.akka.io/docs/akka/current/discovery/index.html Akka Discovery]].
* Utility to delegate Couchbase node address lookup to [[https://doc.akka.io/libraries/akka-core/current/discovery/index.html Akka Discovery]].
*/
sealed class DiscoverySupport private {

Expand Down Expand Up @@ -96,7 +96,7 @@ sealed class DiscoverySupport private {
}

/**
* Utility to delegate Couchbase node address lookup to [[https://doc.akka.io/docs/akka/current/discovery/index.html Akka Discovery]].
* Utility to delegate Couchbase node address lookup to [[https://doc.akka.io/libraries/akka-core/current/discovery/index.html Akka Discovery]].
*/
object DiscoverySupport extends DiscoverySupport {

Expand Down
4 changes: 2 additions & 2 deletions docs/release-train-issue-template.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Key links:
### Check availability

- [ ] Check [API](https://doc.akka.io/api/alpakka/$VERSION$/) documentation
- [ ] Check [reference](https://doc.akka.io/docs/alpakka/$VERSION$/) documentation. Check that the reference docs were deployed and show a version warning (see section below on how to fix the version warning).
- [ ] Check [reference](https://doc.akka.io/libraries/alpakka/$VERSION$/) documentation. Check that the reference docs were deployed and show a version warning (see section below on how to fix the version warning).
- [ ] Check the release on https://repo.akka.io/maven/com/lightbend/akka/akka-stream-alpakka-xml_2.13/$VERSION$/akka-stream-alpakka-xml_2.13-$VERSION$.pom

### When everything is on https://repo.akka.io/maven
Expand All @@ -40,7 +40,7 @@ Key links:
```
cd ~/www
git status
git add docs/alpakka/current docs/alpakka/$VERSION$
git add libraries/alpakka/current libraries/alpakka/$VERSION$
git add api/alpakka/current api/alpakka/$VERSION$
git commit -m "Alpakka $VERSION$"
```
Expand Down
10 changes: 5 additions & 5 deletions docs/src/main/paradox/.htaccess
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# This file can be used for redirects like:
RedirectMatch 301 ^/docs/alpakka/([^/]+)/data-transformations/recordio.html https://doc.akka.io/docs/alpakka/$1/data-transformations/simple-codecs.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/data-transformations/recordio.html https://doc.akka.io/libraries/alpakka/$1/data-transformations/simple-codecs.html
# Move CSV example to alpakka-samples
RedirectMatch 301 ^/docs/alpakka/([^/]+)/examples/csv-samples.html https://akka.io/alpakka-samples/
# Drop pre-1.0 release notes
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-RC1.html https://doc.akka.io/docs/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M3.html https://doc.akka.io/docs/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M2.html https://doc.akka.io/docs/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M1.html https://doc.akka.io/docs/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-RC1.html https://doc.akka.io/libraries/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M3.html https://doc.akka.io/libraries/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M2.html https://doc.akka.io/libraries/alpakka/$1/release-notes/1.0.x.html
RedirectMatch 301 ^/docs/alpakka/([^/]+)/release-notes/1.0-M1.html https://doc.akka.io/libraries/alpakka/$1/release-notes/1.0.x.html
4 changes: 2 additions & 2 deletions docs/src/main/paradox/external/grpc.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# gRPC

[gRPC](https://grpc.io/) is a transport mechanism for request/response and (non-persistent) streaming use cases. See [Why gRPC?](https://doc.akka.io/docs/akka-grpc/current/whygrpc.html) for more information about when to use gRPC as your transport.
[gRPC](https://grpc.io/) is a transport mechanism for request/response and (non-persistent) streaming use cases. See [Why gRPC?](https://doc.akka.io/libraries/akka-grpc/current/whygrpc.html) for more information about when to use gRPC as your transport.

## Akka gRPC

Akka gRPC provides support for building Reactive Streams-compliant gRPC servers and clients on top of Akka Streams.
Learn more about it in the [Akka gRPC documentation](https://doc.akka.io/docs/akka-grpc/current/).
Learn more about it in the [Akka gRPC documentation](https://doc.akka.io/libraries/akka-grpc/current/).
2 changes: 1 addition & 1 deletion docs/src/main/paradox/external/http.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ Akka comes with its Reactive Streams-compliant **HTTP server and client**.
## Akka HTTP

This library is not maintained in the Alpakka repository.
Learn more about it in the [Akka HTTP documentation](https://doc.akka.io/docs/akka-http/current/).
Learn more about it in the [Akka HTTP documentation](https://doc.akka.io/libraries/akka-http/current/).
2 changes: 1 addition & 1 deletion docs/src/main/paradox/external/tcp.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ Akka Streams provides a way of handling **TCP connections** with streams. While
## Akka TCP

Akka comes with its Reactive Streams-compliant TCP server and client.
Learn more about it in the [Akka Streaming TCP documentation](https://doc.akka.io/docs/akka/current/stream/stream-io.html#streaming-tcp).
Learn more about it in the [Akka Streaming TCP documentation](https://doc.akka.io/libraries/akka-core/current/stream/stream-io.html#streaming-tcp).
2 changes: 1 addition & 1 deletion docs/src/main/paradox/file.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ The table below shows direct dependencies of this module and the second tab show

Use the `FileIO` class to create streams reading from or writing to files. It is part part of Akka streams.

[Akka Streaming File IO documentation](https://doc.akka.io/docs/akka/current/stream/stream-io.html#streaming-file-io)
[Akka Streaming File IO documentation](https://doc.akka.io/libraries/akka-core/current/stream/stream-io.html#streaming-file-io)


## Tailing a file into a stream
Expand Down
2 changes: 1 addition & 1 deletion docs/src/main/paradox/google-cloud-bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ The BigQuery connector enables loading data into tables via real-time streaming
For an overview of these strategies see the [BigQuery documentation](https://cloud.google.com/bigquery/docs/loading-data).

The @scala[@apidoc[BigQuery.insertAll[In]](BigQuery$)] @java[@apidoc[BigQuery.<In>insertAll](BigQuery$)] method creates a sink that accepts batches of @scala[`Seq[In]`] @java[`List<In>`]
(for example created via the [`batch`](https://doc.akka.io/docs/akka/current/stream/operators/Source-or-Flow/batch.html) operator) and streams them directly into a table.
(for example created via the [`batch`](https://doc.akka.io/libraries/akka-core/current/stream/operators/Source-or-Flow/batch.html) operator) and streams them directly into a table.
To enable/disable BigQuery’s best-effort deduplication feature use the appropriate @apidoc[InsertAllRetryPolicy$].

Scala
Expand Down
2 changes: 1 addition & 1 deletion docs/src/main/paradox/google-fcm.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Java
With this type of send you can get responses from the server.
These responses can be @scaladoc[FcmSuccessResponse](akka.stream.alpakka.google.firebase.fcm.FcmSuccessResponse) or @scaladoc[FcmErrorResponse](akka.stream.alpakka.google.firebase.fcm.FcmErrorResponse).
You can choose what you want to do with this information, but keep in mind
if you try to resend the failed messages you will need to use exponential backoff! (see [Akka docs `RestartFlow.onFailuresWithBackoff`](https://doc.akka.io/docs/akka/current/stream/operators/RestartFlow/onFailuresWithBackoff.html))
if you try to resend the failed messages you will need to use exponential backoff! (see [Akka docs `RestartFlow.onFailuresWithBackoff`](https://doc.akka.io/libraries/akka-core/current/stream/operators/RestartFlow/onFailuresWithBackoff.html))

If you don't care if the notification was sent successfully, you may use `fireAndForget`.

Expand Down
2 changes: 1 addition & 1 deletion docs/src/main/paradox/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Alpakka Documentation

The [Alpakka project](https://doc.akka.io/docs/alpakka/current/) is an initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/docs/akka/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/docs/akka/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.
The [Alpakka project](https://doc.akka.io/libraries/alpakka/current/) is an initiative to implement stream-aware and reactive integration pipelines for Java and Scala. It is built on top of [Akka Streams](https://doc.akka.io/libraries/akka-core/current/stream/index.html), and has been designed from the ground up to understand streaming natively and provide a DSL for reactive and stream-oriented programming, with built-in support for backpressure. Akka Streams is a [Reactive Streams](https://www.reactive-streams.org/) and JDK 9+ [java.util.concurrent.Flow](https://docs.oracle.com/javase/10/docs/api/java/util/concurrent/Flow.html)-compliant implementation and therefore [fully interoperable](https://doc.akka.io/libraries/akka-core/current/general/stream/stream-design.html#interoperation-with-other-reactive-streams-implementations) with other implementations.

@@toc { depth=2 }

Expand Down
2 changes: 1 addition & 1 deletion docs/src/main/paradox/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ The **Alpakka Kafka connector** (originally known as **Reactive Kafka** or even

## Separate repository

Please read more about it in the [Alpakka Kafka documentation](https://doc.akka.io/docs/alpakka-kafka/current/).
Please read more about it in the [Alpakka Kafka documentation](https://doc.akka.io/libraries/alpakka-kafka/current/).
Loading
Loading