Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streamline docs for new users #3803

Merged
merged 18 commits into from
Jul 17, 2024
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same for this file

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would like to leave it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I disagree about it as this somehow is our "official" blog and a community post is not official in any way. The cookbook was the right place for it, but it was never used… Maybe we can just keep it in gh discussions ("show and tell" category)?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this needs future discusssion but for now it is good to go on my side.

I like the itterative aproach :)

Original file line number Diff line number Diff line change
@@ -1,22 +1,23 @@
---
title: Continuous Deployment
title: "[Community] Continuous Deployment"
description: Deploy your artifacts to an app server
slug: continuous-deployment
authors:
- name: lonix1
url: https://github.com/lonix1
image_url: https://github.com/lonix1.png
hide_table_of_contents: false
tags: [community,cd,deployment]
---

<!--truncate-->

A typical CI pipeline contains steps such as: _clone_, _build_, _test_, _package_ and _push_. The final build product may be artifacts pushed to a git repository or a docker container pushed to a container registry.

When these should be deployed on an app server, the pipeline should include a _deploy_ step, which represents the "CD" in CI/CD - the automatic deployment of a pipeline's final product.

There are various ways to accomplish CD with Woodpecker, depending on your project's specific needs.

<!--truncate-->

## Invoking deploy script via SSH

The final step in your pipeline could SSH into the app server and run a deployment script.
Expand Down
13 changes: 0 additions & 13 deletions docs/cookbook/2023-12-23-hello-cookbook/index.md

This file was deleted.

89 changes: 0 additions & 89 deletions docs/docs/10-intro.md

This file was deleted.

26 changes: 26 additions & 0 deletions docs/docs/10-intro/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Welcome to Woodpecker

Woodpecker is a CI/CD tool. It is designed to be lightweight, simple to use and fast. Before we dive into the details, let's have a look at some of the basics.

## Have you ever heard of CI/CD or pipelines?

Don't worry if you haven't. We'll guide you through the basics. CI/CD stands for Continuous Integration and Continuous Deployment. It's basically like a conveyor belt that moves your code from development to production doing all kinds of
checks, tests and routines along the way. A typical pipeline might include the following steps:

1. Running tests
2. Building your application
3. Deploying your application

[Have a deeper look into the idea of CI/CD](https://www.redhat.com/en/topics/devops/what-is-ci-cd)

## Do you know containers?

If you are already using containers in your daily workflow, you'll for sure love Woodpecker. If not yet, you'll be amazed how easy it is to get started with [containers](https://opencontainers.org/).

## Already have access to a Woodpecker instace?

Then you might want to jump directly into it and [start creating your first pipelines](../20-usage/10-intro.md).

## Want to start from scratch and deploy your own Woodpecker instance?

Woodpecker is [pretty lightweight](../30-administration/00-getting-started.md#hardware-requirements) and will even run on your Raspberry Pi. You can follow the [deployment guide](../30-administration/00-getting-started.md) to set up your own Woodpecker instance.
133 changes: 84 additions & 49 deletions docs/docs/20-usage/10-intro.md
Original file line number Diff line number Diff line change
@@ -1,73 +1,108 @@
# Getting started
# Your first pipeline

## Repository Activation
Let's get started and create your first pipeline.

To activate your project navigate to your account settings. You will see a list of repositories which can be activated with a simple toggle. When you activate your repository, Woodpecker automatically adds webhooks to your forge (e.g. GitHub, Gitea, ...).
## 1. Repository Activation

Webhooks are used to trigger pipeline executions. When you push code to your repository, open a pull request, or create a tag, your forge will automatically send a webhook to Woodpecker which will in turn trigger the pipeline execution.
To activate your repository in Woodpecker navigate to the repository list and `New repository`. You will see a list of repositories from your forge (GitHub, Gitlab, ...) which can be activated with a simple click.

![repository list](repo-list.png)
![new repository list](repo-new.png)

## Required Permissions
To enable a repository in Woodpecker you must have `Admin` rights on that repository, so that Woodpecker can add something
that is called a webhook (Woodpecker needs it to know about actions like pushes, pull requests, tags, etc.).

The user who enables a repo in Woodpecker must have `Admin` rights on that repo, so that Woodpecker can add the webhook.
## 2. Define first workflow

:::note
Note that manually creating webhooks yourself is not possible.
This is because webhooks are signed using a per-repository secret key which is not exposed to end users.
:::
After enabling a repository Woodpecker will listen for changes in your repository. When a change is detected, Woodpecker will check for a pipeline configuration. So let's create a file at `.woodpecker/my-first-workflow.yaml` inside your repository:

## Configuration
```yaml title=".woodpecker/my-first-workflow.yaml"
when:
- event: push
branch: main

To configure your pipeline you must create a `.woodpecker.yaml` file in the root of your repository. The `.woodpecker.yaml` file is used to define your pipeline steps.

:::note
We support most of YAML 1.2, but preserve some behavior from 1.1 for backward compatibility.
Read more at: [https://github.com/go-yaml/yaml](https://github.com/go-yaml/yaml/tree/v3)
:::

Example pipeline configuration:

```yaml
steps:
- name: build
image: golang
image: debian
commands:
- go get
- go build
- go test

services:
- name: postgres
image: postgres:9.4.5
environment:
- POSTGRES_USER=myapp
- echo "This is the build step"
- echo "binary-data-123" > executable
- name: a-test-step
image: golang:1.16
commands:
- echo "Testing ..."
- ./executable
```

Example pipeline configuration with multiple, serial steps:
__So what did we do here?__

```yaml
steps:
- name: backend
image: golang
commands:
- go get
- go build
- go test
1. We defined your first workflow file `my-first-workflow.yaml`.
2. This workflow will be executed when a push event happens on the `main` branch,
because we added a filter using the `when` section:
```diff
+ when:
+ - event: push
+ branch: main

- name: frontend
image: node:6
commands:
- npm install
- npm test
...
qwerty287 marked this conversation as resolved.
Show resolved Hide resolved
```
1. We defined two steps: `build` and `a-test-step`

The steps are executed in the order they are defined, so `build` will be executed first and then `a-test-step`.

In the `build` step we use the `debian` image and build a "binary file" called `executable`.

In the `a-test-step` we use the `golang:1.16` image and run the `executable` file to test it.

You can use any image from registries like the [Docker Hub](https://hub.docker.com/search?type=image) you have access to:

```diff
steps:
- name: build
- image: debian
+ image: mycompany/image-with-awscli
commands:
- aws help
```

- name: notify
## 3. Push the file and trigger first pipeline

If you push this file to your repository now, Woodpecker will already execute your first pipeline.

You can check the pipeline execution in the Woodpecker UI by navigating to the `Pipelines` section of your repository.

![pipeline view](./pipeline.png)

As you probably noticed, there is another step in called `clone` which is executed before your steps. This step clones your repository into a folder called `workspace` which is available throughout all steps.

This for example allows the first step to build your application using your source code and as the second step will receive
the same workspace it can use the previously built binary and test it.

## 4. Use a plugin for reusable tasks

Sometimes you have some tasks that you need to do in every project. For example, deploying to Kubernetes or sending a Slack message. Therefore you can use one of the [official and community plugins](/plugins) or simply [create your own](./51-plugins/20-creating-plugins.md).

If you want to get a Slack notification after your pipeline has finished, you can add a Slack plugin to your pipeline:

```yaml
...

- name: notify me on Slack
image: plugins/slack
settings:
channel: developers
username: woodpecker
password:
from_secret: slack_token
when:
status: [ success, failure ] # This will execute the step on success and failure
```

## Execution
To configure a plugin you can use the `settings` section.

Sometime you need to provide secrets to the plugin. You can do this by using the `from_secret` key. The secret must be defined in the Woodpecker UI. You can find more information about secrets [here](./40-secrets.md).

Similar to the `when` section at the top of the file which is for the complete workflow, you can use the `when` section for each step to define when a step should be executed.

Learn more about [plugins](./51-plugins/51-overview.md).

To trigger your first pipeline execution you can push code to your repository, open a pull request, or push a tag. Any of these events triggers a webhook from your forge and execute your pipeline.
As you now have a basic understanding of how to create a pipeline, you can dive deeper into the [workflow syntax](./20-workflow-syntax.md) and [plugins](./51-plugins/51-overview.md).
15 changes: 8 additions & 7 deletions docs/docs/20-usage/15-terminology/index.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,5 @@
# Terminology

## Woodpecker architecture

![Woodpecker architecture](architecture.svg)

## Pipeline, workflow & step

![Relation between pipelines, workflows and steps](pipeline-workflow-step.svg)

## Glossary

Expand All @@ -33,6 +26,14 @@
- **Status**: Status refers to the outcome of a step or [workflow][Workflow] after it has been executed, determined by the internal command exit code. At the end of a [workflow][Workflow], its status is sent to the [forge][Forge].
- **Service extension**: Some parts of Woodpecker internal services like secrets storage or config fetcher can be replaced through service extensions.

## Woodpecker architecture

![Woodpecker architecture](architecture.svg)

## Pipeline, workflow & step

![Relation between pipelines, workflows and steps](pipeline-workflow-step.svg)

## Pipeline events

- `push`: A push event is triggered when a commit is pushed to a branch.
Expand Down
5 changes: 5 additions & 0 deletions docs/docs/20-usage/20-workflow-syntax.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ The Workflow section defines a list of steps to build, test and deploy your code
An exception to this rule are steps with a [`status: [failure]`](#status) condition, which ensures that they are executed in the case of a failed run.
:::

:::note
We support most of YAML 1.2, but preserve some behavior from 1.1 for backward compatibility.
Read more at: [https://github.com/go-yaml/yaml](https://github.com/go-yaml/yaml/tree/v3)
:::

Example steps:

```yaml
Expand Down
19 changes: 19 additions & 0 deletions docs/docs/20-usage/51-plugins/51-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,25 @@ Plugins are pipeline steps that perform pre-defined tasks and are configured as

They are automatically pulled from the default container registry the agent's have configured.

```dockerfile title="Dockerfile"
FROM laszlocloud/kubectl
COPY deploy /usr/local/deploy
ENTRYPOINT ["/usr/local/deploy"]
```

```bash title="deploy"
kubectl apply -f $PLUGIN_TEMPLATE
```

```yaml title=".woodpecker.yaml"
steps:
- name: deploy-to-k8s
image: laszlocloud/my-k8s-plugin
settings:
template: config/k8s/service.yaml
```


Example pipeline using the Docker and Slack plugins:

```yaml
Expand Down
Binary file added docs/docs/20-usage/pipeline.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/docs/20-usage/repo-list.png
Binary file not shown.
Binary file added docs/docs/20-usage/repo-new.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading