Skip to content

Latest commit

 

History

History
470 lines (186 loc) · 13.3 KB

README.md

File metadata and controls

470 lines (186 loc) · 13.3 KB

Scripts

Scripts to rule them all

This directory follows the Scripts to Rule Them All pattern:

Installs/updates all dependencies necessary for the docs environment. Equivalent of npm install.


Starts the local development server. Equivalent of npm start.

To keep things snappy, only English and Japanese are enabled. To run the server with all languages enabled, run script/server-all-languages


Runs tests. Equivalent of npm test.


Additional scripts

Run this script during the Enterprise deprecation process to download static copies of all pages for the oldest supported Enterprise version. See the Enterprise deprecation issue template for instructions.


This script copies any English files that are missing from the translations directory into the translations directory. We only need to run this if problems occur with Crowdin's automatic sync.


This script checks which modules you have used in your code and then makes sure they are listed as dependencies in your package.json, or vice-versa

https://github.com/dependency-check-team/dependency-check

The ignore array is for client-side or build-time stuff that doesn't get require()d in the normal way.


The script is run once per day via a scheduled GitHub Action to check all links in the site. It automatically opens an issue if it finds broken links. To exclude a URL from the link check, add it to lib/excluded-links.js.

For checking internal links, see script/check-internal-links.


This script is run automatically when you run the server locally. It checks whether Node.js is installed.


This script wraps tests/links-and-images.js and provides an option to output results to a file.

For more information, see tests/README.md#broken-link-test.


Run this script in your branch to check whether any images referenced in Enterprise content are not in the expected S3 bucket. You will need to authenticate to S3 via awssume to use this script. Instructions for the one-time setup are here.





This script turns a Google Sheets CSV spreadsheet into a YAML file.


This script finds and lists all the Heroku staging apps and deletes any leftover apps that have closed PRs


This script parses options for script/check-external-links.


Pass this script any old dotcom path (e.g., articles/foo or foo.md) and it will output the new path in the content/github directory.


Helper script that returns a "new" versioned path given an "old" versioned path.

Examples:

Given: /github/getting-started-with-github/using-github Returns: /free-pro-team@latest/github/getting-started-with-github/using-github

Given: /enterprise/admin/installation/upgrading-github-enterprise Returns: /enterprise-server@2.22/admin/installation/upgrading-github-enterprise











This script lists all local image files, sorted by their dimensions.


Pass this script three arguments: 1. current category path (e.g., github/automating-your-workflows-with-github-actions) 2. new product ID (e.g., actions) 3. new product name in quotes (e.g., "GitHub Actions") and it does everything that needs to be done to make the category into a new product.


This script moves reusables out of YAML files into individual Markdown files.



All the new versioning!

Usage $ script/new-versioning/main






This is a temporary script to visualize which pages have liquid (and conditionals) in their title frontmatter


This script finds all Heroku staging apps and pings them to make sure they're always "warmed" and responsive to requests.


This script is intended to be used as a git "prepush" hook. If the current branch is main, it will exit unsuccesfully and prevent the push.


This script is run as a git precommit hook (installed by husky after npm install). It detects changes to files the in the translations folder and prevents the commit if any changes exist.


This script stitches and unstitches the github/github OpenAPI description via rest-api-operations to produce a local preview in docs-internal.

github, rest-api-operations, and docs-internal must share a parent directory locally.

You must bootstrap github for this script to work. To check if you need to bootstrap, check if the bin directory in github exists locally. If it does not exist, run ./script/bootstrap from the github directory.

To stitch the repos together and do an npm build, pass the stitch argument.

To unstitch the repos and revert them to their pre-stitched state, pass the unstitch argument.


Run this script to manually purge the Fastly cache. Note this script requires a FASTLY_SERVICE_ID and FASTLY_TOKEN in your .env file.


Run this script to manually purge the Fastly cache for all language variants of a single URL or for a batch of URLs in a file. This script does not require authentication.


An automated test checks for discrepancies between category directory names and slugified category titles as IDs.

If the test fails, a human needs to run this script to update the directory names and add appropriate redirects.

This script is not currently supported on Windows.


An automated test checks for discrepancies between filenames and autogenerated heading IDs. If the test fails, a human needs to run this script to update the filenames.

This script is not currently supported on Windows.


Run this script after an Enterprise deprecation to remove Liquid statements and frontmatter that contain the deprecated Enterprise version. See the Enterprise deprecation issue template for instructions.


An automated test checks for files in the translations/ directory that do not have an equivalent English file in the content/ directory, and fails if it finds extraneous files. When the test fails, a human needs to run this script to remove the files.


Run this script to remove reusables and image files that exist in the repo but are not used in content files. It also displays a list of unused variables. Set the --dry-run to flag to print results without deleting any files. For images you don't want to delete, add them to ignoreList in lib/find-unused-assets.js


This is a convenience script for replacing the contents of translated files with the English content from their corresponding source file.

It's intended to be a workaround to temporarily bypass Crowdin parser bugs while we wait for Crowdin to fix them.

Usage: script/reset-translated-File.js []

script/reset-translated-File.js content/desktop/foo.md -> resets all translations of foo.md

script/reset-translated-File.js content/desktop/foo.md de -> resets german translation of foo.md



Starts the local development server with all of the available languages enabled.


Run this script to standardize frontmatter fields in all content files, per the order decided in https://github.com/github/docs-internal/issues/9658#issuecomment-485536265.


This script is run automatically via GitHub Actions on every push to master to generate searchable data and upload it to our Algolia account. It can also be run manually. For more info see contributing/search.md


List all the TODOs in our JavaScript files and stylesheets.


Run this script during Enterprise releases and deprecations. It uses the GitHub API to get dates from enterprise-releases and updates lib/enterprise-dates.json. The help site uses this JSON to display dates at the top of some Enterprise versions.

This script requires that you have a GitHub Personal Access Token in a .env file. If you don't have a token, get one here. If you don't have an .env file in your docs checkout, run this command in Terminal:

cp .env.example .env

Open the .env file in a text editor, and find the GITHUB_TOKEN= placeholder. Add your token after the equals sign.

Do not commit the .env file; just leave it in your checkout.


This script crawls the script directory, hooks on special comment markers in each script, and adds the comment to script/README.md.


This script is used by other scripts to update temporary AWS credentials and authenticate to S3. See docs at Setting up awssume and S3cmd.



Run this script to: upload individual files to S3 or: upload a batch of files to S3 for a new Enterprise release. Run upload-enterprise-images-to-s3.js --help for usage details.