style: prettier (#2624)

This commit is contained in:
Gregor Martynus 2022-11-23 16:02:51 -08:00 committed by GitHub
parent 8a0d8be51f
commit d13ea9280e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
54 changed files with 3129 additions and 15324 deletions

View File

@ -8,19 +8,19 @@ In the interest of fostering an open and welcoming environment, we as contributo
Examples of behavior that contributes to creating a positive environment include: Examples of behavior that contributes to creating a positive environment include:
* Using welcoming and inclusive language - Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences - Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism - Gracefully accepting constructive criticism
* Focusing on what is best for the community - Focusing on what is best for the community
* Showing empathy towards other community members - Showing empathy towards other community members
Examples of unacceptable behavior by participants include: Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or advances - The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks - Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment - Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission - Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting - Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities ## Our Responsibilities

View File

@ -3,6 +3,7 @@
✨ Thanks for contributing to **semantic-release**! ✨ ✨ Thanks for contributing to **semantic-release**! ✨
As a contributor, here are the guidelines we would like you to follow: As a contributor, here are the guidelines we would like you to follow:
- [Code of conduct](#code-of-conduct) - [Code of conduct](#code-of-conduct)
- [How can I contribute?](#how-can-i-contribute) - [How can I contribute?](#how-can-i-contribute)
- [Using the issue tracker](#using-the-issue-tracker) - [Using the issue tracker](#using-the-issue-tracker)
@ -74,24 +75,31 @@ Here is a summary of the steps to follow:
1. [Set up the workspace](#set-up-the-workspace) 1. [Set up the workspace](#set-up-the-workspace)
2. If you cloned a while ago, get the latest changes from upstream and update dependencies: 2. If you cloned a while ago, get the latest changes from upstream and update dependencies:
```bash ```bash
$ git checkout master $ git checkout master
$ git pull upstream master $ git pull upstream master
$ rm -rf node_modules $ rm -rf node_modules
$ npm install $ npm install
``` ```
3. Create a new topic branch (off the main project development branch) to contain your feature, change, or fix: 3. Create a new topic branch (off the main project development branch) to contain your feature, change, or fix:
```bash ```bash
$ git checkout -b <topic-branch-name> $ git checkout -b <topic-branch-name>
``` ```
4. Make your code changes, following the [Coding rules](#coding-rules) 4. Make your code changes, following the [Coding rules](#coding-rules)
5. Push your topic branch up to your fork: 5. Push your topic branch up to your fork:
```bash ```bash
$ git push origin <topic-branch-name> $ git push origin <topic-branch-name>
``` ```
6. [Open a Pull Request](https://help.github.com/articles/creating-a-pull-request/#creating-the-pull-request) with a clear title and description. 6. [Open a Pull Request](https://help.github.com/articles/creating-a-pull-request/#creating-the-pull-request) with a clear title and description.
**Tips**: **Tips**:
- For ambitious tasks, open a Pull Request as soon as possible with the `[WIP]` prefix in the title, in order to get feedback and help from the community. - For ambitious tasks, open a Pull Request as soon as possible with the `[WIP]` prefix in the title, in order to get feedback and help from the community.
- [Allow semantic-release maintainers to make changes to your Pull Request branch](https://help.github.com/articles/allowing-changes-to-a-pull-request-branch-created-from-a-fork). - [Allow semantic-release maintainers to make changes to your Pull Request branch](https://help.github.com/articles/allowing-changes-to-a-pull-request-branch-created-from-a-fork).
This way, we can rebase it and make some minor changes if necessary. This way, we can rebase it and make some minor changes if necessary.
@ -102,6 +110,7 @@ $ git push origin <topic-branch-name>
### Source code ### Source code
To ensure consistency and quality throughout the source code, all code modifications must have: To ensure consistency and quality throughout the source code, all code modifications must have:
- No [linting](#lint) errors - No [linting](#lint) errors
- A [test](#tests) for every possible case introduced by your code change - A [test](#tests) for every possible case introduced by your code change
- **100%** test coverage - **100%** test coverage
@ -112,6 +121,7 @@ To ensure consistency and quality throughout the source code, all code modificat
### Documentation ### Documentation
To ensure consistency and quality, all documentation modifications must: To ensure consistency and quality, all documentation modifications must:
- Refer to brand in [bold](https://help.github.com/articles/basic-writing-and-formatting-syntax/#styling-text) with proper capitalization, i.e. **GitHub**, **semantic-release**, **npm** - Refer to brand in [bold](https://help.github.com/articles/basic-writing-and-formatting-syntax/#styling-text) with proper capitalization, i.e. **GitHub**, **semantic-release**, **npm**
- Prefer [tables](https://help.github.com/articles/organizing-information-with-tables) over [lists](https://help.github.com/articles/basic-writing-and-formatting-syntax/#lists) when listing key values, i.e. List of options with their description - Prefer [tables](https://help.github.com/articles/organizing-information-with-tables) over [lists](https://help.github.com/articles/basic-writing-and-formatting-syntax/#lists) when listing key values, i.e. List of options with their description
- Use [links](https://help.github.com/articles/basic-writing-and-formatting-syntax/#links) when you are referring to: - Use [links](https://help.github.com/articles/basic-writing-and-formatting-syntax/#links) when you are referring to:
@ -133,6 +143,7 @@ To ensure consistency and quality, all documentation modifications must:
#### Atomic commits #### Atomic commits
If possible, make [atomic commits](https://en.wikipedia.org/wiki/Atomic_commit), which means: If possible, make [atomic commits](https://en.wikipedia.org/wiki/Atomic_commit), which means:
- a commit should contain exactly one self-contained functional change - a commit should contain exactly one self-contained functional change
- a functional change should be contained in exactly one commit - a functional change should be contained in exactly one commit
- a commit should not create an inconsistent state (such as test errors, linting errors, partial fix, feature with documentation etc...) - a commit should not create an inconsistent state (such as test errors, linting errors, partial fix, feature with documentation etc...)
@ -166,7 +177,7 @@ In the body it should say: `This reverts commit <hash>.`, where the hash is the
The type must be one of the following: The type must be one of the following:
| Type | Description | | Type | Description |
|--------------|-------------------------------------------------------------------------------------------------------------| | ------------ | ----------------------------------------------------------------------------------------------------------- |
| **build** | Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm) | | **build** | Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm) |
| **ci** | Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs) | | **ci** | Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs) |
| **docs** | Documentation only changes | | **docs** | Documentation only changes |
@ -186,10 +197,12 @@ The subject contains succinct description of the change:
- no dot (.) at the end - no dot (.) at the end
#### Body #### Body
Just as in the **subject**, use the imperative, present tense: "change" not "changed" nor "changes". Just as in the **subject**, use the imperative, present tense: "change" not "changed" nor "changes".
The body should include the motivation for the change and contrast this with previous behavior. The body should include the motivation for the change and contrast this with previous behavior.
#### Footer #### Footer
The footer should contain any information about **Breaking Changes** and is also the place to reference GitHub issues that this commit **Closes**. The footer should contain any information about **Breaking Changes** and is also the place to reference GitHub issues that this commit **Closes**.
**Breaking Changes** should start with the word `BREAKING CHANGE:` with a space or two newlines. **Breaking Changes** should start with the word `BREAKING CHANGE:` with a space or two newlines.
@ -240,6 +253,7 @@ Prettier formatting will be automatically verified and fixed by XO.
Before pushing your code changes make sure there are no linting errors with `npm run lint`. Before pushing your code changes make sure there are no linting errors with `npm run lint`.
**Tips**: **Tips**:
- Most linting errors can be automatically fixed with `npm run lint -- --fix`. - Most linting errors can be automatically fixed with `npm run lint -- --fix`.
- Install the [XO plugin](https://github.com/sindresorhus/xo#editor-plugins) for your editor to see linting errors directly in your editor and automatically fix them on save. - Install the [XO plugin](https://github.com/sindresorhus/xo#editor-plugins) for your editor to see linting errors directly in your editor and automatically fix them on save.
@ -256,6 +270,7 @@ $ npm run test
``` ```
**Tips:** During development you can: **Tips:** During development you can:
- run only a subset of test files with `ava <glob>`, for example `ava test/mytestfile.test.js` - run only a subset of test files with `ava <glob>`, for example `ava test/mytestfile.test.js`
- run in watch mode with `ava -w` to automatically run a test file when you modify it - run in watch mode with `ava -w` to automatically run a test file when you modify it
- run only the test you are working on by adding [`.only` to the test definition](https://github.com/avajs/ava#running-specific-tests) - run only the test you are working on by adding [`.only` to the test definition](https://github.com/avajs/ava#running-specific-tests)

View File

@ -57,7 +57,7 @@ Tools such as [commitizen](https://github.com/commitizen/cz-cli) or [commitlint]
The table below shows which commit message gets you which release type when `semantic-release` runs (using the default configuration): The table below shows which commit message gets you which release type when `semantic-release` runs (using the default configuration):
| Commit message | Release type | | Commit message | Release type |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------- | | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------- |
| `fix(pencil): stop graphite breaking when too much pressure applied` | ~~Patch~~ Fix Release | | `fix(pencil): stop graphite breaking when too much pressure applied` | ~~Patch~~ Fix Release |
| `feat(pencil): add 'graphiteWidth' option` | ~~Minor~~ Feature Release | | `feat(pencil): add 'graphiteWidth' option` | ~~Minor~~ Feature Release |
| `perf(pencil): remove graphiteWidth option`<br><br>`BREAKING CHANGE: The graphiteWidth option has been removed.`<br>`The default graphite width of 10mm is always used for performance reasons.` | ~~Major~~ Breaking Release <br /> (Note that the `BREAKING CHANGE: ` token must be in the footer of the commit) | | `perf(pencil): remove graphiteWidth option`<br><br>`BREAKING CHANGE: The graphiteWidth option has been removed.`<br>`The default graphite width of 10mm is always used for performance reasons.` | ~~Major~~ Breaking Release <br /> (Note that the `BREAKING CHANGE: ` token must be in the footer of the commit) |
@ -145,7 +145,6 @@ Let people know that your package is published using **semantic-release** and wh
```md ```md
[![semantic-release: angular](https://img.shields.io/badge/semantic--release-angular-e10079?logo=semantic-release)](https://github.com/semantic-release/semantic-release) [![semantic-release: angular](https://img.shields.io/badge/semantic--release-angular-e10079?logo=semantic-release)](https://github.com/semantic-release/semantic-release)
``` ```
## Team ## Team

View File

@ -1,6 +1,7 @@
# Summary # Summary
## Usage ## Usage
- [Getting started](docs/usage/getting-started.md#getting-started) - [Getting started](docs/usage/getting-started.md#getting-started)
- [Installation](docs/usage/installation.md#installation) - [Installation](docs/usage/installation.md#installation)
- [CI Configuration](docs/usage/ci-configuration.md#ci-configuration) - [CI Configuration](docs/usage/ci-configuration.md#ci-configuration)
@ -10,10 +11,12 @@
- [Shareable configurations](docs/usage/shareable-configurations.md) - [Shareable configurations](docs/usage/shareable-configurations.md)
## Extending ## Extending
- [Plugins](docs/extending/plugins-list.md) - [Plugins](docs/extending/plugins-list.md)
- [Shareable configuration](docs/extending/shareable-configurations-list.md) - [Shareable configuration](docs/extending/shareable-configurations-list.md)
## Recipes ## Recipes
- [CI configurations](docs/recipes/ci-configurations/README.md) - [CI configurations](docs/recipes/ci-configurations/README.md)
- [CircleCI 2.0](docs/recipes/ci-configurations/circleci-workflows.md) - [CircleCI 2.0](docs/recipes/ci-configurations/circleci-workflows.md)
- [Travis CI](docs/recipes/ci-configurations/travis.md) - [Travis CI](docs/recipes/ci-configurations/travis.md)
@ -28,11 +31,13 @@
- [Publishing pre-releases](docs/recipes/release-workflow/pre-releases.md) - [Publishing pre-releases](docs/recipes/release-workflow/pre-releases.md)
## Developer guide ## Developer guide
- [JavaScript API](docs/developer-guide/js-api.md) - [JavaScript API](docs/developer-guide/js-api.md)
- [Plugin development](docs/developer-guide/plugin.md) - [Plugin development](docs/developer-guide/plugin.md)
- [Shareable configuration development](docs/developer-guide/shareable-configuration.md) - [Shareable configuration development](docs/developer-guide/shareable-configuration.md)
## Support ## Support
- [Resources](docs/support/resources.md) - [Resources](docs/support/resources.md)
- [Frequently Asked Questions](docs/support/FAQ.md) - [Frequently Asked Questions](docs/support/FAQ.md)
- [Troubleshooting](docs/support/troubleshooting.md) - [Troubleshooting](docs/support/troubleshooting.md)

View File

@ -2,17 +2,17 @@
/* eslint-disable no-var */ /* eslint-disable no-var */
import semver from 'semver'; import semver from "semver";
import { execa } from 'execa'; import { execa } from "execa";
import findVersions from 'find-versions'; import findVersions from "find-versions";
import cli from '../cli.js'; import cli from "../cli.js";
import {createRequire} from 'node:module'; import { createRequire } from "node:module";
const require = createRequire(import.meta.url); const require = createRequire(import.meta.url);
const { engines } = require('../package.json'); const { engines } = require("../package.json");
const { satisfies, lt } = semver; const { satisfies, lt } = semver;
const MIN_GIT_VERSION = '2.7.1'; const MIN_GIT_VERSION = "2.7.1";
if (!satisfies(process.version, engines.node)) { if (!satisfies(process.version, engines.node)) {
console.error( console.error(
@ -23,8 +23,8 @@ See https://github.com/semantic-release/semantic-release/blob/master/docs/suppor
process.exit(1); process.exit(1);
} }
execa('git', ['--version']) execa("git", ["--version"])
.then(({stdout}) => { .then(({ stdout }) => {
const gitVersion = findVersions(stdout)[0]; const gitVersion = findVersions(stdout)[0];
if (lt(gitVersion, MIN_GIT_VERSION)) { if (lt(gitVersion, MIN_GIT_VERSION)) {
console.error(`[semantic-release]: Git version ${MIN_GIT_VERSION} is required. Found ${gitVersion}.`); console.error(`[semantic-release]: Git version ${MIN_GIT_VERSION} is required. Found ${gitVersion}.`);

62
cli.js
View File

@ -1,47 +1,47 @@
import util from 'node:util'; import util from "node:util";
import yargs from 'yargs'; import yargs from "yargs";
import {hideBin} from 'yargs/helpers'; import { hideBin } from "yargs/helpers";
import hideSensitive from './lib/hide-sensitive.js'; import hideSensitive from "./lib/hide-sensitive.js";
const stringList = { const stringList = {
type: 'string', type: "string",
array: true, array: true,
coerce: (values) => coerce: (values) =>
values.length === 1 && values[0].trim() === 'false' values.length === 1 && values[0].trim() === "false"
? [] ? []
: values.reduce((values, value) => values.concat(value.split(',').map((value) => value.trim())), []), : values.reduce((values, value) => values.concat(value.split(",").map((value) => value.trim())), []),
}; };
export default async () => { export default async () => {
const cli = yargs(hideBin(process.argv)) const cli = yargs(hideBin(process.argv))
.command('$0', 'Run automated package publishing', (yargs) => { .command("$0", "Run automated package publishing", (yargs) => {
yargs.demandCommand(0, 0).usage(`Run automated package publishing yargs.demandCommand(0, 0).usage(`Run automated package publishing
Usage: Usage:
semantic-release [options] [plugins]`); semantic-release [options] [plugins]`);
}) })
.option('b', {alias: 'branches', describe: 'Git branches to release from', ...stringList, group: 'Options'}) .option("b", { alias: "branches", describe: "Git branches to release from", ...stringList, group: "Options" })
.option('r', {alias: 'repository-url', describe: 'Git repository URL', type: 'string', group: 'Options'}) .option("r", { alias: "repository-url", describe: "Git repository URL", type: "string", group: "Options" })
.option('t', {alias: 'tag-format', describe: 'Git tag format', type: 'string', group: 'Options'}) .option("t", { alias: "tag-format", describe: "Git tag format", type: "string", group: "Options" })
.option('p', {alias: 'plugins', describe: 'Plugins', ...stringList, group: 'Options'}) .option("p", { alias: "plugins", describe: "Plugins", ...stringList, group: "Options" })
.option('e', {alias: 'extends', describe: 'Shareable configurations', ...stringList, group: 'Options'}) .option("e", { alias: "extends", describe: "Shareable configurations", ...stringList, group: "Options" })
.option('ci', {describe: 'Toggle CI verifications', type: 'boolean', group: 'Options'}) .option("ci", { describe: "Toggle CI verifications", type: "boolean", group: "Options" })
.option('verify-conditions', {...stringList, group: 'Plugins'}) .option("verify-conditions", { ...stringList, group: "Plugins" })
.option('analyze-commits', {type: 'string', group: 'Plugins'}) .option("analyze-commits", { type: "string", group: "Plugins" })
.option('verify-release', {...stringList, group: 'Plugins'}) .option("verify-release", { ...stringList, group: "Plugins" })
.option('generate-notes', {...stringList, group: 'Plugins'}) .option("generate-notes", { ...stringList, group: "Plugins" })
.option('prepare', {...stringList, group: 'Plugins'}) .option("prepare", { ...stringList, group: "Plugins" })
.option('publish', {...stringList, group: 'Plugins'}) .option("publish", { ...stringList, group: "Plugins" })
.option('success', {...stringList, group: 'Plugins'}) .option("success", { ...stringList, group: "Plugins" })
.option('fail', {...stringList, group: 'Plugins'}) .option("fail", { ...stringList, group: "Plugins" })
.option('debug', {describe: 'Output debugging information', type: 'boolean', group: 'Options'}) .option("debug", { describe: "Output debugging information", type: "boolean", group: "Options" })
.option('d', {alias: 'dry-run', describe: 'Skip publishing', type: 'boolean', group: 'Options'}) .option("d", { alias: "dry-run", describe: "Skip publishing", type: "boolean", group: "Options" })
.option('h', {alias: 'help', group: 'Options'}) .option("h", { alias: "help", group: "Options" })
.strict(false) .strict(false)
.exitProcess(false); .exitProcess(false);
try { try {
const {help, version, ...options} = cli.parse(process.argv.slice(2)); const { help, version, ...options } = cli.parse(process.argv.slice(2));
if (Boolean(help) || Boolean(version)) { if (Boolean(help) || Boolean(version)) {
return 0; return 0;
@ -49,16 +49,16 @@ Usage:
if (options.debug) { if (options.debug) {
// Debug must be enabled before other requires in order to work // Debug must be enabled before other requires in order to work
(await import('debug')).default.enable('semantic-release:*'); (await import("debug")).default.enable("semantic-release:*");
} }
await (await import('./index.js')).default(options); await (await import("./index.js")).default(options);
return 0; return 0;
} catch (error) { } catch (error) {
if (error.name !== 'YError') { if (error.name !== "YError") {
process.stderr.write(hideSensitive(process.env)(util.inspect(error, {colors: true}))); process.stderr.write(hideSensitive(process.env)(util.inspect(error, { colors: true })));
} }
return 1; return 1;
} }
} };

View File

@ -3,43 +3,48 @@
## Usage ## Usage
```js ```js
const semanticRelease = require('semantic-release'); const semanticRelease = require("semantic-release");
const {WritableStreamBuffer} = require('stream-buffers'); const { WritableStreamBuffer } = require("stream-buffers");
const stdoutBuffer = WritableStreamBuffer(); const stdoutBuffer = WritableStreamBuffer();
const stderrBuffer = WritableStreamBuffer(); const stderrBuffer = WritableStreamBuffer();
try { try {
const result = await semanticRelease({ const result = await semanticRelease(
{
// Core options // Core options
branches: [ branches: [
'+([0-9])?(.{+([0-9]),x}).x', "+([0-9])?(.{+([0-9]),x}).x",
'master', "master",
'next', "next",
'next-major', "next-major",
{name: 'beta', prerelease: true}, { name: "beta", prerelease: true },
{name: 'alpha', prerelease: true} { name: "alpha", prerelease: true },
], ],
repositoryUrl: 'https://github.com/me/my-package.git', repositoryUrl: "https://github.com/me/my-package.git",
// Shareable config // Shareable config
extends: 'my-shareable-config', extends: "my-shareable-config",
// Plugin options // Plugin options
githubUrl: 'https://my-ghe.com', githubUrl: "https://my-ghe.com",
githubApiPathPrefix: '/api-prefix' githubApiPathPrefix: "/api-prefix",
}, { },
{
// Run semantic-release from `/path/to/git/repo/root` without having to change local process `cwd` with `process.chdir()` // Run semantic-release from `/path/to/git/repo/root` without having to change local process `cwd` with `process.chdir()`
cwd: '/path/to/git/repo/root', cwd: "/path/to/git/repo/root",
// Pass the variable `MY_ENV_VAR` to semantic-release without having to modify the local `process.env` // Pass the variable `MY_ENV_VAR` to semantic-release without having to modify the local `process.env`
env: {...process.env, MY_ENV_VAR: 'MY_ENV_VAR_VALUE'}, env: { ...process.env, MY_ENV_VAR: "MY_ENV_VAR_VALUE" },
// Store stdout and stderr to use later instead of writing to `process.stdout` and `process.stderr` // Store stdout and stderr to use later instead of writing to `process.stdout` and `process.stderr`
stdout: stdoutBuffer, stdout: stdoutBuffer,
stderr: stderrBuffer stderr: stderrBuffer,
}); }
);
if (result) { if (result) {
const {lastRelease, commits, nextRelease, releases} = result; const { lastRelease, commits, nextRelease, releases } = result;
console.log(`Published ${nextRelease.type} release version ${nextRelease.version} containing ${commits.length} commits.`); console.log(
`Published ${nextRelease.type} release version ${nextRelease.version} containing ${commits.length} commits.`
);
if (lastRelease.version) { if (lastRelease.version) {
console.log(`The last release was "${lastRelease.version}".`); console.log(`The last release was "${lastRelease.version}".`);
@ -49,14 +54,14 @@ try {
console.log(`The release was published with plugin "${release.pluginName}".`); console.log(`The release was published with plugin "${release.pluginName}".`);
} }
} else { } else {
console.log('No release published.'); console.log("No release published.");
} }
// Get stdout and stderr content // Get stdout and stderr content
const logs = stdoutBuffer.getContentsAsString('utf8'); const logs = stdoutBuffer.getContentsAsString("utf8");
const errors = stderrBuffer.getContentsAsString('utf8'); const errors = stderrBuffer.getContentsAsString("utf8");
} catch (err) { } catch (err) {
console.error('The automated release failed with %O', err) console.error("The automated release failed with %O", err);
} }
``` ```
@ -131,7 +136,7 @@ Type: `Object`
Information related to the last release found: Information related to the last release found:
| Name | Type | Description | | Name | Type | Description |
|---------|----------|-------------------------------------------------------------------------------------------------------------------------------------| | ------- | -------- | ----------------------------------------------------------------------------------------------------------------------------------- |
| version | `String` | The version of the last release. | | version | `String` | The version of the last release. |
| gitHead | `String` | The sha of the last commit being part of the last release. | | gitHead | `String` | The sha of the last commit being part of the last release. |
| gitTag | `String` | The [Git tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging) associated with the last release. | | gitTag | `String` | The [Git tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging) associated with the last release. |
@ -140,6 +145,7 @@ Information related to the last release found:
**Notes**: If no previous release is found, `lastRelease` will be an empty `Object`. **Notes**: If no previous release is found, `lastRelease` will be an empty `Object`.
Example: Example:
```js ```js
{ {
gitHead: 'da39a3ee5e6b4b0d3255bfef95601890afd80709', gitHead: 'da39a3ee5e6b4b0d3255bfef95601890afd80709',
@ -157,7 +163,7 @@ The list of commit included in the new release.<br>
Each commit object has the following properties: Each commit object has the following properties:
| Name | Type | Description | | Name | Type | Description |
|-----------------|----------|-------------------------------------------------| | --------------- | -------- | ----------------------------------------------- |
| commit | `Object` | The commit abbreviated and full hash. | | commit | `Object` | The commit abbreviated and full hash. |
| commit.long | `String` | The commit hash. | | commit.long | `String` | The commit hash. |
| commit.short | `String` | The commit abbreviated hash. | | commit.short | `String` | The commit abbreviated hash. |
@ -179,6 +185,7 @@ Each commit object has the following properties:
| committerDate | `String` | The committer date. | | committerDate | `String` | The committer date. |
Example: Example:
```js ```js
[ [
{ {
@ -216,7 +223,7 @@ Type: `Object`
Information related to the newly published release: Information related to the newly published release:
| Name | Type | Description | | Name | Type | Description |
|---------|----------|-------------------------------------------------------------------------------------------------------------------------------| | ------- | -------- | ----------------------------------------------------------------------------------------------------------------------------- |
| type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). | | type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). |
| version | `String` | The version of the new release. | | version | `String` | The version of the new release. |
| gitHead | `String` | The sha of the last commit being part of the new release. | | gitHead | `String` | The sha of the last commit being part of the new release. |
@ -225,6 +232,7 @@ Information related to the newly published release:
| channel | `String` | The distribution channel on which the next release will be made available (`undefined` for the default distribution channel). | | channel | `String` | The distribution channel on which the next release will be made available (`undefined` for the default distribution channel). |
Example: Example:
```js ```js
{ {
type: 'minor', type: 'minor',
@ -244,7 +252,7 @@ The list of releases published or made available to a distribution channel.<br>
Each release object has the following properties: Each release object has the following properties:
| Name | Type | Description | | Name | Type | Description |
|------------|----------|----------------------------------------------------------------------------------------------------------------| | ---------- | -------- | -------------------------------------------------------------------------------------------------------------- |
| name | `String` | **Optional.** The release name, only if set by the corresponding `publish` plugin. | | name | `String` | **Optional.** The release name, only if set by the corresponding `publish` plugin. |
| url | `String` | **Optional.** The release URL, only if set by the corresponding `publish` plugin. | | url | `String` | **Optional.** The release URL, only if set by the corresponding `publish` plugin. |
| type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). | | type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). |
@ -256,6 +264,7 @@ Each release object has the following properties:
| channel | `String` | The distribution channel on which the release is available (`undefined` for the default distribution channel). | | channel | `String` | The distribution channel on which the release is available (`undefined` for the default distribution channel). |
Example: Example:
```js ```js
[ [
{ {

View File

@ -34,7 +34,7 @@ We recommend you setup a linting system to ensure good javascript practices are
In your `index.js` file, you can start by writing the following code In your `index.js` file, you can start by writing the following code
```javascript ```javascript
const verify = require('./src/verify'); const verify = require("./src/verify");
let verified; let verified;
@ -54,7 +54,7 @@ module.exports = { verifyConditions };
Then, in your `src` folder, create a file called `verify.js` and add the following Then, in your `src` folder, create a file called `verify.js` and add the following
```javascript ```javascript
const AggregateError = require('aggregate-error'); const AggregateError = require("aggregate-error");
/** /**
* A method to verify that the user has given us a slack webhook url to post to * A method to verify that the user has given us a slack webhook url to post to
@ -81,8 +81,8 @@ Let's say we want to verify that an `option` is passed. An `option` is a configu
```js ```js
{ {
prepare: { prepare: {
path: "@semantic-release/my-special-plugin" path: "@semantic-release/my-special-plugin";
message: "My cool release message" message: "My cool release message";
} }
} }
``` ```
@ -101,95 +101,96 @@ if (message.length) {
### Common context keys ### Common context keys
* `stdout` - `stdout`
* `stderr` - `stderr`
* `logger` - `logger`
### Context object keys by lifecycle ### Context object keys by lifecycle
#### verifyConditions #### verifyConditions
Initially the context object contains the following keys (`verifyConditions` lifecycle): Initially the context object contains the following keys (`verifyConditions` lifecycle):
* `cwd`
* Current working directory - `cwd`
* `env` - Current working directory
* Environment variables - `env`
* `envCi` - Environment variables
* Information about CI environment - `envCi`
* Contains (at least) the following keys: - Information about CI environment
* `isCi` - Contains (at least) the following keys:
* Boolean, true if the environment is a CI environment - `isCi`
* `commit` - Boolean, true if the environment is a CI environment
* Commit hash - `commit`
* `branch` - Commit hash
* Current branch - `branch`
* `options` - Current branch
* Options passed to `semantic-release` via CLI, configuration files etc. - `options`
* `branch` - Options passed to `semantic-release` via CLI, configuration files etc.
* Information on the current branch - `branch`
* Object keys: - Information on the current branch
* `channel` - Object keys:
* `tags` - `channel`
* `type` - `tags`
* `name` - `type`
* `range` - `name`
* `accept` - `range`
* `main` - `accept`
* `branches` - `main`
* Information on branches - `branches`
* List of branch objects (see above) - Information on branches
- List of branch objects (see above)
#### analyzeCommits #### analyzeCommits
Compared to the verifyConditions, `analyzeCommits` lifecycle context has keys Compared to the verifyConditions, `analyzeCommits` lifecycle context has keys
* `commits` (List) - `commits` (List)
* List of commits taken into account when determining the new version. - List of commits taken into account when determining the new version.
* Keys: - Keys:
* `commit` (Object) - `commit` (Object)
* Keys: - Keys:
* `long` (String, Commit hash) - `long` (String, Commit hash)
* `short` (String, Commit hash) - `short` (String, Commit hash)
* `tree` (Object) - `tree` (Object)
* Keys: - Keys:
* `long` (String, Commit hash) - `long` (String, Commit hash)
* `short` (String, Commit hash) - `short` (String, Commit hash)
* `author` (Object) - `author` (Object)
* Keys: - Keys:
* `name` (String) - `name` (String)
* `email` (String) - `email` (String)
* `date` (String, ISO 8601 timestamp) - `date` (String, ISO 8601 timestamp)
* `committer` (Object) - `committer` (Object)
* Keys: - Keys:
* `name` (String) - `name` (String)
* `email` (String) - `email` (String)
* `date` (String, ISO 8601 timestamp) - `date` (String, ISO 8601 timestamp)
* `subject` (String, Commit message subject) - `subject` (String, Commit message subject)
* `body` (String, Commit message body) - `body` (String, Commit message body)
* `hash` (String, Commit hash) - `hash` (String, Commit hash)
* `committerDate` (String, ISO 8601 timestamp) - `committerDate` (String, ISO 8601 timestamp)
* `message` (String) - `message` (String)
* `gitTags` (String, List of git tags) - `gitTags` (String, List of git tags)
* `releases` (List) - `releases` (List)
* `lastRelease` (Object) - `lastRelease` (Object)
* Keys - Keys
* `version` (String) - `version` (String)
* `gitTag` (String) - `gitTag` (String)
* `channels` (List) - `channels` (List)
* `gitHead` (String, Commit hash) - `gitHead` (String, Commit hash)
* `name` (String) - `name` (String)
#### verifyRelease #### verifyRelease
Additional keys: Additional keys:
* `nextRelease` (Object) - `nextRelease` (Object)
* `type` (String) - `type` (String)
* `channel` (String) - `channel` (String)
* `gitHead` (String, Git hash) - `gitHead` (String, Git hash)
* `version` (String, version without `v`) - `version` (String, version without `v`)
* `gitTag` (String, version with `v`) - `gitTag` (String, version with `v`)
* `name` (String) - `name` (String)
#### generateNotes #### generateNotes
@ -197,7 +198,7 @@ No new content in the context.
#### addChannel #### addChannel
*This is run only if there are releases that have been merged from a higher branch but not added on the channel of the current branch.* _This is run only if there are releases that have been merged from a higher branch but not added on the channel of the current branch._
Context content is similar to lifecycle `verifyRelease`. Context content is similar to lifecycle `verifyRelease`.
@ -215,8 +216,8 @@ Lifecycles `success` and `fail` are mutually exclusive, only one of them will be
Additional keys: Additional keys:
* `releases` - `releases`
* Populated by `publish` lifecycle - Populated by `publish` lifecycle
#### fail #### fail
@ -224,7 +225,7 @@ Lifecycles `success` and `fail` are mutually exclusive, only one of them will be
Additional keys: Additional keys:
* `errors` - `errors`
### Supporting Environment Variables ### Supporting Environment Variables
@ -237,7 +238,9 @@ if (env.GITHUB_TOKEN) {
//... //...
} }
``` ```
## Logger ## Logger
Use `context.logger` to provide debug logging in the plugin. Use `context.logger` to provide debug logging in the plugin.
```js ```js
@ -269,12 +272,13 @@ Knowledge that might be useful for plugin developers.
While it may be trivial that multiple analyzeCommits (or any lifecycle plugins) can be defined, it is not that self-evident that the plugins executed AFTER the first one (for example, the default one: `commit-analyzer`) can change the result. This way it is possible to create more advanced rules or situations, e.g. if none of the commits would result in new release, then a default can be defined. While it may be trivial that multiple analyzeCommits (or any lifecycle plugins) can be defined, it is not that self-evident that the plugins executed AFTER the first one (for example, the default one: `commit-analyzer`) can change the result. This way it is possible to create more advanced rules or situations, e.g. if none of the commits would result in new release, then a default can be defined.
The commit must be a known release type, for example the commit-analyzer has the following default types: The commit must be a known release type, for example the commit-analyzer has the following default types:
* major
* premajor - major
* minor - premajor
* preminor - minor
* patch - preminor
* prepatch - patch
* prerelease - prepatch
- prerelease
If the analyzeCommits-lifecycle plugin does not return anything, then the earlier result is used, but if it returns a supported string value, then that overrides the previous result. If the analyzeCommits-lifecycle plugin does not return anything, then the earlier result is used, but if it returns a supported string value, then that overrides the previous result.

View File

@ -1,6 +1,7 @@
# Plugins list # Plugins list
## Official plugins ## Official plugins
- [@semantic-release/commit-analyzer](https://github.com/semantic-release/commit-analyzer) - [@semantic-release/commit-analyzer](https://github.com/semantic-release/commit-analyzer)
- **Note**: this is already part of semantic-release and does not have to be installed separately - **Note**: this is already part of semantic-release and does not have to be installed separately
- `analyzeCommits`: Determine the type of release by analyzing commits with [conventional-changelog](https://github.com/conventional-changelog/conventional-changelog) - `analyzeCommits`: Determine the type of release by analyzing commits with [conventional-changelog](https://github.com/conventional-changelog/conventional-changelog)
@ -124,21 +125,21 @@
- `verifyConditions`: Verify the presence of a license file - `verifyConditions`: Verify the presence of a license file
- `prepare`: Update the license file based on its type - `prepare`: Update the license file based on its type
- [semantic-release-pypi](https://github.com/abichinger/semantic-release-pypi) - [semantic-release-pypi](https://github.com/abichinger/semantic-release-pypi)
- `verifyConditions`: Verify the environment variable ```PYPI_TOKEN``` and installation of build tools - `verifyConditions`: Verify the environment variable `PYPI_TOKEN` and installation of build tools
- `prepare`: Update the version in ```setup.cfg``` and create the distribution packages - `prepare`: Update the version in `setup.cfg` and create the distribution packages
- `publish`: Publish the python package to a repository (default: pypi) - `publish`: Publish the python package to a repository (default: pypi)
- [semantic-release-helm](https://github.com/m1pl/semantic-release-helm) - [semantic-release-helm](https://github.com/m1pl/semantic-release-helm)
- `verifyConditions`: Validate configuration and (if present) credentials - `verifyConditions`: Validate configuration and (if present) credentials
- `prepare`: Update version and appVersion in ```Chart.yaml``` - `prepare`: Update version and appVersion in `Chart.yaml`
- `publish`: Publish the chart to a registry (if configured) - `publish`: Publish the chart to a registry (if configured)
- [semantic-release-codeartifact](https://github.com/ryansonshine/semantic-release-codeartifact) - [semantic-release-codeartifact](https://github.com/ryansonshine/semantic-release-codeartifact)
- `verifyConditions`: Validate configuration, get AWS CodeArtifact authentication and repository, validate `publishConfig` or `.npmrc` (if they exist), then pass the configuration to the associated plugins. - `verifyConditions`: Validate configuration, get AWS CodeArtifact authentication and repository, validate `publishConfig` or `.npmrc` (if they exist), then pass the configuration to the associated plugins.
- [semantic-release-telegram](https://github.com/pustovitDmytro/semantic-release-telegram) - [semantic-release-telegram](https://github.com/pustovitDmytro/semantic-release-telegram)
- `verifyConditions`: Validate configuration and verify ```TELEGRAM_BOT_ID``` and ```TELEGRAM_BOT_TOKEN``` - `verifyConditions`: Validate configuration and verify `TELEGRAM_BOT_ID` and `TELEGRAM_BOT_TOKEN`
- `success`: Publish a message about the successful release to a telegram chat - `success`: Publish a message about the successful release to a telegram chat
- `fail`: publish a message about failure to a telegram chat - `fail`: publish a message about failure to a telegram chat
- [semantic-release-heroku](https://github.com/pustovitDmytro/semantic-release-heroku) - [semantic-release-heroku](https://github.com/pustovitDmytro/semantic-release-heroku)
- `verifyConditions`: Validate configuration and verify ```HEROKU_API_KEY``` - `verifyConditions`: Validate configuration and verify `HEROKU_API_KEY`
- `prepare`: Update the package.json version and create release tarball - `prepare`: Update the package.json version and create release tarball
- `publish`: Publish version to heroku - `publish`: Publish version to heroku
- [semantic-release-mattermost](https://github.com/ttrobisch/semantic-release-mattermost) - [semantic-release-mattermost](https://github.com/ttrobisch/semantic-release-mattermost)

View File

@ -1,10 +1,12 @@
# Shareable configurations list # Shareable configurations list
## Official configurations ## Official configurations
- [@semantic-release/apm-config](https://github.com/semantic-release/apm-config) - semantic-release shareable configuration for releasing atom packages - [@semantic-release/apm-config](https://github.com/semantic-release/apm-config) - semantic-release shareable configuration for releasing atom packages
- [@semantic-release/gitlab-config](https://github.com/semantic-release/gitlab-config) - semantic-release shareable configuration for GitLab - [@semantic-release/gitlab-config](https://github.com/semantic-release/gitlab-config) - semantic-release shareable configuration for GitLab
## Community configurations ## Community configurations
- [@jedmao/semantic-release-npm-github-config](https://github.com/jedmao/semantic-release-npm-github-config) - [@jedmao/semantic-release-npm-github-config](https://github.com/jedmao/semantic-release-npm-github-config)
- Provides an informative [Git](https://github.com/semantic-release/git) commit message for the release commit that does not trigger continuous integration and conforms to the [conventional commits specification](https://www.conventionalcommits.org/) (e.g., `chore(release): 1.2.3 [skip ci]\n\nnotes`). - Provides an informative [Git](https://github.com/semantic-release/git) commit message for the release commit that does not trigger continuous integration and conforms to the [conventional commits specification](https://www.conventionalcommits.org/) (e.g., `chore(release): 1.2.3 [skip ci]\n\nnotes`).
- Creates a tarball that gets uploaded with each [GitHub release](https://github.com/semantic-release/github). - Creates a tarball that gets uploaded with each [GitHub release](https://github.com/semantic-release/github).

View File

@ -1,4 +1,5 @@
# CI configurations # CI configurations
- [CircleCI 2.0 workflows](circleci-workflows.md) - [CircleCI 2.0 workflows](circleci-workflows.md)
- [Travis CI](travis.md) - [Travis CI](travis.md)
- [GitLab CI](gitlab-ci.md) - [GitLab CI](gitlab-ci.md)

View File

@ -35,7 +35,7 @@ jobs:
- name: Setup Node.js - name: Setup Node.js
uses: actions/setup-node@v2 uses: actions/setup-node@v2
with: with:
node-version: 'lts/*' node-version: "lts/*"
- name: Install dependencies - name: Install dependencies
run: npm ci run: npm ci
- name: Release - name: Release
@ -64,9 +64,11 @@ If the risk is acceptable, some extra configuration is needed. The [actions/chec
## Trigger semantic-release on demand ## Trigger semantic-release on demand
### Using GUI: ### Using GUI:
You can use [Manual Triggers](https://github.blog/changelog/2020-07-06-github-actions-manual-triggers-with-workflow_dispatch/) for GitHub Actions. You can use [Manual Triggers](https://github.blog/changelog/2020-07-06-github-actions-manual-triggers-with-workflow_dispatch/) for GitHub Actions.
### Using HTTP: ### Using HTTP:
Use [`repository_dispatch`](https://docs.github.com/en/actions/reference/events-that-trigger-workflows#repository_dispatch) event to have control on when to generate a release by making an HTTP request, e.g.: Use [`repository_dispatch`](https://docs.github.com/en/actions/reference/events-that-trigger-workflows#repository_dispatch) event to have control on when to generate a release by making an HTTP request, e.g.:
```yaml ```yaml
@ -85,7 +87,8 @@ $ curl -v -H "Accept: application/vnd.github.everest-preview+json" -H "Authoriza
``` ```
### Using 3rd party apps: ### Using 3rd party apps:
If you'd like to use a GitHub app to manage this instead of creating a personal access token, you could consider using a project like: If you'd like to use a GitHub app to manage this instead of creating a personal access token, you could consider using a project like:
* [Actions Panel](https://www.actionspanel.app/) - A declaratively configured way for triggering GitHub Actions - [Actions Panel](https://www.actionspanel.app/) - A declaratively configured way for triggering GitHub Actions
* [Action Button](https://github-action-button.web.app/#details) - A simple badge based mechanism for triggering GitHub Actions - [Action Button](https://github-action-button.web.app/#details) - A simple badge based mechanism for triggering GitHub Actions

View File

@ -52,7 +52,6 @@ This example is a minimal configuration for **semantic-release** with a build ru
**Note**: The`semantic-release` execution command varies depending if you are using a [local](../../usage/installation.md#local-installation) or [global](../../usage/installation.md#global-installation) **semantic-release** installation. **Note**: The`semantic-release` execution command varies depending if you are using a [local](../../usage/installation.md#local-installation) or [global](../../usage/installation.md#global-installation) **semantic-release** installation.
```yaml ```yaml
# The release pipeline will run only on the master branch a commit is triggered # The release pipeline will run only on the master branch a commit is triggered
stages: stages:

View File

@ -1,2 +1,3 @@
# Git hosted services # Git hosted services
- [Git authentication with SSH keys](git-auth-ssh-keys.md) - [Git authentication with SSH keys](git-auth-ssh-keys.md)

View File

@ -21,6 +21,7 @@ This will generate a public key in `git_deploy_key.pub` and a private key in `gi
## Adding the SSH public key to the Git hosted account ## Adding the SSH public key to the Git hosted account
Step by step instructions are provided for the following Git hosted services: Step by step instructions are provided for the following Git hosted services:
- [GitHub](#adding-the-ssh-public-key-to-github) - [GitHub](#adding-the-ssh-public-key-to-github)
### Adding the SSH public key to GitHub ### Adding the SSH public key to GitHub
@ -44,6 +45,7 @@ See [Adding a new SSH key to your GitHub account](https://help.github.com/articl
In order to be available on the CI environment, the SSH private key must be encrypted, committed to the Git repository and decrypted by the CI service. In order to be available on the CI environment, the SSH private key must be encrypted, committed to the Git repository and decrypted by the CI service.
Step by step instructions are provided for the following environments: Step by step instructions are provided for the following environments:
- [Travis CI](#adding-the-ssh-private-key-to-travis-ci) - [Travis CI](#adding-the-ssh-private-key-to-travis-ci)
- [Circle CI](#adding-the-ssh-private-key-to-circle-ci) - [Circle CI](#adding-the-ssh-private-key-to-circle-ci)
@ -109,7 +111,7 @@ $ git push
### Adding the SSH private key to Circle CI ### Adding the SSH private key to Circle CI
First we encrypt the `git_deploy_key` (private key) using a symmetric encryption (AES-256). Run the following `openssl` command and *make sure to note the output which we'll need later*: First we encrypt the `git_deploy_key` (private key) using a symmetric encryption (AES-256). Run the following `openssl` command and _make sure to note the output which we'll need later_:
```bash ```bash
$ openssl aes-256-cbc -e -p -in git_deploy_key -out git_deploy_key.enc -K `openssl rand -hex 32` -iv `openssl rand -hex 16` $ openssl aes-256-cbc -e -p -in git_deploy_key -out git_deploy_key.enc -K `openssl rand -hex 32` -iv `openssl rand -hex 16`
@ -119,6 +121,7 @@ iv =VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV
``` ```
Add the following [environment variables](https://circleci.com/docs/2.0/env-vars/#adding-environment-variables-in-the-app) to Circle CI: Add the following [environment variables](https://circleci.com/docs/2.0/env-vars/#adding-environment-variables-in-the-app) to Circle CI:
- `SSL_PASSPHRASE` - the value set during the [SSH keys generation](#generating-the-ssh-keys) step. - `SSL_PASSPHRASE` - the value set during the [SSH keys generation](#generating-the-ssh-keys) step.
- `REPO_ENC_KEY` - the `key` (KKK) value from the `openssl` step above. - `REPO_ENC_KEY` - the `key` (KKK) value from the `openssl` step above.
- `REPO_ENC_IV` - the `iv` (VVV) value from the `openssl` step above. - `REPO_ENC_IV` - the `iv` (VVV) value from the `openssl` step above.

View File

@ -1,4 +1,5 @@
# Release workflow # Release workflow
- [Publishing on distribution channels](distribution-channels.md) - [Publishing on distribution channels](distribution-channels.md)
- [Publishing maintenance releases](maintenance-releases.md) - [Publishing maintenance releases](maintenance-releases.md)
- [Publishing pre-releases](pre-releases.md) - [Publishing pre-releases](pre-releases.md)

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses distribution channels to make releases available only to a subset of users, in order to collect feedback before distributing the release to all users. This recipe will walk you through a simple example that uses distribution channels to make releases available only to a subset of users, in order to collect feedback before distributing the release to all users.
This example uses the **semantic-release** default configuration: This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]` - [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']` - [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses Git branches and distribution channels to publish fixes and features for old versions of a package. This recipe will walk you through a simple example that uses Git branches and distribution channels to publish fixes and features for old versions of a package.
This example uses the **semantic-release** default configuration: This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]` - [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']` - [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses pre-releases to publish beta versions while working on a future major release and then make only one release on the default distribution. This recipe will walk you through a simple example that uses pre-releases to publish beta versions while working on a future major release and then make only one release on the default distribution.
This example uses the **semantic-release** default configuration: This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]` - [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']` - [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -4,7 +4,7 @@
[`@semantic-release/npm`](https://github.com/semantic-release/npm) takes care of updating the `package.json`s version before publishing to [npm](https://www.npmjs.com). [`@semantic-release/npm`](https://github.com/semantic-release/npm) takes care of updating the `package.json`s version before publishing to [npm](https://www.npmjs.com).
By default, only the published package will contain the version, which is the only place where it is *really* required, but the updated `package.json` will not be pushed to the Git repository By default, only the published package will contain the version, which is the only place where it is _really_ required, but the updated `package.json` will not be pushed to the Git repository
However, the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin can be used to push the updated `package.json` as well as other files to the Git repository. However, the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin can be used to push the updated `package.json` as well as other files to the Git repository.
@ -17,19 +17,24 @@ The `package.json`s version will be updated by the `semantic-release` command
As the [`@semantic-release/npm`](https://github.com/semantic-release/npm) plugin uses the [npm CLI](https://docs.npmjs.com/cli/npm) to update the `package.json` version and publish the package, all [npm hook scripts](https://docs.npmjs.com/misc/scripts#description) will be executed. As the [`@semantic-release/npm`](https://github.com/semantic-release/npm) plugin uses the [npm CLI](https://docs.npmjs.com/cli/npm) to update the `package.json` version and publish the package, all [npm hook scripts](https://docs.npmjs.com/misc/scripts#description) will be executed.
You can run your build script in: You can run your build script in:
- the `prepublishOnly` or `prepack` hook so it will be executed during the `publish` step of `@semantic-release/npm` - the `prepublishOnly` or `prepack` hook so it will be executed during the `publish` step of `@semantic-release/npm`
- the `postversion` hook so it will be executed during the `prepare` step of `@semantic-release/npm`, which allow for example to update files before committing them with the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin - the `postversion` hook so it will be executed during the `prepare` step of `@semantic-release/npm`, which allow for example to update files before committing them with the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin
If using npm hook scripts is not possible, and alternative solution is to [`@semantic-release/exec`](https://github.com/semantic-release/exec) plugin to run your script in the `prepare` step: If using npm hook scripts is not possible, and alternative solution is to [`@semantic-release/exec`](https://github.com/semantic-release/exec) plugin to run your script in the `prepare` step:
```json ```json
{ {
"plugins": [ "plugins": [
"@semantic-release/commit-analyzer", "@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator", "@semantic-release/release-notes-generator",
"@semantic-release/npm", "@semantic-release/npm",
["@semantic-release/exec", { [
"prepareCmd": "./my-build-script.sh ${nextRelease.version}", "@semantic-release/exec",
}], {
"prepareCmd": "./my-build-script.sh ${nextRelease.version}"
}
]
] ]
} }
``` ```
@ -43,6 +48,7 @@ Yes with the [dry-run options](../usage/configuration.md#dryrun) which prints to
Yes, **semantic-release** is a Node CLI application, but it can be used to publish any type of packages. Yes, **semantic-release** is a Node CLI application, but it can be used to publish any type of packages.
To publish a non-Node package (without a `package.json`) you would need to: To publish a non-Node package (without a `package.json`) you would need to:
- Use a [global](../usage/installation.md#global-installation) **semantic-release** installation - Use a [global](../usage/installation.md#global-installation) **semantic-release** installation
- Set **semantic-release** [options](../usage/configuration.md#options) via [CLI arguments or `.rc` file](../usage/configuration.md#configuration) - Set **semantic-release** [options](../usage/configuration.md#options) via [CLI arguments or `.rc` file](../usage/configuration.md#configuration)
- Make sure your CI job executing the `semantic-release` command has access to a version of Node that [meets our version requirement](./node-version.md) to execute the `semantic-release` command - Make sure your CI job executing the `semantic-release` command has access to a version of Node that [meets our version requirement](./node-version.md) to execute the `semantic-release` command
@ -61,10 +67,13 @@ Here is a basic example to create [GitHub releases](https://help.github.com/arti
"@semantic-release/commit-analyzer", "@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator", "@semantic-release/release-notes-generator",
"@semantic-release/github", "@semantic-release/github",
["@semantic-release/exec", { [
"prepareCmd" : "set-version ${nextRelease.version}", "@semantic-release/exec",
"publishCmd" : "publish-package" {
}] "prepareCmd": "set-version ${nextRelease.version}",
"publishCmd": "publish-package"
}
]
] ]
} }
``` ```
@ -76,6 +85,7 @@ See the [package managers and languages recipes](../recipes/release-workflow/REA
## Can I use semantic-release with any CI service? ## Can I use semantic-release with any CI service?
Yes, **semantic-release** can be used with any CI service, as long as it provides: Yes, **semantic-release** can be used with any CI service, as long as it provides:
- A way to set [authentication](../usage/ci-configuration.md#authentication) via environment variables - A way to set [authentication](../usage/ci-configuration.md#authentication) via environment variables
- A way to guarantee that the `semantic-release` command is [executed only after all the tests of all the jobs in the CI build pass](../usage/ci-configuration.md#run-semantic-release-only-after-all-tests-succeeded) - A way to guarantee that the `semantic-release` command is [executed only after all the tests of all the jobs in the CI build pass](../usage/ci-configuration.md#run-semantic-release-only-after-all-tests-succeeded)
@ -112,6 +122,7 @@ See the [`@semantic-release/npm`](https://github.com/semantic-release/npm#semant
## How can I revert a release? ## How can I revert a release?
If you have introduced a breaking bug in a release you have 2 options: If you have introduced a breaking bug in a release you have 2 options:
- If you have a fix immediately ready, commit and push it (or merge it via a pull request) to the release branch - If you have a fix immediately ready, commit and push it (or merge it via a pull request) to the release branch
- Otherwise, [revert the commit](https://git-scm.com/docs/git-revert) that introduced the bug and push the revert commit (or merge it via a pull request) to the release branch - Otherwise, [revert the commit](https://git-scm.com/docs/git-revert) that introduced the bug and push the revert commit (or merge it via a pull request) to the release branch
@ -157,7 +168,7 @@ See [Artifactory - npm Registry](https://www.jfrog.com/confluence/display/RTF/Np
## Can I manually trigger the release of a specific version? ## Can I manually trigger the release of a specific version?
You can trigger a release by pushing to your Git repository. You deliberately cannot trigger a *specific* version release, because this is the whole point of semantic-release. You can trigger a release by pushing to your Git repository. You deliberately cannot trigger a _specific_ version release, because this is the whole point of semantic-release.
## Can I exclude commits from the analysis? ## Can I exclude commits from the analysis?
@ -168,7 +179,7 @@ Yes, every commits that contains `[skip release]` or `[release skip]` in their m
By default **semantic-release** uses the [Angular Commit Message Conventions](https://github.com/angular/angular.js/blob/master/DEVELOPERS.md#-git-commit-guidelines) and triggers releases based on the following rules: By default **semantic-release** uses the [Angular Commit Message Conventions](https://github.com/angular/angular.js/blob/master/DEVELOPERS.md#-git-commit-guidelines) and triggers releases based on the following rules:
| Commit | Release type | | Commit | Release type |
|-----------------------------|----------------------------| | --------------------------- | -------------------------- |
| Commit with breaking change | ~~Major~~ Breaking release | | Commit with breaking change | ~~Major~~ Breaking release |
| Commit with type `feat` | ~~Minor~~ Feature release | | Commit with type `feat` | ~~Minor~~ Feature release |
| Commit with type `fix` | Patch release | | Commit with type `fix` | Patch release |
@ -178,9 +189,9 @@ See the [`@semantic-release/npm`](https://github.com/semantic-release/npm#npm-co
This is fully customizable with the [`@semantic-release/commit-analyzer`](https://github.com/semantic-release/commit-analyzer) plugin's [`release-rules` option](https://github.com/semantic-release/commit-analyzer#release-rules). This is fully customizable with the [`@semantic-release/commit-analyzer`](https://github.com/semantic-release/commit-analyzer) plugin's [`release-rules` option](https://github.com/semantic-release/commit-analyzer#release-rules).
## Is it *really* a good idea to release on every push? ## Is it _really_ a good idea to release on every push?
It is indeed a great idea because it *forces* you to follow best practices. If you dont feel comfortable releasing every feature or fix on your `master` you might not treat your `master` branch as intended. It is indeed a great idea because it _forces_ you to follow best practices. If you dont feel comfortable releasing every feature or fix on your `master` you might not treat your `master` branch as intended.
From [Understanding the GitHub Flow](https://guides.github.com/introduction/flow/index.html): From [Understanding the GitHub Flow](https://guides.github.com/introduction/flow/index.html):

View File

@ -15,6 +15,7 @@ This is most likely related to a misconfiguration of the [npm registry authentic
It might also happen if the package name you are trying to publish already exists (in the case of npm, you may be trying to publish a new version of a package that is not yours, hence the permission error). It might also happen if the package name you are trying to publish already exists (in the case of npm, you may be trying to publish a new version of a package that is not yours, hence the permission error).
To verify if your package name is available you can use [npm-name-cli](https://github.com/sindresorhus/npm-name-cli): To verify if your package name is available you can use [npm-name-cli](https://github.com/sindresorhus/npm-name-cli):
```bash ```bash
$ npm install --global npm-name-cli $ npm install --global npm-name-cli
$ npm-name <package-name> $ npm-name <package-name>

View File

@ -4,6 +4,7 @@
The `semantic-release` command must be executed only after all the tests in the CI build pass. If the build runs multiple jobs (for example to test on multiple Operating Systems or Node versions) the CI has to be configured to guarantee that the `semantic-release` command is executed only after all jobs are successful. The `semantic-release` command must be executed only after all the tests in the CI build pass. If the build runs multiple jobs (for example to test on multiple Operating Systems or Node versions) the CI has to be configured to guarantee that the `semantic-release` command is executed only after all jobs are successful.
Here are a few examples of the CI services that can be used to achieve this: Here are a few examples of the CI services that can be used to achieve this:
- [Travis Build Stages](https://docs.travis-ci.com/user/build-stages) - [Travis Build Stages](https://docs.travis-ci.com/user/build-stages)
- [CircleCI Workflows](https://circleci.com/docs/2.0/workflows) - [CircleCI Workflows](https://circleci.com/docs/2.0/workflows)
- [GitHub Actions](https://github.com/features/actions) - [GitHub Actions](https://github.com/features/actions)
@ -22,7 +23,7 @@ See [CI configuration recipes](../recipes/ci-configurations#ci-configurations) f
**semantic-release** requires push access to the project Git repository in order to create [Git tags](https://git-scm.com/book/en/v2/Git-Basics-Tagging). The Git authentication can be set with one of the following environment variables: **semantic-release** requires push access to the project Git repository in order to create [Git tags](https://git-scm.com/book/en/v2/Git-Basics-Tagging). The Git authentication can be set with one of the following environment variables:
| Variable | Description | | Variable | Description |
|-------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | ----------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `GH_TOKEN` or `GITHUB_TOKEN` | A GitHub [personal access token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line). | | `GH_TOKEN` or `GITHUB_TOKEN` | A GitHub [personal access token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line). |
| `GL_TOKEN` or `GITLAB_TOKEN` | A GitLab [personal access token](https://docs.gitlab.com/ce/user/profile/personal_access_tokens.html). | | `GL_TOKEN` or `GITLAB_TOKEN` | A GitLab [personal access token](https://docs.gitlab.com/ce/user/profile/personal_access_tokens.html). |
| `BB_TOKEN` or `BITBUCKET_TOKEN` | A Bitbucket [personal access token](https://confluence.atlassian.com/bitbucketserver/personal-access-tokens-939515499.html). | | `BB_TOKEN` or `BITBUCKET_TOKEN` | A Bitbucket [personal access token](https://confluence.atlassian.com/bitbucketserver/personal-access-tokens-939515499.html). |
@ -36,7 +37,7 @@ Alternatively the Git authentication can be set up via [SSH keys](../recipes/git
Most **semantic-release** [plugins](plugins.md) require setting up authentication in order to publish to a package manager registry. The default [@semantic-release/npm](https://github.com/semantic-release/npm#environment-variables) and [@semantic-release/github](https://github.com/semantic-release/github#environment-variables) plugins require the following environment variables: Most **semantic-release** [plugins](plugins.md) require setting up authentication in order to publish to a package manager registry. The default [@semantic-release/npm](https://github.com/semantic-release/npm#environment-variables) and [@semantic-release/github](https://github.com/semantic-release/github#environment-variables) plugins require the following environment variables:
| Variable | Description | | Variable | Description |
|-------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | ----------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `NPM_TOKEN` | npm token created via [npm token create](https://docs.npmjs.com/getting-started/working_with_tokens#how-to-create-new-tokens).<br/>**Note**: Only the `auth-only` [level of npm two-factor authentication](https://docs.npmjs.com/getting-started/using-two-factor-authentication#levels-of-authentication) is supported. | | `NPM_TOKEN` | npm token created via [npm token create](https://docs.npmjs.com/getting-started/working_with_tokens#how-to-create-new-tokens).<br/>**Note**: Only the `auth-only` [level of npm two-factor authentication](https://docs.npmjs.com/getting-started/using-two-factor-authentication#levels-of-authentication) is supported. |
| `GH_TOKEN` | GitHub authentication token.<br/>**Note**: Only the [personal token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line) authentication is supported. | | `GH_TOKEN` | GitHub authentication token.<br/>**Note**: Only the [personal token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line) authentication is supported. |

View File

@ -1,6 +1,7 @@
# Configuration # Configuration
**semantic-release** configuration consists of: **semantic-release** configuration consists of:
- Git repository ([URL](#repositoryurl) and options [release branches](#branches) and [tag format](#tagformat)) - Git repository ([URL](#repositoryurl) and options [release branches](#branches) and [tag format](#tagformat))
- Plugins [declaration](#plugins) and options - Plugins [declaration](#plugins) and options
- Run mode ([debug](#debug), [dry run](#dryrun) and [local (no CI)](#ci)) - Run mode ([debug](#debug), [dry run](#dryrun) and [local (no CI)](#ci))
@ -12,6 +13,7 @@ Additionally, metadata of Git tags generated by **semantic-release** can be cust
## Configuration file ## Configuration file
**semantic-release**s [options](#options), mode and [plugins](plugins.md) can be set via either: **semantic-release**s [options](#options), mode and [plugins](plugins.md) can be set via either:
- A `.releaserc` file, written in YAML or JSON, with optional extensions: `.yaml`/`.yml`/`.json`/`.js`/`.cjs` - A `.releaserc` file, written in YAML or JSON, with optional extensions: `.yaml`/`.yml`/`.json`/`.js`/`.cjs`
- A `release.config.(js|cjs)` file that exports an object - A `release.config.(js|cjs)` file that exports an object
- A `release` key in the project's `package.json` file - A `release` key in the project's `package.json` file
@ -21,6 +23,7 @@ Alternatively, some options can be set via CLI arguments.
The following three examples are the same. The following three examples are the same.
- Via `release` key in the project's `package.json` file: - Via `release` key in the project's `package.json` file:
```json ```json
{ {
"release": { "release": {
@ -30,6 +33,7 @@ The following three examples are the same.
``` ```
- Via `.releaserc` file: - Via `.releaserc` file:
```json ```json
{ {
"branches": ["master", "next"] "branches": ["master", "next"]
@ -37,6 +41,7 @@ The following three examples are the same.
``` ```
- Via CLI argument: - Via CLI argument:
```bash ```bash
$ semantic-release --branches next $ semantic-release --branches next
``` ```
@ -65,6 +70,7 @@ Default: `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name:
CLI arguments: `--branches` CLI arguments: `--branches`
The branches on which releases should happen. By default **semantic-release** will release: The branches on which releases should happen. By default **semantic-release** will release:
- regular releases to the default distribution channel from the branch `master` - regular releases to the default distribution channel from the branch `master`
- regular releases to a distribution channel matching the branch name from any existing branch with a name matching a maintenance release range (`N.N.x` or `N.x.x` or `N.x` with `N` being a number) - regular releases to a distribution channel matching the branch name from any existing branch with a name matching a maintenance release range (`N.N.x` or `N.x.x` or `N.x` with `N` being a number)
- regular releases to the `next` distribution channel from the branch `next` if it exists - regular releases to the `next` distribution channel from the branch `next` if it exists
@ -143,7 +149,7 @@ Output debugging information. This can also be enabled by setting the `DEBUG` en
## Git environment variables ## Git environment variables
| Variable | Description | Default | | Variable | Description | Default |
|-----------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------| | --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------ |
| `GIT_AUTHOR_NAME` | The author name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. | | `GIT_AUTHOR_NAME` | The author name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. |
| `GIT_AUTHOR_EMAIL` | The author email associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot email address. | | `GIT_AUTHOR_EMAIL` | The author email associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot email address. |
| `GIT_COMMITTER_NAME` | The committer name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. | | `GIT_COMMITTER_NAME` | The committer name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. |

View File

@ -5,7 +5,7 @@ Each [release step](../../README.md#release-steps) is implemented by configurabl
A plugin is a npm module that can implement one or more of the following steps: A plugin is a npm module that can implement one or more of the following steps:
| Step | Required | Description | | Step | Required | Description |
|--------------------|----------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | ------------------ | -------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `verifyConditions` | No | Responsible for verifying conditions necessary to proceed with the release: configuration is correct, authentication token are valid, etc... | | `verifyConditions` | No | Responsible for verifying conditions necessary to proceed with the release: configuration is correct, authentication token are valid, etc... |
| `analyzeCommits` | Yes | Responsible for determining the type of the next release (`major`, `minor` or `patch`). If multiple plugins with a `analyzeCommits` step are defined, the release type will be the highest one among plugins output. | | `analyzeCommits` | Yes | Responsible for determining the type of the next release (`major`, `minor` or `patch`). If multiple plugins with a `analyzeCommits` step are defined, the release type will be the highest one among plugins output. |
| `verifyRelease` | No | Responsible for verifying the parameters (version, type, dist-tag etc...) of the release that is about to be published. | | `verifyRelease` | No | Responsible for verifying the parameters (version, type, dist-tag etc...) of the release that is about to be published. |
@ -25,6 +25,7 @@ Release steps will run in that order. At each step, **semantic-release** will ru
### Default plugins ### Default plugins
These four plugins are already part of **semantic-release** and are listed in order of execution. They do not have to be installed separately: These four plugins are already part of **semantic-release** and are listed in order of execution. They do not have to be installed separately:
``` ```
"@semantic-release/commit-analyzer" "@semantic-release/commit-analyzer"
"@semantic-release/release-notes-generator" "@semantic-release/release-notes-generator"
@ -66,6 +67,7 @@ For each [release step](../../README.md#release-steps) the plugins that implemen
``` ```
With this configuration **semantic-release** will: With this configuration **semantic-release** will:
- execute the `verifyConditions` implementation of `@semantic-release/npm` then `@semantic-release/git` - execute the `verifyConditions` implementation of `@semantic-release/npm` then `@semantic-release/git`
- execute the `analyzeCommits` implementation of `@semantic-release/commit-analyzer` - execute the `analyzeCommits` implementation of `@semantic-release/commit-analyzer`
- execute the `generateNotes` implementation of `@semantic-release/release-notes-generator` - execute the `generateNotes` implementation of `@semantic-release/release-notes-generator`
@ -85,9 +87,12 @@ Global plugin configuration can be defined at the root of the **semantic-release
"plugins": [ "plugins": [
"@semantic-release/commit-analyzer", "@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator", "@semantic-release/release-notes-generator",
["@semantic-release/github", { [
"@semantic-release/github",
{
"assets": ["dist/**"] "assets": ["dist/**"]
}], }
],
"@semantic-release/git" "@semantic-release/git"
], ],
"preset": "angular" "preset": "angular"
@ -95,5 +100,6 @@ Global plugin configuration can be defined at the root of the **semantic-release
``` ```
With this configuration: With this configuration:
- All plugins will receive the `preset` option, which will be used by both `@semantic-release/commit-analyzer` and `@semantic-release/release-notes-generator` (and ignored by `@semantic-release/github` and `@semantic-release/git`) - All plugins will receive the `preset` option, which will be used by both `@semantic-release/commit-analyzer` and `@semantic-release/release-notes-generator` (and ignored by `@semantic-release/github` and `@semantic-release/git`)
- The `@semantic-release/github` plugin will receive the `assets` options (`@semantic-release/git` will not receive it and therefore will use it's default value for that option) - The `@semantic-release/github` plugin will receive the `assets` options (`@semantic-release/git` will not receive it and therefore will use it's default value for that option)

View File

@ -1,6 +1,7 @@
# Workflow configuration # Workflow configuration
**semantic-release** allow to manage and automate complex release workflow, based on multiple Git branches and distribution channels. This allow to: **semantic-release** allow to manage and automate complex release workflow, based on multiple Git branches and distribution channels. This allow to:
- Distribute certain releases to a particular group of users via distribution channels - Distribute certain releases to a particular group of users via distribution channels
- Manage the availability of releases on distribution channels via branches merge - Manage the availability of releases on distribution channels via branches merge
- Maintain multiple lines of releases in parallel - Maintain multiple lines of releases in parallel
@ -12,6 +13,7 @@ The release workflow is configured via the [branches option](./configuration.md#
Each branch can be defined either as a string, a [glob](https://github.com/micromatch/micromatch#matching-features) or an object. For string and glob definitions each [property](#branches-properties) will be defaulted. Each branch can be defined either as a string, a [glob](https://github.com/micromatch/micromatch#matching-features) or an object. For string and glob definitions each [property](#branches-properties) will be defaulted.
A branch can defined as one of three types: A branch can defined as one of three types:
- [release](#release-branches): to make releases on top of the last version released - [release](#release-branches): to make releases on top of the last version released
- [maintenance](#maintenance-branches): to make releases on top of an old release - [maintenance](#maintenance-branches): to make releases on top of an old release
- [pre-release](#pre-release-branches): to make pre-releases - [pre-release](#pre-release-branches): to make pre-releases
@ -21,7 +23,7 @@ The type of the branch is automatically determined based on naming convention an
## Branches properties ## Branches properties
| Property | Branch type | Description | Default | | Property | Branch type | Description | Default |
|--------------|-------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------| | ------------ | ----------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| `name` | All | **Required.** The Git branch holding the commits to analyze and the code to release. See [name](#name). | - The value itself if defined as a `String` or the matching branches name if defined as a glob. | | `name` | All | **Required.** The Git branch holding the commits to analyze and the code to release. See [name](#name). | - The value itself if defined as a `String` or the matching branches name if defined as a glob. |
| `channel` | All | The distribution channel on which to publish releases from this branch. Set to `false` to force the default distribution channel instead of using the default. See [channel](#channel). | `undefined` for the first release branch, the value of `name` for subsequent ones. | | `channel` | All | The distribution channel on which to publish releases from this branch. Set to `false` to force the default distribution channel instead of using the default. See [channel](#channel). | `undefined` for the first release branch, the value of `name` for subsequent ones. |
| `range` | [maintenance](#maintenance-branches) only | **Required unless `name` is formatted like `N.N.x` or `N.x` (`N` is a number).** The range of [semantic versions](https://semver.org) to support on this branch. See [range](#range). | The value of `name`. | | `range` | [maintenance](#maintenance-branches) only | **Required unless `name` is formatted like `N.N.x` or `N.x` (`N` is a number).** The range of [semantic versions](https://semver.org) to support on this branch. See [range](#range). | The value of `name`. |
@ -35,14 +37,15 @@ It can be defined as a [glob](https://github.com/micromatch/micromatch#matching-
If `name` doesn't match to any branch existing in the repository, the definition will be ignored. For example the default configuration includes the definition `next` and `next-major` which will become active only when the branches `next` and/or `next-major` are created in the repository. This allow to define your workflow once with all potential branches you might use and have the effective configuration evolving as you create new branches. If `name` doesn't match to any branch existing in the repository, the definition will be ignored. For example the default configuration includes the definition `next` and `next-major` which will become active only when the branches `next` and/or `next-major` are created in the repository. This allow to define your workflow once with all potential branches you might use and have the effective configuration evolving as you create new branches.
For example the configuration `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next']` will be expanded as: For example the configuration `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next']` will be expanded as:
```js ```js
{ {
branches: [ branches: [
{name: '1.x', range: '1.x', channel: '1.x'}, // Only after the `1.x` is created in the repo { name: "1.x", range: "1.x", channel: "1.x" }, // Only after the `1.x` is created in the repo
{name: '2.x', range: '2.x', channel: '2.x'}, // Only after the `2.x` is created in the repo { name: "2.x", range: "2.x", channel: "2.x" }, // Only after the `2.x` is created in the repo
{name: 'master'}, { name: "master" },
{name: 'next', channel: 'next'}, // Only after the `next` is created in the repo { name: "next", channel: "next" }, // Only after the `next` is created in the repo
] ];
} }
``` ```
@ -54,12 +57,13 @@ If the `channel` property is set to `false` the default channel will be used.
The value of `channel`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available. The value of `channel`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available.
For example the configuration `['master', {name: 'next', channel: 'channel-${name}'}]` will be expanded as: For example the configuration `['master', {name: 'next', channel: 'channel-${name}'}]` will be expanded as:
```js ```js
{ {
branches: [ branches: [
{name: 'master'}, // `channel` is undefined so the default distribution channel will be used { name: "master" }, // `channel` is undefined so the default distribution channel will be used
{name: 'next', channel: 'channel-next'}, // `channel` is built with the template `channel-${name}` { name: "next", channel: "channel-next" }, // `channel` is built with the template `channel-${name}`
] ];
} }
``` ```
@ -68,13 +72,14 @@ For example the configuration `['master', {name: 'next', channel: 'channel-${nam
A `range` only applies to maintenance branches, is required and must be formatted like `N.N.x` or `N.x` (`N` is a number). In case the `name` is formatted as a range (for example `1.x` or `1.5.x`) the branch will be considered a maintenance branch and the `name` value will be used for the `range`. A `range` only applies to maintenance branches, is required and must be formatted like `N.N.x` or `N.x` (`N` is a number). In case the `name` is formatted as a range (for example `1.x` or `1.5.x`) the branch will be considered a maintenance branch and the `name` value will be used for the `range`.
For example the configuration `['1.1.x', '1.2.x', 'master']` will be expanded as: For example the configuration `['1.1.x', '1.2.x', 'master']` will be expanded as:
```js ```js
{ {
branches: [ branches: [
{name: '1.1.x', range: '1.1.x', channel: '1.1.x'}, { name: "1.1.x", range: "1.1.x", channel: "1.1.x" },
{name: '1.2.x', range: '1.2.x', channel: '1.2.x'}, { name: "1.2.x", range: "1.2.x", channel: "1.2.x" },
{name: 'master'}, { name: "master" },
] ];
} }
``` ```
@ -86,13 +91,14 @@ If the `prerelease` property is set to `true` the `name` value will be used.
The value of `prerelease`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available. The value of `prerelease`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available.
For example the configuration `['master', {name: 'pre/rc', prerelease: '${name.replace(/^pre\\//g, "")}'}, {name: 'beta', prerelease: true}]` will be expanded as: For example the configuration `['master', {name: 'pre/rc', prerelease: '${name.replace(/^pre\\//g, "")}'}, {name: 'beta', prerelease: true}]` will be expanded as:
```js ```js
{ {
branches: [ branches: [
{name: 'master'}, { name: "master" },
{name: 'pre/rc', channel: 'pre/rc', prerelease: 'rc'}, // `prerelease` is built with the template `${name.replace(/^pre\\//g, "")}` { name: "pre/rc", channel: "pre/rc", prerelease: "rc" }, // `prerelease` is built with the template `${name.replace(/^pre\\//g, "")}`
{name: 'beta', channel: 'beta', prerelease: true}, // `prerelease` is set to `beta` as it is the value of `name` { name: "beta", channel: "beta", prerelease: true }, // `prerelease` is set to `beta` as it is the value of `name`
] ];
} }
``` ```
@ -113,10 +119,12 @@ See [publishing on distribution channels recipe](../recipes/release-workflow/dis
#### Pushing to a release branch #### Pushing to a release branch
With the configuration `"branches": ["master", "next"]`, if the last release published from `master` is `1.0.0` and the last one from `next` is `2.0.0` then: With the configuration `"branches": ["master", "next"]`, if the last release published from `master` is `1.0.0` and the last one from `next` is `2.0.0` then:
- Only versions in range `1.x.x` can be published from `master`, so only `fix` and `feat` commits can be pushed to `master` - Only versions in range `1.x.x` can be published from `master`, so only `fix` and `feat` commits can be pushed to `master`
- Once `next` get merged into `master` the release `2.0.0` will be made available on the channel associated with `master` and both `master` and `next` will accept any commit type - Once `next` get merged into `master` the release `2.0.0` will be made available on the channel associated with `master` and both `master` and `next` will accept any commit type
This verification prevent scenario such as: This verification prevent scenario such as:
1. Create a `feat` commit on `next` which triggers the release of version `1.0.0` on the `next` channel 1. Create a `feat` commit on `next` which triggers the release of version `1.0.0` on the `next` channel
2. Merge `next` into `master` which adds `1.0.0` on the default channel 2. Merge `next` into `master` which adds `1.0.0` on the default channel
3. Create a `feat` commit on `next` which triggers the release of version `1.1.0` on the `next` channel 3. Create a `feat` commit on `next` which triggers the release of version `1.1.0` on the `next` channel
@ -147,6 +155,7 @@ See [publishing maintenance releases recipe](../recipes/release-workflow/mainten
#### Pushing to a maintenance branch #### Pushing to a maintenance branch
With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.5.0` then: With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.5.0` then:
- Only versions in range `>=1.0.0 <1.1.0` can be published from `1.0.x`, so only `fix` commits can be pushed to `1.0.x` - Only versions in range `>=1.0.0 <1.1.0` can be published from `1.0.x`, so only `fix` commits can be pushed to `1.0.x`
- Only versions in range `>=1.1.0 <1.5.0` can be published from `1.x`, so only `fix` and `feat` commits can be pushed to `1.x` as long the resulting release is lower than `1.5.0` - Only versions in range `>=1.1.0 <1.5.0` can be published from `1.x`, so only `fix` and `feat` commits can be pushed to `1.x` as long the resulting release is lower than `1.5.0`
- Once `2.0.0` is released from `master`, versions in range `>=1.1.0 <2.0.0` can be published from `1.x`, so any number of `fix` and `feat` commits can be pushed to `1.x` - Once `2.0.0` is released from `master`, versions in range `>=1.1.0 <2.0.0` can be published from `1.x`, so any number of `fix` and `feat` commits can be pushed to `1.x`
@ -154,6 +163,7 @@ With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last rel
#### Merging into a maintenance branch #### Merging into a maintenance branch
With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.0.0` then: With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.0.0` then:
- Creating the branch `1.0.x` from `master` will make the `1.0.0` release available on the `1.0.x` distribution channel - Creating the branch `1.0.x` from `master` will make the `1.0.0` release available on the `1.0.x` distribution channel
- Pushing a `fix` commit on the `1.0.x` branch will release the version `1.0.1` on the `1.0.x` distribution channel - Pushing a `fix` commit on the `1.0.x` branch will release the version `1.0.1` on the `1.0.x` distribution channel
- Creating the branch `1.x` from `master` will make the `1.0.0` release available on the `1.x` distribution channel - Creating the branch `1.x` from `master` will make the `1.0.0` release available on the `1.x` distribution channel
@ -176,11 +186,13 @@ See [publishing pre-releases recipe](../recipes/release-workflow/pre-releases.md
#### Pushing to a pre-release branch #### Pushing to a pre-release branch
With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` then: With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` then:
- Pushing a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.1` on the `beta` distribution channel - Pushing a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.1` on the `beta` distribution channel
- Pushing either a `fix`, `feat` or a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.2` (then `2.0.0-beta.3`, `2.0.0-beta.4`, etc...) on the `beta` distribution channel - Pushing either a `fix`, `feat` or a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.2` (then `2.0.0-beta.3`, `2.0.0-beta.4`, etc...) on the `beta` distribution channel
#### Merging into a pre-release branch #### Merging into a pre-release branch
With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` and the last one published from `beta` is `2.0.0-beta.1` then: With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` and the last one published from `beta` is `2.0.0-beta.1` then:
- Pushing a `fix` commit on the `master` branch will release the version `1.0.1` on the default distribution channel - Pushing a `fix` commit on the `master` branch will release the version `1.0.1` on the default distribution channel
- Merging the branch `master` into `beta` will release the version `2.0.0-beta.2` on the `beta` distribution channel - Merging the branch `master` into `beta` will release the version `2.0.0-beta.2` on the `beta` distribution channel

147
index.js
View File

@ -1,33 +1,33 @@
import {createRequire} from 'node:module'; import { createRequire } from "node:module";
import {pick} from 'lodash-es'; import { pick } from "lodash-es";
import * as marked from 'marked'; import * as marked from "marked";
import envCi from 'env-ci'; import envCi from "env-ci";
import {hookStd} from 'hook-std'; import { hookStd } from "hook-std";
import semver from 'semver'; import semver from "semver";
import AggregateError from 'aggregate-error'; import AggregateError from "aggregate-error";
import hideSensitive from './lib/hide-sensitive.js'; import hideSensitive from "./lib/hide-sensitive.js";
import getConfig from './lib/get-config.js'; import getConfig from "./lib/get-config.js";
import verify from './lib/verify.js'; import verify from "./lib/verify.js";
import getNextVersion from './lib/get-next-version.js'; import getNextVersion from "./lib/get-next-version.js";
import getCommits from './lib/get-commits.js'; import getCommits from "./lib/get-commits.js";
import getLastRelease from './lib/get-last-release.js'; import getLastRelease from "./lib/get-last-release.js";
import getReleaseToAdd from './lib/get-release-to-add.js'; import getReleaseToAdd from "./lib/get-release-to-add.js";
import {extractErrors, makeTag} from './lib/utils.js'; import { extractErrors, makeTag } from "./lib/utils.js";
import getGitAuthUrl from './lib/get-git-auth-url.js'; import getGitAuthUrl from "./lib/get-git-auth-url.js";
import getBranches from './lib/branches/index.js'; import getBranches from "./lib/branches/index.js";
import getLogger from './lib/get-logger.js'; import getLogger from "./lib/get-logger.js";
import {addNote, getGitHead, getTagHead, isBranchUpToDate, push, pushNotes, tag, verifyAuth} from './lib/git.js'; import { addNote, getGitHead, getTagHead, isBranchUpToDate, push, pushNotes, tag, verifyAuth } from "./lib/git.js";
import getError from './lib/get-error.js'; import getError from "./lib/get-error.js";
import {COMMIT_EMAIL, COMMIT_NAME} from './lib/definitions/constants.js'; import { COMMIT_EMAIL, COMMIT_NAME } from "./lib/definitions/constants.js";
const require = createRequire(import.meta.url); const require = createRequire(import.meta.url);
const pkg = require('./package.json'); const pkg = require("./package.json");
let markedOptionsSet = false; let markedOptionsSet = false;
async function terminalOutput(text) { async function terminalOutput(text) {
if (!markedOptionsSet) { if (!markedOptionsSet) {
const {default: TerminalRenderer} = await import('marked-terminal'); // eslint-disable-line node/no-unsupported-features/es-syntax const { default: TerminalRenderer } = await import("marked-terminal"); // eslint-disable-line node/no-unsupported-features/es-syntax
marked.setOptions({renderer: new TerminalRenderer()}); marked.setOptions({ renderer: new TerminalRenderer() });
markedOptionsSet = true; markedOptionsSet = true;
} }
@ -36,12 +36,12 @@ async function terminalOutput(text) {
/* eslint complexity: off */ /* eslint complexity: off */
async function run(context, plugins) { async function run(context, plugins) {
const {cwd, env, options, logger, envCi} = context; const { cwd, env, options, logger, envCi } = context;
const {isCi, branch, prBranch, isPr} = envCi; const { isCi, branch, prBranch, isPr } = envCi;
const ciBranch = isPr ? prBranch : branch; const ciBranch = isPr ? prBranch : branch;
if (!isCi && !options.dryRun && !options.noCi) { if (!isCi && !options.dryRun && !options.noCi) {
logger.warn('This run was not triggered in a known CI environment, running in dry-run mode.'); logger.warn("This run was not triggered in a known CI environment, running in dry-run mode.");
options.dryRun = true; options.dryRun = true;
} else { } else {
// When running on CI, set the commits author and committer info and prevent the `git` CLI to prompt for username/password. See #703. // When running on CI, set the commits author and committer info and prevent the `git` CLI to prompt for username/password. See #703.
@ -51,7 +51,7 @@ async function run(context, plugins) {
GIT_COMMITTER_NAME: COMMIT_NAME, GIT_COMMITTER_NAME: COMMIT_NAME,
GIT_COMMITTER_EMAIL: COMMIT_EMAIL, GIT_COMMITTER_EMAIL: COMMIT_EMAIL,
...env, ...env,
GIT_ASKPASS: 'echo', GIT_ASKPASS: "echo",
GIT_TERMINAL_PROMPT: 0, GIT_TERMINAL_PROMPT: 0,
}); });
} }
@ -64,30 +64,30 @@ async function run(context, plugins) {
// Verify config // Verify config
await verify(context); await verify(context);
options.repositoryUrl = await getGitAuthUrl({...context, branch: {name: ciBranch}}); options.repositoryUrl = await getGitAuthUrl({ ...context, branch: { name: ciBranch } });
context.branches = await getBranches(options.repositoryUrl, ciBranch, context); context.branches = await getBranches(options.repositoryUrl, ciBranch, context);
context.branch = context.branches.find(({name}) => name === ciBranch); context.branch = context.branches.find(({ name }) => name === ciBranch);
if (!context.branch) { if (!context.branch) {
logger.log( logger.log(
`This test run was triggered on the branch ${ciBranch}, while semantic-release is configured to only publish from ${context.branches `This test run was triggered on the branch ${ciBranch}, while semantic-release is configured to only publish from ${context.branches
.map(({name}) => name) .map(({ name }) => name)
.join(', ')}, therefore a new version wont be published.` .join(", ")}, therefore a new version wont be published.`
); );
return false; return false;
} }
logger[options.dryRun ? 'warn' : 'success']( logger[options.dryRun ? "warn" : "success"](
`Run automated release from branch ${ciBranch} on repository ${options.originalRepositoryURL}${ `Run automated release from branch ${ciBranch} on repository ${options.originalRepositoryURL}${
options.dryRun ? ' in dry-run mode' : '' options.dryRun ? " in dry-run mode" : ""
}` }`
); );
try { try {
try { try {
await verifyAuth(options.repositoryUrl, context.branch.name, {cwd, env}); await verifyAuth(options.repositoryUrl, context.branch.name, { cwd, env });
} catch (error) { } catch (error) {
if (!(await isBranchUpToDate(options.repositoryUrl, context.branch.name, {cwd, env}))) { if (!(await isBranchUpToDate(options.repositoryUrl, context.branch.name, { cwd, env }))) {
logger.log( logger.log(
`The local branch ${context.branch.name} is behind the remote one, therefore a new version won't be published.` `The local branch ${context.branch.name} is behind the remote one, therefore a new version won't be published.`
); );
@ -98,7 +98,7 @@ async function run(context, plugins) {
} }
} catch (error) { } catch (error) {
logger.error(`The command "${error.command}" failed with the error message ${error.stderr}.`); logger.error(`The command "${error.command}" failed with the error message ${error.stderr}.`);
throw getError('EGITNOPERMISSION', context); throw getError("EGITNOPERMISSION", context);
} }
logger.success(`Allowed to push to the Git repository`); logger.success(`Allowed to push to the Git repository`);
@ -110,24 +110,27 @@ async function run(context, plugins) {
const releaseToAdd = getReleaseToAdd(context); const releaseToAdd = getReleaseToAdd(context);
if (releaseToAdd) { if (releaseToAdd) {
const {lastRelease, currentRelease, nextRelease} = releaseToAdd; const { lastRelease, currentRelease, nextRelease } = releaseToAdd;
nextRelease.gitHead = await getTagHead(nextRelease.gitHead, {cwd, env}); nextRelease.gitHead = await getTagHead(nextRelease.gitHead, { cwd, env });
currentRelease.gitHead = await getTagHead(currentRelease.gitHead, {cwd, env}); currentRelease.gitHead = await getTagHead(currentRelease.gitHead, { cwd, env });
if (context.branch.mergeRange && !semver.satisfies(nextRelease.version, context.branch.mergeRange)) { if (context.branch.mergeRange && !semver.satisfies(nextRelease.version, context.branch.mergeRange)) {
errors.push(getError('EINVALIDMAINTENANCEMERGE', {...context, nextRelease})); errors.push(getError("EINVALIDMAINTENANCEMERGE", { ...context, nextRelease }));
} else { } else {
const commits = await getCommits({...context, lastRelease, nextRelease}); const commits = await getCommits({ ...context, lastRelease, nextRelease });
nextRelease.notes = await plugins.generateNotes({...context, commits, lastRelease, nextRelease}); nextRelease.notes = await plugins.generateNotes({ ...context, commits, lastRelease, nextRelease });
if (options.dryRun) { if (options.dryRun) {
logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`); logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`);
} else { } else {
await addNote({channels: [...currentRelease.channels, nextRelease.channel]}, nextRelease.gitHead, {cwd, env}); await addNote({ channels: [...currentRelease.channels, nextRelease.channel] }, nextRelease.gitHead, {
await push(options.repositoryUrl, {cwd, env}); cwd,
await pushNotes(options.repositoryUrl, {cwd, env}); env,
});
await push(options.repositoryUrl, { cwd, env });
await pushNotes(options.repositoryUrl, { cwd, env });
logger.success( logger.success(
`Add ${nextRelease.channel ? `channel ${nextRelease.channel}` : 'default channel'} to tag ${ `Add ${nextRelease.channel ? `channel ${nextRelease.channel}` : "default channel"} to tag ${
nextRelease.gitTag nextRelease.gitTag
}` }`
); );
@ -140,9 +143,9 @@ async function run(context, plugins) {
gitHead: nextRelease.gitHead, gitHead: nextRelease.gitHead,
}); });
const releases = await plugins.addChannel({...context, commits, lastRelease, currentRelease, nextRelease}); const releases = await plugins.addChannel({ ...context, commits, lastRelease, currentRelease, nextRelease });
context.releases.push(...releases); context.releases.push(...releases);
await plugins.success({...context, lastRelease, commits, nextRelease, releases}); await plugins.success({ ...context, lastRelease, commits, nextRelease, releases });
} }
} }
@ -152,7 +155,7 @@ async function run(context, plugins) {
context.lastRelease = getLastRelease(context); context.lastRelease = getLastRelease(context);
if (context.lastRelease.gitHead) { if (context.lastRelease.gitHead) {
context.lastRelease.gitHead = await getTagHead(context.lastRelease.gitHead, {cwd, env}); context.lastRelease.gitHead = await getTagHead(context.lastRelease.gitHead, { cwd, env });
} }
if (context.lastRelease.gitTag) { if (context.lastRelease.gitTag) {
@ -168,11 +171,11 @@ async function run(context, plugins) {
const nextRelease = { const nextRelease = {
type: await plugins.analyzeCommits(context), type: await plugins.analyzeCommits(context),
channel: context.branch.channel || null, channel: context.branch.channel || null,
gitHead: await getGitHead({cwd, env}), gitHead: await getGitHead({ cwd, env }),
}; };
if (!nextRelease.type) { if (!nextRelease.type) {
logger.log('There are no relevant changes, so no new version is released.'); logger.log("There are no relevant changes, so no new version is released.");
return context.releases.length > 0 ? {releases: context.releases} : false; return context.releases.length > 0 ? { releases: context.releases } : false;
} }
context.nextRelease = nextRelease; context.nextRelease = nextRelease;
@ -180,11 +183,11 @@ async function run(context, plugins) {
nextRelease.gitTag = makeTag(options.tagFormat, nextRelease.version); nextRelease.gitTag = makeTag(options.tagFormat, nextRelease.version);
nextRelease.name = nextRelease.gitTag; nextRelease.name = nextRelease.gitTag;
if (context.branch.type !== 'prerelease' && !semver.satisfies(nextRelease.version, context.branch.range)) { if (context.branch.type !== "prerelease" && !semver.satisfies(nextRelease.version, context.branch.range)) {
throw getError('EINVALIDNEXTVERSION', { throw getError("EINVALIDNEXTVERSION", {
...context, ...context,
validBranches: context.branches.filter( validBranches: context.branches.filter(
({type, accept}) => type !== 'prerelease' && accept.includes(nextRelease.type) ({ type, accept }) => type !== "prerelease" && accept.includes(nextRelease.type)
), ),
}); });
} }
@ -199,20 +202,20 @@ async function run(context, plugins) {
logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`); logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`);
} else { } else {
// Create the tag before calling the publish plugins as some require the tag to exists // Create the tag before calling the publish plugins as some require the tag to exists
await tag(nextRelease.gitTag, nextRelease.gitHead, {cwd, env}); await tag(nextRelease.gitTag, nextRelease.gitHead, { cwd, env });
await addNote({channels: [nextRelease.channel]}, nextRelease.gitHead, {cwd, env}); await addNote({ channels: [nextRelease.channel] }, nextRelease.gitHead, { cwd, env });
await push(options.repositoryUrl, {cwd, env}); await push(options.repositoryUrl, { cwd, env });
await pushNotes(options.repositoryUrl, {cwd, env}); await pushNotes(options.repositoryUrl, { cwd, env });
logger.success(`Created tag ${nextRelease.gitTag}`); logger.success(`Created tag ${nextRelease.gitTag}`);
} }
const releases = await plugins.publish(context); const releases = await plugins.publish(context);
context.releases.push(...releases); context.releases.push(...releases);
await plugins.success({...context, releases}); await plugins.success({ ...context, releases });
logger.success( logger.success(
`Published release ${nextRelease.version} on ${nextRelease.channel ? nextRelease.channel : 'default'} channel` `Published release ${nextRelease.version} on ${nextRelease.channel ? nextRelease.channel : "default"} channel`
); );
if (options.dryRun) { if (options.dryRun) {
@ -222,10 +225,10 @@ async function run(context, plugins) {
} }
} }
return pick(context, ['lastRelease', 'commits', 'nextRelease', 'releases']); return pick(context, ["lastRelease", "commits", "nextRelease", "releases"]);
} }
async function logErrors({logger, stderr}, err) { async function logErrors({ logger, stderr }, err) {
const errors = extractErrors(err).sort((error) => (error.semanticRelease ? -1 : 0)); const errors = extractErrors(err).sort((error) => (error.semanticRelease ? -1 : 0));
for (const error of errors) { for (const error of errors) {
if (error.semanticRelease) { if (error.semanticRelease) {
@ -234,7 +237,7 @@ async function logErrors({logger, stderr}, err) {
stderr.write(await terminalOutput(error.details)); // eslint-disable-line no-await-in-loop stderr.write(await terminalOutput(error.details)); // eslint-disable-line no-await-in-loop
} }
} else { } else {
logger.error('An error occurred while running semantic-release: %O', error); logger.error("An error occurred while running semantic-release: %O", error);
} }
} }
} }
@ -243,16 +246,16 @@ async function callFail(context, plugins, err) {
const errors = extractErrors(err).filter((err) => err.semanticRelease); const errors = extractErrors(err).filter((err) => err.semanticRelease);
if (errors.length > 0) { if (errors.length > 0) {
try { try {
await plugins.fail({...context, errors}); await plugins.fail({ ...context, errors });
} catch (error) { } catch (error) {
await logErrors(context, error); await logErrors(context, error);
} }
} }
} }
export default async (cliOptions = {}, {cwd = process.cwd(), env = process.env, stdout, stderr} = {}) => { export default async (cliOptions = {}, { cwd = process.cwd(), env = process.env, stdout, stderr } = {}) => {
const {unhook} = hookStd( const { unhook } = hookStd(
{silent: false, streams: [process.stdout, process.stderr, stdout, stderr].filter(Boolean)}, { silent: false, streams: [process.stdout, process.stderr, stdout, stderr].filter(Boolean) },
hideSensitive(env) hideSensitive(env)
); );
const context = { const context = {
@ -260,12 +263,12 @@ export default async (cliOptions = {}, {cwd = process.cwd(), env = process.env,
env, env,
stdout: stdout || process.stdout, stdout: stdout || process.stdout,
stderr: stderr || process.stderr, stderr: stderr || process.stderr,
envCi: envCi({env, cwd}), envCi: envCi({ env, cwd }),
}; };
context.logger = getLogger(context); context.logger = getLogger(context);
context.logger.log(`Running ${pkg.name} version ${pkg.version}`); context.logger.log(`Running ${pkg.name} version ${pkg.version}`);
try { try {
const {plugins, options} = await getConfig(context, cliOptions); const { plugins, options } = await getConfig(context, cliOptions);
options.originalRepositoryURL = options.repositoryUrl; options.originalRepositoryURL = options.repositoryUrl;
context.options = options; context.options = options;
try { try {
@ -281,4 +284,4 @@ export default async (cliOptions = {}, {cwd = process.cwd(), env = process.env,
unhook(); unhook();
throw error; throw error;
} }
} };

View File

@ -1,7 +1,7 @@
import debugCommits from 'debug'; import debugCommits from "debug";
import {getCommits} from './git.js'; import { getCommits } from "./git.js";
const debug = debugCommits('semantic-release:get-commits'); const debug = debugCommits("semantic-release:get-commits");
/** /**
* Retrieve the list of commits on the current branch since the commit sha associated with the last release, or all the commits of the current branch if there is no last released version. * Retrieve the list of commits on the current branch since the commit sha associated with the last release, or all the commits of the current branch if there is no last released version.
@ -10,16 +10,22 @@ const debug = debugCommits('semantic-release:get-commits');
* *
* @return {Promise<Array<Object>>} The list of commits on the branch `branch` since the last release. * @return {Promise<Array<Object>>} The list of commits on the branch `branch` since the last release.
*/ */
export default async ({cwd, env, lastRelease: {gitHead: from}, nextRelease: {gitHead: to = 'HEAD'} = {}, logger}) => { export default async ({
cwd,
env,
lastRelease: { gitHead: from },
nextRelease: { gitHead: to = "HEAD" } = {},
logger,
}) => {
if (from) { if (from) {
debug('Use from: %s', from); debug("Use from: %s", from);
} else { } else {
logger.log('No previous release found, retrieving all commits'); logger.log("No previous release found, retrieving all commits");
} }
const commits = await getCommits(from, to, {cwd, env}); const commits = await getCommits(from, to, { cwd, env });
logger.log(`Found ${commits.length} commits since last release`); logger.log(`Found ${commits.length} commits since last release`);
debug('Parsed commits: %o', commits); debug("Parsed commits: %o", commits);
return commits; return commits;
} };

View File

@ -1,39 +1,39 @@
import {dirname, resolve} from 'node:path'; import { dirname, resolve } from "node:path";
import {fileURLToPath} from 'node:url'; import { fileURLToPath } from "node:url";
import {createRequire} from 'node:module'; import { createRequire } from "node:module";
import {castArray, isNil, isPlainObject, isString, pickBy} from 'lodash-es'; import { castArray, isNil, isPlainObject, isString, pickBy } from "lodash-es";
import {readPackageUp} from 'read-pkg-up'; import { readPackageUp } from "read-pkg-up";
import {cosmiconfig} from 'cosmiconfig'; import { cosmiconfig } from "cosmiconfig";
import resolveFrom from 'resolve-from'; import resolveFrom from "resolve-from";
import debugConfig from 'debug'; import debugConfig from "debug";
import {repoUrl} from './git.js'; import { repoUrl } from "./git.js";
import PLUGINS_DEFINITIONS from './definitions/plugins.js'; import PLUGINS_DEFINITIONS from "./definitions/plugins.js";
import plugins from './plugins/index.js'; import plugins from "./plugins/index.js";
import {parseConfig, validatePlugin} from './plugins/utils.js'; import { parseConfig, validatePlugin } from "./plugins/utils.js";
const debug = debugConfig('semantic-release:config'); const debug = debugConfig("semantic-release:config");
const __dirname = dirname(fileURLToPath(import.meta.url)); const __dirname = dirname(fileURLToPath(import.meta.url));
const require = createRequire(import.meta.url); const require = createRequire(import.meta.url);
const CONFIG_NAME = 'release'; const CONFIG_NAME = "release";
export default async (context, cliOptions) => { export default async (context, cliOptions) => {
const {cwd, env} = context; const { cwd, env } = context;
const {config, filepath} = (await cosmiconfig(CONFIG_NAME).search(cwd)) || {}; const { config, filepath } = (await cosmiconfig(CONFIG_NAME).search(cwd)) || {};
debug('load config from: %s', filepath); debug("load config from: %s", filepath);
// Merge config file options and CLI/API options // Merge config file options and CLI/API options
let options = {...config, ...cliOptions}; let options = { ...config, ...cliOptions };
const pluginsPath = {}; const pluginsPath = {};
let extendPaths; let extendPaths;
({extends: extendPaths, ...options} = options); ({ extends: extendPaths, ...options } = options);
if (extendPaths) { if (extendPaths) {
// If `extends` is defined, load and merge each shareable config with `options` // If `extends` is defined, load and merge each shareable config with `options`
options = { options = {
...await (castArray(extendPaths).reduce(async(eventualResult, extendPath) => { ...(await castArray(extendPaths).reduce(async (eventualResult, extendPath) => {
const result = await eventualResult; const result = await eventualResult;
const extendsOptions = require(resolveFrom.silent(__dirname, extendPath) || resolveFrom(cwd, extendPath)); const extendsOptions = require(resolveFrom.silent(__dirname, extendPath) || resolveFrom(cwd, extendPath));
@ -43,7 +43,7 @@ export default async (context, cliOptions) => {
.filter(([, value]) => Boolean(value)) .filter(([, value]) => Boolean(value))
.reduce((pluginsPath, [option, value]) => { .reduce((pluginsPath, [option, value]) => {
castArray(value).forEach((plugin) => { castArray(value).forEach((plugin) => {
if (option === 'plugins' && validatePlugin(plugin)) { if (option === "plugins" && validatePlugin(plugin)) {
pluginsPath[parseConfig(plugin)[0]] = extendPath; pluginsPath[parseConfig(plugin)[0]] = extendPath;
} else if ( } else if (
PLUGINS_DEFINITIONS[option] && PLUGINS_DEFINITIONS[option] &&
@ -55,7 +55,7 @@ export default async (context, cliOptions) => {
return pluginsPath; return pluginsPath;
}, pluginsPath); }, pluginsPath);
return {...result, ...extendsOptions}; return { ...result, ...extendsOptions };
}, {})), }, {})),
...options, ...options,
}; };
@ -64,36 +64,36 @@ export default async (context, cliOptions) => {
// Set default options values if not defined yet // Set default options values if not defined yet
options = { options = {
branches: [ branches: [
'+([0-9])?(.{+([0-9]),x}).x', "+([0-9])?(.{+([0-9]),x}).x",
'master', "master",
'next', "next",
'next-major', "next-major",
{name: 'beta', prerelease: true}, { name: "beta", prerelease: true },
{name: 'alpha', prerelease: true}, { name: "alpha", prerelease: true },
], ],
repositoryUrl: (await pkgRepoUrl({normalize: false, cwd})) || (await repoUrl({cwd, env})), repositoryUrl: (await pkgRepoUrl({ normalize: false, cwd })) || (await repoUrl({ cwd, env })),
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: [ plugins: [
'@semantic-release/commit-analyzer', "@semantic-release/commit-analyzer",
'@semantic-release/release-notes-generator', "@semantic-release/release-notes-generator",
'@semantic-release/npm', "@semantic-release/npm",
'@semantic-release/github', "@semantic-release/github",
], ],
// Remove `null` and `undefined` options, so they can be replaced with default ones // Remove `null` and `undefined` options, so they can be replaced with default ones
...pickBy(options, (option) => !isNil(option)), ...pickBy(options, (option) => !isNil(option)),
...(options.branches ? {branches: castArray(options.branches)} : {}), ...(options.branches ? { branches: castArray(options.branches) } : {}),
}; };
if (options.ci === false) { if (options.ci === false) {
options.noCi = true; options.noCi = true;
} }
debug('options values: %O', options); debug("options values: %O", options);
return {options, plugins: await plugins({...context, options}, pluginsPath)}; return { options, plugins: await plugins({ ...context, options }, pluginsPath) };
} };
async function pkgRepoUrl(options) { async function pkgRepoUrl(options) {
const {packageJson} = (await readPackageUp(options)) || {}; const { packageJson } = (await readPackageUp(options)) || {};
return packageJson && (isPlainObject(packageJson.repository) ? packageJson.repository.url : packageJson.repository); return packageJson && (isPlainObject(packageJson.repository) ? packageJson.repository.url : packageJson.repository);
} }

View File

@ -1,7 +1,7 @@
import SemanticReleaseError from '@semantic-release/error'; import SemanticReleaseError from "@semantic-release/error";
import * as ERROR_DEFINITIONS from './definitions/errors.js'; import * as ERROR_DEFINITIONS from "./definitions/errors.js";
export default (code, ctx = {}) => { export default (code, ctx = {}) => {
const {message, details} = ERROR_DEFINITIONS[code](ctx); const { message, details } = ERROR_DEFINITIONS[code](ctx);
return new SemanticReleaseError(message, code, details); return new SemanticReleaseError(message, code, details);
} };

View File

@ -1,10 +1,10 @@
import {format, parse} from 'node:url'; import { format, parse } from "node:url";
import {isNil} from 'lodash-es'; import { isNil } from "lodash-es";
import hostedGitInfo from 'hosted-git-info'; import hostedGitInfo from "hosted-git-info";
import debugAuthUrl from 'debug'; import debugAuthUrl from "debug";
import {verifyAuth} from './git.js'; import { verifyAuth } from "./git.js";
const debug = debugAuthUrl('semantic-release:get-git-auth-url'); const debug = debugAuthUrl("semantic-release:get-git-auth-url");
/** /**
* Machinery to format a repository URL with the given credentials * Machinery to format a repository URL with the given credentials
@ -18,15 +18,15 @@ const debug = debugAuthUrl('semantic-release:get-git-auth-url');
function formatAuthUrl(protocol, repositoryUrl, gitCredentials) { function formatAuthUrl(protocol, repositoryUrl, gitCredentials) {
const [match, auth, host, basePort, path] = const [match, auth, host, basePort, path] =
/^(?!.+:\/\/)(?:(?<auth>.*)@)?(?<host>.*?):(?<port>\d+)?:?\/?(?<path>.*)$/.exec(repositoryUrl) || []; /^(?!.+:\/\/)(?:(?<auth>.*)@)?(?<host>.*?):(?<port>\d+)?:?\/?(?<path>.*)$/.exec(repositoryUrl) || [];
const {port, hostname, ...parsed} = parse( const { port, hostname, ...parsed } = parse(
match ? `ssh://${auth ? `${auth}@` : ''}${host}${basePort ? `:${basePort}` : ''}/${path}` : repositoryUrl match ? `ssh://${auth ? `${auth}@` : ""}${host}${basePort ? `:${basePort}` : ""}/${path}` : repositoryUrl
); );
return format({ return format({
...parsed, ...parsed,
auth: gitCredentials, auth: gitCredentials,
host: `${hostname}${protocol === 'ssh:' ? '' : port ? `:${port}` : ''}`, host: `${hostname}${protocol === "ssh:" ? "" : port ? `:${port}` : ""}`,
protocol: protocol && /http[^s]/.test(protocol) ? 'http' : 'https', protocol: protocol && /http[^s]/.test(protocol) ? "http" : "https",
}); });
} }
@ -38,9 +38,9 @@ function formatAuthUrl(protocol, repositoryUrl, gitCredentials) {
* *
* @return {String} The authUrl as is if the connection was successfull, null otherwise * @return {String} The authUrl as is if the connection was successfull, null otherwise
*/ */
async function ensureValidAuthUrl({cwd, env, branch}, authUrl) { async function ensureValidAuthUrl({ cwd, env, branch }, authUrl) {
try { try {
await verifyAuth(authUrl, branch.name, {cwd, env}); await verifyAuth(authUrl, branch.name, { cwd, env });
return authUrl; return authUrl;
} catch (error) { } catch (error) {
debug(error); debug(error);
@ -60,44 +60,44 @@ async function ensureValidAuthUrl({cwd, env, branch}, authUrl) {
* @return {String} The formatted Git repository URL. * @return {String} The formatted Git repository URL.
*/ */
export default async (context) => { export default async (context) => {
const {cwd, env, branch} = context; const { cwd, env, branch } = context;
const GIT_TOKENS = { const GIT_TOKENS = {
GIT_CREDENTIALS: undefined, GIT_CREDENTIALS: undefined,
GH_TOKEN: undefined, GH_TOKEN: undefined,
// GitHub Actions require the "x-access-token:" prefix for git access // GitHub Actions require the "x-access-token:" prefix for git access
// https://developer.github.com/apps/building-github-apps/authenticating-with-github-apps/#http-based-git-access-by-an-installation // https://developer.github.com/apps/building-github-apps/authenticating-with-github-apps/#http-based-git-access-by-an-installation
GITHUB_TOKEN: isNil(env.GITHUB_ACTION) ? undefined : 'x-access-token:', GITHUB_TOKEN: isNil(env.GITHUB_ACTION) ? undefined : "x-access-token:",
GL_TOKEN: 'gitlab-ci-token:', GL_TOKEN: "gitlab-ci-token:",
GITLAB_TOKEN: 'gitlab-ci-token:', GITLAB_TOKEN: "gitlab-ci-token:",
BB_TOKEN: 'x-token-auth:', BB_TOKEN: "x-token-auth:",
BITBUCKET_TOKEN: 'x-token-auth:', BITBUCKET_TOKEN: "x-token-auth:",
BB_TOKEN_BASIC_AUTH: '', BB_TOKEN_BASIC_AUTH: "",
BITBUCKET_TOKEN_BASIC_AUTH: '', BITBUCKET_TOKEN_BASIC_AUTH: "",
}; };
let {repositoryUrl} = context.options; let { repositoryUrl } = context.options;
const info = hostedGitInfo.fromUrl(repositoryUrl, {noGitPlus: true}); const info = hostedGitInfo.fromUrl(repositoryUrl, { noGitPlus: true });
const {protocol, ...parsed} = parse(repositoryUrl); const { protocol, ...parsed } = parse(repositoryUrl);
if (info && info.getDefaultRepresentation() === 'shortcut') { if (info && info.getDefaultRepresentation() === "shortcut") {
// Expand shorthand URLs (such as `owner/repo` or `gitlab:owner/repo`) // Expand shorthand URLs (such as `owner/repo` or `gitlab:owner/repo`)
repositoryUrl = info.https(); repositoryUrl = info.https();
} else if (protocol && protocol.includes('http')) { } else if (protocol && protocol.includes("http")) {
// Replace `git+https` and `git+http` with `https` or `http` // Replace `git+https` and `git+http` with `https` or `http`
repositoryUrl = format({...parsed, protocol: protocol.includes('https') ? 'https' : 'http', href: null}); repositoryUrl = format({ ...parsed, protocol: protocol.includes("https") ? "https" : "http", href: null });
} }
// Test if push is allowed without transforming the URL (e.g. is ssh keys are set up) // Test if push is allowed without transforming the URL (e.g. is ssh keys are set up)
try { try {
debug('Verifying ssh auth by attempting to push to %s', repositoryUrl); debug("Verifying ssh auth by attempting to push to %s", repositoryUrl);
await verifyAuth(repositoryUrl, branch.name, {cwd, env}); await verifyAuth(repositoryUrl, branch.name, { cwd, env });
} catch { } catch {
debug('SSH key auth failed, falling back to https.'); debug("SSH key auth failed, falling back to https.");
const envVars = Object.keys(GIT_TOKENS).filter((envVar) => !isNil(env[envVar])); const envVars = Object.keys(GIT_TOKENS).filter((envVar) => !isNil(env[envVar]));
// Skip verification if there is no ambiguity on which env var to use for authentication // Skip verification if there is no ambiguity on which env var to use for authentication
if (envVars.length === 1) { if (envVars.length === 1) {
const gitCredentials = `${GIT_TOKENS[envVars[0]] || ''}${env[envVars[0]]}`; const gitCredentials = `${GIT_TOKENS[envVars[0]] || ""}${env[envVars[0]]}`;
return formatAuthUrl(protocol, repositoryUrl, gitCredentials); return formatAuthUrl(protocol, repositoryUrl, gitCredentials);
} }
@ -106,7 +106,7 @@ export default async (context) => {
const candidateRepositoryUrls = []; const candidateRepositoryUrls = [];
for (const envVar of envVars) { for (const envVar of envVars) {
const gitCredentials = `${GIT_TOKENS[envVar] || ''}${env[envVar]}`; const gitCredentials = `${GIT_TOKENS[envVar] || ""}${env[envVar]}`;
const authUrl = formatAuthUrl(protocol, repositoryUrl, gitCredentials); const authUrl = formatAuthUrl(protocol, repositoryUrl, gitCredentials);
candidateRepositoryUrls.push(ensureValidAuthUrl(context, authUrl)); candidateRepositoryUrls.push(ensureValidAuthUrl(context, authUrl));
} }
@ -121,4 +121,4 @@ export default async (context) => {
} }
return repositoryUrl; return repositoryUrl;
} };

View File

@ -1,6 +1,6 @@
import {isUndefined} from 'lodash-es'; import { isUndefined } from "lodash-es";
import semver from 'semver'; import semver from "semver";
import {isSameChannel, makeTag} from './utils.js'; import { isSameChannel, makeTag } from "./utils.js";
/** /**
* Last release. * Last release.
@ -26,19 +26,19 @@ import {isSameChannel, makeTag} from './utils.js';
* *
* @return {LastRelease} The last tagged release or empty object if none is found. * @return {LastRelease} The last tagged release or empty object if none is found.
*/ */
export default ({branch, options: {tagFormat}}, {before} = {}) => { export default ({ branch, options: { tagFormat } }, { before } = {}) => {
const [{version, gitTag, channels} = {}] = branch.tags const [{ version, gitTag, channels } = {}] = branch.tags
.filter( .filter(
(tag) => (tag) =>
((branch.type === 'prerelease' && tag.channels.some((channel) => isSameChannel(branch.channel, channel))) || ((branch.type === "prerelease" && tag.channels.some((channel) => isSameChannel(branch.channel, channel))) ||
!semver.prerelease(tag.version)) && !semver.prerelease(tag.version)) &&
(isUndefined(before) || semver.lt(tag.version, before)) (isUndefined(before) || semver.lt(tag.version, before))
) )
.sort((a, b) => semver.rcompare(a.version, b.version)); .sort((a, b) => semver.rcompare(a.version, b.version));
if (gitTag) { if (gitTag) {
return {version, gitTag, channels, gitHead: gitTag, name: makeTag(tagFormat, version)}; return { version, gitTag, channels, gitHead: gitTag, name: makeTag(tagFormat, version) };
} }
return {}; return {};
} };

View File

@ -1,18 +1,18 @@
import signale from 'signale'; import signale from "signale";
import figures from 'figures'; import figures from "figures";
const {Signale} = signale; const { Signale } = signale;
export default ({stdout, stderr}) => export default ({ stdout, stderr }) =>
new Signale({ new Signale({
config: {displayTimestamp: true, underlineMessage: false, displayLabel: false}, config: { displayTimestamp: true, underlineMessage: false, displayLabel: false },
disabled: false, disabled: false,
interactive: false, interactive: false,
scope: 'semantic-release', scope: "semantic-release",
stream: [stdout], stream: [stdout],
types: { types: {
error: {badge: figures.cross, color: 'red', label: '', stream: [stderr]}, error: { badge: figures.cross, color: "red", label: "", stream: [stderr] },
log: {badge: figures.info, color: 'magenta', label: '', stream: [stdout]}, log: { badge: figures.info, color: "magenta", label: "", stream: [stdout] },
success: {badge: figures.tick, color: 'green', label: '', stream: [stdout]}, success: { badge: figures.tick, color: "green", label: "", stream: [stdout] },
}, },
}) });

View File

@ -1,20 +1,20 @@
import semver from 'semver'; import semver from "semver";
import {FIRST_RELEASE, FIRSTPRERELEASE} from './definitions/constants.js'; import { FIRST_RELEASE, FIRSTPRERELEASE } from "./definitions/constants.js";
import {getLatestVersion, highest, isSameChannel, tagsToVersions} from './utils.js'; import { getLatestVersion, highest, isSameChannel, tagsToVersions } from "./utils.js";
export default ({branch, nextRelease: {type, channel}, lastRelease, logger}) => { export default ({ branch, nextRelease: { type, channel }, lastRelease, logger }) => {
let version; let version;
if (lastRelease.version) { if (lastRelease.version) {
const {major, minor, patch} = semver.parse(lastRelease.version); const { major, minor, patch } = semver.parse(lastRelease.version);
if (branch.type === 'prerelease') { if (branch.type === "prerelease") {
if ( if (
semver.prerelease(lastRelease.version) && semver.prerelease(lastRelease.version) &&
lastRelease.channels.some((lastReleaseChannel) => isSameChannel(lastReleaseChannel, channel)) lastRelease.channels.some((lastReleaseChannel) => isSameChannel(lastReleaseChannel, channel))
) { ) {
version = highest( version = highest(
semver.inc(lastRelease.version, 'prerelease'), semver.inc(lastRelease.version, "prerelease"),
`${semver.inc(getLatestVersion(tagsToVersions(branch.tags), {withPrerelease: true}), type)}-${ `${semver.inc(getLatestVersion(tagsToVersions(branch.tags), { withPrerelease: true }), type)}-${
branch.prerelease branch.prerelease
}.${FIRSTPRERELEASE}` }.${FIRSTPRERELEASE}`
); );
@ -25,11 +25,11 @@ export default ({branch, nextRelease: {type, channel}, lastRelease, logger}) =>
version = semver.inc(lastRelease.version, type); version = semver.inc(lastRelease.version, type);
} }
logger.log('The next release version is %s', version); logger.log("The next release version is %s", version);
} else { } else {
version = branch.type === 'prerelease' ? `${FIRST_RELEASE}-${branch.prerelease}.${FIRSTPRERELEASE}` : FIRST_RELEASE; version = branch.type === "prerelease" ? `${FIRST_RELEASE}-${branch.prerelease}.${FIRSTPRERELEASE}` : FIRST_RELEASE;
logger.log(`There is no previous release, the next release version is ${version}`); logger.log(`There is no previous release, the next release version is ${version}`);
} }
return version; return version;
} };

View File

@ -1,8 +1,8 @@
import {intersection, uniqBy} from 'lodash-es'; import { intersection, uniqBy } from "lodash-es";
import semver from 'semver'; import semver from "semver";
import semverDiff from 'semver-diff'; import semverDiff from "semver-diff";
import getLastRelease from './get-last-release.js'; import getLastRelease from "./get-last-release.js";
import {getLowerBound, makeTag} from './utils.js'; import { getLowerBound, makeTag } from "./utils.js";
/** /**
* Find releases that have been merged from from a higher branch but not added on the channel of the current branch. * Find releases that have been merged from from a higher branch but not added on the channel of the current branch.
@ -15,38 +15,38 @@ export default (context) => {
const { const {
branch, branch,
branches, branches,
options: {tagFormat}, options: { tagFormat },
} = context; } = context;
const higherChannels = branches const higherChannels = branches
// Consider only releases of higher branches // Consider only releases of higher branches
.slice(branches.findIndex(({name}) => name === branch.name) + 1) .slice(branches.findIndex(({ name }) => name === branch.name) + 1)
// Exclude prerelease branches // Exclude prerelease branches
.filter(({type}) => type !== 'prerelease') .filter(({ type }) => type !== "prerelease")
.map(({channel}) => channel || null); .map(({ channel }) => channel || null);
const versiontoAdd = uniqBy( const versiontoAdd = uniqBy(
branch.tags.filter( branch.tags.filter(
({channels, version}) => ({ channels, version }) =>
!channels.includes(branch.channel || null) && !channels.includes(branch.channel || null) &&
intersection(channels, higherChannels).length > 0 && intersection(channels, higherChannels).length > 0 &&
(branch.type !== 'maintenance' || semver.gte(version, getLowerBound(branch.mergeRange))) (branch.type !== "maintenance" || semver.gte(version, getLowerBound(branch.mergeRange)))
), ),
'version' "version"
).sort((a, b) => semver.compare(b.version, a.version))[0]; ).sort((a, b) => semver.compare(b.version, a.version))[0];
if (versiontoAdd) { if (versiontoAdd) {
const {version, gitTag, channels} = versiontoAdd; const { version, gitTag, channels } = versiontoAdd;
const lastRelease = getLastRelease(context, {before: version}); const lastRelease = getLastRelease(context, { before: version });
if (semver.gt(getLastRelease(context).version, version)) { if (semver.gt(getLastRelease(context).version, version)) {
return; return;
} }
const type = lastRelease.version ? semverDiff(lastRelease.version, version) : 'major'; const type = lastRelease.version ? semverDiff(lastRelease.version, version) : "major";
const name = makeTag(tagFormat, version); const name = makeTag(tagFormat, version);
return { return {
lastRelease, lastRelease,
currentRelease: {type, version, channels, gitTag, name, gitHead: gitTag}, currentRelease: { type, version, channels, gitTag, name, gitHead: gitTag },
nextRelease: { nextRelease: {
type, type,
version, version,
@ -57,4 +57,4 @@ export default (context) => {
}, },
}; };
} }
} };

View File

@ -1,12 +1,12 @@
import gitLogParser from 'git-log-parser'; import gitLogParser from "git-log-parser";
import getStream from 'get-stream'; import getStream from "get-stream";
import {execa} from 'execa'; import { execa } from "execa";
import debugGit from 'debug'; import debugGit from "debug";
import {GIT_NOTE_REF} from './definitions/constants.js'; import { GIT_NOTE_REF } from "./definitions/constants.js";
const debug = debugGit('semantic-release:git'); const debug = debugGit("semantic-release:git");
Object.assign(gitLogParser.fields, {hash: 'H', message: 'B', gitTags: 'd', committerDate: {key: 'ci', type: Date}}); Object.assign(gitLogParser.fields, { hash: "H", message: "B", gitTags: "d", committerDate: { key: "ci", type: Date } });
/** /**
* Get the commit sha for a given tag. * Get the commit sha for a given tag.
@ -17,7 +17,7 @@ Object.assign(gitLogParser.fields, {hash: 'H', message: 'B', gitTags: 'd', commi
* @return {String} The commit sha of the tag in parameter or `null`. * @return {String} The commit sha of the tag in parameter or `null`.
*/ */
export async function getTagHead(tagName, execaOptions) { export async function getTagHead(tagName, execaOptions) {
return (await execa('git', ['rev-list', '-1', tagName], execaOptions)).stdout; return (await execa("git", ["rev-list", "-1", tagName], execaOptions)).stdout;
} }
/** /**
@ -30,8 +30,8 @@ export async function getTagHead(tagName, execaOptions) {
* @throws {Error} If the `git` command fails. * @throws {Error} If the `git` command fails.
*/ */
export async function getTags(branch, execaOptions) { export async function getTags(branch, execaOptions) {
return (await execa('git', ['tag', '--merged', branch], execaOptions)).stdout return (await execa("git", ["tag", "--merged", branch], execaOptions)).stdout
.split('\n') .split("\n")
.map((tag) => tag.trim()) .map((tag) => tag.trim())
.filter(Boolean); .filter(Boolean);
} }
@ -48,11 +48,11 @@ export async function getCommits(from, to, execaOptions) {
return ( return (
await getStream.array( await getStream.array(
gitLogParser.parse( gitLogParser.parse(
{_: `${from ? from + '..' : ''}${to}`}, { _: `${from ? from + ".." : ""}${to}` },
{cwd: execaOptions.cwd, env: {...process.env, ...execaOptions.env}} { cwd: execaOptions.cwd, env: { ...process.env, ...execaOptions.env } }
) )
) )
).map(({message, gitTags, ...commit}) => ({...commit, message: message.trim(), gitTags: gitTags.trim()})); ).map(({ message, gitTags, ...commit }) => ({ ...commit, message: message.trim(), gitTags: gitTags.trim() }));
} }
/** /**
@ -65,8 +65,8 @@ export async function getCommits(from, to, execaOptions) {
* @throws {Error} If the `git` command fails. * @throws {Error} If the `git` command fails.
*/ */
export async function getBranches(repositoryUrl, execaOptions) { export async function getBranches(repositoryUrl, execaOptions) {
return (await execa('git', ['ls-remote', '--heads', repositoryUrl], execaOptions)).stdout return (await execa("git", ["ls-remote", "--heads", repositoryUrl], execaOptions)).stdout
.split('\n') .split("\n")
.filter(Boolean) .filter(Boolean)
.map((branch) => branch.match(/^.+refs\/heads\/(?<branch>.+)$/)[1]); .map((branch) => branch.match(/^.+refs\/heads\/(?<branch>.+)$/)[1]);
} }
@ -81,7 +81,7 @@ export async function getBranches(repositoryUrl, execaOptions) {
*/ */
export async function isRefExists(ref, execaOptions) { export async function isRefExists(ref, execaOptions) {
try { try {
return (await execa('git', ['rev-parse', '--verify', ref], execaOptions)).exitCode === 0; return (await execa("git", ["rev-parse", "--verify", ref], execaOptions)).exitCode === 0;
} catch (error) { } catch (error) {
debug(error); debug(error);
} }
@ -103,30 +103,30 @@ export async function isRefExists(ref, execaOptions) {
*/ */
export async function fetch(repositoryUrl, branch, ciBranch, execaOptions) { export async function fetch(repositoryUrl, branch, ciBranch, execaOptions) {
const isDetachedHead = const isDetachedHead =
(await execa('git', ['rev-parse', '--abbrev-ref', 'HEAD'], {...execaOptions, reject: false})).stdout === 'HEAD'; (await execa("git", ["rev-parse", "--abbrev-ref", "HEAD"], { ...execaOptions, reject: false })).stdout === "HEAD";
try { try {
await execa( await execa(
'git', "git",
[ [
'fetch', "fetch",
'--unshallow', "--unshallow",
'--tags', "--tags",
...(branch === ciBranch && !isDetachedHead ...(branch === ciBranch && !isDetachedHead
? [repositoryUrl] ? [repositoryUrl]
: ['--update-head-ok', repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]), : ["--update-head-ok", repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
], ],
execaOptions execaOptions
); );
} catch { } catch {
await execa( await execa(
'git', "git",
[ [
'fetch', "fetch",
'--tags', "--tags",
...(branch === ciBranch && !isDetachedHead ...(branch === ciBranch && !isDetachedHead
? [repositoryUrl] ? [repositoryUrl]
: ['--update-head-ok', repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]), : ["--update-head-ok", repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
], ],
execaOptions execaOptions
); );
@ -142,12 +142,12 @@ export async function fetch(repositoryUrl, branch, ciBranch, execaOptions) {
export async function fetchNotes(repositoryUrl, execaOptions) { export async function fetchNotes(repositoryUrl, execaOptions) {
try { try {
await execa( await execa(
'git', "git",
['fetch', '--unshallow', repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`], ["fetch", "--unshallow", repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`],
execaOptions execaOptions
); );
} catch { } catch {
await execa('git', ['fetch', repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`], { await execa("git", ["fetch", repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`], {
...execaOptions, ...execaOptions,
reject: false, reject: false,
}); });
@ -162,7 +162,7 @@ export async function fetchNotes(repositoryUrl, execaOptions) {
* @return {String} the sha of the HEAD commit. * @return {String} the sha of the HEAD commit.
*/ */
export async function getGitHead(execaOptions) { export async function getGitHead(execaOptions) {
return (await execa('git', ['rev-parse', 'HEAD'], execaOptions)).stdout; return (await execa("git", ["rev-parse", "HEAD"], execaOptions)).stdout;
} }
/** /**
@ -174,7 +174,7 @@ export async function getGitHead(execaOptions) {
*/ */
export async function repoUrl(execaOptions) { export async function repoUrl(execaOptions) {
try { try {
return (await execa('git', ['config', '--get', 'remote.origin.url'], execaOptions)).stdout; return (await execa("git", ["config", "--get", "remote.origin.url"], execaOptions)).stdout;
} catch (error) { } catch (error) {
debug(error); debug(error);
} }
@ -189,7 +189,7 @@ export async function repoUrl(execaOptions) {
*/ */
export async function isGitRepo(execaOptions) { export async function isGitRepo(execaOptions) {
try { try {
return (await execa('git', ['rev-parse', '--git-dir'], execaOptions)).exitCode === 0; return (await execa("git", ["rev-parse", "--git-dir"], execaOptions)).exitCode === 0;
} catch (error) { } catch (error) {
debug(error); debug(error);
} }
@ -206,7 +206,7 @@ export async function isGitRepo(execaOptions) {
*/ */
export async function verifyAuth(repositoryUrl, branch, execaOptions) { export async function verifyAuth(repositoryUrl, branch, execaOptions) {
try { try {
await execa('git', ['push', '--dry-run', '--no-verify', repositoryUrl, `HEAD:${branch}`], execaOptions); await execa("git", ["push", "--dry-run", "--no-verify", repositoryUrl, `HEAD:${branch}`], execaOptions);
} catch (error) { } catch (error) {
debug(error); debug(error);
throw error; throw error;
@ -223,7 +223,7 @@ export async function verifyAuth(repositoryUrl, branch, execaOptions) {
* @throws {Error} if the tag creation failed. * @throws {Error} if the tag creation failed.
*/ */
export async function tag(tagName, ref, execaOptions) { export async function tag(tagName, ref, execaOptions) {
await execa('git', ['tag', tagName, ref], execaOptions); await execa("git", ["tag", tagName, ref], execaOptions);
} }
/** /**
@ -235,7 +235,7 @@ export async function tag(tagName, ref, execaOptions) {
* @throws {Error} if the push failed. * @throws {Error} if the push failed.
*/ */
export async function push(repositoryUrl, execaOptions) { export async function push(repositoryUrl, execaOptions) {
await execa('git', ['push', '--tags', repositoryUrl], execaOptions); await execa("git", ["push", "--tags", repositoryUrl], execaOptions);
} }
/** /**
@ -247,7 +247,7 @@ export async function push(repositoryUrl, execaOptions) {
* @throws {Error} if the push failed. * @throws {Error} if the push failed.
*/ */
export async function pushNotes(repositoryUrl, execaOptions) { export async function pushNotes(repositoryUrl, execaOptions) {
await execa('git', ['push', repositoryUrl, `refs/notes/${GIT_NOTE_REF}`], execaOptions); await execa("git", ["push", repositoryUrl, `refs/notes/${GIT_NOTE_REF}`], execaOptions);
} }
/** /**
@ -260,7 +260,7 @@ export async function pushNotes(repositoryUrl, execaOptions) {
*/ */
export async function verifyTagName(tagName, execaOptions) { export async function verifyTagName(tagName, execaOptions) {
try { try {
return (await execa('git', ['check-ref-format', `refs/tags/${tagName}`], execaOptions)).exitCode === 0; return (await execa("git", ["check-ref-format", `refs/tags/${tagName}`], execaOptions)).exitCode === 0;
} catch (error) { } catch (error) {
debug(error); debug(error);
} }
@ -276,7 +276,7 @@ export async function verifyTagName(tagName, execaOptions) {
*/ */
export async function verifyBranchName(branch, execaOptions) { export async function verifyBranchName(branch, execaOptions) {
try { try {
return (await execa('git', ['check-ref-format', `refs/heads/${branch}`], execaOptions)).exitCode === 0; return (await execa("git", ["check-ref-format", `refs/heads/${branch}`], execaOptions)).exitCode === 0;
} catch (error) { } catch (error) {
debug(error); debug(error);
} }
@ -294,7 +294,7 @@ export async function verifyBranchName(branch, execaOptions) {
export async function isBranchUpToDate(repositoryUrl, branch, execaOptions) { export async function isBranchUpToDate(repositoryUrl, branch, execaOptions) {
return ( return (
(await getGitHead(execaOptions)) === (await getGitHead(execaOptions)) ===
(await execa('git', ['ls-remote', '--heads', repositoryUrl, branch], execaOptions)).stdout.match(/^(?<ref>\w+)?/)[1] (await execa("git", ["ls-remote", "--heads", repositoryUrl, branch], execaOptions)).stdout.match(/^(?<ref>\w+)?/)[1]
); );
} }
@ -308,7 +308,7 @@ export async function isBranchUpToDate(repositoryUrl, branch, execaOptions) {
*/ */
export async function getNote(ref, execaOptions) { export async function getNote(ref, execaOptions) {
try { try {
return JSON.parse((await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'show', ref], execaOptions)).stdout); return JSON.parse((await execa("git", ["notes", "--ref", GIT_NOTE_REF, "show", ref], execaOptions)).stdout);
} catch (error) { } catch (error) {
if (error.exitCode === 1) { if (error.exitCode === 1) {
return {}; return {};
@ -327,5 +327,5 @@ export async function getNote(ref, execaOptions) {
* @param {Object} [execaOpts] Options to pass to `execa`. * @param {Object} [execaOpts] Options to pass to `execa`.
*/ */
export async function addNote(note, ref, execaOptions) { export async function addNote(note, ref, execaOptions) {
await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'add', '-f', '-m', JSON.stringify(note), ref], execaOptions); await execa("git", ["notes", "--ref", GIT_NOTE_REF, "add", "-f", "-m", JSON.stringify(note), ref], execaOptions);
} }

View File

@ -1,10 +1,10 @@
import {escapeRegExp, isString, size} from 'lodash-es'; import { escapeRegExp, isString, size } from "lodash-es";
import {SECRET_MIN_SIZE, SECRET_REPLACEMENT} from './definitions/constants.js'; import { SECRET_MIN_SIZE, SECRET_REPLACEMENT } from "./definitions/constants.js";
export default (env) => { export default (env) => {
const toReplace = Object.keys(env).filter((envVar) => { const toReplace = Object.keys(env).filter((envVar) => {
// https://github.com/semantic-release/semantic-release/issues/1558 // https://github.com/semantic-release/semantic-release/issues/1558
if (envVar === 'GOPRIVATE') { if (envVar === "GOPRIVATE") {
return false; return false;
} }
@ -12,9 +12,9 @@ export default (env) => {
}); });
const regexp = new RegExp( const regexp = new RegExp(
toReplace.map((envVar) => `${escapeRegExp(env[envVar])}|${escapeRegExp(encodeURI(env[envVar]))}`).join('|'), toReplace.map((envVar) => `${escapeRegExp(env[envVar])}|${escapeRegExp(encodeURI(env[envVar]))}`).join("|"),
'g' "g"
); );
return (output) => return (output) =>
output && isString(output) && toReplace.length > 0 ? output.toString().replace(regexp, SECRET_REPLACEMENT) : output; output && isString(output) && toReplace.length > 0 ? output.toString().replace(regexp, SECRET_REPLACEMENT) : output;
} };

View File

@ -1,6 +1,6 @@
import {isFunction, template, union} from 'lodash-es'; import { isFunction, template, union } from "lodash-es";
import semver from 'semver'; import semver from "semver";
import hideSensitive from './hide-sensitive.js'; import hideSensitive from "./hide-sensitive.js";
export function extractErrors(err) { export function extractErrors(err) {
return err && err.errors ? [...err.errors] : [err]; return err && err.errors ? [...err.errors] : [err];
@ -19,7 +19,7 @@ export function hideSensitiveValues(env, objs) {
} }
export function tagsToVersions(tags) { export function tagsToVersions(tags) {
return tags.map(({version}) => version); return tags.map(({ version }) => version);
} }
export function isMajorRange(range) { export function isMajorRange(range) {
@ -33,16 +33,16 @@ export function isMaintenanceRange(range) {
export function getUpperBound(range) { export function getUpperBound(range) {
const result = semver.valid(range) const result = semver.valid(range)
? range ? range
: ((semver.validRange(range) || '').match(/<(?<upperBound>\d+\.\d+\.\d+(-\d+)?)$/) || [])[1]; : ((semver.validRange(range) || "").match(/<(?<upperBound>\d+\.\d+\.\d+(-\d+)?)$/) || [])[1];
return result return result
? // https://github.com/npm/node-semver/issues/322 ? // https://github.com/npm/node-semver/issues/322
result.replace(/-\d+$/, '') result.replace(/-\d+$/, "")
: result; : result;
} }
export function getLowerBound(range) { export function getLowerBound(range) {
return ((semver.validRange(range) || '').match(/(?<lowerBound>\d+\.\d+\.\d+)/) || [])[1]; return ((semver.validRange(range) || "").match(/(?<lowerBound>\d+\.\d+\.\d+)/) || [])[1];
} }
export function highest(version1, version2) { export function highest(version1, version2) {
@ -53,16 +53,16 @@ export function lowest(version1, version2) {
return version1 && version2 ? (semver.lt(version1, version2) ? version1 : version2) : version1 || version2; return version1 && version2 ? (semver.lt(version1, version2) ? version1 : version2) : version1 || version2;
} }
export function getLatestVersion(versions, {withPrerelease} = {}) { export function getLatestVersion(versions, { withPrerelease } = {}) {
return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.rcompare)[0]; return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.rcompare)[0];
} }
export function getEarliestVersion(versions, {withPrerelease} = {}) { export function getEarliestVersion(versions, { withPrerelease } = {}) {
return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.compare)[0]; return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.compare)[0];
} }
export function getFirstVersion(versions, lowerBranches) { export function getFirstVersion(versions, lowerBranches) {
const lowerVersion = union(...lowerBranches.map(({tags}) => tagsToVersions(tags))).sort(semver.rcompare); const lowerVersion = union(...lowerBranches.map(({ tags }) => tagsToVersions(tags))).sort(semver.rcompare);
if (lowerVersion[0]) { if (lowerVersion[0]) {
return versions.sort(semver.compare).find((version) => semver.gt(version, lowerVersion[0])); return versions.sort(semver.compare).find((version) => semver.gt(version, lowerVersion[0]));
} }
@ -71,11 +71,11 @@ export function getFirstVersion(versions, lowerBranches) {
} }
export function getRange(min, max) { export function getRange(min, max) {
return `>=${min}${max ? ` <${max}` : ''}`; return `>=${min}${max ? ` <${max}` : ""}`;
} }
export function makeTag(tagFormat, version) { export function makeTag(tagFormat, version) {
return template(tagFormat)({version}); return template(tagFormat)({ version });
} }
export function isSameChannel(channel, otherChannel) { export function isSameChannel(channel, otherChannel) {

View File

@ -1,43 +1,43 @@
import {isPlainObject, isString, template} from 'lodash-es'; import { isPlainObject, isString, template } from "lodash-es";
import AggregateError from 'aggregate-error'; import AggregateError from "aggregate-error";
import {isGitRepo, verifyTagName} from './git.js'; import { isGitRepo, verifyTagName } from "./git.js";
import getError from './get-error.js'; import getError from "./get-error.js";
export default async (context) => { export default async (context) => {
const { const {
cwd, cwd,
env, env,
options: {repositoryUrl, tagFormat, branches}, options: { repositoryUrl, tagFormat, branches },
} = context; } = context;
const errors = []; const errors = [];
if (!(await isGitRepo({cwd, env}))) { if (!(await isGitRepo({ cwd, env }))) {
errors.push(getError('ENOGITREPO', {cwd})); errors.push(getError("ENOGITREPO", { cwd }));
} else if (!repositoryUrl) { } else if (!repositoryUrl) {
errors.push(getError('ENOREPOURL')); errors.push(getError("ENOREPOURL"));
} }
// Verify that compiling the `tagFormat` produce a valid Git tag // Verify that compiling the `tagFormat` produce a valid Git tag
if (!(await verifyTagName(template(tagFormat)({version: '0.0.0'})))) { if (!(await verifyTagName(template(tagFormat)({ version: "0.0.0" })))) {
errors.push(getError('EINVALIDTAGFORMAT', context)); errors.push(getError("EINVALIDTAGFORMAT", context));
} }
// Verify the `tagFormat` contains the variable `version` by compiling the `tagFormat` template // Verify the `tagFormat` contains the variable `version` by compiling the `tagFormat` template
// with a space as the `version` value and verify the result contains the space. // with a space as the `version` value and verify the result contains the space.
// The space is used as it's an invalid tag character, so it's guaranteed to no be present in the `tagFormat`. // The space is used as it's an invalid tag character, so it's guaranteed to no be present in the `tagFormat`.
if ((template(tagFormat)({version: ' '}).match(/ /g) || []).length !== 1) { if ((template(tagFormat)({ version: " " }).match(/ /g) || []).length !== 1) {
errors.push(getError('ETAGNOVERSION', context)); errors.push(getError("ETAGNOVERSION", context));
} }
branches.forEach((branch) => { branches.forEach((branch) => {
if ( if (
!((isString(branch) && branch.trim()) || (isPlainObject(branch) && isString(branch.name) && branch.name.trim())) !((isString(branch) && branch.trim()) || (isPlainObject(branch) && isString(branch.name) && branch.name.trim()))
) { ) {
errors.push(getError('EINVALIDBRANCH', {branch})); errors.push(getError("EINVALIDBRANCH", { branch }));
} }
}); });
if (errors.length > 0) { if (errors.length > 0) {
throw new AggregateError(errors); throw new AggregateError(errors);
} }
} };

12386
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -69,11 +69,11 @@
"mockserver-client": "5.14.0", "mockserver-client": "5.14.0",
"nock": "13.2.9", "nock": "13.2.9",
"p-retry": "^5.1.1", "p-retry": "^5.1.1",
"prettier": "^2.7.1",
"sinon": "14.0.0", "sinon": "14.0.0",
"stream-buffers": "3.0.2", "stream-buffers": "3.0.2",
"tempy": "^3.0.0", "tempy": "^3.0.0",
"testdouble": "3.16.6", "testdouble": "3.16.6"
"xo": "0.32.1"
}, },
"overrides": { "overrides": {
"semantic-release": "20.0.0-beta.1" "semantic-release": "20.0.0-beta.1"
@ -128,20 +128,13 @@
}, },
"scripts": { "scripts": {
"codecov": "codecov -f coverage/coverage-final.json", "codecov": "codecov -f coverage/coverage-final.json",
"lint": "xo", "lint": "prettier --check \"*.{js,json,md}\" \".github/**/*.{md,yml}\" \"docs/**/*.md\" \"{bin,lib,test}/*.js\"",
"lint:fix": "prettier --write \"*.{js,json,md}\" \".github/**/*.{md,yml}\" \"docs/**/*.md\" \"{bin,lib,test}/*.js\"",
"pretest": "npm run lint", "pretest": "npm run lint",
"semantic-release": "./bin/semantic-release.js", "semantic-release": "./bin/semantic-release.js",
"test": "c8 ava --verbose", "test": "c8 ava --verbose",
"test:ci": "c8 ava --verbose" "test:ci": "c8 ava --verbose"
}, },
"xo": {
"prettier": true,
"space": true,
"rules": {
"unicorn/no-reduce": "off",
"unicorn/string-content": "off"
}
},
"renovate": { "renovate": {
"extends": [ "extends": [
"github>semantic-release/.github" "github>semantic-release/.github"

View File

@ -1,19 +1,19 @@
import test from 'ava'; import test from "ava";
import {escapeRegExp} from 'lodash-es'; import { escapeRegExp } from "lodash-es";
import * as td from 'testdouble'; import * as td from "testdouble";
import {stub} from 'sinon'; import { stub } from "sinon";
import {SECRET_REPLACEMENT} from '../lib/definitions/constants.js'; import { SECRET_REPLACEMENT } from "../lib/definitions/constants.js";
let previousArgv; let previousArgv;
let previousEnv; let previousEnv;
test.beforeEach((t) => { test.beforeEach((t) => {
t.context.logs = ''; t.context.logs = "";
t.context.errors = ''; t.context.errors = "";
t.context.stdout = stub(process.stdout, 'write').callsFake((value) => { t.context.stdout = stub(process.stdout, "write").callsFake((value) => {
t.context.logs += value.toString(); t.context.logs += value.toString();
}); });
t.context.stderr = stub(process.stderr, 'write').callsFake((value) => { t.context.stderr = stub(process.stderr, "write").callsFake((value) => {
t.context.errors += value.toString(); t.context.errors += value.toString();
}); });
@ -31,196 +31,202 @@ test.afterEach.always((t) => {
td.reset(); td.reset();
}); });
test.serial('Pass options to semantic-release API', async (t) => { test.serial("Pass options to semantic-release API", async (t) => {
const argv = [ const argv = [
'', "",
'', "",
'-b', "-b",
'master', "master",
'next', "next",
'-r', "-r",
'https://github/com/owner/repo.git', "https://github/com/owner/repo.git",
'-t', "-t",
`v\${version}`, `v\${version}`,
'-p', "-p",
'plugin1', "plugin1",
'plugin2', "plugin2",
'-e', "-e",
'config1', "config1",
'config2', "config2",
'--verify-conditions', "--verify-conditions",
'condition1', "condition1",
'condition2', "condition2",
'--analyze-commits', "--analyze-commits",
'analyze', "analyze",
'--verify-release', "--verify-release",
'verify1', "verify1",
'verify2', "verify2",
'--generate-notes', "--generate-notes",
'notes', "notes",
'--prepare', "--prepare",
'prepare1', "prepare1",
'prepare2', "prepare2",
'--publish', "--publish",
'publish1', "publish1",
'publish2', "publish2",
'--success', "--success",
'success1', "success1",
'success2', "success2",
'--fail', "--fail",
'fail1', "fail1",
'fail2', "fail2",
'--debug', "--debug",
'-d', "-d",
]; ];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
td.verify(index.default({ td.verify(
branches: ['master', 'next'], index.default({
b: ['master', 'next'], branches: ["master", "next"],
'repository-url': 'https://github/com/owner/repo.git', b: ["master", "next"],
repositoryUrl: 'https://github/com/owner/repo.git', "repository-url": "https://github/com/owner/repo.git",
r: 'https://github/com/owner/repo.git', repositoryUrl: "https://github/com/owner/repo.git",
'tag-format': `v\${version}`, r: "https://github/com/owner/repo.git",
"tag-format": `v\${version}`,
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
t: `v\${version}`, t: `v\${version}`,
plugins: ['plugin1', 'plugin2'], plugins: ["plugin1", "plugin2"],
p: ['plugin1', 'plugin2'], p: ["plugin1", "plugin2"],
extends: ['config1', 'config2'], extends: ["config1", "config2"],
e: ['config1', 'config2'], e: ["config1", "config2"],
'dry-run': true, "dry-run": true,
dryRun: true, dryRun: true,
d: true, d: true,
verifyConditions: ['condition1', 'condition2'], verifyConditions: ["condition1", "condition2"],
'verify-conditions': ['condition1', 'condition2'], "verify-conditions": ["condition1", "condition2"],
analyzeCommits: 'analyze', analyzeCommits: "analyze",
'analyze-commits': 'analyze', "analyze-commits": "analyze",
verifyRelease: ['verify1', 'verify2'], verifyRelease: ["verify1", "verify2"],
'verify-release': ['verify1', 'verify2'], "verify-release": ["verify1", "verify2"],
generateNotes: ['notes'], generateNotes: ["notes"],
'generate-notes': ['notes'], "generate-notes": ["notes"],
prepare: ['prepare1', 'prepare2'], prepare: ["prepare1", "prepare2"],
publish: ['publish1', 'publish2'], publish: ["publish1", "publish2"],
success: ['success1', 'success2'], success: ["success1", "success2"],
fail: ['fail1', 'fail2'], fail: ["fail1", "fail2"],
debug: true, debug: true,
_: [], _: [],
'$0': '' $0: "",
})); })
);
t.is(exitCode, 0); t.is(exitCode, 0);
}); });
test.serial('Pass options to semantic-release API with alias arguments', async (t) => { test.serial("Pass options to semantic-release API with alias arguments", async (t) => {
const argv = [ const argv = [
'', "",
'', "",
'--branches', "--branches",
'master', "master",
'--repository-url', "--repository-url",
'https://github/com/owner/repo.git', "https://github/com/owner/repo.git",
'--tag-format', "--tag-format",
`v\${version}`, `v\${version}`,
'--plugins', "--plugins",
'plugin1', "plugin1",
'plugin2', "plugin2",
'--extends', "--extends",
'config1', "config1",
'config2', "config2",
'--dry-run', "--dry-run",
]; ];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
td.verify(index.default({ td.verify(
branches: ['master'], index.default({
b: ['master'], branches: ["master"],
'repository-url': 'https://github/com/owner/repo.git', b: ["master"],
repositoryUrl: 'https://github/com/owner/repo.git', "repository-url": "https://github/com/owner/repo.git",
r: 'https://github/com/owner/repo.git', repositoryUrl: "https://github/com/owner/repo.git",
'tag-format': `v\${version}`, r: "https://github/com/owner/repo.git",
"tag-format": `v\${version}`,
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
t: `v\${version}`, t: `v\${version}`,
plugins: ['plugin1', 'plugin2'], plugins: ["plugin1", "plugin2"],
p: ['plugin1', 'plugin2'], p: ["plugin1", "plugin2"],
extends: ['config1', 'config2'], extends: ["config1", "config2"],
e: ['config1', 'config2'], e: ["config1", "config2"],
'dry-run': true, "dry-run": true,
dryRun: true, dryRun: true,
d: true, d: true,
_: [], _: [],
'$0': '' $0: "",
})); })
);
t.is(exitCode, 0); t.is(exitCode, 0);
}); });
test.serial('Pass unknown options to semantic-release API', async (t) => { test.serial("Pass unknown options to semantic-release API", async (t) => {
const argv = ['', '', '--bool', '--first-option', 'value1', '--second-option', 'value2', '--second-option', 'value3']; const argv = ["", "", "--bool", "--first-option", "value1", "--second-option", "value2", "--second-option", "value3"];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
td.verify(index.default({ td.verify(
index.default({
bool: true, bool: true,
firstOption: 'value1', firstOption: "value1",
'first-option': 'value1', "first-option": "value1",
secondOption: ['value2', 'value3'], secondOption: ["value2", "value3"],
'second-option': ['value2', 'value3'], "second-option": ["value2", "value3"],
_: [], _: [],
'$0': '' $0: "",
})); })
);
t.is(exitCode, 0); t.is(exitCode, 0);
}); });
test.serial('Pass empty Array to semantic-release API for list option set to "false"', async (t) => { test.serial('Pass empty Array to semantic-release API for list option set to "false"', async (t) => {
const argv = ['', '', '--publish', 'false']; const argv = ["", "", "--publish", "false"];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
td.verify(index.default({publish: [], _: [], '$0': ''})); td.verify(index.default({ publish: [], _: [], $0: "" }));
t.is(exitCode, 0); t.is(exitCode, 0);
}); });
test.serial('Do not set properties in option for which arg is not in command line', async (t) => { test.serial("Do not set properties in option for which arg is not in command line", async (t) => {
const run = stub().resolves(true); const run = stub().resolves(true);
const argv = ['', '', '-b', 'master']; const argv = ["", "", "-b", "master"];
await td.replaceEsm('../index.js', null, run); await td.replaceEsm("../index.js", null, run);
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
await cli(); await cli();
t.false('ci' in run.args[0][0]); t.false("ci" in run.args[0][0]);
t.false('d' in run.args[0][0]); t.false("d" in run.args[0][0]);
t.false('dry-run' in run.args[0][0]); t.false("dry-run" in run.args[0][0]);
t.false('debug' in run.args[0][0]); t.false("debug" in run.args[0][0]);
t.false('r' in run.args[0][0]); t.false("r" in run.args[0][0]);
t.false('t' in run.args[0][0]); t.false("t" in run.args[0][0]);
t.false('p' in run.args[0][0]); t.false("p" in run.args[0][0]);
t.false('e' in run.args[0][0]); t.false("e" in run.args[0][0]);
}); });
test.serial('Display help', async (t) => { test.serial("Display help", async (t) => {
const run = stub().resolves(true); const run = stub().resolves(true);
const argv = ['', '', '--help']; const argv = ["", "", "--help"];
await td.replaceEsm('../index.js', null, run); await td.replaceEsm("../index.js", null, run);
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
@ -228,12 +234,12 @@ test.serial('Display help', async (t) => {
t.is(exitCode, 0); t.is(exitCode, 0);
}); });
test.serial('Return error exitCode and prints help if called with a command', async (t) => { test.serial("Return error exitCode and prints help if called with a command", async (t) => {
const run = stub().resolves(true); const run = stub().resolves(true);
const argv = ['', '', 'pre']; const argv = ["", "", "pre"];
await td.replaceEsm('../index.js', null, run); await td.replaceEsm("../index.js", null, run);
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
@ -242,12 +248,12 @@ test.serial('Return error exitCode and prints help if called with a command', as
t.is(exitCode, 1); t.is(exitCode, 1);
}); });
test.serial('Return error exitCode if multiple plugin are set for single plugin', async (t) => { test.serial("Return error exitCode if multiple plugin are set for single plugin", async (t) => {
const run = stub().resolves(true); const run = stub().resolves(true);
const argv = ['', '', '--analyze-commits', 'analyze1', 'analyze2']; const argv = ["", "", "--analyze-commits", "analyze1", "analyze2"];
await td.replaceEsm('../index.js', null, run); await td.replaceEsm("../index.js", null, run);
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
@ -256,12 +262,12 @@ test.serial('Return error exitCode if multiple plugin are set for single plugin'
t.is(exitCode, 1); t.is(exitCode, 1);
}); });
test.serial('Return error exitCode if semantic-release throw error', async (t) => { test.serial("Return error exitCode if semantic-release throw error", async (t) => {
const argv = ['', '']; const argv = ["", ""];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
td.when(index.default({_: [], '$0': ''})).thenReject(new Error('semantic-release error')); td.when(index.default({ _: [], $0: "" })).thenReject(new Error("semantic-release error"));
process.argv = argv; process.argv = argv;
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();
@ -269,14 +275,14 @@ test.serial('Return error exitCode if semantic-release throw error', async (t) =
t.is(exitCode, 1); t.is(exitCode, 1);
}); });
test.serial('Hide sensitive environment variable values from the logs', async (t) => { test.serial("Hide sensitive environment variable values from the logs", async (t) => {
const env = {MY_TOKEN: 'secret token'}; const env = { MY_TOKEN: "secret token" };
const argv = ['', '']; const argv = ["", ""];
const index = await td.replaceEsm('../index.js'); const index = await td.replaceEsm("../index.js");
td.when(index.default({_: [], '$0': ''})).thenReject(new Error(`Throw error: Exposing token ${env.MY_TOKEN}`)); td.when(index.default({ _: [], $0: "" })).thenReject(new Error(`Throw error: Exposing token ${env.MY_TOKEN}`));
process.argv = argv; process.argv = argv;
process.env = {...process.env, ...env}; process.env = { ...process.env, ...env };
const cli = (await import('../cli.js')).default; const cli = (await import("../cli.js")).default;
const exitCode = await cli(); const exitCode = await cli();

View File

@ -1,39 +1,39 @@
import test from 'ava'; import test from "ava";
import {stub} from 'sinon'; import { stub } from "sinon";
import getCommits from '../lib/get-commits.js'; import getCommits from "../lib/get-commits.js";
import {gitCommits, gitDetachedHead, gitRepo} from './helpers/git-utils.js'; import { gitCommits, gitDetachedHead, gitRepo } from "./helpers/git-utils.js";
test.beforeEach((t) => { test.beforeEach((t) => {
// Stub the logger functions // Stub the logger functions
t.context.log = stub(); t.context.log = stub();
t.context.error = stub(); t.context.error = stub();
t.context.logger = {log: t.context.log, error: t.context.error}; t.context.logger = { log: t.context.log, error: t.context.error };
}); });
test('Get all commits when there is no last release', async (t) => { test("Get all commits when there is no last release", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd}); const commits = await gitCommits(["First", "Second"], { cwd });
// Retrieve the commits with the commits module // Retrieve the commits with the commits module
const result = await getCommits({cwd, lastRelease: {}, logger: t.context.logger}); const result = await getCommits({ cwd, lastRelease: {}, logger: t.context.logger });
// Verify the commits created and retrieved by the module are identical // Verify the commits created and retrieved by the module are identical
t.is(result.length, 2); t.is(result.length, 2);
t.deepEqual(result, commits); t.deepEqual(result, commits);
}); });
test('Get all commits since gitHead (from lastRelease)', async (t) => { test("Get all commits since gitHead (from lastRelease)", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd}); const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Retrieve the commits with the commits module, since commit 'First' // Retrieve the commits with the commits module, since commit 'First'
const result = await getCommits({ const result = await getCommits({
cwd, cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash}, lastRelease: { gitHead: commits[commits.length - 1].hash },
logger: t.context.logger, logger: t.context.logger,
}); });
@ -42,18 +42,18 @@ test('Get all commits since gitHead (from lastRelease)', async (t) => {
t.deepEqual(result, commits.slice(0, 2)); t.deepEqual(result, commits.slice(0, 2));
}); });
test('Get all commits since gitHead (from lastRelease) on a detached head repo', async (t) => { test("Get all commits since gitHead (from lastRelease) on a detached head repo", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd}); const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Create a detached head repo at commit 'feat: Second' // Create a detached head repo at commit 'feat: Second'
cwd = await gitDetachedHead(repositoryUrl, commits[1].hash); cwd = await gitDetachedHead(repositoryUrl, commits[1].hash);
// Retrieve the commits with the commits module, since commit 'First' // Retrieve the commits with the commits module, since commit 'First'
const result = await getCommits({ const result = await getCommits({
cwd, cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash}, lastRelease: { gitHead: commits[commits.length - 1].hash },
logger: t.context.logger, logger: t.context.logger,
}); });
@ -66,17 +66,17 @@ test('Get all commits since gitHead (from lastRelease) on a detached head repo',
t.truthy(result[0].committer.name); t.truthy(result[0].committer.name);
}); });
test('Get all commits between lastRelease.gitHead and a shas', async (t) => { test("Get all commits between lastRelease.gitHead and a shas", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd}); const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Retrieve the commits with the commits module, between commit 'First' and 'Third' // Retrieve the commits with the commits module, between commit 'First' and 'Third'
const result = await getCommits({ const result = await getCommits({
cwd, cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash}, lastRelease: { gitHead: commits[commits.length - 1].hash },
nextRelease: {gitHead: commits[1].hash}, nextRelease: { gitHead: commits[1].hash },
logger: t.context.logger, logger: t.context.logger,
}); });
@ -85,16 +85,16 @@ test('Get all commits between lastRelease.gitHead and a shas', async (t) => {
t.deepEqual(result, commits.slice(1, -1)); t.deepEqual(result, commits.slice(1, -1));
}); });
test('Return empty array if lastRelease.gitHead is the last commit', async (t) => { test("Return empty array if lastRelease.gitHead is the last commit", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd}); const commits = await gitCommits(["First", "Second"], { cwd });
// Retrieve the commits with the commits module, since commit 'Second' (therefore none) // Retrieve the commits with the commits module, since commit 'Second' (therefore none)
const result = await getCommits({ const result = await getCommits({
cwd, cwd,
lastRelease: {gitHead: commits[0].hash}, lastRelease: { gitHead: commits[0].hash },
logger: t.context.logger, logger: t.context.logger,
}); });
@ -102,12 +102,12 @@ test('Return empty array if lastRelease.gitHead is the last commit', async (t) =
t.deepEqual(result, []); t.deepEqual(result, []);
}); });
test('Return empty array if there is no commits', async (t) => { test("Return empty array if there is no commits", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Retrieve the commits with the commits module // Retrieve the commits with the commits module
const result = await getCommits({cwd, lastRelease: {}, logger: t.context.logger}); const result = await getCommits({ cwd, lastRelease: {}, logger: t.context.logger });
// Verify no commit is retrieved // Verify no commit is retrieved
t.deepEqual(result, []); t.deepEqual(result, []);

View File

@ -1,589 +1,597 @@
import path from 'node:path'; import path from "node:path";
import {format} from 'node:util'; import { format } from "node:util";
import test from 'ava'; import test from "ava";
import fsExtra from 'fs-extra'; import fsExtra from "fs-extra";
import {omit} from 'lodash-es'; import { omit } from "lodash-es";
import * as td from 'testdouble'; import * as td from "testdouble";
import yaml from 'js-yaml'; import yaml from "js-yaml";
import {gitAddConfig, gitCommits, gitRepo, gitShallowClone, gitTagVersion} from './helpers/git-utils.js'; import { gitAddConfig, gitCommits, gitRepo, gitShallowClone, gitTagVersion } from "./helpers/git-utils.js";
const {outputJson, writeFile} = fsExtra; const { outputJson, writeFile } = fsExtra;
const pluginsConfig = {foo: 'bar', baz: 'qux'}; const pluginsConfig = { foo: "bar", baz: "qux" };
let plugins; let plugins;
const DEFAULT_PLUGINS = [ const DEFAULT_PLUGINS = [
'@semantic-release/commit-analyzer', "@semantic-release/commit-analyzer",
'@semantic-release/release-notes-generator', "@semantic-release/release-notes-generator",
'@semantic-release/npm', "@semantic-release/npm",
'@semantic-release/github', "@semantic-release/github",
]; ];
test.beforeEach(async (t) => { test.beforeEach(async (t) => {
plugins = (await td.replaceEsm('../lib/plugins/index.js')).default; plugins = (await td.replaceEsm("../lib/plugins/index.js")).default;
t.context.getConfig = (await import('../lib/get-config.js')).default; t.context.getConfig = (await import("../lib/get-config.js")).default;
}); });
test.afterEach.always((t) => { test.afterEach.always((t) => {
td.reset(); td.reset();
}); });
test('Default values, reading repositoryUrl from package.json', async (t) => { test("Default values, reading repositoryUrl from package.json", async (t) => {
const pkg = {repository: 'https://host.null/owner/package.git'}; const pkg = { repository: "https://host.null/owner/package.git" };
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true); const { cwd } = await gitRepo(true);
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitTagVersion('v1.0.0', undefined, {cwd}); await gitTagVersion("v1.0.0", undefined, { cwd });
await gitTagVersion('v1.1.0', undefined, {cwd}); await gitTagVersion("v1.1.0", undefined, { cwd });
// Add remote.origin.url config // Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'git@host.null:owner/repo.git', {cwd}); await gitAddConfig("remote.origin.url", "git@host.null:owner/repo.git", { cwd });
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg); await outputJson(path.resolve(cwd, "package.json"), pkg);
const {options: result} = await t.context.getConfig({cwd}); const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set // Verify the default options are set
t.deepEqual(result.branches, [ t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x', "+([0-9])?(.{+([0-9]),x}).x",
'master', "master",
'next', "next",
'next-major', "next-major",
{name: 'beta', prerelease: true}, { name: "beta", prerelease: true },
{name: 'alpha', prerelease: true}, { name: "alpha", prerelease: true },
]); ]);
t.is(result.repositoryUrl, 'https://host.null/owner/package.git'); t.is(result.repositoryUrl, "https://host.null/owner/package.git");
t.is(result.tagFormat, `v\${version}`); t.is(result.tagFormat, `v\${version}`);
}); });
test('Default values, reading repositoryUrl from repo if not set in package.json', async (t) => { test("Default values, reading repositoryUrl from repo if not set in package.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true); const { cwd } = await gitRepo(true);
// Add remote.origin.url config // Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'https://host.null/owner/module.git', {cwd}); await gitAddConfig("remote.origin.url", "https://host.null/owner/module.git", { cwd });
const {options: result} = await t.context.getConfig({cwd}); const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set // Verify the default options are set
t.deepEqual(result.branches, [ t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x', "+([0-9])?(.{+([0-9]),x}).x",
'master', "master",
'next', "next",
'next-major', "next-major",
{name: 'beta', prerelease: true}, { name: "beta", prerelease: true },
{name: 'alpha', prerelease: true}, { name: "alpha", prerelease: true },
]); ]);
t.is(result.repositoryUrl, 'https://host.null/owner/module.git'); t.is(result.repositoryUrl, "https://host.null/owner/module.git");
t.is(result.tagFormat, `v\${version}`); t.is(result.tagFormat, `v\${version}`);
}); });
test('Default values, reading repositoryUrl (http url) from package.json if not set in repo', async (t) => { test("Default values, reading repositoryUrl (http url) from package.json if not set in repo", async (t) => {
const pkg = {repository: 'https://host.null/owner/module.git'}; const pkg = { repository: "https://host.null/owner/module.git" };
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg); await outputJson(path.resolve(cwd, "package.json"), pkg);
const {options: result} = await t.context.getConfig({cwd}); const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set // Verify the default options are set
t.deepEqual(result.branches, [ t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x', "+([0-9])?(.{+([0-9]),x}).x",
'master', "master",
'next', "next",
'next-major', "next-major",
{name: 'beta', prerelease: true}, { name: "beta", prerelease: true },
{name: 'alpha', prerelease: true}, { name: "alpha", prerelease: true },
]); ]);
t.is(result.repositoryUrl, 'https://host.null/owner/module.git'); t.is(result.repositoryUrl, "https://host.null/owner/module.git");
t.is(result.tagFormat, `v\${version}`); t.is(result.tagFormat, `v\${version}`);
}); });
test('Convert "ci" option to "noCi"', async (t) => { test('Convert "ci" option to "noCi"', async (t) => {
const pkg = {repository: 'https://host.null/owner/module.git', release: {ci: false}}; const pkg = { repository: "https://host.null/owner/module.git", release: { ci: false } };
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg); await outputJson(path.resolve(cwd, "package.json"), pkg);
const {options: result} = await t.context.getConfig({cwd}); const { options: result } = await t.context.getConfig({ cwd });
t.is(result.noCi, true); t.is(result.noCi, true);
}); });
test.serial('Read options from package.json', async (t) => { test.serial("Read options from package.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: 'generateNotes', generateNotes: "generateNotes",
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Verify the plugins module is called with the plugin options from package.json // Verify the plugins module is called with the plugin options from package.json
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: options}); await outputJson(path.resolve(cwd, "package.json"), { release: options });
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json // Verify the options contains the plugin config from package.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from .releaserc.yml', async (t) => { test.serial("Read options from .releaserc.yml", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json in repository root // Create package.json in repository root
await writeFile(path.resolve(cwd, '.releaserc.yml'), yaml.dump(options)); await writeFile(path.resolve(cwd, ".releaserc.yml"), yaml.dump(options));
// Verify the plugins module is called with the plugin options from package.json // Verify the plugins module is called with the plugin options from package.json
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json // Verify the options contains the plugin config from package.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from .releaserc.json', async (t) => { test.serial("Read options from .releaserc.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, '.releaserc.json'), options); await outputJson(path.resolve(cwd, ".releaserc.json"), options);
// Verify the plugins module is called with the plugin options from package.json // Verify the plugins module is called with the plugin options from package.json
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json // Verify the options contains the plugin config from package.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from .releaserc.js', async (t) => { test.serial("Read options from .releaserc.js", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json in repository root // Create package.json in repository root
await writeFile(path.resolve(cwd, '.releaserc.js'), `module.exports = ${JSON.stringify(options)}`); await writeFile(path.resolve(cwd, ".releaserc.js"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from package.json // Verify the plugins module is called with the plugin options from package.json
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json // Verify the options contains the plugin config from package.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from .releaserc.cjs', async (t) => { test.serial("Read options from .releaserc.cjs", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create .releaserc.cjs in repository root // Create .releaserc.cjs in repository root
await writeFile(path.resolve(cwd, '.releaserc.cjs'), `module.exports = ${JSON.stringify(options)}`); await writeFile(path.resolve(cwd, ".releaserc.cjs"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from .releaserc.cjs // Verify the plugins module is called with the plugin options from .releaserc.cjs
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from .releaserc.cjs // Verify the options contains the plugin config from .releaserc.cjs
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from release.config.js', async (t) => { test.serial("Read options from release.config.js", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json in repository root // Create package.json in repository root
await writeFile(path.resolve(cwd, 'release.config.js'), `module.exports = ${JSON.stringify(options)}`); await writeFile(path.resolve(cwd, "release.config.js"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from package.json // Verify the plugins module is called with the plugin options from package.json
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json // Verify the options contains the plugin config from package.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read options from release.config.cjs', async (t) => { test.serial("Read options from release.config.cjs", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Verify the plugins module is called with the plugin options from release.config.cjs // Verify the plugins module is called with the plugin options from release.config.cjs
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
// Create release.config.cjs in repository root // Create release.config.cjs in repository root
await writeFile(path.resolve(cwd, 'release.config.cjs'), `module.exports = ${JSON.stringify(options)}`); await writeFile(path.resolve(cwd, "release.config.cjs"), `module.exports = ${JSON.stringify(options)}`);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from release.config.cjs // Verify the options contains the plugin config from release.config.cjs
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Prioritise CLI/API parameters over file configuration and git repo', async (t) => { test.serial("Prioritise CLI/API parameters over file configuration and git repo", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
// Create a clone // Create a clone
cwd = await gitShallowClone(repositoryUrl); cwd = await gitShallowClone(repositoryUrl);
const pkgOptions = { const pkgOptions = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_pkg'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_pkg" },
branches: ['branch_pkg'], branches: ["branch_pkg"],
}; };
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_cli'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_cli" },
branches: ['branch_cli'], branches: ["branch_cli"],
repositoryUrl: 'http://cli-url.com/owner/package', repositoryUrl: "http://cli-url.com/owner/package",
tagFormat: `cli\${version}`, tagFormat: `cli\${version}`,
plugins: false, plugins: false,
}; };
// Verify the plugins module is called with the plugin options from CLI/API // Verify the plugins module is called with the plugin options from CLI/API
td.when(plugins({cwd, options}, {})).thenResolve(pluginsConfig); td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const pkg = {release: pkgOptions, repository: 'git@host.null:owner/module.git'}; const pkg = { release: pkgOptions, repository: "git@host.null:owner/module.git" };
// Create package.json in repository root // Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg); await outputJson(path.resolve(cwd, "package.json"), pkg);
const result = await t.context.getConfig({cwd}, options); const result = await t.context.getConfig({ cwd }, options);
// Verify the options contains the plugin config from CLI/API // Verify the options contains the plugin config from CLI/API
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read configuration from file path in "extends"', async (t) => { test.serial('Read configuration from file path in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = {extends: './shareable.json'}; const pkgOptions = { extends: "./shareable.json" };
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: 'generateNotes', generateNotes: "generateNotes",
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: ['plugin-1', ['plugin-2', {plugin2Opt: 'value'}]], plugins: ["plugin-1", ["plugin-2", { plugin2Opt: "value" }]],
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable.json'), options); await outputJson(path.resolve(cwd, "shareable.json"), options);
// Verify the plugins module is called with the plugin options from shareable.json // Verify the plugins module is called with the plugin options from shareable.json
td.when(plugins( td.when(
{cwd, options}, plugins(
{ cwd, options },
{ {
analyzeCommits: './shareable.json', analyzeCommits: "./shareable.json",
generateNotes: './shareable.json', generateNotes: "./shareable.json",
'plugin-1': './shareable.json', "plugin-1": "./shareable.json",
'plugin-2': './shareable.json', "plugin-2": "./shareable.json",
} }
)).thenResolve(pluginsConfig); )
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json // Verify the options contains the plugin config from shareable.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read configuration from module path in "extends"', async (t) => { test.serial('Read configuration from module path in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = {extends: 'shareable'}; const pkgOptions = { extends: "shareable" };
const options = { const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: 'generateNotes', generateNotes: "generateNotes",
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'node_modules/shareable/index.json'), options); await outputJson(path.resolve(cwd, "node_modules/shareable/index.json"), options);
// Verify the plugins module is called with the plugin options from shareable.json // Verify the plugins module is called with the plugin options from shareable.json
td.when(plugins( td.when(plugins({ cwd, options }, { analyzeCommits: "shareable", generateNotes: "shareable" })).thenResolve(
{cwd, options}, pluginsConfig
{analyzeCommits: 'shareable', generateNotes: 'shareable'} );
)).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json // Verify the options contains the plugin config from shareable.json
t.deepEqual(result, {options, plugins: pluginsConfig}); t.deepEqual(result, { options, plugins: pluginsConfig });
}); });
test.serial('Read configuration from an array of paths in "extends"', async (t) => { test.serial('Read configuration from an array of paths in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = {extends: ['./shareable1.json', './shareable2.json']}; const pkgOptions = { extends: ["./shareable1.json", "./shareable2.json"] };
const options1 = { const options1 = {
verifyRelease: 'verifyRelease1', verifyRelease: "verifyRelease1",
analyzeCommits: {path: 'analyzeCommits1', param: 'analyzeCommits_param1'}, analyzeCommits: { path: "analyzeCommits1", param: "analyzeCommits_param1" },
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
}; };
const options2 = { const options2 = {
verifyRelease: 'verifyRelease2', verifyRelease: "verifyRelease2",
generateNotes: 'generateNotes2', generateNotes: "generateNotes2",
analyzeCommits: {path: 'analyzeCommits2', param: 'analyzeCommits_param2'}, analyzeCommits: { path: "analyzeCommits2", param: "analyzeCommits_param2" },
branches: ['test_branch'], branches: ["test_branch"],
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable1.json'), options1); await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await outputJson(path.resolve(cwd, 'shareable2.json'), options2); await outputJson(path.resolve(cwd, "shareable2.json"), options2);
const expectedOptions = {...options1, ...options2, branches: ['test_branch']}; const expectedOptions = { ...options1, ...options2, branches: ["test_branch"] };
// Verify the plugins module is called with the plugin options from shareable1.json and shareable2.json // Verify the plugins module is called with the plugin options from shareable1.json and shareable2.json
td.when(plugins( td.when(
{options: expectedOptions, cwd}, plugins(
{ options: expectedOptions, cwd },
{ {
verifyRelease1: './shareable1.json', verifyRelease1: "./shareable1.json",
verifyRelease2: './shareable2.json', verifyRelease2: "./shareable2.json",
generateNotes2: './shareable2.json', generateNotes2: "./shareable2.json",
analyzeCommits1: './shareable1.json', analyzeCommits1: "./shareable1.json",
analyzeCommits2: './shareable2.json', analyzeCommits2: "./shareable2.json",
} }
)).thenResolve(pluginsConfig); )
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable1.json and shareable2.json // Verify the options contains the plugin config from shareable1.json and shareable2.json
t.deepEqual(result, {options: expectedOptions, plugins: pluginsConfig}); t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
}); });
test.serial('Prioritize configuration from config file over "extends"', async (t) => { test.serial('Prioritize configuration from config file over "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = { const pkgOptions = {
extends: './shareable.json', extends: "./shareable.json",
branches: ['test_pkg'], branches: ["test_pkg"],
generateNotes: 'generateNotes', generateNotes: "generateNotes",
publish: [{path: 'publishPkg', param: 'publishPkg_param'}], publish: [{ path: "publishPkg", param: "publishPkg_param" }],
}; };
const options1 = { const options1 = {
analyzeCommits: 'analyzeCommits', analyzeCommits: "analyzeCommits",
generateNotes: 'generateNotesShareable', generateNotes: "generateNotesShareable",
publish: [{path: 'publishShareable', param: 'publishShareable_param'}], publish: [{ path: "publishShareable", param: "publishShareable_param" }],
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable.json'), options1); await outputJson(path.resolve(cwd, "shareable.json"), options1);
const expectedOptions = omit({...options1, ...pkgOptions, branches: ['test_pkg']}, 'extends'); const expectedOptions = omit({ ...options1, ...pkgOptions, branches: ["test_pkg"] }, "extends");
// Verify the plugins module is called with the plugin options from package.json and shareable.json // Verify the plugins module is called with the plugin options from package.json and shareable.json
td.when(plugins( td.when(
{cwd, options: expectedOptions}, plugins(
{ cwd, options: expectedOptions },
{ {
analyzeCommits: './shareable.json', analyzeCommits: "./shareable.json",
generateNotesShareable: './shareable.json', generateNotesShareable: "./shareable.json",
publishShareable: './shareable.json', publishShareable: "./shareable.json",
} }
)).thenResolve(pluginsConfig); )
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json and shareable.json // Verify the options contains the plugin config from package.json and shareable.json
t.deepEqual(result, {options: expectedOptions, plugins: pluginsConfig}); t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
}); });
test.serial('Prioritize configuration from cli/API options over "extends"', async (t) => { test.serial('Prioritize configuration from cli/API options over "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const cliOptions = { const cliOptions = {
extends: './shareable2.json', extends: "./shareable2.json",
branches: ['branch_opts'], branches: ["branch_opts"],
publish: [{path: 'publishOpts', param: 'publishOpts_param'}], publish: [{ path: "publishOpts", param: "publishOpts_param" }],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
}; };
const pkgOptions = { const pkgOptions = {
extends: './shareable1.json', extends: "./shareable1.json",
branches: ['branch_pkg'], branches: ["branch_pkg"],
generateNotes: 'generateNotes', generateNotes: "generateNotes",
publish: [{path: 'publishPkg', param: 'publishPkg_param'}], publish: [{ path: "publishPkg", param: "publishPkg_param" }],
}; };
const options1 = { const options1 = {
analyzeCommits: 'analyzeCommits1', analyzeCommits: "analyzeCommits1",
generateNotes: 'generateNotesShareable1', generateNotes: "generateNotesShareable1",
publish: [{path: 'publishShareable', param: 'publishShareable_param1'}], publish: [{ path: "publishShareable", param: "publishShareable_param1" }],
branches: ['test_branch1'], branches: ["test_branch1"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
}; };
const options2 = { const options2 = {
analyzeCommits: 'analyzeCommits2', analyzeCommits: "analyzeCommits2",
publish: [{path: 'publishShareable', param: 'publishShareable_param2'}], publish: [{ path: "publishShareable", param: "publishShareable_param2" }],
branches: ['test_branch2'], branches: ["test_branch2"],
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create package.json, shareable1.json and shareable2.json in repository root // Create package.json, shareable1.json and shareable2.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable1.json'), options1); await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await outputJson(path.resolve(cwd, 'shareable2.json'), options2); await outputJson(path.resolve(cwd, "shareable2.json"), options2);
const expectedOptions = omit({...options2, ...pkgOptions, ...cliOptions, branches: ['branch_opts']}, 'extends'); const expectedOptions = omit({ ...options2, ...pkgOptions, ...cliOptions, branches: ["branch_opts"] }, "extends");
// Verify the plugins module is called with the plugin options from package.json and shareable2.json // Verify the plugins module is called with the plugin options from package.json and shareable2.json
td.when(plugins( td.when(
{cwd, options: expectedOptions}, plugins(
{analyzeCommits2: './shareable2.json', publishShareable: './shareable2.json'} { cwd, options: expectedOptions },
)).thenResolve(pluginsConfig); { analyzeCommits2: "./shareable2.json", publishShareable: "./shareable2.json" }
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}, cliOptions); const result = await t.context.getConfig({ cwd }, cliOptions);
// Verify the options contains the plugin config from package.json and shareable2.json // Verify the options contains the plugin config from package.json and shareable2.json
t.deepEqual(result, {options: expectedOptions, plugins: pluginsConfig}); t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
}); });
test.serial('Allow to unset properties defined in shareable config with "null"', async (t) => { test.serial('Allow to unset properties defined in shareable config with "null"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = { const pkgOptions = {
extends: './shareable.json', extends: "./shareable.json",
analyzeCommits: null, analyzeCommits: null,
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
plugins: null, plugins: null,
}; };
const options1 = { const options1 = {
generateNotes: 'generateNotes', generateNotes: "generateNotes",
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: ['test-plugin'], plugins: ["test-plugin"],
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable.json'), options1); await outputJson(path.resolve(cwd, "shareable.json"), options1);
// Verify the plugins module is called with the plugin options from shareable.json and the default `plugins` // Verify the plugins module is called with the plugin options from shareable.json and the default `plugins`
td.when(plugins( td.when(
plugins(
{ {
options: { options: {
...omit(options1, 'analyzeCommits'), ...omit(options1, "analyzeCommits"),
...omit(pkgOptions, ['extends', 'analyzeCommits']), ...omit(pkgOptions, ["extends", "analyzeCommits"]),
plugins: DEFAULT_PLUGINS, plugins: DEFAULT_PLUGINS,
}, },
cwd, cwd,
}, },
{ {
generateNotes: './shareable.json', generateNotes: "./shareable.json",
analyzeCommits: './shareable.json', analyzeCommits: "./shareable.json",
'test-plugin': './shareable.json', "test-plugin": "./shareable.json",
} }
)).thenResolve(pluginsConfig); )
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json and the default `plugins` // Verify the options contains the plugin config from shareable.json and the default `plugins`
t.deepEqual( t.deepEqual(result, {
result,
{
options: { options: {
...omit(options1, ['analyzeCommits']), ...omit(options1, ["analyzeCommits"]),
...omit(pkgOptions, ['extends', 'analyzeCommits']), ...omit(pkgOptions, ["extends", "analyzeCommits"]),
plugins: DEFAULT_PLUGINS, plugins: DEFAULT_PLUGINS,
}, },
plugins: pluginsConfig plugins: pluginsConfig,
} });
);
}); });
test.serial('Allow to unset properties defined in shareable config with "undefined"', async (t) => { test.serial('Allow to unset properties defined in shareable config with "undefined"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = { const pkgOptions = {
extends: './shareable.json', extends: "./shareable.json",
analyzeCommits: undefined, analyzeCommits: undefined,
branches: ['test_branch'], branches: ["test_branch"],
repositoryUrl: 'https://host.null/owner/module.git', repositoryUrl: "https://host.null/owner/module.git",
}; };
const options1 = { const options1 = {
generateNotes: 'generateNotes', generateNotes: "generateNotes",
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'}, analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
plugins: false, plugins: false,
}; };
// Create release.config.js and shareable.json in repository root // Create release.config.js and shareable.json in repository root
await writeFile(path.resolve(cwd, 'release.config.js'), `module.exports = ${format(pkgOptions)}`); await writeFile(path.resolve(cwd, "release.config.js"), `module.exports = ${format(pkgOptions)}`);
await outputJson(path.resolve(cwd, 'shareable.json'), options1); await outputJson(path.resolve(cwd, "shareable.json"), options1);
const expectedOptions = { const expectedOptions = {
...omit(options1, 'analyzeCommits'), ...omit(options1, "analyzeCommits"),
...omit(pkgOptions, ['extends', 'analyzeCommits']), ...omit(pkgOptions, ["extends", "analyzeCommits"]),
branches: ['test_branch'], branches: ["test_branch"],
}; };
// Verify the plugins module is called with the plugin options from shareable.json // Verify the plugins module is called with the plugin options from shareable.json
td.when(plugins( td.when(
{options: expectedOptions, cwd}, plugins(
{generateNotes: './shareable.json', analyzeCommits: './shareable.json'} { options: expectedOptions, cwd },
)).thenResolve(pluginsConfig); { generateNotes: "./shareable.json", analyzeCommits: "./shareable.json" }
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({cwd}); const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json // Verify the options contains the plugin config from shareable.json
t.deepEqual(result, {options: expectedOptions, plugins: pluginsConfig}); t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
}); });
test('Throw an Error if one of the shareable config cannot be found', async (t) => { test("Throw an Error if one of the shareable config cannot be found", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = {extends: ['./shareable1.json', 'non-existing-path']}; const pkgOptions = { extends: ["./shareable1.json", "non-existing-path"] };
const options1 = {analyzeCommits: 'analyzeCommits'}; const options1 = { analyzeCommits: "analyzeCommits" };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'shareable1.json'), options1); await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await t.throwsAsync(t.context.getConfig({cwd}), { await t.throwsAsync(t.context.getConfig({ cwd }), {
message: /Cannot find module 'non-existing-path'/, message: /Cannot find module 'non-existing-path'/,
code: 'MODULE_NOT_FOUND', code: "MODULE_NOT_FOUND",
}); });
}); });
test('Convert "ci" option to "noCi" when set from extended config', async (t) => { test('Convert "ci" option to "noCi" when set from extended config', async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const pkgOptions = {extends: './no-ci.json'}; const pkgOptions = { extends: "./no-ci.json" };
const options = { const options = {
ci: false, ci: false,
}; };
// Create package.json and shareable.json in repository root // Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions}); await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, 'no-ci.json'), options); await outputJson(path.resolve(cwd, "no-ci.json"), options);
const {options: result} = await t.context.getConfig({cwd}); const { options: result } = await t.context.getConfig({ cwd });
t.is(result.ci, false); t.is(result.ci, false);
t.is(result.noCi, true); t.is(result.noCi, true);

View File

@ -1,408 +1,413 @@
import test from 'ava'; import test from "ava";
import getAuthUrl from '../lib/get-git-auth-url.js'; import getAuthUrl from "../lib/get-git-auth-url.js";
import {gitRepo} from './helpers/git-utils.js'; import { gitRepo } from "./helpers/git-utils.js";
const env = {GIT_ASKPASS: 'echo', GIT_TERMINAL_PROMPT: 0}; const env = { GIT_ASKPASS: "echo", GIT_TERMINAL_PROMPT: 0 };
test('Return the same "git" formatted URL if "gitCredentials" is not defined', async (t) => { test('Return the same "git" formatted URL if "gitCredentials" is not defined', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({cwd, env, branch: {name: 'master'}, options: {repositoryUrl: 'git@host.null:owner/repo.git'}}), await getAuthUrl({
'git@host.null:owner/repo.git' cwd,
env,
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
"git@host.null:owner/repo.git"
); );
}); });
test('Return the same "https" formatted URL if "gitCredentials" is not defined', async (t) => { test('Return the same "https" formatted URL if "gitCredentials" is not defined', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'https://host.null/owner/repo.git'}, options: { repositoryUrl: "https://host.null/owner/repo.git" },
}), }),
'https://host.null/owner/repo.git' "https://host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is not defined and repositoryUrl is a "git+https" URL', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is not defined and repositoryUrl is a "git+https" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git+https://host.null/owner/repo.git'}, options: { repositoryUrl: "git+https://host.null/owner/repo.git" },
}), }),
'https://host.null/owner/repo.git' "https://host.null/owner/repo.git"
); );
}); });
test('Do not add trailing ".git" if not present in the origian URL', async (t) => { test('Do not add trailing ".git" if not present in the origian URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({cwd, env, vranch: {name: 'master'}, options: {repositoryUrl: 'git@host.null:owner/repo'}}), await getAuthUrl({ cwd, env, vranch: { name: "master" }, options: { repositoryUrl: "git@host.null:owner/repo" } }),
'git@host.null:owner/repo' "git@host.null:owner/repo"
); );
}); });
test('Handle "https" URL with group and subgroup', async (t) => { test('Handle "https" URL with group and subgroup', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'https://host.null/group/subgroup/owner/repo.git'}, options: { repositoryUrl: "https://host.null/group/subgroup/owner/repo.git" },
}), }),
'https://host.null/group/subgroup/owner/repo.git' "https://host.null/group/subgroup/owner/repo.git"
); );
}); });
test('Handle "git" URL with group and subgroup', async (t) => { test('Handle "git" URL with group and subgroup', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:group/subgroup/owner/repo.git'}, options: { repositoryUrl: "git@host.null:group/subgroup/owner/repo.git" },
}), }),
'git@host.null:group/subgroup/owner/repo.git' "git@host.null:group/subgroup/owner/repo.git"
); );
}); });
test('Convert shorthand URL', async (t) => { test("Convert shorthand URL", async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'semantic-release/semantic-release'}, options: { repositoryUrl: "semantic-release/semantic-release" },
}), }),
'https://github.com/semantic-release/semantic-release.git' "https://github.com/semantic-release/semantic-release.git"
); );
}); });
test('Convert GitLab shorthand URL', async (t) => { test("Convert GitLab shorthand URL", async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env, env,
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'gitlab:semantic-release/semantic-release'}, options: { repositoryUrl: "gitlab:semantic-release/semantic-release" },
}), }),
'https://gitlab.com/semantic-release/semantic-release.git' "https://gitlab.com/semantic-release/semantic-release.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://user:pass@host.null/owner/repo.git' "https://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: {branch: 'master', repositoryUrl: 'host.null:owner/repo.git'}, options: { branch: "master", repositoryUrl: "host.null:owner/repo.git" },
}), }),
'https://user:pass@host.null/owner/repo.git' "https://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: {branch: 'master', repositoryUrl: 'host.null:6666:owner/repo.git'}, options: { branch: "master", repositoryUrl: "host.null:6666:owner/repo.git" },
}), }),
'https://user:pass@host.null:6666/owner/repo.git' "https://user:pass@host.null:6666/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port followed by a slash', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port followed by a slash', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: {branch: 'master', repositoryUrl: 'host.null:6666:/owner/repo.git'}, options: { branch: "master", repositoryUrl: "host.null:6666:/owner/repo.git" },
}), }),
'https://user:pass@host.null:6666/owner/repo.git' "https://user:pass@host.null:6666/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "https" URL', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "https" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'https://host.null/owner/repo.git'}, options: { repositoryUrl: "https://host.null/owner/repo.git" },
}), }),
'https://user:pass@host.null/owner/repo.git' "https://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL', async (t) => { test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'http://host.null/owner/repo.git'}, options: { repositoryUrl: "http://host.null/owner/repo.git" },
}), }),
'http://user:pass@host.null/owner/repo.git' "http://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL with custom port', async (t) => { test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL with custom port', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: {branch: 'master', repositoryUrl: 'http://host.null:8080/owner/repo.git'}, options: { branch: "master", repositoryUrl: "http://host.null:8080/owner/repo.git" },
}), }),
'http://user:pass@host.null:8080/owner/repo.git' "http://user:pass@host.null:8080/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+https" URL', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+https" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git+https://host.null/owner/repo.git'}, options: { repositoryUrl: "git+https://host.null/owner/repo.git" },
}), }),
'https://user:pass@host.null/owner/repo.git' "https://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+http" URL', async (t) => { test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+http" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git+http://host.null/owner/repo.git'}, options: { repositoryUrl: "git+http://host.null/owner/repo.git" },
}), }),
'http://user:pass@host.null/owner/repo.git' "http://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "ssh" URL', async (t) => { test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "ssh" URL', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: {branch: 'master', repositoryUrl: 'ssh://git@host.null:2222/owner/repo.git'}, options: { branch: "master", repositoryUrl: "ssh://git@host.null:2222/owner/repo.git" },
}), }),
'https://user:pass@host.null/owner/repo.git' "https://user:pass@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "GH_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "GH_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GH_TOKEN: 'token'}, env: { ...env, GH_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://token@host.null/owner/repo.git' "https://token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "GITHUB_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "GITHUB_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GITHUB_TOKEN: 'token'}, env: { ...env, GITHUB_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://token@host.null/owner/repo.git' "https://token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "GL_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "GL_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GL_TOKEN: 'token'}, env: { ...env, GL_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://gitlab-ci-token:token@host.null/owner/repo.git' "https://gitlab-ci-token:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "GITLAB_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "GITLAB_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GITLAB_TOKEN: 'token'}, env: { ...env, GITLAB_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://gitlab-ci-token:token@host.null/owner/repo.git' "https://gitlab-ci-token:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, BB_TOKEN: 'token'}, env: { ...env, BB_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://x-token-auth:token@host.null/owner/repo.git' "https://x-token-auth:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, BITBUCKET_TOKEN: 'token'}, env: { ...env, BITBUCKET_TOKEN: "token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://x-token-auth:token@host.null/owner/repo.git' "https://x-token-auth:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN_BASIC_AUTH"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN_BASIC_AUTH"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, BB_TOKEN_BASIC_AUTH: 'username:token'}, env: { ...env, BB_TOKEN_BASIC_AUTH: "username:token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://username:token@host.null/owner/repo.git' "https://username:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN_BASIC_AUTH"', async (t) => { test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN_BASIC_AUTH"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, BITBUCKET_TOKEN_BASIC_AUTH: 'username:token'}, env: { ...env, BITBUCKET_TOKEN_BASIC_AUTH: "username:token" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:owner/repo.git'}, options: { repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://username:token@host.null/owner/repo.git' "https://username:token@host.null/owner/repo.git"
); );
}); });
test('Return the "https" formatted URL if "GITHUB_ACTION" is set', async (t) => { test('Return the "https" formatted URL if "GITHUB_ACTION" is set', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GITHUB_ACTION: 'foo', GITHUB_TOKEN: 'token'}, env: { ...env, GITHUB_ACTION: "foo", GITHUB_TOKEN: "token" },
options: {branch: 'master', repositoryUrl: 'git@host.null:owner/repo.git'}, options: { branch: "master", repositoryUrl: "git@host.null:owner/repo.git" },
}), }),
'https://x-access-token:token@host.null/owner/repo.git' "https://x-access-token:token@host.null/owner/repo.git"
); );
}); });
test('Handle "https" URL with group and subgroup, with "GIT_CREDENTIALS"', async (t) => { test('Handle "https" URL with group and subgroup, with "GIT_CREDENTIALS"', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'https://host.null/group/subgroup/owner/repo.git'}, options: { repositoryUrl: "https://host.null/group/subgroup/owner/repo.git" },
}), }),
'https://user:pass@host.null/group/subgroup/owner/repo.git' "https://user:pass@host.null/group/subgroup/owner/repo.git"
); );
}); });
test('Handle "git" URL with group and subgroup, with "GIT_CREDENTIALS', async (t) => { test('Handle "git" URL with group and subgroup, with "GIT_CREDENTIALS', async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl: 'git@host.null:group/subgroup/owner/repo.git'}, options: { repositoryUrl: "git@host.null:group/subgroup/owner/repo.git" },
}), }),
'https://user:pass@host.null/group/subgroup/owner/repo.git' "https://user:pass@host.null/group/subgroup/owner/repo.git"
); );
}); });
test('Do not add git credential to repositoryUrl if push is allowed', async (t) => { test("Do not add git credential to repositoryUrl if push is allowed", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
t.is( t.is(
await getAuthUrl({ await getAuthUrl({
cwd, cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'}, env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: {name: 'master'}, branch: { name: "master" },
options: {repositoryUrl}, options: { repositoryUrl },
}), }),
repositoryUrl repositoryUrl
); );

View File

@ -1,80 +1,80 @@
import test from 'ava'; import test from "ava";
import getLastRelease from '../lib/get-last-release.js'; import getLastRelease from "../lib/get-last-release.js";
test('Get the highest non-prerelease valid tag', (t) => { test("Get the highest non-prerelease valid tag", (t) => {
const result = getLastRelease({ const result = getLastRelease({
branch: { branch: {
name: 'master', name: "master",
tags: [ tags: [
{version: '2.0.0', gitTag: 'v2.0.0', gitHead: 'v2.0.0'}, { version: "2.0.0", gitTag: "v2.0.0", gitHead: "v2.0.0" },
{version: '1.0.0', gitTag: 'v1.0.0', gitHead: 'v1.0.0'}, { version: "1.0.0", gitTag: "v1.0.0", gitHead: "v1.0.0" },
{version: '3.0.0-beta.1', gitTag: 'v3.0.0-beta.1', gitHead: 'v3.0.0-beta.1'}, { version: "3.0.0-beta.1", gitTag: "v3.0.0-beta.1", gitHead: "v3.0.0-beta.1" },
], ],
type: 'release', type: "release",
}, },
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, {version: '2.0.0', gitTag: 'v2.0.0', name: 'v2.0.0', gitHead: 'v2.0.0', channels: undefined}); t.deepEqual(result, { version: "2.0.0", gitTag: "v2.0.0", name: "v2.0.0", gitHead: "v2.0.0", channels: undefined });
}); });
test('Get the highest prerelease valid tag, ignoring other tags from other prerelease channels', (t) => { test("Get the highest prerelease valid tag, ignoring other tags from other prerelease channels", (t) => {
const result = getLastRelease({ const result = getLastRelease({
branch: { branch: {
name: 'beta', name: "beta",
prerelease: 'beta', prerelease: "beta",
channel: 'beta', channel: "beta",
tags: [ tags: [
{version: '1.0.0-beta.1', gitTag: 'v1.0.0-beta.1', gitHead: 'v1.0.0-beta.1', channels: ['beta']}, { version: "1.0.0-beta.1", gitTag: "v1.0.0-beta.1", gitHead: "v1.0.0-beta.1", channels: ["beta"] },
{version: '1.0.0-beta.2', gitTag: 'v1.0.0-beta.2', gitHead: 'v1.0.0-beta.2', channels: ['beta']}, { version: "1.0.0-beta.2", gitTag: "v1.0.0-beta.2", gitHead: "v1.0.0-beta.2", channels: ["beta"] },
{version: '1.0.0-alpha.1', gitTag: 'v1.0.0-alpha.1', gitHead: 'v1.0.0-alpha.1', channels: ['alpha']}, { version: "1.0.0-alpha.1", gitTag: "v1.0.0-alpha.1", gitHead: "v1.0.0-alpha.1", channels: ["alpha"] },
], ],
type: 'prerelease', type: "prerelease",
}, },
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
version: '1.0.0-beta.2', version: "1.0.0-beta.2",
gitTag: 'v1.0.0-beta.2', gitTag: "v1.0.0-beta.2",
name: 'v1.0.0-beta.2', name: "v1.0.0-beta.2",
gitHead: 'v1.0.0-beta.2', gitHead: "v1.0.0-beta.2",
channels: ['beta'], channels: ["beta"],
}); });
}); });
test('Return empty object if no valid tag is found', (t) => { test("Return empty object if no valid tag is found", (t) => {
const result = getLastRelease({ const result = getLastRelease({
branch: { branch: {
name: 'master', name: "master",
tags: [{version: '3.0.0-beta.1', gitTag: 'v3.0.0-beta.1', gitHead: 'v3.0.0-beta.1'}], tags: [{ version: "3.0.0-beta.1", gitTag: "v3.0.0-beta.1", gitHead: "v3.0.0-beta.1" }],
type: 'release', type: "release",
}, },
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, {}); t.deepEqual(result, {});
}); });
test('Get the highest non-prerelease valid tag before a certain version', (t) => { test("Get the highest non-prerelease valid tag before a certain version", (t) => {
const result = getLastRelease( const result = getLastRelease(
{ {
branch: { branch: {
name: 'master', name: "master",
channel: undefined, channel: undefined,
tags: [ tags: [
{version: '2.0.0', gitTag: 'v2.0.0', gitHead: 'v2.0.0'}, { version: "2.0.0", gitTag: "v2.0.0", gitHead: "v2.0.0" },
{version: '1.0.0', gitTag: 'v1.0.0', gitHead: 'v1.0.0'}, { version: "1.0.0", gitTag: "v1.0.0", gitHead: "v1.0.0" },
{version: '2.0.0-beta.1', gitTag: 'v2.0.0-beta.1', gitHead: 'v2.0.0-beta.1'}, { version: "2.0.0-beta.1", gitTag: "v2.0.0-beta.1", gitHead: "v2.0.0-beta.1" },
{version: '2.1.0', gitTag: 'v2.1.0', gitHead: 'v2.1.0'}, { version: "2.1.0", gitTag: "v2.1.0", gitHead: "v2.1.0" },
{version: '2.1.1', gitTag: 'v2.1.1', gitHead: 'v2.1.1'}, { version: "2.1.1", gitTag: "v2.1.1", gitHead: "v2.1.1" },
], ],
type: 'release', type: "release",
}, },
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}, },
{before: '2.1.0'} { before: "2.1.0" }
); );
t.deepEqual(result, {version: '2.0.0', gitTag: 'v2.0.0', name: 'v2.0.0', gitHead: 'v2.0.0', channels: undefined}); t.deepEqual(result, { version: "2.0.0", gitTag: "v2.0.0", name: "v2.0.0", gitHead: "v2.0.0", channels: undefined });
}); });

View File

@ -1,15 +1,15 @@
import test from 'ava'; import test from "ava";
import {spy} from 'sinon'; import { spy } from "sinon";
import getLogger from '../lib/get-logger.js'; import getLogger from "../lib/get-logger.js";
test('Expose "error", "success" and "log" functions', (t) => { test('Expose "error", "success" and "log" functions', (t) => {
const stdout = spy(); const stdout = spy();
const stderr = spy(); const stderr = spy();
const logger = getLogger({stdout: {write: stdout}, stderr: {write: stderr}}); const logger = getLogger({ stdout: { write: stdout }, stderr: { write: stderr } });
logger.log('test log'); logger.log("test log");
logger.success('test success'); logger.success("test success");
logger.error('test error'); logger.error("test error");
t.regex(stdout.args[0][0], /.*test log/); t.regex(stdout.args[0][0], /.*test log/);
t.regex(stdout.args[1][0], /.*test success/); t.regex(stdout.args[1][0], /.*test success/);

View File

@ -1,277 +1,277 @@
import test from 'ava'; import test from "ava";
import {stub} from 'sinon'; import { stub } from "sinon";
import getNextVersion from '../lib/get-next-version.js'; import getNextVersion from "../lib/get-next-version.js";
test.beforeEach((t) => { test.beforeEach((t) => {
// Stub the logger functions // Stub the logger functions
t.context.log = stub(); t.context.log = stub();
t.context.logger = {log: t.context.log}; t.context.logger = { log: t.context.log };
}); });
test('Increase version for patch release', (t) => { test("Increase version for patch release", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]}, branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: {type: 'patch'}, nextRelease: { type: "patch" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.1' "1.0.1"
); );
}); });
test('Increase version for minor release', (t) => { test("Increase version for minor release", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]}, branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: {type: 'minor'}, nextRelease: { type: "minor" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.1.0' "1.1.0"
); );
}); });
test('Increase version for major release', (t) => { test("Increase version for major release", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]}, branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: {type: 'major'}, nextRelease: { type: "major" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'2.0.0' "2.0.0"
); );
}); });
test('Return 1.0.0 if there is no previous release', (t) => { test("Return 1.0.0 if there is no previous release", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: {name: 'master', type: 'release', tags: []}, branch: { name: "master", type: "release", tags: [] },
nextRelease: {type: 'minor'}, nextRelease: { type: "minor" },
lastRelease: {}, lastRelease: {},
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.0' "1.0.0"
); );
}); });
test('Increase version for patch release on prerelease branch', (t) => { test("Increase version for patch release on prerelease branch", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}], tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
}, },
nextRelease: {type: 'patch', channel: 'beta'}, nextRelease: { type: "patch", channel: "beta" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.1-beta.1' "1.0.1-beta.1"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.0.1-beta.1', version: '1.0.1-beta.1', channels: ['beta']}, { gitTag: "v1.0.1-beta.1", version: "1.0.1-beta.1", channels: ["beta"] },
], ],
}, },
nextRelease: {type: 'patch', channel: 'beta'}, nextRelease: { type: "patch", channel: "beta" },
lastRelease: {version: '1.0.1-beta.1', channels: ['beta']}, lastRelease: { version: "1.0.1-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.1-beta.2' "1.0.1-beta.2"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'alpha', name: "alpha",
type: 'prerelease', type: "prerelease",
prerelease: 'alpha', prerelease: "alpha",
tags: [{gitTag: 'v1.0.1-beta.1', version: '1.0.1-beta.1', channels: ['beta']}], tags: [{ gitTag: "v1.0.1-beta.1", version: "1.0.1-beta.1", channels: ["beta"] }],
}, },
nextRelease: {type: 'patch', channel: 'alpha'}, nextRelease: { type: "patch", channel: "alpha" },
lastRelease: {version: '1.0.1-beta.1', channels: ['beta']}, lastRelease: { version: "1.0.1-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.2-alpha.1' "1.0.2-alpha.1"
); );
}); });
test('Increase version for minor release on prerelease branch', (t) => { test("Increase version for minor release on prerelease branch", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}], tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
}, },
nextRelease: {type: 'minor', channel: 'beta'}, nextRelease: { type: "minor", channel: "beta" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.1.0-beta.1' "1.1.0-beta.1"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: ['beta']}, { gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: ["beta"] },
], ],
}, },
nextRelease: {type: 'minor', channel: 'beta'}, nextRelease: { type: "minor", channel: "beta" },
lastRelease: {version: '1.1.0-beta.1', channels: ['beta']}, lastRelease: { version: "1.1.0-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.1.0-beta.2' "1.1.0-beta.2"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'alpha', name: "alpha",
type: 'prerelease', type: "prerelease",
prerelease: 'alpha', prerelease: "alpha",
tags: [{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: ['beta']}], tags: [{ gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: ["beta"] }],
}, },
nextRelease: {type: 'minor', channel: 'alpha'}, nextRelease: { type: "minor", channel: "alpha" },
lastRelease: {version: '1.1.0-beta.1', channels: ['beta']}, lastRelease: { version: "1.1.0-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.2.0-alpha.1' "1.2.0-alpha.1"
); );
}); });
test('Increase version for major release on prerelease branch', (t) => { test("Increase version for major release on prerelease branch", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}], tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
}, },
nextRelease: {type: 'major', channel: 'beta'}, nextRelease: { type: "major", channel: "beta" },
lastRelease: {version: '1.0.0', channels: [null]}, lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'2.0.0-beta.1' "2.0.0-beta.1"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v2.0.0-beta.1', version: '2.0.0-beta.1', channels: ['beta']}, { gitTag: "v2.0.0-beta.1", version: "2.0.0-beta.1", channels: ["beta"] },
], ],
}, },
nextRelease: {type: 'major', channel: 'beta'}, nextRelease: { type: "major", channel: "beta" },
lastRelease: {version: '2.0.0-beta.1', channels: ['beta']}, lastRelease: { version: "2.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'2.0.0-beta.2' "2.0.0-beta.2"
); );
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'alpha', name: "alpha",
type: 'prerelease', type: "prerelease",
prerelease: 'alpha', prerelease: "alpha",
tags: [{gitTag: 'v2.0.0-beta.1', version: '2.0.0-beta.1', channels: ['beta']}], tags: [{ gitTag: "v2.0.0-beta.1", version: "2.0.0-beta.1", channels: ["beta"] }],
}, },
nextRelease: {type: 'major', channel: 'alpha'}, nextRelease: { type: "major", channel: "alpha" },
lastRelease: {version: '2.0.0-beta.1', channels: ['beta']}, lastRelease: { version: "2.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'3.0.0-alpha.1' "3.0.0-alpha.1"
); );
}); });
test('Return 1.0.0 if there is no previous release on prerelease branch', (t) => { test("Return 1.0.0 if there is no previous release on prerelease branch", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: {name: 'beta', type: 'prerelease', prerelease: 'beta', tags: []}, branch: { name: "beta", type: "prerelease", prerelease: "beta", tags: [] },
nextRelease: {type: 'minor'}, nextRelease: { type: "minor" },
lastRelease: {}, lastRelease: {},
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.0-beta.1' "1.0.0-beta.1"
); );
}); });
test('Increase version for release on prerelease branch after previous commits were merged to release branch', (t) => { test("Increase version for release on prerelease branch after previous commits were merged to release branch", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, // Version v1.1.0 released on default branch after beta was merged into master { gitTag: "v1.1.0", version: "1.1.0", channels: [null] }, // Version v1.1.0 released on default branch after beta was merged into master
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: [null, 'beta']}, { gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: [null, "beta"] },
], ],
}, },
nextRelease: {type: 'minor'}, nextRelease: { type: "minor" },
lastRelease: {version: '1.1.0', channels: [null]}, lastRelease: { version: "1.1.0", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.2.0-beta.1' "1.2.0-beta.1"
); );
}); });
test('Increase version for release on prerelease branch based on highest commit type since last regular release', (t) => { test("Increase version for release on prerelease branch based on highest commit type since last regular release", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: [null, 'beta']}, { gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: [null, "beta"] },
], ],
}, },
nextRelease: {type: 'major'}, nextRelease: { type: "major" },
lastRelease: {version: 'v1.1.0-beta.1', channels: [null]}, lastRelease: { version: "v1.1.0-beta.1", channels: [null] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'2.0.0-beta.1' "2.0.0-beta.1"
); );
}); });
test('Increase version for release on prerelease branch when there is no regular releases on other branches', (t) => { test("Increase version for release on prerelease branch when there is no regular releases on other branches", (t) => {
t.is( t.is(
getNextVersion({ getNextVersion({
branch: { branch: {
name: 'beta', name: "beta",
type: 'prerelease', type: "prerelease",
prerelease: 'beta', prerelease: "beta",
tags: [{gitTag: 'v1.0.0-beta.1', version: '1.0.0-beta.1', channels: ['beta']}], tags: [{ gitTag: "v1.0.0-beta.1", version: "1.0.0-beta.1", channels: ["beta"] }],
}, },
nextRelease: {type: 'minor', channel: 'beta'}, nextRelease: { type: "minor", channel: "beta" },
lastRelease: {version: 'v1.0.0-beta.1', channels: ['beta']}, lastRelease: { version: "v1.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger, logger: t.context.logger,
}), }),
'1.0.0-beta.2' "1.0.0-beta.2"
); );
}); });

View File

@ -1,189 +1,189 @@
import test from 'ava'; import test from "ava";
import getReleaseToAdd from '../lib/get-release-to-add.js'; import getReleaseToAdd from "../lib/get-release-to-add.js";
test('Return versions merged from release to maintenance branch, excluding lower than branch start range', (t) => { test("Return versions merged from release to maintenance branch, excluding lower than branch start range", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: '2.x', name: "2.x",
channel: '2.x', channel: "2.x",
type: 'maintenance', type: "maintenance",
mergeRange: '>=2.0.0 <3.0.0', mergeRange: ">=2.0.0 <3.0.0",
tags: [ tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['2.x']}, { gitTag: "v2.0.0", version: "2.0.0", channels: ["2.x"] },
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]}, { gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]}, { gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null]}, { gitTag: "v2.1.1", version: "2.1.1", channels: [null] },
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, { gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
], ],
}, },
branches: [{name: '2.x', channel: '2.x'}, {name: 'master'}], branches: [{ name: "2.x", channel: "2.x" }, { name: "master" }],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
lastRelease: {version: '2.1.0', channels: [null], gitTag: 'v2.1.0', name: 'v2.1.0', gitHead: 'v2.1.0'}, lastRelease: { version: "2.1.0", channels: [null], gitTag: "v2.1.0", name: "v2.1.0", gitHead: "v2.1.0" },
currentRelease: { currentRelease: {
type: 'patch', type: "patch",
version: '2.1.1', version: "2.1.1",
channels: [null], channels: [null],
gitTag: 'v2.1.1', gitTag: "v2.1.1",
name: 'v2.1.1', name: "v2.1.1",
gitHead: 'v2.1.1', gitHead: "v2.1.1",
}, },
nextRelease: { nextRelease: {
type: 'patch', type: "patch",
version: '2.1.1', version: "2.1.1",
channel: '2.x', channel: "2.x",
gitTag: 'v2.1.1', gitTag: "v2.1.1",
name: 'v2.1.1', name: "v2.1.1",
gitHead: 'v2.1.1', gitHead: "v2.1.1",
}, },
}); });
}); });
test('Return versions merged between release branches', (t) => { test("Return versions merged between release branches", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']}, { gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['next-major']}, { gitTag: "v2.0.0", version: "2.0.0", channels: ["next-major"] },
], ],
}, },
branches: [{name: 'master'}, {name: 'next', channel: 'next'}, {name: 'next-major', channel: 'next-major'}], branches: [{ name: "master" }, { name: "next", channel: "next" }, { name: "next-major", channel: "next-major" }],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
lastRelease: { lastRelease: {
version: '1.1.0', version: "1.1.0",
gitTag: 'v1.1.0', gitTag: "v1.1.0",
name: 'v1.1.0', name: "v1.1.0",
gitHead: 'v1.1.0', gitHead: "v1.1.0",
channels: ['next'], channels: ["next"],
}, },
currentRelease: { currentRelease: {
type: 'major', type: "major",
version: '2.0.0', version: "2.0.0",
channels: ['next-major'], channels: ["next-major"],
gitTag: 'v2.0.0', gitTag: "v2.0.0",
name: 'v2.0.0', name: "v2.0.0",
gitHead: 'v2.0.0', gitHead: "v2.0.0",
}, },
nextRelease: { nextRelease: {
type: 'major', type: "major",
version: '2.0.0', version: "2.0.0",
channel: null, channel: null,
gitTag: 'v2.0.0', gitTag: "v2.0.0",
name: 'v2.0.0', name: "v2.0.0",
gitHead: 'v2.0.0', gitHead: "v2.0.0",
}, },
}); });
}); });
test('Return releases sorted by ascending order', (t) => { test("Return releases sorted by ascending order", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
tags: [ tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['next-major']}, { gitTag: "v2.0.0", version: "2.0.0", channels: ["next-major"] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']}, { gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
], ],
}, },
branches: [{name: 'master'}, {name: 'next', channel: 'next'}, {name: 'next-major', channel: 'next-major'}], branches: [{ name: "master" }, { name: "next", channel: "next" }, { name: "next-major", channel: "next-major" }],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
lastRelease: {version: '1.1.0', gitTag: 'v1.1.0', name: 'v1.1.0', gitHead: 'v1.1.0', channels: ['next']}, lastRelease: { version: "1.1.0", gitTag: "v1.1.0", name: "v1.1.0", gitHead: "v1.1.0", channels: ["next"] },
currentRelease: { currentRelease: {
type: 'major', type: "major",
version: '2.0.0', version: "2.0.0",
channels: ['next-major'], channels: ["next-major"],
gitTag: 'v2.0.0', gitTag: "v2.0.0",
name: 'v2.0.0', name: "v2.0.0",
gitHead: 'v2.0.0', gitHead: "v2.0.0",
}, },
nextRelease: { nextRelease: {
type: 'major', type: "major",
version: '2.0.0', version: "2.0.0",
channel: null, channel: null,
gitTag: 'v2.0.0', gitTag: "v2.0.0",
name: 'v2.0.0', name: "v2.0.0",
gitHead: 'v2.0.0', gitHead: "v2.0.0",
}, },
}); });
}); });
test('No lastRelease', (t) => { test("No lastRelease", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: ['next']}], tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: ["next"] }],
}, },
branches: [{name: 'master'}, {name: 'next', channel: 'next'}], branches: [{ name: "master" }, { name: "next", channel: "next" }],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
lastRelease: {}, lastRelease: {},
currentRelease: { currentRelease: {
type: 'major', type: "major",
version: '1.0.0', version: "1.0.0",
channels: ['next'], channels: ["next"],
gitTag: 'v1.0.0', gitTag: "v1.0.0",
name: 'v1.0.0', name: "v1.0.0",
gitHead: 'v1.0.0', gitHead: "v1.0.0",
}, },
nextRelease: { nextRelease: {
type: 'major', type: "major",
version: '1.0.0', version: "1.0.0",
channel: null, channel: null,
gitTag: 'v1.0.0', gitTag: "v1.0.0",
name: 'v1.0.0', name: "v1.0.0",
gitHead: 'v1.0.0', gitHead: "v1.0.0",
}, },
}); });
}); });
test('Ignore pre-release versions', (t) => { test("Ignore pre-release versions", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']}, { gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{gitTag: 'v2.0.0-alpha.1', version: '2.0.0-alpha.1', channels: ['alpha']}, { gitTag: "v2.0.0-alpha.1", version: "2.0.0-alpha.1", channels: ["alpha"] },
], ],
}, },
branches: [ branches: [
{name: 'master'}, { name: "master" },
{name: 'next', channel: 'next'}, { name: "next", channel: "next" },
{name: 'alpha', type: 'prerelease', channel: 'alpha'}, { name: "alpha", type: "prerelease", channel: "alpha" },
], ],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.deepEqual(result, { t.deepEqual(result, {
lastRelease: {version: '1.0.0', channels: [null, 'next'], gitTag: 'v1.0.0', name: 'v1.0.0', gitHead: 'v1.0.0'}, lastRelease: { version: "1.0.0", channels: [null, "next"], gitTag: "v1.0.0", name: "v1.0.0", gitHead: "v1.0.0" },
currentRelease: { currentRelease: {
type: 'minor', type: "minor",
version: '1.1.0', version: "1.1.0",
channels: ['next'], channels: ["next"],
gitTag: 'v1.1.0', gitTag: "v1.1.0",
name: 'v1.1.0', name: "v1.1.0",
gitHead: 'v1.1.0', gitHead: "v1.1.0",
}, },
nextRelease: { nextRelease: {
type: 'minor', type: "minor",
version: '1.1.0', version: "1.1.0",
channel: null, channel: null,
gitTag: 'v1.1.0', gitTag: "v1.1.0",
name: 'v1.1.0', name: "v1.1.0",
gitHead: 'v1.1.0', gitHead: "v1.1.0",
}, },
}); });
}); });
@ -191,24 +191,24 @@ test('Ignore pre-release versions', (t) => {
test('Exclude versions merged from release to maintenance branch if they have the same "channel"', (t) => { test('Exclude versions merged from release to maintenance branch if they have the same "channel"', (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: '2.x', name: "2.x",
channel: 'latest', channel: "latest",
type: 'maintenance', type: "maintenance",
mergeRange: '>=2.0.0 <3.0.0', mergeRange: ">=2.0.0 <3.0.0",
tags: [ tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]}, { gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]}, { gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]}, { gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null]}, { gitTag: "v2.1.1", version: "2.1.1", channels: [null] },
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, { gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
], ],
}, },
branches: [ branches: [
{name: '2.x', channel: 'latest'}, { name: "2.x", channel: "latest" },
{name: 'master', channel: 'latest'}, { name: "master", channel: "latest" },
], ],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.is(result, undefined); t.is(result, undefined);
@ -217,20 +217,20 @@ test('Exclude versions merged from release to maintenance branch if they have th
test('Exclude versions merged between release branches if they have the same "channel"', (t) => { test('Exclude versions merged between release branches if they have the same "channel"', (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
channel: 'latest', channel: "latest",
tags: [ tags: [
{gitTag: 'v1.0.0', channels: ['latest'], version: '1.0.0'}, { gitTag: "v1.0.0", channels: ["latest"], version: "1.0.0" },
{gitTag: 'v1.1.0', channels: ['latest'], version: '1.1.0'}, { gitTag: "v1.1.0", channels: ["latest"], version: "1.1.0" },
{gitTag: 'v2.0.0', channels: ['latest'], version: '2.0.0'}, { gitTag: "v2.0.0", channels: ["latest"], version: "2.0.0" },
], ],
}, },
branches: [ branches: [
{name: 'master', channel: 'latest'}, { name: "master", channel: "latest" },
{name: 'next', channel: 'latest'}, { name: "next", channel: "latest" },
{name: 'next-major', channel: 'latest'}, { name: "next-major", channel: "latest" },
], ],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.is(result, undefined); t.is(result, undefined);
@ -239,43 +239,43 @@ test('Exclude versions merged between release branches if they have the same "ch
test('Exclude versions merged between release branches if they all have "channel" set to "false"', (t) => { test('Exclude versions merged between release branches if they all have "channel" set to "false"', (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: 'master', name: "master",
channel: false, channel: false,
tags: [ tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, { gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]}, { gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
], ],
}, },
branches: [ branches: [
{name: 'master', channel: false}, { name: "master", channel: false },
{name: 'next', channel: false}, { name: "next", channel: false },
{name: 'next-major', channel: false}, { name: "next-major", channel: false },
], ],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.is(result, undefined); t.is(result, undefined);
}); });
test('Exclude versions number less than the latest version already released on that branch', (t) => { test("Exclude versions number less than the latest version already released on that branch", (t) => {
const result = getReleaseToAdd({ const result = getReleaseToAdd({
branch: { branch: {
name: '2.x', name: "2.x",
channel: '2.x', channel: "2.x",
type: 'maintenance', type: "maintenance",
mergeRange: '>=2.0.0 <3.0.0', mergeRange: ">=2.0.0 <3.0.0",
tags: [ tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['2.x']}, { gitTag: "v2.0.0", version: "2.0.0", channels: ["2.x"] },
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]}, { gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]}, { gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null, '2.x']}, { gitTag: "v2.1.1", version: "2.1.1", channels: [null, "2.x"] },
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}, { gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, { gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
], ],
}, },
branches: [{name: '2.x', channel: '2.x'}, {name: 'master'}], branches: [{ name: "2.x", channel: "2.x" }, { name: "master" }],
options: {tagFormat: `v\${version}`}, options: { tagFormat: `v\${version}` },
}); });
t.is(result, undefined); t.is(result, undefined);

View File

@ -1,5 +1,5 @@
import test from 'ava'; import test from "ava";
import {temporaryDirectory} from 'tempy'; import { temporaryDirectory } from "tempy";
import { import {
addNote, addNote,
fetch, fetch,
@ -15,8 +15,8 @@ import {
push, push,
repoUrl, repoUrl,
tag, tag,
verifyTagName verifyTagName,
} from '../lib/git.js'; } from "../lib/git.js";
import { import {
gitAddConfig, gitAddConfig,
gitAddNote, gitAddNote,
@ -33,385 +33,385 @@ import {
gitRepo, gitRepo,
gitShallowClone, gitShallowClone,
gitTagVersion, gitTagVersion,
initGit initGit,
} from './helpers/git-utils.js'; } from "./helpers/git-utils.js";
test('Get the last commit sha', async (t) => { test("Get the last commit sha", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
const result = await getGitHead({cwd}); const result = await getGitHead({ cwd });
t.is(result, commits[0].hash); t.is(result, commits[0].hash);
}); });
test('Throw error if the last commit sha cannot be found', async (t) => { test("Throw error if the last commit sha cannot be found", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
await t.throwsAsync(getGitHead({cwd})); await t.throwsAsync(getGitHead({ cwd }));
}); });
test('Unshallow and fetch repository', async (t) => { test("Unshallow and fetch repository", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
await gitCommits(['First', 'Second'], {cwd}); await gitCommits(["First", "Second"], { cwd });
// Create a shallow clone with only 1 commit // Create a shallow clone with only 1 commit
cwd = await gitShallowClone(repositoryUrl); cwd = await gitShallowClone(repositoryUrl);
// Verify the shallow clone contains only one commit // Verify the shallow clone contains only one commit
t.is((await gitGetCommits(undefined, {cwd})).length, 1); t.is((await gitGetCommits(undefined, { cwd })).length, 1);
await fetch(repositoryUrl, 'master', 'master', {cwd}); await fetch(repositoryUrl, "master", "master", { cwd });
// Verify the shallow clone contains all the commits // Verify the shallow clone contains all the commits
t.is((await gitGetCommits(undefined, {cwd})).length, 2); t.is((await gitGetCommits(undefined, { cwd })).length, 2);
}); });
test('Do not throw error when unshallow a complete repository', async (t) => { test("Do not throw error when unshallow a complete repository", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout('second-branch', true, {cwd}); await gitCheckout("second-branch", true, { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, 'second-branch', {cwd}); await gitPush(repositoryUrl, "second-branch", { cwd });
await t.notThrowsAsync(fetch(repositoryUrl, 'master', 'master', {cwd})); await t.notThrowsAsync(fetch(repositoryUrl, "master", "master", { cwd }));
await t.notThrowsAsync(fetch(repositoryUrl, 'second-branch', 'master', {cwd})); await t.notThrowsAsync(fetch(repositoryUrl, "second-branch", "master", { cwd }));
}); });
test('Fetch all tags on a detached head repository', async (t) => { test("Fetch all tags on a detached head repository", async (t) => {
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitTagVersion('v1.0.0', undefined, {cwd}); await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitTagVersion('v1.0.1', undefined, {cwd}); await gitTagVersion("v1.0.1", undefined, { cwd });
const [commit] = await gitCommits(['Third'], {cwd}); const [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion('v1.1.0', undefined, {cwd}); await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash); cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd}); await fetch(repositoryUrl, "master", "master", { cwd });
t.deepEqual((await getTags('master', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0'].sort()); t.deepEqual((await getTags("master", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0"].sort());
}); });
test('Fetch all tags on a repository with a detached head from branch (CircleCI)', async (t) => { test("Fetch all tags on a repository with a detached head from branch (CircleCI)", async (t) => {
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitTagVersion('v1.0.0', undefined, {cwd}); await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitTagVersion('v1.0.1', undefined, {cwd}); await gitTagVersion("v1.0.1", undefined, { cwd });
const [commit] = await gitCommits(['Third'], {cwd}); const [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion('v1.1.0', undefined, {cwd}); await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout('other-branch', true, {cwd}); await gitCheckout("other-branch", true, { cwd });
await gitPush(repositoryUrl, 'other-branch', {cwd}); await gitPush(repositoryUrl, "other-branch", { cwd });
await gitCheckout('master', false, {cwd}); await gitCheckout("master", false, { cwd });
await gitCommits(['Fourth'], {cwd}); await gitCommits(["Fourth"], { cwd });
await gitTagVersion('v2.0.0', undefined, {cwd}); await gitTagVersion("v2.0.0", undefined, { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHeadFromBranch(repositoryUrl, 'other-branch', commit.hash); cwd = await gitDetachedHeadFromBranch(repositoryUrl, "other-branch", commit.hash);
await fetch(repositoryUrl, 'master', 'other-branch', {cwd}); await fetch(repositoryUrl, "master", "other-branch", { cwd });
await fetch(repositoryUrl, 'other-branch', 'other-branch', {cwd}); await fetch(repositoryUrl, "other-branch", "other-branch", { cwd });
t.deepEqual((await getTags('other-branch', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0'].sort()); t.deepEqual((await getTags("other-branch", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0"].sort());
t.deepEqual((await getTags('master', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0', 'v2.0.0'].sort()); t.deepEqual((await getTags("master", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0", "v2.0.0"].sort());
}); });
test('Fetch all tags on a detached head repository with outdated cached repo (GitLab CI)', async (t) => { test("Fetch all tags on a detached head repository with outdated cached repo (GitLab CI)", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(); const { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitTagVersion('v1.0.0', undefined, {cwd}); await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitTagVersion('v1.0.1', undefined, {cwd}); await gitTagVersion("v1.0.1", undefined, { cwd });
let [commit] = await gitCommits(['Third'], {cwd}); let [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion('v1.1.0', undefined, {cwd}); await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
// Create a clone (as first CI run would) // Create a clone (as first CI run would)
const cloneCwd = await gitShallowClone(repositoryUrl); const cloneCwd = await gitShallowClone(repositoryUrl);
await gitFetch(repositoryUrl, {cwd: cloneCwd}); await gitFetch(repositoryUrl, { cwd: cloneCwd });
await gitCheckout(commit.hash, false, {cwd: cloneCwd}); await gitCheckout(commit.hash, false, { cwd: cloneCwd });
// Push tag to remote // Push tag to remote
[commit] = await gitCommits(['Fourth'], {cwd}); [commit] = await gitCommits(["Fourth"], { cwd });
await gitTagVersion('v1.2.0', undefined, {cwd}); await gitTagVersion("v1.2.0", undefined, { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
// Fetch on the cached repo and make detached head, leaving master outdated // Fetch on the cached repo and make detached head, leaving master outdated
await fetch(repositoryUrl, 'master', 'master', {cwd: cloneCwd}); await fetch(repositoryUrl, "master", "master", { cwd: cloneCwd });
await gitCheckout(commit.hash, false, {cwd: cloneCwd}); await gitCheckout(commit.hash, false, { cwd: cloneCwd });
t.deepEqual((await getTags('master', {cwd: cloneCwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0', 'v1.2.0'].sort()); t.deepEqual((await getTags("master", { cwd: cloneCwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0", "v1.2.0"].sort());
}); });
test('Verify if a branch exists', async (t) => { test("Verify if a branch exists", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
// Create the new branch 'other-branch' from master // Create the new branch 'other-branch' from master
await gitCheckout('other-branch', true, {cwd}); await gitCheckout("other-branch", true, { cwd });
// Add commits to the 'other-branch' branch // Add commits to the 'other-branch' branch
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
t.true(await isRefExists('master', {cwd})); t.true(await isRefExists("master", { cwd }));
t.true(await isRefExists('other-branch', {cwd})); t.true(await isRefExists("other-branch", { cwd }));
t.falsy(await isRefExists('next', {cwd})); t.falsy(await isRefExists("next", { cwd }));
}); });
test('Get all branches', async (t) => { test("Get all branches", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout('second-branch', true, {cwd}); await gitCheckout("second-branch", true, { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, 'second-branch', {cwd}); await gitPush(repositoryUrl, "second-branch", { cwd });
await gitCheckout('third-branch', true, {cwd}); await gitCheckout("third-branch", true, { cwd });
await gitCommits(['Third'], {cwd}); await gitCommits(["Third"], { cwd });
await gitPush(repositoryUrl, 'third-branch', {cwd}); await gitPush(repositoryUrl, "third-branch", { cwd });
t.deepEqual((await getBranches(repositoryUrl, {cwd})).sort(), ['master', 'second-branch', 'third-branch'].sort()); t.deepEqual((await getBranches(repositoryUrl, { cwd })).sort(), ["master", "second-branch", "third-branch"].sort());
}); });
test('Return empty array if there are no branches', async (t) => { test("Return empty array if there are no branches", async (t) => {
const {cwd, repositoryUrl} = await initGit(true); const { cwd, repositoryUrl } = await initGit(true);
t.deepEqual(await getBranches(repositoryUrl, {cwd}), []); t.deepEqual(await getBranches(repositoryUrl, { cwd }), []);
}); });
test('Get the commit sha for a given tag', async (t) => { test("Get the commit sha for a given tag", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
// Create the tag corresponding to version 1.0.0 // Create the tag corresponding to version 1.0.0
await gitTagVersion('v1.0.0', undefined, {cwd}); await gitTagVersion("v1.0.0", undefined, { cwd });
t.is(await getTagHead('v1.0.0', {cwd}), commits[0].hash); t.is(await getTagHead("v1.0.0", { cwd }), commits[0].hash);
}); });
test('Return git remote repository url from config', async (t) => { test("Return git remote repository url from config", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add remote.origin.url config // Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'git@hostname.com:owner/package.git', {cwd}); await gitAddConfig("remote.origin.url", "git@hostname.com:owner/package.git", { cwd });
t.is(await repoUrl({cwd}), 'git@hostname.com:owner/package.git'); t.is(await repoUrl({ cwd }), "git@hostname.com:owner/package.git");
}); });
test('Return git remote repository url set while cloning', async (t) => { test("Return git remote repository url set while cloning", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
// Create a clone // Create a clone
cwd = await gitShallowClone(repositoryUrl); cwd = await gitShallowClone(repositoryUrl);
t.is(await repoUrl({cwd}), repositoryUrl); t.is(await repoUrl({ cwd }), repositoryUrl);
}); });
test('Return falsy if git repository url is not set', async (t) => { test("Return falsy if git repository url is not set", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
t.falsy(await repoUrl({cwd})); t.falsy(await repoUrl({ cwd }));
}); });
test('Add tag on head commit', async (t) => { test("Add tag on head commit", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const commits = await gitCommits(['Test commit'], {cwd}); const commits = await gitCommits(["Test commit"], { cwd });
await tag('tag_name', 'HEAD', {cwd}); await tag("tag_name", "HEAD", { cwd });
await t.is(await gitCommitTag(commits[0].hash, {cwd}), 'tag_name'); await t.is(await gitCommitTag(commits[0].hash, { cwd }), "tag_name");
}); });
test('Push tag to remote repository', async (t) => { test("Push tag to remote repository", async (t) => {
// Create a git repository with a remote, set the current working directory at the root of the repo // Create a git repository with a remote, set the current working directory at the root of the repo
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const commits = await gitCommits(['Test commit'], {cwd}); const commits = await gitCommits(["Test commit"], { cwd });
await tag('tag_name', 'HEAD', {cwd}); await tag("tag_name", "HEAD", { cwd });
await push(repositoryUrl, {cwd}); await push(repositoryUrl, { cwd });
t.is(await gitRemoteTagHead(repositoryUrl, 'tag_name', {cwd}), commits[0].hash); t.is(await gitRemoteTagHead(repositoryUrl, "tag_name", { cwd }), commits[0].hash);
}); });
test('Push tag to remote repository with remote branch ahead', async (t) => { test("Push tag to remote repository with remote branch ahead", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
const temporaryRepo = await gitShallowClone(repositoryUrl); const temporaryRepo = await gitShallowClone(repositoryUrl);
await gitCommits(['Second'], {cwd: temporaryRepo}); await gitCommits(["Second"], { cwd: temporaryRepo });
await gitPush('origin', 'master', {cwd: temporaryRepo}); await gitPush("origin", "master", { cwd: temporaryRepo });
await tag('tag_name', 'HEAD', {cwd}); await tag("tag_name", "HEAD", { cwd });
await push(repositoryUrl, {cwd}); await push(repositoryUrl, { cwd });
t.is(await gitRemoteTagHead(repositoryUrl, 'tag_name', {cwd}), commits[0].hash); t.is(await gitRemoteTagHead(repositoryUrl, "tag_name", { cwd }), commits[0].hash);
}); });
test('Return "true" if in a Git repository', async (t) => { test('Return "true" if in a Git repository', async (t) => {
// Create a git repository with a remote, set the current working directory at the root of the repo // Create a git repository with a remote, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true); const { cwd } = await gitRepo(true);
t.true(await isGitRepo({cwd})); t.true(await isGitRepo({ cwd }));
}); });
test('Return falsy if not in a Git repository', async (t) => { test("Return falsy if not in a Git repository", async (t) => {
const cwd = temporaryDirectory(); const cwd = temporaryDirectory();
t.falsy(await isGitRepo({cwd})); t.falsy(await isGitRepo({ cwd }));
}); });
test('Return "true" for valid tag names', async (t) => { test('Return "true" for valid tag names', async (t) => {
t.true(await verifyTagName('1.0.0')); t.true(await verifyTagName("1.0.0"));
t.true(await verifyTagName('v1.0.0')); t.true(await verifyTagName("v1.0.0"));
t.true(await verifyTagName('tag_name')); t.true(await verifyTagName("tag_name"));
t.true(await verifyTagName('tag/name')); t.true(await verifyTagName("tag/name"));
}); });
test('Return falsy for invalid tag names', async (t) => { test("Return falsy for invalid tag names", async (t) => {
t.falsy(await verifyTagName('?1.0.0')); t.falsy(await verifyTagName("?1.0.0"));
t.falsy(await verifyTagName('*1.0.0')); t.falsy(await verifyTagName("*1.0.0"));
t.falsy(await verifyTagName('[1.0.0]')); t.falsy(await verifyTagName("[1.0.0]"));
t.falsy(await verifyTagName('1.0.0..')); t.falsy(await verifyTagName("1.0.0.."));
}); });
test('Throws error if obtaining the tags fails', async (t) => { test("Throws error if obtaining the tags fails", async (t) => {
const cwd = temporaryDirectory(); const cwd = temporaryDirectory();
await t.throwsAsync(getTags('master', {cwd})); await t.throwsAsync(getTags("master", { cwd }));
}); });
test('Return "true" if repository is up to date', async (t) => { test('Return "true" if repository is up to date', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
t.true(await isBranchUpToDate(repositoryUrl, 'master', {cwd})); t.true(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
}); });
test('Return falsy if repository is not up to date', async (t) => { test("Return falsy if repository is not up to date", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
t.true(await isBranchUpToDate(repositoryUrl, 'master', {cwd})); t.true(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
const temporaryRepo = await gitShallowClone(repositoryUrl); const temporaryRepo = await gitShallowClone(repositoryUrl);
await gitCommits(['Third'], {cwd: temporaryRepo}); await gitCommits(["Third"], { cwd: temporaryRepo });
await gitPush('origin', 'master', {cwd: temporaryRepo}); await gitPush("origin", "master", { cwd: temporaryRepo });
t.falsy(await isBranchUpToDate(repositoryUrl, 'master', {cwd})); t.falsy(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
}); });
test('Return falsy if detached head repository is not up to date', async (t) => { test("Return falsy if detached head repository is not up to date", async (t) => {
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
const [commit] = await gitCommits(['First'], {cwd}); const [commit] = await gitCommits(["First"], { cwd });
await gitCommits(['Second'], {cwd}); await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash); cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd}); await fetch(repositoryUrl, "master", "master", { cwd });
t.falsy(await isBranchUpToDate(repositoryUrl, 'master', {cwd})); t.falsy(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
}); });
test('Get a commit note', async (t) => { test("Get a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
await gitAddNote(JSON.stringify({note: 'note'}), commits[0].hash, {cwd}); await gitAddNote(JSON.stringify({ note: "note" }), commits[0].hash, { cwd });
t.deepEqual(await getNote(commits[0].hash, {cwd}), {note: 'note'}); t.deepEqual(await getNote(commits[0].hash, { cwd }), { note: "note" });
}); });
test('Return empty object if there is no commit note', async (t) => { test("Return empty object if there is no commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
t.deepEqual(await getNote(commits[0].hash, {cwd}), {}); t.deepEqual(await getNote(commits[0].hash, { cwd }), {});
}); });
test('Throw error if a commit note in invalid', async (t) => { test("Throw error if a commit note in invalid", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
await gitAddNote('non-json note', commits[0].hash, {cwd}); await gitAddNote("non-json note", commits[0].hash, { cwd });
await t.throwsAsync(getNote(commits[0].hash, {cwd})); await t.throwsAsync(getNote(commits[0].hash, { cwd }));
}); });
test('Add a commit note', async (t) => { test("Add a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
await addNote({note: 'note'}, commits[0].hash, {cwd}); await addNote({ note: "note" }, commits[0].hash, { cwd });
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note"}'); t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note"}');
}); });
test('Overwrite a commit note', async (t) => { test("Overwrite a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First'], {cwd}); const commits = await gitCommits(["First"], { cwd });
await addNote({note: 'note'}, commits[0].hash, {cwd}); await addNote({ note: "note" }, commits[0].hash, { cwd });
await addNote({note: 'note2'}, commits[0].hash, {cwd}); await addNote({ note: "note2" }, commits[0].hash, { cwd });
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note2"}'); t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note2"}');
}); });
test('Unshallow and fetch repository with notes', async (t) => { test("Unshallow and fetch repository with notes", async (t) => {
// Create a git repository, set the current working directory at the root of the repo // Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch // Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd}); const commits = await gitCommits(["First", "Second"], { cwd });
await gitAddNote(JSON.stringify({note: 'note'}), commits[0].hash, {cwd}); await gitAddNote(JSON.stringify({ note: "note" }), commits[0].hash, { cwd });
// Create a shallow clone with only 1 commit // Create a shallow clone with only 1 commit
cwd = await gitShallowClone(repositoryUrl); cwd = await gitShallowClone(repositoryUrl);
// Verify the shallow clone doesn't contains the note // Verify the shallow clone doesn't contains the note
await t.throwsAsync(gitGetNote(commits[0].hash, {cwd})); await t.throwsAsync(gitGetNote(commits[0].hash, { cwd }));
await fetch(repositoryUrl, 'master', 'master', {cwd}); await fetch(repositoryUrl, "master", "master", { cwd });
await fetchNotes(repositoryUrl, {cwd}); await fetchNotes(repositoryUrl, { cwd });
// Verify the shallow clone contains the note // Verify the shallow clone contains the note
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note"}'); t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note"}');
}); });
test('Fetch all notes on a detached head repository', async (t) => { test("Fetch all notes on a detached head repository", async (t) => {
let {cwd, repositoryUrl} = await gitRepo(); let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd}); await gitCommits(["First"], { cwd });
const [commit] = await gitCommits(['Second'], {cwd}); const [commit] = await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, 'master', {cwd}); await gitPush(repositoryUrl, "master", { cwd });
await gitAddNote(JSON.stringify({note: 'note'}), commit.hash, {cwd}); await gitAddNote(JSON.stringify({ note: "note" }), commit.hash, { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash); cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd}); await fetch(repositoryUrl, "master", "master", { cwd });
await fetchNotes(repositoryUrl, {cwd}); await fetchNotes(repositoryUrl, { cwd });
t.is(await gitGetNote(commit.hash, {cwd}), '{"note":"note"}'); t.is(await gitGetNote(commit.hash, { cwd }), '{"note":"note"}');
}); });

View File

@ -1,18 +1,18 @@
import test from 'ava'; import test from "ava";
import {repeat} from 'lodash-es'; import { repeat } from "lodash-es";
import hideSensitive from '../lib/hide-sensitive.js'; import hideSensitive from "../lib/hide-sensitive.js";
import {SECRET_MIN_SIZE, SECRET_REPLACEMENT} from '../lib/definitions/constants.js'; import { SECRET_MIN_SIZE, SECRET_REPLACEMENT } from "../lib/definitions/constants.js";
test('Replace multiple sensitive environment variable values', (t) => { test("Replace multiple sensitive environment variable values", (t) => {
const env = {SOME_PASSWORD: 'password', SOME_TOKEN: 'secret'}; const env = { SOME_PASSWORD: "password", SOME_TOKEN: "secret" };
t.is( t.is(
hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=${env.SOME_TOKEN}`), hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=${env.SOME_TOKEN}`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}` `https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}`
); );
}); });
test('Replace multiple occurences of sensitive environment variable values', (t) => { test("Replace multiple occurences of sensitive environment variable values", (t) => {
const env = {secretKey: 'secret'}; const env = { secretKey: "secret" };
t.is( t.is(
hideSensitive(env)(`https://user:${env.secretKey}@host.com?token=${env.secretKey}`), hideSensitive(env)(`https://user:${env.secretKey}@host.com?token=${env.secretKey}`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}` `https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}`
@ -20,28 +20,28 @@ test('Replace multiple occurences of sensitive environment variable values', (t)
}); });
test('Replace sensitive environment variable matching specific regex for "private"', (t) => { test('Replace sensitive environment variable matching specific regex for "private"', (t) => {
const env = {privateKey: 'secret', GOPRIVATE: 'host.com'}; const env = { privateKey: "secret", GOPRIVATE: "host.com" };
t.is(hideSensitive(env)(`https://host.com?token=${env.privateKey}`), `https://host.com?token=${SECRET_REPLACEMENT}`); t.is(hideSensitive(env)(`https://host.com?token=${env.privateKey}`), `https://host.com?token=${SECRET_REPLACEMENT}`);
}); });
test('Replace url-encoded environment variable', (t) => { test("Replace url-encoded environment variable", (t) => {
const env = {privateKey: 'secret '}; const env = { privateKey: "secret " };
t.is( t.is(
hideSensitive(env)(`https://host.com?token=${encodeURI(env.privateKey)}`), hideSensitive(env)(`https://host.com?token=${encodeURI(env.privateKey)}`),
`https://host.com?token=${SECRET_REPLACEMENT}` `https://host.com?token=${SECRET_REPLACEMENT}`
); );
}); });
test('Escape regexp special characters', (t) => { test("Escape regexp special characters", (t) => {
const env = {SOME_CREDENTIALS: 'p$^{.+}\\w[a-z]o.*rd'}; const env = { SOME_CREDENTIALS: "p$^{.+}\\w[a-z]o.*rd" };
t.is( t.is(
hideSensitive(env)(`https://user:${env.SOME_CREDENTIALS}@host.com`), hideSensitive(env)(`https://user:${env.SOME_CREDENTIALS}@host.com`),
`https://user:${SECRET_REPLACEMENT}@host.com` `https://user:${SECRET_REPLACEMENT}@host.com`
); );
}); });
test('Escape regexp special characters in url-encoded environment variable', (t) => { test("Escape regexp special characters in url-encoded environment variable", (t) => {
const env = {SOME_PASSWORD: 'secret password p$^{.+}\\w[a-z]o.*rd)('}; const env = { SOME_PASSWORD: "secret password p$^{.+}\\w[a-z]o.*rd)(" };
t.is( t.is(
hideSensitive(env)(`https://user:${encodeURI(env.SOME_PASSWORD)}@host.com`), hideSensitive(env)(`https://user:${encodeURI(env.SOME_PASSWORD)}@host.com`),
`https://user:${SECRET_REPLACEMENT}@host.com` `https://user:${SECRET_REPLACEMENT}@host.com`
@ -52,31 +52,31 @@ test('Accept "undefined" input', (t) => {
t.is(hideSensitive({})(), undefined); t.is(hideSensitive({})(), undefined);
}); });
test('Return same string if no environment variable has to be replaced', (t) => { test("Return same string if no environment variable has to be replaced", (t) => {
t.is(hideSensitive({})('test'), 'test'); t.is(hideSensitive({})("test"), "test");
}); });
test('Exclude empty environment variables from the regexp', (t) => { test("Exclude empty environment variables from the regexp", (t) => {
const env = {SOME_PASSWORD: 'password', SOME_TOKEN: ''}; const env = { SOME_PASSWORD: "password", SOME_TOKEN: "" };
t.is( t.is(
hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=`), hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=` `https://user:${SECRET_REPLACEMENT}@host.com?token=`
); );
}); });
test('Exclude empty environment variables from the regexp if there is only empty ones', (t) => { test("Exclude empty environment variables from the regexp if there is only empty ones", (t) => {
t.is(hideSensitive({SOME_PASSWORD: '', SOME_TOKEN: ' \n '})(`https://host.com?token=`), 'https://host.com?token='); t.is(hideSensitive({ SOME_PASSWORD: "", SOME_TOKEN: " \n " })(`https://host.com?token=`), "https://host.com?token=");
}); });
test('Exclude nonsensitive GOPRIVATE environment variable for Golang projects from the regexp', (t) => { test("Exclude nonsensitive GOPRIVATE environment variable for Golang projects from the regexp", (t) => {
const env = {GOPRIVATE: 'host.com'}; const env = { GOPRIVATE: "host.com" };
t.is(hideSensitive(env)(`https://host.com?token=`), 'https://host.com?token='); t.is(hideSensitive(env)(`https://host.com?token=`), "https://host.com?token=");
}); });
test('Exclude environment variables with value shorter than SECRET_MIN_SIZE from the regexp', (t) => { test("Exclude environment variables with value shorter than SECRET_MIN_SIZE from the regexp", (t) => {
const SHORT_TOKEN = repeat('a', SECRET_MIN_SIZE - 1); const SHORT_TOKEN = repeat("a", SECRET_MIN_SIZE - 1);
const LONG_TOKEN = repeat('b', SECRET_MIN_SIZE); const LONG_TOKEN = repeat("b", SECRET_MIN_SIZE);
const env = {SHORT_TOKEN, LONG_TOKEN}; const env = { SHORT_TOKEN, LONG_TOKEN };
t.is( t.is(
hideSensitive(env)(`https://user:${SHORT_TOKEN}@host.com?token=${LONG_TOKEN}`), hideSensitive(env)(`https://user:${SHORT_TOKEN}@host.com?token=${LONG_TOKEN}`),
`https://user:${SHORT_TOKEN}@host.com?token=${SECRET_REPLACEMENT}` `https://user:${SHORT_TOKEN}@host.com?token=${SECRET_REPLACEMENT}`

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,5 @@
import test from 'ava'; import test from "ava";
import AggregateError from 'aggregate-error'; import AggregateError from "aggregate-error";
import { import {
extractErrors, extractErrors,
getEarliestVersion, getEarliestVersion,
@ -14,173 +14,176 @@ import {
isSameChannel, isSameChannel,
lowest, lowest,
makeTag, makeTag,
tagsToVersions tagsToVersions,
} from '../lib/utils.js'; } from "../lib/utils.js";
test('extractErrors', (t) => { test("extractErrors", (t) => {
const errors = [new Error('Error 1'), new Error('Error 2')]; const errors = [new Error("Error 1"), new Error("Error 2")];
t.deepEqual(extractErrors(new AggregateError(errors)), errors); t.deepEqual(extractErrors(new AggregateError(errors)), errors);
t.deepEqual(extractErrors(errors[0]), [errors[0]]); t.deepEqual(extractErrors(errors[0]), [errors[0]]);
}); });
test('tagsToVersions', (t) => { test("tagsToVersions", (t) => {
t.deepEqual(tagsToVersions([{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]), [ t.deepEqual(tagsToVersions([{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }]), [
'1.0.0', "1.0.0",
'1.1.0', "1.1.0",
'1.2.0', "1.2.0",
]); ]);
}); });
test('isMajorRange', (t) => { test("isMajorRange", (t) => {
t.false(isMajorRange('1.1.x')); t.false(isMajorRange("1.1.x"));
t.false(isMajorRange('1.11.x')); t.false(isMajorRange("1.11.x"));
t.false(isMajorRange('11.1.x')); t.false(isMajorRange("11.1.x"));
t.false(isMajorRange('11.11.x')); t.false(isMajorRange("11.11.x"));
t.false(isMajorRange('1.1.X')); t.false(isMajorRange("1.1.X"));
t.false(isMajorRange('1.1.0')); t.false(isMajorRange("1.1.0"));
t.true(isMajorRange('1.x.x')); t.true(isMajorRange("1.x.x"));
t.true(isMajorRange('11.x.x')); t.true(isMajorRange("11.x.x"));
t.true(isMajorRange('1.X.X')); t.true(isMajorRange("1.X.X"));
t.true(isMajorRange('1.x')); t.true(isMajorRange("1.x"));
t.true(isMajorRange('11.x')); t.true(isMajorRange("11.x"));
t.true(isMajorRange('1.X')); t.true(isMajorRange("1.X"));
}); });
test('isMaintenanceRange', (t) => { test("isMaintenanceRange", (t) => {
t.true(isMaintenanceRange('1.1.x')); t.true(isMaintenanceRange("1.1.x"));
t.true(isMaintenanceRange('11.1.x')); t.true(isMaintenanceRange("11.1.x"));
t.true(isMaintenanceRange('11.11.x')); t.true(isMaintenanceRange("11.11.x"));
t.true(isMaintenanceRange('1.11.x')); t.true(isMaintenanceRange("1.11.x"));
t.true(isMaintenanceRange('1.x.x')); t.true(isMaintenanceRange("1.x.x"));
t.true(isMaintenanceRange('11.x.x')); t.true(isMaintenanceRange("11.x.x"));
t.true(isMaintenanceRange('1.x')); t.true(isMaintenanceRange("1.x"));
t.true(isMaintenanceRange('11.x')); t.true(isMaintenanceRange("11.x"));
t.true(isMaintenanceRange('1.1.X')); t.true(isMaintenanceRange("1.1.X"));
t.true(isMaintenanceRange('1.X.X')); t.true(isMaintenanceRange("1.X.X"));
t.true(isMaintenanceRange('1.X')); t.true(isMaintenanceRange("1.X"));
t.false(isMaintenanceRange('1.1.0')); t.false(isMaintenanceRange("1.1.0"));
t.false(isMaintenanceRange('11.1.0')); t.false(isMaintenanceRange("11.1.0"));
t.false(isMaintenanceRange('1.11.0')); t.false(isMaintenanceRange("1.11.0"));
t.false(isMaintenanceRange('11.11.0')); t.false(isMaintenanceRange("11.11.0"));
t.false(isMaintenanceRange('~1.0.0')); t.false(isMaintenanceRange("~1.0.0"));
t.false(isMaintenanceRange('^1.0.0')); t.false(isMaintenanceRange("^1.0.0"));
}); });
test('getUpperBound', (t) => { test("getUpperBound", (t) => {
t.is(getUpperBound('1.x.x'), '2.0.0'); t.is(getUpperBound("1.x.x"), "2.0.0");
t.is(getUpperBound('1.X.X'), '2.0.0'); t.is(getUpperBound("1.X.X"), "2.0.0");
t.is(getUpperBound('10.x.x'), '11.0.0'); t.is(getUpperBound("10.x.x"), "11.0.0");
t.is(getUpperBound('1.x'), '2.0.0'); t.is(getUpperBound("1.x"), "2.0.0");
t.is(getUpperBound('10.x'), '11.0.0'); t.is(getUpperBound("10.x"), "11.0.0");
t.is(getUpperBound('1.0.x'), '1.1.0'); t.is(getUpperBound("1.0.x"), "1.1.0");
t.is(getUpperBound('10.0.x'), '10.1.0'); t.is(getUpperBound("10.0.x"), "10.1.0");
t.is(getUpperBound('10.10.x'), '10.11.0'); t.is(getUpperBound("10.10.x"), "10.11.0");
t.is(getUpperBound('1.0.0'), '1.0.0'); t.is(getUpperBound("1.0.0"), "1.0.0");
t.is(getUpperBound('10.0.0'), '10.0.0'); t.is(getUpperBound("10.0.0"), "10.0.0");
t.is(getUpperBound('foo'), undefined); t.is(getUpperBound("foo"), undefined);
}); });
test('getLowerBound', (t) => { test("getLowerBound", (t) => {
t.is(getLowerBound('1.x.x'), '1.0.0'); t.is(getLowerBound("1.x.x"), "1.0.0");
t.is(getLowerBound('1.X.X'), '1.0.0'); t.is(getLowerBound("1.X.X"), "1.0.0");
t.is(getLowerBound('10.x.x'), '10.0.0'); t.is(getLowerBound("10.x.x"), "10.0.0");
t.is(getLowerBound('1.x'), '1.0.0'); t.is(getLowerBound("1.x"), "1.0.0");
t.is(getLowerBound('10.x'), '10.0.0'); t.is(getLowerBound("10.x"), "10.0.0");
t.is(getLowerBound('1.0.x'), '1.0.0'); t.is(getLowerBound("1.0.x"), "1.0.0");
t.is(getLowerBound('10.0.x'), '10.0.0'); t.is(getLowerBound("10.0.x"), "10.0.0");
t.is(getLowerBound('1.10.x'), '1.10.0'); t.is(getLowerBound("1.10.x"), "1.10.0");
t.is(getLowerBound('1.0.0'), '1.0.0'); t.is(getLowerBound("1.0.0"), "1.0.0");
t.is(getLowerBound('10.0.0'), '10.0.0'); t.is(getLowerBound("10.0.0"), "10.0.0");
t.is(getLowerBound('foo'), undefined); t.is(getLowerBound("foo"), undefined);
}); });
test('highest', (t) => { test("highest", (t) => {
t.is(highest('1.0.0', '2.0.0'), '2.0.0'); t.is(highest("1.0.0", "2.0.0"), "2.0.0");
t.is(highest('1.1.1', '1.1.0'), '1.1.1'); t.is(highest("1.1.1", "1.1.0"), "1.1.1");
t.is(highest(null, '1.0.0'), '1.0.0'); t.is(highest(null, "1.0.0"), "1.0.0");
t.is(highest('1.0.0'), '1.0.0'); t.is(highest("1.0.0"), "1.0.0");
t.is(highest(), undefined); t.is(highest(), undefined);
}); });
test('lowest', (t) => { test("lowest", (t) => {
t.is(lowest('1.0.0', '2.0.0'), '1.0.0'); t.is(lowest("1.0.0", "2.0.0"), "1.0.0");
t.is(lowest('1.1.1', '1.1.0'), '1.1.0'); t.is(lowest("1.1.1", "1.1.0"), "1.1.0");
t.is(lowest(null, '1.0.0'), '1.0.0'); t.is(lowest(null, "1.0.0"), "1.0.0");
t.is(lowest(), undefined); t.is(lowest(), undefined);
}); });
test.serial('getLatestVersion', (t) => { test.serial("getLatestVersion", (t) => {
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1']), '1.2.0'); t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"]), "1.2.0");
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined); t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1']), '1.2.0'); t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"]), "1.2.0");
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined); t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1'], {withPrerelease: true}), '1.2.3-alpha.3'); t.is(
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2'], {withPrerelease: true}), '1.2.3-alpha.3'); getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"], { withPrerelease: true }),
"1.2.3-alpha.3"
);
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"], { withPrerelease: true }), "1.2.3-alpha.3");
t.is(getLatestVersion([]), undefined); t.is(getLatestVersion([]), undefined);
}); });
test.serial('getEarliestVersion', (t) => { test.serial("getEarliestVersion", (t) => {
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.0', '1.0.1-alpha.1']), '1.0.0'); t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.0", "1.0.1-alpha.1"]), "1.0.0");
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined); t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.0', '1.0.1-alpha.1']), '1.0.0'); t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.0", "1.0.1-alpha.1"]), "1.0.0");
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined); t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is( t.is(
getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1'], {withPrerelease: true}), getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"], { withPrerelease: true }),
'1.0.0-alpha.1' "1.0.0-alpha.1"
); );
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2'], {withPrerelease: true}), '1.2.3-alpha.2'); t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"], { withPrerelease: true }), "1.2.3-alpha.2");
t.is(getEarliestVersion([]), undefined); t.is(getEarliestVersion([]), undefined);
}); });
test('getFirstVersion', (t) => { test("getFirstVersion", (t) => {
t.is(getFirstVersion(['1.2.0', '1.0.0', '1.3.0', '1.1.0', '1.4.0'], []), '1.0.0'); t.is(getFirstVersion(["1.2.0", "1.0.0", "1.3.0", "1.1.0", "1.4.0"], []), "1.0.0");
t.is( t.is(
getFirstVersion( getFirstVersion(
['1.2.0', '1.0.0', '1.3.0', '1.1.0', '1.4.0'], ["1.2.0", "1.0.0", "1.3.0", "1.1.0", "1.4.0"],
[ [
{name: 'master', tags: [{version: '1.0.0'}, {version: '1.1.0'}]}, { name: "master", tags: [{ version: "1.0.0" }, { version: "1.1.0" }] },
{name: 'next', tags: [{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]}, { name: "next", tags: [{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }] },
] ]
), ),
'1.3.0' "1.3.0"
); );
t.is( t.is(
getFirstVersion( getFirstVersion(
['1.2.0', '1.0.0', '1.1.0'], ["1.2.0", "1.0.0", "1.1.0"],
[ [
{name: 'master', tags: [{version: '1.0.0'}, {version: '1.1.0'}]}, { name: "master", tags: [{ version: "1.0.0" }, { version: "1.1.0" }] },
{name: 'next', tags: [{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]}, { name: "next", tags: [{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }] },
] ]
), ),
undefined undefined
); );
}); });
test('getRange', (t) => { test("getRange", (t) => {
t.is(getRange('1.0.0', '1.1.0'), '>=1.0.0 <1.1.0'); t.is(getRange("1.0.0", "1.1.0"), ">=1.0.0 <1.1.0");
t.is(getRange('1.0.0'), '>=1.0.0'); t.is(getRange("1.0.0"), ">=1.0.0");
}); });
test('makeTag', (t) => { test("makeTag", (t) => {
t.is(makeTag(`v\${version}`, '1.0.0'), 'v1.0.0'); t.is(makeTag(`v\${version}`, "1.0.0"), "v1.0.0");
}); });
test('isSameChannel', (t) => { test("isSameChannel", (t) => {
t.true(isSameChannel('next', 'next')); t.true(isSameChannel("next", "next"));
t.true(isSameChannel(null, undefined)); t.true(isSameChannel(null, undefined));
t.true(isSameChannel(false, undefined)); t.true(isSameChannel(false, undefined));
t.true(isSameChannel('', false)); t.true(isSameChannel("", false));
t.false(isSameChannel('next', false)); t.false(isSameChannel("next", false));
}); });

View File

@ -1,117 +1,117 @@
import test from 'ava'; import test from "ava";
import {temporaryDirectory} from 'tempy'; import { temporaryDirectory } from "tempy";
import verify from '../lib/verify.js'; import verify from "../lib/verify.js";
import {gitRepo} from './helpers/git-utils.js'; import { gitRepo } from "./helpers/git-utils.js";
test('Throw a AggregateError', async (t) => { test("Throw a AggregateError", async (t) => {
const {cwd} = await gitRepo(); const { cwd } = await gitRepo();
const options = {branches: [{name: 'master'}, {name: ''}]}; const options = { branches: [{ name: "master" }, { name: "" }] };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'ENOREPOURL'); t.is(errors[0].code, "ENOREPOURL");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
t.is(errors[1].name, 'SemanticReleaseError'); t.is(errors[1].name, "SemanticReleaseError");
t.is(errors[1].code, 'EINVALIDTAGFORMAT'); t.is(errors[1].code, "EINVALIDTAGFORMAT");
t.truthy(errors[1].message); t.truthy(errors[1].message);
t.truthy(errors[1].details); t.truthy(errors[1].details);
t.is(errors[2].name, 'SemanticReleaseError'); t.is(errors[2].name, "SemanticReleaseError");
t.is(errors[2].code, 'ETAGNOVERSION'); t.is(errors[2].code, "ETAGNOVERSION");
t.truthy(errors[2].message); t.truthy(errors[2].message);
t.truthy(errors[2].details); t.truthy(errors[2].details);
t.is(errors[3].name, 'SemanticReleaseError'); t.is(errors[3].name, "SemanticReleaseError");
t.is(errors[3].code, 'EINVALIDBRANCH'); t.is(errors[3].code, "EINVALIDBRANCH");
t.truthy(errors[3].message); t.truthy(errors[3].message);
t.truthy(errors[3].details); t.truthy(errors[3].details);
}); });
test('Throw a SemanticReleaseError if does not run on a git repository', async (t) => { test("Throw a SemanticReleaseError if does not run on a git repository", async (t) => {
const cwd = temporaryDirectory(); const cwd = temporaryDirectory();
const options = {branches: []}; const options = { branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'ENOGITREPO'); t.is(errors[0].code, "ENOGITREPO");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
}); });
test('Throw a SemanticReleaseError if the "tagFormat" is not valid', async (t) => { test('Throw a SemanticReleaseError if the "tagFormat" is not valid', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `?\${version}`, branches: []}; const options = { repositoryUrl, tagFormat: `?\${version}`, branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'EINVALIDTAGFORMAT'); t.is(errors[0].code, "EINVALIDTAGFORMAT");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
}); });
test('Throw a SemanticReleaseError if the "tagFormat" does not contains the "version" variable', async (t) => { test('Throw a SemanticReleaseError if the "tagFormat" does not contains the "version" variable', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const options = {repositoryUrl, tagFormat: 'test', branches: []}; const options = { repositoryUrl, tagFormat: "test", branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'ETAGNOVERSION'); t.is(errors[0].code, "ETAGNOVERSION");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
}); });
test('Throw a SemanticReleaseError if the "tagFormat" contains multiple "version" variables', async (t) => { test('Throw a SemanticReleaseError if the "tagFormat" contains multiple "version" variables', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `\${version}v\${version}`, branches: []}; const options = { repositoryUrl, tagFormat: `\${version}v\${version}`, branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'ETAGNOVERSION'); t.is(errors[0].code, "ETAGNOVERSION");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
}); });
test('Throw a SemanticReleaseError for each invalid branch', async (t) => { test("Throw a SemanticReleaseError for each invalid branch", async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const options = { const options = {
repositoryUrl, repositoryUrl,
tagFormat: `v\${version}`, tagFormat: `v\${version}`,
branches: [{name: ''}, {name: ' '}, {name: 1}, {}, {name: ''}, 1, 'master'], branches: [{ name: "" }, { name: " " }, { name: 1 }, {}, { name: "" }, 1, "master"],
}; };
const errors = [...(await t.throwsAsync(verify({cwd, options}))).errors]; const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError'); t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, 'EINVALIDBRANCH'); t.is(errors[0].code, "EINVALIDBRANCH");
t.truthy(errors[0].message); t.truthy(errors[0].message);
t.truthy(errors[0].details); t.truthy(errors[0].details);
t.is(errors[1].name, 'SemanticReleaseError'); t.is(errors[1].name, "SemanticReleaseError");
t.is(errors[1].code, 'EINVALIDBRANCH'); t.is(errors[1].code, "EINVALIDBRANCH");
t.truthy(errors[1].message); t.truthy(errors[1].message);
t.truthy(errors[1].details); t.truthy(errors[1].details);
t.is(errors[2].name, 'SemanticReleaseError'); t.is(errors[2].name, "SemanticReleaseError");
t.is(errors[2].code, 'EINVALIDBRANCH'); t.is(errors[2].code, "EINVALIDBRANCH");
t.truthy(errors[2].message); t.truthy(errors[2].message);
t.truthy(errors[2].details); t.truthy(errors[2].details);
t.is(errors[3].name, 'SemanticReleaseError'); t.is(errors[3].name, "SemanticReleaseError");
t.is(errors[3].code, 'EINVALIDBRANCH'); t.is(errors[3].code, "EINVALIDBRANCH");
t.truthy(errors[3].message); t.truthy(errors[3].message);
t.truthy(errors[3].details); t.truthy(errors[3].details);
t.is(errors[4].code, 'EINVALIDBRANCH'); t.is(errors[4].code, "EINVALIDBRANCH");
t.truthy(errors[4].message); t.truthy(errors[4].message);
t.truthy(errors[4].details); t.truthy(errors[4].details);
t.is(errors[5].code, 'EINVALIDBRANCH'); t.is(errors[5].code, "EINVALIDBRANCH");
t.truthy(errors[5].message); t.truthy(errors[5].message);
t.truthy(errors[5].details); t.truthy(errors[5].details);
}); });
test('Return "true" if all verification pass', async (t) => { test('Return "true" if all verification pass', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true); const { cwd, repositoryUrl } = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `v\${version}`, branches: [{name: 'master'}]}; const options = { repositoryUrl, tagFormat: `v\${version}`, branches: [{ name: "master" }] };
await t.notThrowsAsync(verify({cwd, options})); await t.notThrowsAsync(verify({ cwd, options }));
}); });