Merge pull request #2607 from semantic-release/beta

This commit is contained in:
Matt Travi 2023-01-06 13:54:06 -06:00 committed by GitHub
commit b9b5c7689f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
93 changed files with 5542 additions and 16468 deletions

View File

@ -17,9 +17,8 @@ jobs:
strategy:
matrix:
node-version:
- 14.17
- 16.0.0
- 17
- 18.0.0
- 19
runs-on: ubuntu-latest
@ -32,7 +31,9 @@ jobs:
with:
node-version: ${{ matrix.node-version }}
cache: npm
- run: npm ci
- run: npm clean-install
- name: Ensure dependencies are compatible with the version of node
run: npx ls-engines
- run: npm run test:ci
# separate job to set as required in branch protection,
@ -44,9 +45,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 16
node-version: lts/*
cache: npm
- run: npm ci
- name: Ensure dependencies are compatible with the version of node
run: npx ls-engines@0.4
- run: npm clean-install
- run: npm run lint

View File

@ -8,19 +8,19 @@ In the interest of fostering an open and welcoming environment, we as contributo
Examples of behavior that contributes to creating a positive environment include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
- Using welcoming and inclusive language
- Being respectful of differing viewpoints and experiences
- Gracefully accepting constructive criticism
- Focusing on what is best for the community
- Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting
- The use of sexualized language or imagery and unwelcome sexual attention or advances
- Trolling, insulting/derogatory comments, and personal or political attacks
- Public or private harassment
- Publishing others' private information, such as a physical or electronic address, without explicit permission
- Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities

View File

@ -3,6 +3,7 @@
✨ Thanks for contributing to **semantic-release**! ✨
As a contributor, here are the guidelines we would like you to follow:
- [Code of conduct](#code-of-conduct)
- [How can I contribute?](#how-can-i-contribute)
- [Using the issue tracker](#using-the-issue-tracker)
@ -74,24 +75,31 @@ Here is a summary of the steps to follow:
1. [Set up the workspace](#set-up-the-workspace)
2. If you cloned a while ago, get the latest changes from upstream and update dependencies:
```bash
$ git checkout master
$ git pull upstream master
$ rm -rf node_modules
$ npm install
```
3. Create a new topic branch (off the main project development branch) to contain your feature, change, or fix:
```bash
$ git checkout -b <topic-branch-name>
```
4. Make your code changes, following the [Coding rules](#coding-rules)
5. Push your topic branch up to your fork:
```bash
$ git push origin <topic-branch-name>
```
6. [Open a Pull Request](https://help.github.com/articles/creating-a-pull-request/#creating-the-pull-request) with a clear title and description.
**Tips**:
- For ambitious tasks, open a Pull Request as soon as possible with the `[WIP]` prefix in the title, in order to get feedback and help from the community.
- [Allow semantic-release maintainers to make changes to your Pull Request branch](https://help.github.com/articles/allowing-changes-to-a-pull-request-branch-created-from-a-fork).
This way, we can rebase it and make some minor changes if necessary.
@ -102,6 +110,7 @@ $ git push origin <topic-branch-name>
### Source code
To ensure consistency and quality throughout the source code, all code modifications must have:
- No [linting](#lint) errors
- A [test](#tests) for every possible case introduced by your code change
- **100%** test coverage
@ -112,6 +121,7 @@ To ensure consistency and quality throughout the source code, all code modificat
### Documentation
To ensure consistency and quality, all documentation modifications must:
- Refer to brand in [bold](https://help.github.com/articles/basic-writing-and-formatting-syntax/#styling-text) with proper capitalization, i.e. **GitHub**, **semantic-release**, **npm**
- Prefer [tables](https://help.github.com/articles/organizing-information-with-tables) over [lists](https://help.github.com/articles/basic-writing-and-formatting-syntax/#lists) when listing key values, i.e. List of options with their description
- Use [links](https://help.github.com/articles/basic-writing-and-formatting-syntax/#links) when you are referring to:
@ -133,6 +143,7 @@ To ensure consistency and quality, all documentation modifications must:
#### Atomic commits
If possible, make [atomic commits](https://en.wikipedia.org/wiki/Atomic_commit), which means:
- a commit should contain exactly one self-contained functional change
- a functional change should be contained in exactly one commit
- a commit should not create an inconsistent state (such as test errors, linting errors, partial fix, feature with documentation etc...)
@ -166,7 +177,7 @@ In the body it should say: `This reverts commit <hash>.`, where the hash is the
The type must be one of the following:
| Type | Description |
|--------------|-------------------------------------------------------------------------------------------------------------|
| ------------ | ----------------------------------------------------------------------------------------------------------- |
| **build** | Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm) |
| **ci** | Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs) |
| **docs** | Documentation only changes |
@ -186,10 +197,12 @@ The subject contains succinct description of the change:
- no dot (.) at the end
#### Body
Just as in the **subject**, use the imperative, present tense: "change" not "changed" nor "changes".
The body should include the motivation for the change and contrast this with previous behavior.
#### Footer
The footer should contain any information about **Breaking Changes** and is also the place to reference GitHub issues that this commit **Closes**.
**Breaking Changes** should start with the word `BREAKING CHANGE:` with a space or two newlines.
@ -240,6 +253,7 @@ Prettier formatting will be automatically verified and fixed by XO.
Before pushing your code changes make sure there are no linting errors with `npm run lint`.
**Tips**:
- Most linting errors can be automatically fixed with `npm run lint -- --fix`.
- Install the [XO plugin](https://github.com/sindresorhus/xo#editor-plugins) for your editor to see linting errors directly in your editor and automatically fix them on save.
@ -256,6 +270,7 @@ $ npm run test
```
**Tips:** During development you can:
- run only a subset of test files with `ava <glob>`, for example `ava test/mytestfile.test.js`
- run in watch mode with `ava -w` to automatically run a test file when you modify it
- run only the test you are working on by adding [`.only` to the test definition](https://github.com/avajs/ava#running-specific-tests)

View File

@ -57,7 +57,7 @@ Tools such as [commitizen](https://github.com/commitizen/cz-cli) or [commitlint]
The table below shows which commit message gets you which release type when `semantic-release` runs (using the default configuration):
| Commit message | Release type |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------- |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------- |
| `fix(pencil): stop graphite breaking when too much pressure applied` | ~~Patch~~ Fix Release |
| `feat(pencil): add 'graphiteWidth' option` | ~~Minor~~ Feature Release |
| `perf(pencil): remove graphiteWidth option`<br><br>`BREAKING CHANGE: The graphiteWidth option has been removed.`<br>`The default graphite width of 10mm is always used for performance reasons.` | ~~Major~~ Breaking Release <br /> (Note that the `BREAKING CHANGE: ` token must be in the footer of the commit) |
@ -145,7 +145,6 @@ Let people know that your package is published using **semantic-release** and wh
```md
[![semantic-release: angular](https://img.shields.io/badge/semantic--release-angular-e10079?logo=semantic-release)](https://github.com/semantic-release/semantic-release)
```
## Team

View File

@ -1,6 +1,7 @@
# Summary
## Usage
- [Getting started](docs/usage/getting-started.md#getting-started)
- [Installation](docs/usage/installation.md#installation)
- [CI Configuration](docs/usage/ci-configuration.md#ci-configuration)
@ -10,10 +11,12 @@
- [Shareable configurations](docs/usage/shareable-configurations.md)
## Extending
- [Plugins](docs/extending/plugins-list.md)
- [Shareable configuration](docs/extending/shareable-configurations-list.md)
## Recipes
- [CI configurations](docs/recipes/ci-configurations/README.md)
- [CircleCI 2.0](docs/recipes/ci-configurations/circleci-workflows.md)
- [Travis CI](docs/recipes/ci-configurations/travis.md)
@ -28,11 +31,13 @@
- [Publishing pre-releases](docs/recipes/release-workflow/pre-releases.md)
## Developer guide
- [JavaScript API](docs/developer-guide/js-api.md)
- [Plugin development](docs/developer-guide/plugin.md)
- [Shareable configuration development](docs/developer-guide/shareable-configuration.md)
## Support
- [Resources](docs/support/resources.md)
- [Frequently Asked Questions](docs/support/FAQ.md)
- [Troubleshooting](docs/support/troubleshooting.md)

View File

@ -1,30 +1,32 @@
#!/usr/bin/env node
// Bad news: We have to write plain ES5 in this file
// Good news: It's the only file of the entire project
/* eslint-disable no-var */
var semver = require('semver');
var execa = require('execa');
var findVersions = require('find-versions');
var pkg = require('../package.json');
import semver from "semver";
import { execa } from "execa";
import findVersions from "find-versions";
import cli from "../cli.js";
import { createRequire } from "node:module";
var MIN_GIT_VERSION = '2.7.1';
const require = createRequire(import.meta.url);
const { engines } = require("../package.json");
const { satisfies, lt } = semver;
if (!semver.satisfies(process.version, pkg.engines.node)) {
const MIN_GIT_VERSION = "2.7.1";
if (!satisfies(process.version, engines.node)) {
console.error(
`[semantic-release]: node version ${pkg.engines.node} is required. Found ${process.version}.
`[semantic-release]: node version ${engines.node} is required. Found ${process.version}.
See https://github.com/semantic-release/semantic-release/blob/master/docs/support/node-version.md for more details and solutions.`
);
process.exit(1);
}
execa('git', ['--version'])
.then(({stdout}) => {
var gitVersion = findVersions(stdout)[0];
if (semver.lt(gitVersion, MIN_GIT_VERSION)) {
execa("git", ["--version"])
.then(({ stdout }) => {
const gitVersion = findVersions(stdout)[0];
if (lt(gitVersion, MIN_GIT_VERSION)) {
console.error(`[semantic-release]: Git version ${MIN_GIT_VERSION} is required. Found ${gitVersion}.`);
process.exit(1);
}
@ -35,8 +37,7 @@ execa('git', ['--version'])
process.exit(1);
});
// Node 10+ from this point on
require('../cli')()
cli()
.then((exitCode) => {
process.exitCode = exitCode;
})

64
cli.js
View File

@ -1,47 +1,47 @@
const {argv, env, stderr} = require('process'); // eslint-disable-line node/prefer-global/process
const util = require('util');
const hideSensitive = require('./lib/hide-sensitive');
import util from "node:util";
import yargs from "yargs";
import { hideBin } from "yargs/helpers";
import hideSensitive from "./lib/hide-sensitive.js";
const stringList = {
type: 'string',
type: "string",
array: true,
coerce: (values) =>
values.length === 1 && values[0].trim() === 'false'
values.length === 1 && values[0].trim() === "false"
? []
: values.reduce((values, value) => values.concat(value.split(',').map((value) => value.trim())), []),
: values.reduce((values, value) => values.concat(value.split(",").map((value) => value.trim())), []),
};
module.exports = async () => {
const cli = require('yargs')
.command('$0', 'Run automated package publishing', (yargs) => {
export default async () => {
const cli = yargs(hideBin(process.argv))
.command("$0", "Run automated package publishing", (yargs) => {
yargs.demandCommand(0, 0).usage(`Run automated package publishing
Usage:
semantic-release [options] [plugins]`);
})
.option('b', {alias: 'branches', describe: 'Git branches to release from', ...stringList, group: 'Options'})
.option('r', {alias: 'repository-url', describe: 'Git repository URL', type: 'string', group: 'Options'})
.option('t', {alias: 'tag-format', describe: 'Git tag format', type: 'string', group: 'Options'})
.option('p', {alias: 'plugins', describe: 'Plugins', ...stringList, group: 'Options'})
.option('e', {alias: 'extends', describe: 'Shareable configurations', ...stringList, group: 'Options'})
.option('ci', {describe: 'Toggle CI verifications', type: 'boolean', group: 'Options'})
.option('verify-conditions', {...stringList, group: 'Plugins'})
.option('analyze-commits', {type: 'string', group: 'Plugins'})
.option('verify-release', {...stringList, group: 'Plugins'})
.option('generate-notes', {...stringList, group: 'Plugins'})
.option('prepare', {...stringList, group: 'Plugins'})
.option('publish', {...stringList, group: 'Plugins'})
.option('success', {...stringList, group: 'Plugins'})
.option('fail', {...stringList, group: 'Plugins'})
.option('debug', {describe: 'Output debugging information', type: 'boolean', group: 'Options'})
.option('d', {alias: 'dry-run', describe: 'Skip publishing', type: 'boolean', group: 'Options'})
.option('h', {alias: 'help', group: 'Options'})
.option('v', {alias: 'version', group: 'Options'})
.option("b", { alias: "branches", describe: "Git branches to release from", ...stringList, group: "Options" })
.option("r", { alias: "repository-url", describe: "Git repository URL", type: "string", group: "Options" })
.option("t", { alias: "tag-format", describe: "Git tag format", type: "string", group: "Options" })
.option("p", { alias: "plugins", describe: "Plugins", ...stringList, group: "Options" })
.option("e", { alias: "extends", describe: "Shareable configurations", ...stringList, group: "Options" })
.option("ci", { describe: "Toggle CI verifications", type: "boolean", group: "Options" })
.option("verify-conditions", { ...stringList, group: "Plugins" })
.option("analyze-commits", { type: "string", group: "Plugins" })
.option("verify-release", { ...stringList, group: "Plugins" })
.option("generate-notes", { ...stringList, group: "Plugins" })
.option("prepare", { ...stringList, group: "Plugins" })
.option("publish", { ...stringList, group: "Plugins" })
.option("success", { ...stringList, group: "Plugins" })
.option("fail", { ...stringList, group: "Plugins" })
.option("debug", { describe: "Output debugging information", type: "boolean", group: "Options" })
.option("d", { alias: "dry-run", describe: "Skip publishing", type: "boolean", group: "Options" })
.option("h", { alias: "help", group: "Options" })
.strict(false)
.exitProcess(false);
try {
const {help, version, ...options} = cli.parse(argv.slice(2));
const { help, version, ...options } = cli.parse(process.argv.slice(2));
if (Boolean(help) || Boolean(version)) {
return 0;
@ -49,14 +49,14 @@ Usage:
if (options.debug) {
// Debug must be enabled before other requires in order to work
require('debug').enable('semantic-release:*');
(await import("debug")).default.enable("semantic-release:*");
}
await require('.')(options);
await (await import("./index.js")).default(options);
return 0;
} catch (error) {
if (error.name !== 'YError') {
stderr.write(hideSensitive(env)(util.inspect(error, {colors: true})));
if (error.name !== "YError") {
process.stderr.write(hideSensitive(process.env)(util.inspect(error, { colors: true })));
}
return 1;

View File

@ -3,43 +3,48 @@
## Usage
```js
const semanticRelease = require('semantic-release');
const {WritableStreamBuffer} = require('stream-buffers');
const semanticRelease = require("semantic-release");
const { WritableStreamBuffer } = require("stream-buffers");
const stdoutBuffer = WritableStreamBuffer();
const stderrBuffer = WritableStreamBuffer();
try {
const result = await semanticRelease({
const result = await semanticRelease(
{
// Core options
branches: [
'+([0-9])?(.{+([0-9]),x}).x',
'master',
'next',
'next-major',
{name: 'beta', prerelease: true},
{name: 'alpha', prerelease: true}
"+([0-9])?(.{+([0-9]),x}).x",
"master",
"next",
"next-major",
{ name: "beta", prerelease: true },
{ name: "alpha", prerelease: true },
],
repositoryUrl: 'https://github.com/me/my-package.git',
repositoryUrl: "https://github.com/me/my-package.git",
// Shareable config
extends: 'my-shareable-config',
extends: "my-shareable-config",
// Plugin options
githubUrl: 'https://my-ghe.com',
githubApiPathPrefix: '/api-prefix'
}, {
githubUrl: "https://my-ghe.com",
githubApiPathPrefix: "/api-prefix",
},
{
// Run semantic-release from `/path/to/git/repo/root` without having to change local process `cwd` with `process.chdir()`
cwd: '/path/to/git/repo/root',
cwd: "/path/to/git/repo/root",
// Pass the variable `MY_ENV_VAR` to semantic-release without having to modify the local `process.env`
env: {...process.env, MY_ENV_VAR: 'MY_ENV_VAR_VALUE'},
env: { ...process.env, MY_ENV_VAR: "MY_ENV_VAR_VALUE" },
// Store stdout and stderr to use later instead of writing to `process.stdout` and `process.stderr`
stdout: stdoutBuffer,
stderr: stderrBuffer
});
stderr: stderrBuffer,
}
);
if (result) {
const {lastRelease, commits, nextRelease, releases} = result;
const { lastRelease, commits, nextRelease, releases } = result;
console.log(`Published ${nextRelease.type} release version ${nextRelease.version} containing ${commits.length} commits.`);
console.log(
`Published ${nextRelease.type} release version ${nextRelease.version} containing ${commits.length} commits.`
);
if (lastRelease.version) {
console.log(`The last release was "${lastRelease.version}".`);
@ -49,14 +54,14 @@ try {
console.log(`The release was published with plugin "${release.pluginName}".`);
}
} else {
console.log('No release published.');
console.log("No release published.");
}
// Get stdout and stderr content
const logs = stdoutBuffer.getContentsAsString('utf8');
const errors = stderrBuffer.getContentsAsString('utf8');
const logs = stdoutBuffer.getContentsAsString("utf8");
const errors = stderrBuffer.getContentsAsString("utf8");
} catch (err) {
console.error('The automated release failed with %O', err)
console.error("The automated release failed with %O", err);
}
```
@ -131,7 +136,7 @@ Type: `Object`
Information related to the last release found:
| Name | Type | Description |
|---------|----------|-------------------------------------------------------------------------------------------------------------------------------------|
| ------- | -------- | ----------------------------------------------------------------------------------------------------------------------------------- |
| version | `String` | The version of the last release. |
| gitHead | `String` | The sha of the last commit being part of the last release. |
| gitTag | `String` | The [Git tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging) associated with the last release. |
@ -140,6 +145,7 @@ Information related to the last release found:
**Notes**: If no previous release is found, `lastRelease` will be an empty `Object`.
Example:
```js
{
gitHead: 'da39a3ee5e6b4b0d3255bfef95601890afd80709',
@ -157,7 +163,7 @@ The list of commit included in the new release.<br>
Each commit object has the following properties:
| Name | Type | Description |
|-----------------|----------|-------------------------------------------------|
| --------------- | -------- | ----------------------------------------------- |
| commit | `Object` | The commit abbreviated and full hash. |
| commit.long | `String` | The commit hash. |
| commit.short | `String` | The commit abbreviated hash. |
@ -179,6 +185,7 @@ Each commit object has the following properties:
| committerDate | `String` | The committer date. |
Example:
```js
[
{
@ -216,7 +223,7 @@ Type: `Object`
Information related to the newly published release:
| Name | Type | Description |
|---------|----------|-------------------------------------------------------------------------------------------------------------------------------|
| ------- | -------- | ----------------------------------------------------------------------------------------------------------------------------- |
| type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). |
| version | `String` | The version of the new release. |
| gitHead | `String` | The sha of the last commit being part of the new release. |
@ -225,6 +232,7 @@ Information related to the newly published release:
| channel | `String` | The distribution channel on which the next release will be made available (`undefined` for the default distribution channel). |
Example:
```js
{
type: 'minor',
@ -244,7 +252,7 @@ The list of releases published or made available to a distribution channel.<br>
Each release object has the following properties:
| Name | Type | Description |
|------------|----------|----------------------------------------------------------------------------------------------------------------|
| ---------- | -------- | -------------------------------------------------------------------------------------------------------------- |
| name | `String` | **Optional.** The release name, only if set by the corresponding `publish` plugin. |
| url | `String` | **Optional.** The release URL, only if set by the corresponding `publish` plugin. |
| type | `String` | The [semver](https://semver.org) type of the release (`patch`, `minor` or `major`). |
@ -256,6 +264,7 @@ Each release object has the following properties:
| channel | `String` | The distribution channel on which the release is available (`undefined` for the default distribution channel). |
Example:
```js
[
{

View File

@ -34,7 +34,7 @@ We recommend you setup a linting system to ensure good javascript practices are
In your `index.js` file, you can start by writing the following code
```javascript
const verify = require('./src/verify');
const verify = require("./src/verify");
let verified;
@ -54,7 +54,7 @@ module.exports = { verifyConditions };
Then, in your `src` folder, create a file called `verify.js` and add the following
```javascript
const AggregateError = require('aggregate-error');
const AggregateError = require("aggregate-error");
/**
* A method to verify that the user has given us a slack webhook url to post to
@ -81,8 +81,8 @@ Let's say we want to verify that an `option` is passed. An `option` is a configu
```js
{
prepare: {
path: "@semantic-release/my-special-plugin"
message: "My cool release message"
path: "@semantic-release/my-special-plugin";
message: "My cool release message";
}
}
```
@ -101,95 +101,96 @@ if (message.length) {
### Common context keys
* `stdout`
* `stderr`
* `logger`
- `stdout`
- `stderr`
- `logger`
### Context object keys by lifecycle
#### verifyConditions
Initially the context object contains the following keys (`verifyConditions` lifecycle):
* `cwd`
* Current working directory
* `env`
* Environment variables
* `envCi`
* Information about CI environment
* Contains (at least) the following keys:
* `isCi`
* Boolean, true if the environment is a CI environment
* `commit`
* Commit hash
* `branch`
* Current branch
* `options`
* Options passed to `semantic-release` via CLI, configuration files etc.
* `branch`
* Information on the current branch
* Object keys:
* `channel`
* `tags`
* `type`
* `name`
* `range`
* `accept`
* `main`
* `branches`
* Information on branches
* List of branch objects (see above)
- `cwd`
- Current working directory
- `env`
- Environment variables
- `envCi`
- Information about CI environment
- Contains (at least) the following keys:
- `isCi`
- Boolean, true if the environment is a CI environment
- `commit`
- Commit hash
- `branch`
- Current branch
- `options`
- Options passed to `semantic-release` via CLI, configuration files etc.
- `branch`
- Information on the current branch
- Object keys:
- `channel`
- `tags`
- `type`
- `name`
- `range`
- `accept`
- `main`
- `branches`
- Information on branches
- List of branch objects (see above)
#### analyzeCommits
Compared to the verifyConditions, `analyzeCommits` lifecycle context has keys
* `commits` (List)
* List of commits taken into account when determining the new version.
* Keys:
* `commit` (Object)
* Keys:
* `long` (String, Commit hash)
* `short` (String, Commit hash)
* `tree` (Object)
* Keys:
* `long` (String, Commit hash)
* `short` (String, Commit hash)
* `author` (Object)
* Keys:
* `name` (String)
* `email` (String)
* `date` (String, ISO 8601 timestamp)
* `committer` (Object)
* Keys:
* `name` (String)
* `email` (String)
* `date` (String, ISO 8601 timestamp)
* `subject` (String, Commit message subject)
* `body` (String, Commit message body)
* `hash` (String, Commit hash)
* `committerDate` (String, ISO 8601 timestamp)
* `message` (String)
* `gitTags` (String, List of git tags)
* `releases` (List)
* `lastRelease` (Object)
* Keys
* `version` (String)
* `gitTag` (String)
* `channels` (List)
* `gitHead` (String, Commit hash)
* `name` (String)
- `commits` (List)
- List of commits taken into account when determining the new version.
- Keys:
- `commit` (Object)
- Keys:
- `long` (String, Commit hash)
- `short` (String, Commit hash)
- `tree` (Object)
- Keys:
- `long` (String, Commit hash)
- `short` (String, Commit hash)
- `author` (Object)
- Keys:
- `name` (String)
- `email` (String)
- `date` (String, ISO 8601 timestamp)
- `committer` (Object)
- Keys:
- `name` (String)
- `email` (String)
- `date` (String, ISO 8601 timestamp)
- `subject` (String, Commit message subject)
- `body` (String, Commit message body)
- `hash` (String, Commit hash)
- `committerDate` (String, ISO 8601 timestamp)
- `message` (String)
- `gitTags` (String, List of git tags)
- `releases` (List)
- `lastRelease` (Object)
- Keys
- `version` (String)
- `gitTag` (String)
- `channels` (List)
- `gitHead` (String, Commit hash)
- `name` (String)
#### verifyRelease
Additional keys:
* `nextRelease` (Object)
* `type` (String)
* `channel` (String)
* `gitHead` (String, Git hash)
* `version` (String, version without `v`)
* `gitTag` (String, version with `v`)
* `name` (String)
- `nextRelease` (Object)
- `type` (String)
- `channel` (String)
- `gitHead` (String, Git hash)
- `version` (String, version without `v`)
- `gitTag` (String, version with `v`)
- `name` (String)
#### generateNotes
@ -197,7 +198,7 @@ No new content in the context.
#### addChannel
*This is run only if there are releases that have been merged from a higher branch but not added on the channel of the current branch.*
_This is run only if there are releases that have been merged from a higher branch but not added on the channel of the current branch._
Context content is similar to lifecycle `verifyRelease`.
@ -215,8 +216,8 @@ Lifecycles `success` and `fail` are mutually exclusive, only one of them will be
Additional keys:
* `releases`
* Populated by `publish` lifecycle
- `releases`
- Populated by `publish` lifecycle
#### fail
@ -224,7 +225,7 @@ Lifecycles `success` and `fail` are mutually exclusive, only one of them will be
Additional keys:
* `errors`
- `errors`
### Supporting Environment Variables
@ -237,7 +238,9 @@ if (env.GITHUB_TOKEN) {
//...
}
```
## Logger
Use `context.logger` to provide debug logging in the plugin.
```js
@ -269,12 +272,13 @@ Knowledge that might be useful for plugin developers.
While it may be trivial that multiple analyzeCommits (or any lifecycle plugins) can be defined, it is not that self-evident that the plugins executed AFTER the first one (for example, the default one: `commit-analyzer`) can change the result. This way it is possible to create more advanced rules or situations, e.g. if none of the commits would result in new release, then a default can be defined.
The commit must be a known release type, for example the commit-analyzer has the following default types:
* major
* premajor
* minor
* preminor
* patch
* prepatch
* prerelease
- major
- premajor
- minor
- preminor
- patch
- prepatch
- prerelease
If the analyzeCommits-lifecycle plugin does not return anything, then the earlier result is used, but if it returns a supported string value, then that overrides the previous result.

View File

@ -1,6 +1,7 @@
# Plugins list
## Official plugins
- [@semantic-release/commit-analyzer](https://github.com/semantic-release/commit-analyzer)
- **Note**: this is already part of semantic-release and does not have to be installed separately
- `analyzeCommits`: Determine the type of release by analyzing commits with [conventional-changelog](https://github.com/conventional-changelog/conventional-changelog)
@ -124,21 +125,21 @@
- `verifyConditions`: Verify the presence of a license file
- `prepare`: Update the license file based on its type
- [semantic-release-pypi](https://github.com/abichinger/semantic-release-pypi)
- `verifyConditions`: Verify the environment variable ```PYPI_TOKEN``` and installation of build tools
- `prepare`: Update the version in ```setup.cfg``` and create the distribution packages
- `verifyConditions`: Verify the environment variable `PYPI_TOKEN` and installation of build tools
- `prepare`: Update the version in `setup.cfg` and create the distribution packages
- `publish`: Publish the python package to a repository (default: pypi)
- [semantic-release-helm](https://github.com/m1pl/semantic-release-helm)
- `verifyConditions`: Validate configuration and (if present) credentials
- `prepare`: Update version and appVersion in ```Chart.yaml```
- `prepare`: Update version and appVersion in `Chart.yaml`
- `publish`: Publish the chart to a registry (if configured)
- [semantic-release-codeartifact](https://github.com/ryansonshine/semantic-release-codeartifact)
- `verifyConditions`: Validate configuration, get AWS CodeArtifact authentication and repository, validate `publishConfig` or `.npmrc` (if they exist), then pass the configuration to the associated plugins.
- [semantic-release-telegram](https://github.com/pustovitDmytro/semantic-release-telegram)
- `verifyConditions`: Validate configuration and verify ```TELEGRAM_BOT_ID``` and ```TELEGRAM_BOT_TOKEN```
- `verifyConditions`: Validate configuration and verify `TELEGRAM_BOT_ID` and `TELEGRAM_BOT_TOKEN`
- `success`: Publish a message about the successful release to a telegram chat
- `fail`: publish a message about failure to a telegram chat
- [semantic-release-heroku](https://github.com/pustovitDmytro/semantic-release-heroku)
- `verifyConditions`: Validate configuration and verify ```HEROKU_API_KEY```
- `verifyConditions`: Validate configuration and verify `HEROKU_API_KEY`
- `prepare`: Update the package.json version and create release tarball
- `publish`: Publish version to heroku
- [semantic-release-mattermost](https://github.com/ttrobisch/semantic-release-mattermost)

View File

@ -1,10 +1,12 @@
# Shareable configurations list
## Official configurations
- [@semantic-release/apm-config](https://github.com/semantic-release/apm-config) - semantic-release shareable configuration for releasing atom packages
- [@semantic-release/gitlab-config](https://github.com/semantic-release/gitlab-config) - semantic-release shareable configuration for GitLab
## Community configurations
- [@jedmao/semantic-release-npm-github-config](https://github.com/jedmao/semantic-release-npm-github-config)
- Provides an informative [Git](https://github.com/semantic-release/git) commit message for the release commit that does not trigger continuous integration and conforms to the [conventional commits specification](https://www.conventionalcommits.org/) (e.g., `chore(release): 1.2.3 [skip ci]\n\nnotes`).
- Creates a tarball that gets uploaded with each [GitHub release](https://github.com/semantic-release/github).

View File

@ -1,4 +1,5 @@
# CI configurations
- [CircleCI 2.0 workflows](circleci-workflows.md)
- [Travis CI](travis.md)
- [GitLab CI](gitlab-ci.md)

View File

@ -35,7 +35,7 @@ jobs:
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 'lts/*'
node-version: "lts/*"
- name: Install dependencies
run: npm ci
- name: Release
@ -64,9 +64,11 @@ If the risk is acceptable, some extra configuration is needed. The [actions/chec
## Trigger semantic-release on demand
### Using GUI:
You can use [Manual Triggers](https://github.blog/changelog/2020-07-06-github-actions-manual-triggers-with-workflow_dispatch/) for GitHub Actions.
### Using HTTP:
Use [`repository_dispatch`](https://docs.github.com/en/actions/reference/events-that-trigger-workflows#repository_dispatch) event to have control on when to generate a release by making an HTTP request, e.g.:
```yaml
@ -85,7 +87,8 @@ $ curl -v -H "Accept: application/vnd.github.everest-preview+json" -H "Authoriza
```
### Using 3rd party apps:
If you'd like to use a GitHub app to manage this instead of creating a personal access token, you could consider using a project like:
* [Actions Panel](https://www.actionspanel.app/) - A declaratively configured way for triggering GitHub Actions
* [Action Button](https://github-action-button.web.app/#details) - A simple badge based mechanism for triggering GitHub Actions
- [Actions Panel](https://www.actionspanel.app/) - A declaratively configured way for triggering GitHub Actions
- [Action Button](https://github-action-button.web.app/#details) - A simple badge based mechanism for triggering GitHub Actions

View File

@ -52,7 +52,6 @@ This example is a minimal configuration for **semantic-release** with a build ru
**Note**: The`semantic-release` execution command varies depending if you are using a [local](../../usage/installation.md#local-installation) or [global](../../usage/installation.md#global-installation) **semantic-release** installation.
```yaml
# The release pipeline will run only on the master branch a commit is triggered
stages:

View File

@ -1,2 +1,3 @@
# Git hosted services
- [Git authentication with SSH keys](git-auth-ssh-keys.md)

View File

@ -21,6 +21,7 @@ This will generate a public key in `git_deploy_key.pub` and a private key in `gi
## Adding the SSH public key to the Git hosted account
Step by step instructions are provided for the following Git hosted services:
- [GitHub](#adding-the-ssh-public-key-to-github)
### Adding the SSH public key to GitHub
@ -44,6 +45,7 @@ See [Adding a new SSH key to your GitHub account](https://help.github.com/articl
In order to be available on the CI environment, the SSH private key must be encrypted, committed to the Git repository and decrypted by the CI service.
Step by step instructions are provided for the following environments:
- [Travis CI](#adding-the-ssh-private-key-to-travis-ci)
- [Circle CI](#adding-the-ssh-private-key-to-circle-ci)
@ -109,7 +111,7 @@ $ git push
### Adding the SSH private key to Circle CI
First we encrypt the `git_deploy_key` (private key) using a symmetric encryption (AES-256). Run the following `openssl` command and *make sure to note the output which we'll need later*:
First we encrypt the `git_deploy_key` (private key) using a symmetric encryption (AES-256). Run the following `openssl` command and _make sure to note the output which we'll need later_:
```bash
$ openssl aes-256-cbc -e -p -in git_deploy_key -out git_deploy_key.enc -K `openssl rand -hex 32` -iv `openssl rand -hex 16`
@ -119,6 +121,7 @@ iv =VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV
```
Add the following [environment variables](https://circleci.com/docs/2.0/env-vars/#adding-environment-variables-in-the-app) to Circle CI:
- `SSL_PASSPHRASE` - the value set during the [SSH keys generation](#generating-the-ssh-keys) step.
- `REPO_ENC_KEY` - the `key` (KKK) value from the `openssl` step above.
- `REPO_ENC_IV` - the `iv` (VVV) value from the `openssl` step above.

View File

@ -1,4 +1,5 @@
# Release workflow
- [Publishing on distribution channels](distribution-channels.md)
- [Publishing maintenance releases](maintenance-releases.md)
- [Publishing pre-releases](pre-releases.md)

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses distribution channels to make releases available only to a subset of users, in order to collect feedback before distributing the release to all users.
This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses Git branches and distribution channels to publish fixes and features for old versions of a package.
This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -3,6 +3,7 @@
This recipe will walk you through a simple example that uses pre-releases to publish beta versions while working on a future major release and then make only one release on the default distribution.
This example uses the **semantic-release** default configuration:
- [branches](../../usage/configuration.md#branches): `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name: 'beta', prerelease: true}, {name: 'alpha', prerelease: true}]`
- [plugins](../../usage/configuration.md#plugins): `['@semantic-release/commit-analyzer', '@semantic-release/release-notes-generator', '@semantic-release/npm', '@semantic-release/github']`

View File

@ -4,7 +4,7 @@
[`@semantic-release/npm`](https://github.com/semantic-release/npm) takes care of updating the `package.json`s version before publishing to [npm](https://www.npmjs.com).
By default, only the published package will contain the version, which is the only place where it is *really* required, but the updated `package.json` will not be pushed to the Git repository
By default, only the published package will contain the version, which is the only place where it is _really_ required, but the updated `package.json` will not be pushed to the Git repository
However, the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin can be used to push the updated `package.json` as well as other files to the Git repository.
@ -17,19 +17,24 @@ The `package.json`s version will be updated by the `semantic-release` command
As the [`@semantic-release/npm`](https://github.com/semantic-release/npm) plugin uses the [npm CLI](https://docs.npmjs.com/cli/npm) to update the `package.json` version and publish the package, all [npm hook scripts](https://docs.npmjs.com/misc/scripts#description) will be executed.
You can run your build script in:
- the `prepublishOnly` or `prepack` hook so it will be executed during the `publish` step of `@semantic-release/npm`
- the `postversion` hook so it will be executed during the `prepare` step of `@semantic-release/npm`, which allow for example to update files before committing them with the [`@semantic-release/git`](https://github.com/semantic-release/git) plugin
If using npm hook scripts is not possible, and alternative solution is to [`@semantic-release/exec`](https://github.com/semantic-release/exec) plugin to run your script in the `prepare` step:
```json
{
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/npm",
["@semantic-release/exec", {
"prepareCmd": "./my-build-script.sh ${nextRelease.version}",
}],
[
"@semantic-release/exec",
{
"prepareCmd": "./my-build-script.sh ${nextRelease.version}"
}
]
]
}
```
@ -43,6 +48,7 @@ Yes with the [dry-run options](../usage/configuration.md#dryrun) which prints to
Yes, **semantic-release** is a Node CLI application, but it can be used to publish any type of packages.
To publish a non-Node package (without a `package.json`) you would need to:
- Use a [global](../usage/installation.md#global-installation) **semantic-release** installation
- Set **semantic-release** [options](../usage/configuration.md#options) via [CLI arguments or `.rc` file](../usage/configuration.md#configuration)
- Make sure your CI job executing the `semantic-release` command has access to a version of Node that [meets our version requirement](./node-version.md) to execute the `semantic-release` command
@ -61,10 +67,13 @@ Here is a basic example to create [GitHub releases](https://help.github.com/arti
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/github",
["@semantic-release/exec", {
"prepareCmd" : "set-version ${nextRelease.version}",
"publishCmd" : "publish-package"
}]
[
"@semantic-release/exec",
{
"prepareCmd": "set-version ${nextRelease.version}",
"publishCmd": "publish-package"
}
]
]
}
```
@ -76,6 +85,7 @@ See the [package managers and languages recipes](../recipes/release-workflow/REA
## Can I use semantic-release with any CI service?
Yes, **semantic-release** can be used with any CI service, as long as it provides:
- A way to set [authentication](../usage/ci-configuration.md#authentication) via environment variables
- A way to guarantee that the `semantic-release` command is [executed only after all the tests of all the jobs in the CI build pass](../usage/ci-configuration.md#run-semantic-release-only-after-all-tests-succeeded)
@ -112,6 +122,7 @@ See the [`@semantic-release/npm`](https://github.com/semantic-release/npm#semant
## How can I revert a release?
If you have introduced a breaking bug in a release you have 2 options:
- If you have a fix immediately ready, commit and push it (or merge it via a pull request) to the release branch
- Otherwise, [revert the commit](https://git-scm.com/docs/git-revert) that introduced the bug and push the revert commit (or merge it via a pull request) to the release branch
@ -157,7 +168,7 @@ See [Artifactory - npm Registry](https://www.jfrog.com/confluence/display/RTF/Np
## Can I manually trigger the release of a specific version?
You can trigger a release by pushing to your Git repository. You deliberately cannot trigger a *specific* version release, because this is the whole point of semantic-release.
You can trigger a release by pushing to your Git repository. You deliberately cannot trigger a _specific_ version release, because this is the whole point of semantic-release.
## Can I exclude commits from the analysis?
@ -168,7 +179,7 @@ Yes, every commits that contains `[skip release]` or `[release skip]` in their m
By default **semantic-release** uses the [Angular Commit Message Conventions](https://github.com/angular/angular.js/blob/master/DEVELOPERS.md#-git-commit-guidelines) and triggers releases based on the following rules:
| Commit | Release type |
|-----------------------------|----------------------------|
| --------------------------- | -------------------------- |
| Commit with breaking change | ~~Major~~ Breaking release |
| Commit with type `feat` | ~~Minor~~ Feature release |
| Commit with type `fix` | Patch release |
@ -178,9 +189,9 @@ See the [`@semantic-release/npm`](https://github.com/semantic-release/npm#npm-co
This is fully customizable with the [`@semantic-release/commit-analyzer`](https://github.com/semantic-release/commit-analyzer) plugin's [`release-rules` option](https://github.com/semantic-release/commit-analyzer#release-rules).
## Is it *really* a good idea to release on every push?
## Is it _really_ a good idea to release on every push?
It is indeed a great idea because it *forces* you to follow best practices. If you dont feel comfortable releasing every feature or fix on your `master` you might not treat your `master` branch as intended.
It is indeed a great idea because it _forces_ you to follow best practices. If you dont feel comfortable releasing every feature or fix on your `master` you might not treat your `master` branch as intended.
From [Understanding the GitHub Flow](https://guides.github.com/introduction/flow/index.html):

View File

@ -1,6 +1,6 @@
# Node version requirement
**semantic-release** is written using the latest [ECMAScript 2017](https://www.ecma-international.org/publications/standards/Ecma-262.htm) features, without transpilation which **requires Node version 14.17 or higher**.
**semantic-release** is written using the latest [ECMAScript 2017](https://www.ecma-international.org/publications/standards/Ecma-262.htm) features, without transpilation which **requires Node version 18.0.0 or higher**.
**semantic-release** is meant to be used in a CI environment as a development support tool, not as a production dependency.
Therefore, the only constraint is to run the `semantic-release` in a CI environment providing version of Node that meets our version requirement.

View File

@ -15,6 +15,7 @@ This is most likely related to a misconfiguration of the [npm registry authentic
It might also happen if the package name you are trying to publish already exists (in the case of npm, you may be trying to publish a new version of a package that is not yours, hence the permission error).
To verify if your package name is available you can use [npm-name-cli](https://github.com/sindresorhus/npm-name-cli):
```bash
$ npm install --global npm-name-cli
$ npm-name <package-name>

View File

@ -4,6 +4,7 @@
The `semantic-release` command must be executed only after all the tests in the CI build pass. If the build runs multiple jobs (for example to test on multiple Operating Systems or Node versions) the CI has to be configured to guarantee that the `semantic-release` command is executed only after all jobs are successful.
Here are a few examples of the CI services that can be used to achieve this:
- [Travis Build Stages](https://docs.travis-ci.com/user/build-stages)
- [CircleCI Workflows](https://circleci.com/docs/2.0/workflows)
- [GitHub Actions](https://github.com/features/actions)
@ -22,7 +23,7 @@ See [CI configuration recipes](../recipes/ci-configurations#ci-configurations) f
**semantic-release** requires push access to the project Git repository in order to create [Git tags](https://git-scm.com/book/en/v2/Git-Basics-Tagging). The Git authentication can be set with one of the following environment variables:
| Variable | Description |
|-------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ----------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `GH_TOKEN` or `GITHUB_TOKEN` | A GitHub [personal access token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line). |
| `GL_TOKEN` or `GITLAB_TOKEN` | A GitLab [personal access token](https://docs.gitlab.com/ce/user/profile/personal_access_tokens.html). |
| `BB_TOKEN` or `BITBUCKET_TOKEN` | A Bitbucket [personal access token](https://confluence.atlassian.com/bitbucketserver/personal-access-tokens-939515499.html). |
@ -36,7 +37,7 @@ Alternatively the Git authentication can be set up via [SSH keys](../recipes/git
Most **semantic-release** [plugins](plugins.md) require setting up authentication in order to publish to a package manager registry. The default [@semantic-release/npm](https://github.com/semantic-release/npm#environment-variables) and [@semantic-release/github](https://github.com/semantic-release/github#environment-variables) plugins require the following environment variables:
| Variable | Description |
|-------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ----------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `NPM_TOKEN` | npm token created via [npm token create](https://docs.npmjs.com/getting-started/working_with_tokens#how-to-create-new-tokens).<br/>**Note**: Only the `auth-only` [level of npm two-factor authentication](https://docs.npmjs.com/getting-started/using-two-factor-authentication#levels-of-authentication) is supported. |
| `GH_TOKEN` | GitHub authentication token.<br/>**Note**: Only the [personal token](https://help.github.com/articles/creating-a-personal-access-token-for-the-command-line) authentication is supported. |

View File

@ -1,6 +1,7 @@
# Configuration
**semantic-release** configuration consists of:
- Git repository ([URL](#repositoryurl) and options [release branches](#branches) and [tag format](#tagformat))
- Plugins [declaration](#plugins) and options
- Run mode ([debug](#debug), [dry run](#dryrun) and [local (no CI)](#ci))
@ -12,6 +13,7 @@ Additionally, metadata of Git tags generated by **semantic-release** can be cust
## Configuration file
**semantic-release**s [options](#options), mode and [plugins](plugins.md) can be set via either:
- A `.releaserc` file, written in YAML or JSON, with optional extensions: `.yaml`/`.yml`/`.json`/`.js`/`.cjs`
- A `release.config.(js|cjs)` file that exports an object
- A `release` key in the project's `package.json` file
@ -21,6 +23,7 @@ Alternatively, some options can be set via CLI arguments.
The following three examples are the same.
- Via `release` key in the project's `package.json` file:
```json
{
"release": {
@ -30,6 +33,7 @@ The following three examples are the same.
```
- Via `.releaserc` file:
```json
{
"branches": ["master", "next"]
@ -37,6 +41,7 @@ The following three examples are the same.
```
- Via CLI argument:
```bash
$ semantic-release --branches next
```
@ -65,6 +70,7 @@ Default: `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next', 'next-major', {name:
CLI arguments: `--branches`
The branches on which releases should happen. By default **semantic-release** will release:
- regular releases to the default distribution channel from the branch `master`
- regular releases to a distribution channel matching the branch name from any existing branch with a name matching a maintenance release range (`N.N.x` or `N.x.x` or `N.x` with `N` being a number)
- regular releases to the `next` distribution channel from the branch `next` if it exists
@ -143,7 +149,7 @@ Output debugging information. This can also be enabled by setting the `DEBUG` en
## Git environment variables
| Variable | Description | Default |
|-----------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------|
| --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------------------------------ |
| `GIT_AUTHOR_NAME` | The author name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. |
| `GIT_AUTHOR_EMAIL` | The author email associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot email address. |
| `GIT_COMMITTER_NAME` | The committer name associated with the [Git release tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging). See [Git environment variables](https://git-scm.com/book/en/v2/Git-Internals-Environment-Variables#_committing). | @semantic-release-bot. |

View File

@ -5,7 +5,7 @@ Each [release step](../../README.md#release-steps) is implemented by configurabl
A plugin is a npm module that can implement one or more of the following steps:
| Step | Required | Description |
|--------------------|----------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| ------------------ | -------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `verifyConditions` | No | Responsible for verifying conditions necessary to proceed with the release: configuration is correct, authentication token are valid, etc... |
| `analyzeCommits` | Yes | Responsible for determining the type of the next release (`major`, `minor` or `patch`). If multiple plugins with a `analyzeCommits` step are defined, the release type will be the highest one among plugins output. |
| `verifyRelease` | No | Responsible for verifying the parameters (version, type, dist-tag etc...) of the release that is about to be published. |
@ -25,6 +25,7 @@ Release steps will run in that order. At each step, **semantic-release** will ru
### Default plugins
These four plugins are already part of **semantic-release** and are listed in order of execution. They do not have to be installed separately:
```
"@semantic-release/commit-analyzer"
"@semantic-release/release-notes-generator"
@ -66,6 +67,7 @@ For each [release step](../../README.md#release-steps) the plugins that implemen
```
With this configuration **semantic-release** will:
- execute the `verifyConditions` implementation of `@semantic-release/npm` then `@semantic-release/git`
- execute the `analyzeCommits` implementation of `@semantic-release/commit-analyzer`
- execute the `generateNotes` implementation of `@semantic-release/release-notes-generator`
@ -85,9 +87,12 @@ Global plugin configuration can be defined at the root of the **semantic-release
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
["@semantic-release/github", {
[
"@semantic-release/github",
{
"assets": ["dist/**"]
}],
}
],
"@semantic-release/git"
],
"preset": "angular"
@ -95,5 +100,6 @@ Global plugin configuration can be defined at the root of the **semantic-release
```
With this configuration:
- All plugins will receive the `preset` option, which will be used by both `@semantic-release/commit-analyzer` and `@semantic-release/release-notes-generator` (and ignored by `@semantic-release/github` and `@semantic-release/git`)
- The `@semantic-release/github` plugin will receive the `assets` options (`@semantic-release/git` will not receive it and therefore will use it's default value for that option)

View File

@ -1,6 +1,7 @@
# Workflow configuration
**semantic-release** allow to manage and automate complex release workflow, based on multiple Git branches and distribution channels. This allow to:
- Distribute certain releases to a particular group of users via distribution channels
- Manage the availability of releases on distribution channels via branches merge
- Maintain multiple lines of releases in parallel
@ -12,6 +13,7 @@ The release workflow is configured via the [branches option](./configuration.md#
Each branch can be defined either as a string, a [glob](https://github.com/micromatch/micromatch#matching-features) or an object. For string and glob definitions each [property](#branches-properties) will be defaulted.
A branch can defined as one of three types:
- [release](#release-branches): to make releases on top of the last version released
- [maintenance](#maintenance-branches): to make releases on top of an old release
- [pre-release](#pre-release-branches): to make pre-releases
@ -21,7 +23,7 @@ The type of the branch is automatically determined based on naming convention an
## Branches properties
| Property | Branch type | Description | Default |
|--------------|-------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------|
| ------------ | ----------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| `name` | All | **Required.** The Git branch holding the commits to analyze and the code to release. See [name](#name). | - The value itself if defined as a `String` or the matching branches name if defined as a glob. |
| `channel` | All | The distribution channel on which to publish releases from this branch. Set to `false` to force the default distribution channel instead of using the default. See [channel](#channel). | `undefined` for the first release branch, the value of `name` for subsequent ones. |
| `range` | [maintenance](#maintenance-branches) only | **Required unless `name` is formatted like `N.N.x` or `N.x` (`N` is a number).** The range of [semantic versions](https://semver.org) to support on this branch. See [range](#range). | The value of `name`. |
@ -35,14 +37,15 @@ It can be defined as a [glob](https://github.com/micromatch/micromatch#matching-
If `name` doesn't match to any branch existing in the repository, the definition will be ignored. For example the default configuration includes the definition `next` and `next-major` which will become active only when the branches `next` and/or `next-major` are created in the repository. This allow to define your workflow once with all potential branches you might use and have the effective configuration evolving as you create new branches.
For example the configuration `['+([0-9])?(.{+([0-9]),x}).x', 'master', 'next']` will be expanded as:
```js
{
branches: [
{name: '1.x', range: '1.x', channel: '1.x'}, // Only after the `1.x` is created in the repo
{name: '2.x', range: '2.x', channel: '2.x'}, // Only after the `2.x` is created in the repo
{name: 'master'},
{name: 'next', channel: 'next'}, // Only after the `next` is created in the repo
]
{ name: "1.x", range: "1.x", channel: "1.x" }, // Only after the `1.x` is created in the repo
{ name: "2.x", range: "2.x", channel: "2.x" }, // Only after the `2.x` is created in the repo
{ name: "master" },
{ name: "next", channel: "next" }, // Only after the `next` is created in the repo
];
}
```
@ -54,12 +57,13 @@ If the `channel` property is set to `false` the default channel will be used.
The value of `channel`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available.
For example the configuration `['master', {name: 'next', channel: 'channel-${name}'}]` will be expanded as:
```js
{
branches: [
{name: 'master'}, // `channel` is undefined so the default distribution channel will be used
{name: 'next', channel: 'channel-next'}, // `channel` is built with the template `channel-${name}`
]
{ name: "master" }, // `channel` is undefined so the default distribution channel will be used
{ name: "next", channel: "channel-next" }, // `channel` is built with the template `channel-${name}`
];
}
```
@ -68,13 +72,14 @@ For example the configuration `['master', {name: 'next', channel: 'channel-${nam
A `range` only applies to maintenance branches, is required and must be formatted like `N.N.x` or `N.x` (`N` is a number). In case the `name` is formatted as a range (for example `1.x` or `1.5.x`) the branch will be considered a maintenance branch and the `name` value will be used for the `range`.
For example the configuration `['1.1.x', '1.2.x', 'master']` will be expanded as:
```js
{
branches: [
{name: '1.1.x', range: '1.1.x', channel: '1.1.x'},
{name: '1.2.x', range: '1.2.x', channel: '1.2.x'},
{name: 'master'},
]
{ name: "1.1.x", range: "1.1.x", channel: "1.1.x" },
{ name: "1.2.x", range: "1.2.x", channel: "1.2.x" },
{ name: "master" },
];
}
```
@ -86,13 +91,14 @@ If the `prerelease` property is set to `true` the `name` value will be used.
The value of `prerelease`, if defined as a string, is generated with [Lodash template](https://lodash.com/docs#template) with the variable `name` available.
For example the configuration `['master', {name: 'pre/rc', prerelease: '${name.replace(/^pre\\//g, "")}'}, {name: 'beta', prerelease: true}]` will be expanded as:
```js
{
branches: [
{name: 'master'},
{name: 'pre/rc', channel: 'pre/rc', prerelease: 'rc'}, // `prerelease` is built with the template `${name.replace(/^pre\\//g, "")}`
{name: 'beta', channel: 'beta', prerelease: true}, // `prerelease` is set to `beta` as it is the value of `name`
]
{ name: "master" },
{ name: "pre/rc", channel: "pre/rc", prerelease: "rc" }, // `prerelease` is built with the template `${name.replace(/^pre\\//g, "")}`
{ name: "beta", channel: "beta", prerelease: true }, // `prerelease` is set to `beta` as it is the value of `name`
];
}
```
@ -113,10 +119,12 @@ See [publishing on distribution channels recipe](../recipes/release-workflow/dis
#### Pushing to a release branch
With the configuration `"branches": ["master", "next"]`, if the last release published from `master` is `1.0.0` and the last one from `next` is `2.0.0` then:
- Only versions in range `1.x.x` can be published from `master`, so only `fix` and `feat` commits can be pushed to `master`
- Once `next` get merged into `master` the release `2.0.0` will be made available on the channel associated with `master` and both `master` and `next` will accept any commit type
This verification prevent scenario such as:
1. Create a `feat` commit on `next` which triggers the release of version `1.0.0` on the `next` channel
2. Merge `next` into `master` which adds `1.0.0` on the default channel
3. Create a `feat` commit on `next` which triggers the release of version `1.1.0` on the `next` channel
@ -147,6 +155,7 @@ See [publishing maintenance releases recipe](../recipes/release-workflow/mainten
#### Pushing to a maintenance branch
With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.5.0` then:
- Only versions in range `>=1.0.0 <1.1.0` can be published from `1.0.x`, so only `fix` commits can be pushed to `1.0.x`
- Only versions in range `>=1.1.0 <1.5.0` can be published from `1.x`, so only `fix` and `feat` commits can be pushed to `1.x` as long the resulting release is lower than `1.5.0`
- Once `2.0.0` is released from `master`, versions in range `>=1.1.0 <2.0.0` can be published from `1.x`, so any number of `fix` and `feat` commits can be pushed to `1.x`
@ -154,6 +163,7 @@ With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last rel
#### Merging into a maintenance branch
With the configuration `"branches": ["1.0.x", "1.x", "master"]`, if the last release published from `master` is `1.0.0` then:
- Creating the branch `1.0.x` from `master` will make the `1.0.0` release available on the `1.0.x` distribution channel
- Pushing a `fix` commit on the `1.0.x` branch will release the version `1.0.1` on the `1.0.x` distribution channel
- Creating the branch `1.x` from `master` will make the `1.0.0` release available on the `1.x` distribution channel
@ -176,11 +186,13 @@ See [publishing pre-releases recipe](../recipes/release-workflow/pre-releases.md
#### Pushing to a pre-release branch
With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` then:
- Pushing a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.1` on the `beta` distribution channel
- Pushing either a `fix`, `feat` or a `BREAKING CHANGE` commit on the `beta` branch will release the version `2.0.0-beta.2` (then `2.0.0-beta.3`, `2.0.0-beta.4`, etc...) on the `beta` distribution channel
#### Merging into a pre-release branch
With the configuration `"branches": ["master", {"name": "beta", "prerelease": true}]`, if the last release published from `master` is `1.0.0` and the last one published from `beta` is `2.0.0-beta.1` then:
- Pushing a `fix` commit on the `master` branch will release the version `1.0.1` on the default distribution channel
- Merging the branch `master` into `beta` will release the version `2.0.0-beta.2` on the `beta` distribution channel

148
index.js
View File

@ -1,30 +1,33 @@
const {pick} = require('lodash');
const marked = require('marked');
const envCi = require('env-ci');
const hookStd = require('hook-std');
const semver = require('semver');
const AggregateError = require('aggregate-error');
const pkg = require('./package.json');
const hideSensitive = require('./lib/hide-sensitive');
const getConfig = require('./lib/get-config');
const verify = require('./lib/verify');
const getNextVersion = require('./lib/get-next-version');
const getCommits = require('./lib/get-commits');
const getLastRelease = require('./lib/get-last-release');
const getReleaseToAdd = require('./lib/get-release-to-add');
const {extractErrors, makeTag} = require('./lib/utils');
const getGitAuthUrl = require('./lib/get-git-auth-url');
const getBranches = require('./lib/branches');
const getLogger = require('./lib/get-logger');
const {verifyAuth, isBranchUpToDate, getGitHead, tag, push, pushNotes, getTagHead, addNote} = require('./lib/git');
const getError = require('./lib/get-error');
const {COMMIT_NAME, COMMIT_EMAIL} = require('./lib/definitions/constants');
import { createRequire } from "node:module";
import { pick } from "lodash-es";
import * as marked from "marked";
import envCi from "env-ci";
import { hookStd } from "hook-std";
import semver from "semver";
import AggregateError from "aggregate-error";
import hideSensitive from "./lib/hide-sensitive.js";
import getConfig from "./lib/get-config.js";
import verify from "./lib/verify.js";
import getNextVersion from "./lib/get-next-version.js";
import getCommits from "./lib/get-commits.js";
import getLastRelease from "./lib/get-last-release.js";
import getReleaseToAdd from "./lib/get-release-to-add.js";
import { extractErrors, makeTag } from "./lib/utils.js";
import getGitAuthUrl from "./lib/get-git-auth-url.js";
import getBranches from "./lib/branches/index.js";
import getLogger from "./lib/get-logger.js";
import { addNote, getGitHead, getTagHead, isBranchUpToDate, push, pushNotes, tag, verifyAuth } from "./lib/git.js";
import getError from "./lib/get-error.js";
import { COMMIT_EMAIL, COMMIT_NAME } from "./lib/definitions/constants.js";
const require = createRequire(import.meta.url);
const pkg = require("./package.json");
let markedOptionsSet = false;
async function terminalOutput(text) {
if (!markedOptionsSet) {
const {default: TerminalRenderer} = await import('marked-terminal'); // eslint-disable-line node/no-unsupported-features/es-syntax
marked.setOptions({renderer: new TerminalRenderer()});
const { default: TerminalRenderer } = await import("marked-terminal"); // eslint-disable-line node/no-unsupported-features/es-syntax
marked.setOptions({ renderer: new TerminalRenderer() });
markedOptionsSet = true;
}
@ -33,22 +36,22 @@ async function terminalOutput(text) {
/* eslint complexity: off */
async function run(context, plugins) {
const {cwd, env, options, logger} = context;
const {isCi, branch, prBranch, isPr} = context.envCi;
const { cwd, env, options, logger, envCi } = context;
const { isCi, branch, prBranch, isPr } = envCi;
const ciBranch = isPr ? prBranch : branch;
if (!isCi && !options.dryRun && !options.noCi) {
logger.warn('This run was not triggered in a known CI environment, running in dry-run mode.');
logger.warn("This run was not triggered in a known CI environment, running in dry-run mode.");
options.dryRun = true;
} else {
// When running on CI, set the commits author and commiter info and prevent the `git` CLI to prompt for username/password. See #703.
// When running on CI, set the commits author and committer info and prevent the `git` CLI to prompt for username/password. See #703.
Object.assign(env, {
GIT_AUTHOR_NAME: COMMIT_NAME,
GIT_AUTHOR_EMAIL: COMMIT_EMAIL,
GIT_COMMITTER_NAME: COMMIT_NAME,
GIT_COMMITTER_EMAIL: COMMIT_EMAIL,
...env,
GIT_ASKPASS: 'echo',
GIT_ASKPASS: "echo",
GIT_TERMINAL_PROMPT: 0,
});
}
@ -61,30 +64,30 @@ async function run(context, plugins) {
// Verify config
await verify(context);
options.repositoryUrl = await getGitAuthUrl({...context, branch: {name: ciBranch}});
options.repositoryUrl = await getGitAuthUrl({ ...context, branch: { name: ciBranch } });
context.branches = await getBranches(options.repositoryUrl, ciBranch, context);
context.branch = context.branches.find(({name}) => name === ciBranch);
context.branch = context.branches.find(({ name }) => name === ciBranch);
if (!context.branch) {
logger.log(
`This test run was triggered on the branch ${ciBranch}, while semantic-release is configured to only publish from ${context.branches
.map(({name}) => name)
.join(', ')}, therefore a new version wont be published.`
.map(({ name }) => name)
.join(", ")}, therefore a new version wont be published.`
);
return false;
}
logger[options.dryRun ? 'warn' : 'success'](
logger[options.dryRun ? "warn" : "success"](
`Run automated release from branch ${ciBranch} on repository ${options.originalRepositoryURL}${
options.dryRun ? ' in dry-run mode' : ''
options.dryRun ? " in dry-run mode" : ""
}`
);
try {
try {
await verifyAuth(options.repositoryUrl, context.branch.name, {cwd, env});
await verifyAuth(options.repositoryUrl, context.branch.name, { cwd, env });
} catch (error) {
if (!(await isBranchUpToDate(options.repositoryUrl, context.branch.name, {cwd, env}))) {
if (!(await isBranchUpToDate(options.repositoryUrl, context.branch.name, { cwd, env }))) {
logger.log(
`The local branch ${context.branch.name} is behind the remote one, therefore a new version won't be published.`
);
@ -95,7 +98,7 @@ async function run(context, plugins) {
}
} catch (error) {
logger.error(`The command "${error.command}" failed with the error message ${error.stderr}.`);
throw getError('EGITNOPERMISSION', context);
throw getError("EGITNOPERMISSION", context);
}
logger.success(`Allowed to push to the Git repository`);
@ -107,24 +110,27 @@ async function run(context, plugins) {
const releaseToAdd = getReleaseToAdd(context);
if (releaseToAdd) {
const {lastRelease, currentRelease, nextRelease} = releaseToAdd;
const { lastRelease, currentRelease, nextRelease } = releaseToAdd;
nextRelease.gitHead = await getTagHead(nextRelease.gitHead, {cwd, env});
currentRelease.gitHead = await getTagHead(currentRelease.gitHead, {cwd, env});
nextRelease.gitHead = await getTagHead(nextRelease.gitHead, { cwd, env });
currentRelease.gitHead = await getTagHead(currentRelease.gitHead, { cwd, env });
if (context.branch.mergeRange && !semver.satisfies(nextRelease.version, context.branch.mergeRange)) {
errors.push(getError('EINVALIDMAINTENANCEMERGE', {...context, nextRelease}));
errors.push(getError("EINVALIDMAINTENANCEMERGE", { ...context, nextRelease }));
} else {
const commits = await getCommits({...context, lastRelease, nextRelease});
nextRelease.notes = await plugins.generateNotes({...context, commits, lastRelease, nextRelease});
const commits = await getCommits({ ...context, lastRelease, nextRelease });
nextRelease.notes = await plugins.generateNotes({ ...context, commits, lastRelease, nextRelease });
if (options.dryRun) {
logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`);
} else {
await addNote({channels: [...currentRelease.channels, nextRelease.channel]}, nextRelease.gitHead, {cwd, env});
await push(options.repositoryUrl, {cwd, env});
await pushNotes(options.repositoryUrl, {cwd, env});
await addNote({ channels: [...currentRelease.channels, nextRelease.channel] }, nextRelease.gitHead, {
cwd,
env,
});
await push(options.repositoryUrl, { cwd, env });
await pushNotes(options.repositoryUrl, { cwd, env });
logger.success(
`Add ${nextRelease.channel ? `channel ${nextRelease.channel}` : 'default channel'} to tag ${
`Add ${nextRelease.channel ? `channel ${nextRelease.channel}` : "default channel"} to tag ${
nextRelease.gitTag
}`
);
@ -137,9 +143,9 @@ async function run(context, plugins) {
gitHead: nextRelease.gitHead,
});
const releases = await plugins.addChannel({...context, commits, lastRelease, currentRelease, nextRelease});
const releases = await plugins.addChannel({ ...context, commits, lastRelease, currentRelease, nextRelease });
context.releases.push(...releases);
await plugins.success({...context, lastRelease, commits, nextRelease, releases});
await plugins.success({ ...context, lastRelease, commits, nextRelease, releases });
}
}
@ -149,7 +155,7 @@ async function run(context, plugins) {
context.lastRelease = getLastRelease(context);
if (context.lastRelease.gitHead) {
context.lastRelease.gitHead = await getTagHead(context.lastRelease.gitHead, {cwd, env});
context.lastRelease.gitHead = await getTagHead(context.lastRelease.gitHead, { cwd, env });
}
if (context.lastRelease.gitTag) {
@ -165,11 +171,11 @@ async function run(context, plugins) {
const nextRelease = {
type: await plugins.analyzeCommits(context),
channel: context.branch.channel || null,
gitHead: await getGitHead({cwd, env}),
gitHead: await getGitHead({ cwd, env }),
};
if (!nextRelease.type) {
logger.log('There are no relevant changes, so no new version is released.');
return context.releases.length > 0 ? {releases: context.releases} : false;
logger.log("There are no relevant changes, so no new version is released.");
return context.releases.length > 0 ? { releases: context.releases } : false;
}
context.nextRelease = nextRelease;
@ -177,11 +183,11 @@ async function run(context, plugins) {
nextRelease.gitTag = makeTag(options.tagFormat, nextRelease.version);
nextRelease.name = nextRelease.gitTag;
if (context.branch.type !== 'prerelease' && !semver.satisfies(nextRelease.version, context.branch.range)) {
throw getError('EINVALIDNEXTVERSION', {
if (context.branch.type !== "prerelease" && !semver.satisfies(nextRelease.version, context.branch.range)) {
throw getError("EINVALIDNEXTVERSION", {
...context,
validBranches: context.branches.filter(
({type, accept}) => type !== 'prerelease' && accept.includes(nextRelease.type)
({ type, accept }) => type !== "prerelease" && accept.includes(nextRelease.type)
),
});
}
@ -196,20 +202,20 @@ async function run(context, plugins) {
logger.warn(`Skip ${nextRelease.gitTag} tag creation in dry-run mode`);
} else {
// Create the tag before calling the publish plugins as some require the tag to exists
await tag(nextRelease.gitTag, nextRelease.gitHead, {cwd, env});
await addNote({channels: [nextRelease.channel]}, nextRelease.gitHead, {cwd, env});
await push(options.repositoryUrl, {cwd, env});
await pushNotes(options.repositoryUrl, {cwd, env});
await tag(nextRelease.gitTag, nextRelease.gitHead, { cwd, env });
await addNote({ channels: [nextRelease.channel] }, nextRelease.gitHead, { cwd, env });
await push(options.repositoryUrl, { cwd, env });
await pushNotes(options.repositoryUrl, { cwd, env });
logger.success(`Created tag ${nextRelease.gitTag}`);
}
const releases = await plugins.publish(context);
context.releases.push(...releases);
await plugins.success({...context, releases});
await plugins.success({ ...context, releases });
logger.success(
`Published release ${nextRelease.version} on ${nextRelease.channel ? nextRelease.channel : 'default'} channel`
`Published release ${nextRelease.version} on ${nextRelease.channel ? nextRelease.channel : "default"} channel`
);
if (options.dryRun) {
@ -219,10 +225,10 @@ async function run(context, plugins) {
}
}
return pick(context, ['lastRelease', 'commits', 'nextRelease', 'releases']);
return pick(context, ["lastRelease", "commits", "nextRelease", "releases"]);
}
async function logErrors({logger, stderr}, err) {
async function logErrors({ logger, stderr }, err) {
const errors = extractErrors(err).sort((error) => (error.semanticRelease ? -1 : 0));
for (const error of errors) {
if (error.semanticRelease) {
@ -231,7 +237,7 @@ async function logErrors({logger, stderr}, err) {
stderr.write(await terminalOutput(error.details)); // eslint-disable-line no-await-in-loop
}
} else {
logger.error('An error occurred while running semantic-release: %O', error);
logger.error("An error occurred while running semantic-release: %O", error);
}
}
}
@ -240,16 +246,16 @@ async function callFail(context, plugins, err) {
const errors = extractErrors(err).filter((err) => err.semanticRelease);
if (errors.length > 0) {
try {
await plugins.fail({...context, errors});
await plugins.fail({ ...context, errors });
} catch (error) {
await logErrors(context, error);
}
}
}
module.exports = async (cliOptions = {}, {cwd = process.cwd(), env = process.env, stdout, stderr} = {}) => {
const {unhook} = hookStd(
{silent: false, streams: [process.stdout, process.stderr, stdout, stderr].filter(Boolean)},
export default async (cliOptions = {}, { cwd = process.cwd(), env = process.env, stdout, stderr } = {}) => {
const { unhook } = hookStd(
{ silent: false, streams: [process.stdout, process.stderr, stdout, stderr].filter(Boolean) },
hideSensitive(env)
);
const context = {
@ -257,12 +263,12 @@ module.exports = async (cliOptions = {}, {cwd = process.cwd(), env = process.env
env,
stdout: stdout || process.stdout,
stderr: stderr || process.stderr,
envCi: envCi({env, cwd}),
envCi: envCi({ env, cwd }),
};
context.logger = getLogger(context);
context.logger.log(`Running ${pkg.name} version ${pkg.version}`);
try {
const {plugins, options} = await getConfig(context, cliOptions);
const { plugins, options } = await getConfig(context, cliOptions);
options.originalRepositoryURL = options.repositoryUrl;
context.options = options;
try {

View File

@ -1,8 +1,8 @@
const {isString, remove, omit, mapValues, template} = require('lodash');
const micromatch = require('micromatch');
const {getBranches} = require('../git');
import {isString, mapValues, omit, remove, template} from 'lodash-es';
import micromatch from 'micromatch';
import {getBranches} from '../git.js';
module.exports = async (repositoryUrl, {cwd}, branches) => {
export default async (repositoryUrl, {cwd}, branches) => {
const gitBranches = await getBranches(repositoryUrl, {cwd});
return branches.reduce(
@ -15,4 +15,4 @@ module.exports = async (repositoryUrl, {cwd}, branches) => {
],
[]
);
};
}

View File

@ -1,10 +1,13 @@
const {template, escapeRegExp} = require('lodash');
const semver = require('semver');
const pReduce = require('p-reduce');
const debug = require('debug')('semantic-release:get-tags');
const {getTags, getNote} = require('../../lib/git');
import {escapeRegExp, template} from 'lodash-es';
import semver from 'semver';
import pReduce from 'p-reduce';
import debugTags from 'debug';
import {getNote, getTags} from '../../lib/git.js';
module.exports = async ({cwd, env, options: {tagFormat}}, branches) => {
const debug = debugTags('semantic-release:get-tags');
export default async ({cwd, env, options: {tagFormat}}, branches) => {
// Generate a regex to parse tags formatted with `tagFormat`
// by replacing the `version` variable in the template by `(.+)`.
// The `tagFormat` is compiled with space as the `version` as it's an invalid tag character,
@ -30,4 +33,4 @@ module.exports = async ({cwd, env, options: {tagFormat}}, branches) => {
},
[]
);
};
}

View File

@ -1,14 +1,14 @@
const {isString, isRegExp} = require('lodash');
const AggregateError = require('aggregate-error');
const pEachSeries = require('p-each-series');
const DEFINITIONS = require('../definitions/branches');
const getError = require('../get-error');
const {fetch, fetchNotes, verifyBranchName} = require('../git');
const expand = require('./expand');
const getTags = require('./get-tags');
const normalize = require('./normalize');
import {isRegExp, isString} from 'lodash-es';
import AggregateError from 'aggregate-error';
import pEachSeries from 'p-each-series';
import * as DEFINITIONS from '../definitions/branches.js';
import getError from '../get-error.js';
import {fetch, fetchNotes, verifyBranchName} from '../git.js';
import expand from './expand.js';
import getTags from './get-tags.js';
import * as normalize from './normalize.js';
module.exports = async (repositoryUrl, ciBranch, context) => {
export default async (repositoryUrl, ciBranch, context) => {
const {cwd, env} = context;
const remoteBranches = await expand(
@ -68,4 +68,4 @@ module.exports = async (repositoryUrl, ciBranch, context) => {
}
return [...result.maintenance, ...result.release, ...result.prerelease];
};
}

View File

@ -1,19 +1,18 @@
const {sortBy, isNil} = require('lodash');
const semverDiff = require('semver-diff');
const {FIRST_RELEASE, RELEASE_TYPE} = require('../definitions/constants');
const {
tagsToVersions,
isMajorRange,
getUpperBound,
getLowerBound,
highest,
lowest,
getLatestVersion,
import {isNil, sortBy} from 'lodash-es';
import semverDiff from 'semver-diff';
import {FIRST_RELEASE, RELEASE_TYPE} from '../definitions/constants.js';
import {
getFirstVersion,
getRange,
} = require('../utils');
getLatestVersion,
getLowerBound, getRange,
getUpperBound,
highest,
isMajorRange,
lowest,
tagsToVersions
} from '../utils.js';
function maintenance({maintenance, release}) {
export function maintenance({maintenance, release}) {
return sortBy(
maintenance.map(({name, range, channel, ...rest}) => ({
...rest,
@ -55,7 +54,7 @@ function maintenance({maintenance, release}) {
});
}
function release({release}) {
export function release({release}) {
if (release.length === 0) {
return release;
}
@ -89,7 +88,7 @@ function release({release}) {
});
}
function prerelease({prerelease}) {
export function prerelease({prerelease}) {
return prerelease.map(({name, prerelease, channel, tags, ...rest}) => {
const preid = prerelease === true ? name : prerelease;
return {
@ -102,5 +101,3 @@ function prerelease({prerelease}) {
};
});
}
module.exports = {maintenance, release, prerelease};

View File

@ -1,24 +1,22 @@
const {isNil, uniqBy} = require('lodash');
const semver = require('semver');
const {isMaintenanceRange} = require('../utils');
import {isNil, uniqBy} from 'lodash-es';
import semver from 'semver';
import {isMaintenanceRange} from '../utils.js';
const maintenance = {
export const maintenance = {
filter: ({name, range}) => (!isNil(range) && range !== false) || isMaintenanceRange(name),
branchValidator: ({range}) => (isNil(range) ? true : isMaintenanceRange(range)),
branchesValidator: (branches) => uniqBy(branches, ({range}) => semver.validRange(range)).length === branches.length,
};
const prerelease = {
export const prerelease = {
filter: ({prerelease}) => !isNil(prerelease) && prerelease !== false,
branchValidator: ({name, prerelease}) =>
Boolean(prerelease) && Boolean(semver.valid(`1.0.0-${prerelease === true ? name : prerelease}.1`)),
branchesValidator: (branches) => uniqBy(branches, 'prerelease').length === branches.length,
};
const release = {
export const release = {
// eslint-disable-next-line unicorn/no-fn-reference-in-iterator
filter: (branch) => !maintenance.filter(branch) && !prerelease.filter(branch),
branchesValidator: (branches) => branches.length <= 3 && branches.length > 0,
};
module.exports = {maintenance, prerelease, release};

View File

@ -1,29 +1,17 @@
const RELEASE_TYPE = ['patch', 'minor', 'major'];
export const RELEASE_TYPE = ['patch', 'minor', 'major'];
const FIRST_RELEASE = '1.0.0';
export const FIRST_RELEASE = '1.0.0';
const FIRSTPRERELEASE = '1';
export const FIRSTPRERELEASE = '1';
const COMMIT_NAME = 'semantic-release-bot';
export const COMMIT_NAME = 'semantic-release-bot';
const COMMIT_EMAIL = 'semantic-release-bot@martynus.net';
export const COMMIT_EMAIL = 'semantic-release-bot@martynus.net';
const RELEASE_NOTES_SEPARATOR = '\n\n';
export const RELEASE_NOTES_SEPARATOR = '\n\n';
const SECRET_REPLACEMENT = '[secure]';
export const SECRET_REPLACEMENT = '[secure]';
const SECRET_MIN_SIZE = 5;
export const SECRET_MIN_SIZE = 5;
const GIT_NOTE_REF = 'semantic-release';
module.exports = {
RELEASE_TYPE,
FIRST_RELEASE,
FIRSTPRERELEASE,
COMMIT_NAME,
COMMIT_EMAIL,
RELEASE_NOTES_SEPARATOR,
SECRET_REPLACEMENT,
SECRET_MIN_SIZE,
GIT_NOTE_REF,
};
export const GIT_NOTE_REF = 'semantic-release';

View File

@ -1,7 +1,10 @@
const {inspect} = require('util');
const {toLower, isString, trim} = require('lodash');
import {inspect} from 'node:util';
import {createRequire} from 'node:module';
import {isString, toLower, trim} from 'lodash-es';
import {RELEASE_TYPE} from './constants.js';
const require = createRequire(import.meta.url);
const pkg = require('../../package.json');
const {RELEASE_TYPE} = require('./constants');
const [homepage] = pkg.homepage.split('#');
const stringify = (object) =>
@ -10,16 +13,19 @@ const linkify = (file) => `${homepage}/blob/master/${file}`;
const wordsList = (words) =>
`${words.slice(0, -1).join(', ')}${words.length > 1 ? ` or ${words[words.length - 1]}` : trim(words[0])}`;
module.exports = {
ENOGITREPO: ({cwd}) => ({
export function ENOGITREPO({cwd}) {
return {
message: 'Not running from a git repository.',
details: `The \`semantic-release\` command must be executed from a Git repository.
The current working directory is \`${cwd}\`.
Please verify your CI configuration to make sure the \`semantic-release\` command is executed from the root of the cloned repository.`,
}),
ENOREPOURL: () => ({
};
}
export function ENOREPOURL() {
return {
message: 'The `repositoryUrl` option is required.',
details: `The [repositoryUrl option](${linkify(
'docs/usage/configuration.md#repositoryurl'
@ -28,8 +34,11 @@ Please verify your CI configuration to make sure the \`semantic-release\` comman
Please make sure to add the \`repositoryUrl\` to the [semantic-release configuration] (${linkify(
'docs/usage/configuration.md'
)}).`,
}),
EGITNOPERMISSION: ({options: {repositoryUrl}, branch: {name}}) => ({
};
}
export function EGITNOPERMISSION({options: {repositoryUrl}, branch: {name}}) {
return {
message: 'Cannot push to the Git repository.',
details: `**semantic-release** cannot push the version tag to the branch \`${name}\` on the remote Git repository with URL \`${repositoryUrl}\`.
@ -39,40 +48,55 @@ This can be caused by:
- or missing push permission for the user configured via the [Git credentials on your CI environment](${linkify(
'docs/usage/ci-configuration.md#authentication'
)})`,
}),
EINVALIDTAGFORMAT: ({options: {tagFormat}}) => ({
};
}
export function EINVALIDTAGFORMAT({options: {tagFormat}}) {
return {
message: 'Invalid `tagFormat` option.',
details: `The [tagFormat](${linkify(
'docs/usage/configuration.md#tagformat'
)}) must compile to a [valid Git reference](https://git-scm.com/docs/git-check-ref-format#_description).
Your configuration for the \`tagFormat\` option is \`${stringify(tagFormat)}\`.`,
}),
ETAGNOVERSION: ({options: {tagFormat}}) => ({
};
}
export function ETAGNOVERSION({options: {tagFormat}}) {
return {
message: 'Invalid `tagFormat` option.',
details: `The [tagFormat](${linkify(
'docs/usage/configuration.md#tagformat'
)}) option must contain the variable \`version\` exactly once.
Your configuration for the \`tagFormat\` option is \`${stringify(tagFormat)}\`.`,
}),
EPLUGINCONF: ({type, required, pluginConf}) => ({
};
}
export function EPLUGINCONF({type, required, pluginConf}) {
return {
message: `The \`${type}\` plugin configuration is invalid.`,
details: `The [${type} plugin configuration](${linkify(`docs/usage/plugins.md#${toLower(type)}-plugin`)}) ${
required ? 'is required and ' : ''
} must be a single or an array of plugins definition. A plugin definition is an npm module name, optionally wrapped in an array with an object.
Your configuration for the \`${type}\` plugin is \`${stringify(pluginConf)}\`.`,
}),
EPLUGINSCONF: ({plugin}) => ({
};
}
export function EPLUGINSCONF({plugin}) {
return {
message: 'The `plugins` configuration is invalid.',
details: `The [plugins](${linkify(
'docs/usage/configuration.md#plugins'
)}) option must be an array of plugin definitions. A plugin definition is an npm module name, optionally wrapped in an array with an object.
The invalid configuration is \`${stringify(plugin)}\`.`,
}),
EPLUGIN: ({pluginName, type}) => ({
};
}
export function EPLUGIN({pluginName, type}) {
return {
message: `A plugin configured in the step ${type} is not a valid semantic-release plugin.`,
details: `A valid \`${type}\` **semantic-release** plugin must be a function or an object with a function in the property \`${type}\`.
@ -81,8 +105,11 @@ The plugin \`${pluginName}\` doesn't have the property \`${type}\` and cannot be
Please refer to the \`${pluginName}\` and [semantic-release plugins configuration](${linkify(
'docs/usage/plugins.md'
)}) documentation for more details.`,
}),
EANALYZECOMMITSOUTPUT: ({result, pluginName}) => ({
};
}
export function EANALYZECOMMITSOUTPUT({result, pluginName}) {
return {
message: 'The `analyzeCommits` plugin returned an invalid value. It must return a valid semver release type.',
details: `The \`analyzeCommits\` plugin must return a valid [semver](https://semver.org) release type. The valid values are: ${RELEASE_TYPE.map(
(type) => `\`${type}\``
@ -97,8 +124,11 @@ We recommend to report the issue to the \`${pluginName}\` authors, providing the
- A link to the **semantic-release** plugin developer guide: [${linkify('docs/developer-guide/plugin.md')}](${linkify(
'docs/developer-guide/plugin.md'
)})`,
}),
EGENERATENOTESOUTPUT: ({result, pluginName}) => ({
};
}
export function EGENERATENOTESOUTPUT({result, pluginName}) {
return {
message: 'The `generateNotes` plugin returned an invalid value. It must return a `String`.',
details: `The \`generateNotes\` plugin must return a \`String\`.
@ -111,8 +141,11 @@ We recommend to report the issue to the \`${pluginName}\` authors, providing the
- A link to the **semantic-release** plugin developer guide: [${linkify('docs/developer-guide/plugin.md')}](${linkify(
'docs/developer-guide/plugin.md'
)})`,
}),
EPUBLISHOUTPUT: ({result, pluginName}) => ({
};
}
export function EPUBLISHOUTPUT({result, pluginName}) {
return {
message: 'A `publish` plugin returned an invalid value. It must return an `Object`.',
details: `The \`publish\` plugins must return an \`Object\`.
@ -125,8 +158,11 @@ We recommend to report the issue to the \`${pluginName}\` authors, providing the
- A link to the **semantic-release** plugin developer guide: [${linkify('docs/developer-guide/plugin.md')}](${linkify(
'docs/developer-guide/plugin.md'
)})`,
}),
EADDCHANNELOUTPUT: ({result, pluginName}) => ({
};
}
export function EADDCHANNELOUTPUT({result, pluginName}) {
return {
message: 'A `addChannel` plugin returned an invalid value. It must return an `Object`.',
details: `The \`addChannel\` plugins must return an \`Object\`.
@ -139,48 +175,66 @@ We recommend to report the issue to the \`${pluginName}\` authors, providing the
- A link to the **semantic-release** plugin developer guide: [${linkify('docs/developer-guide/plugin.md')}](${linkify(
'docs/developer-guide/plugin.md'
)})`,
}),
EINVALIDBRANCH: ({branch}) => ({
};
}
export function EINVALIDBRANCH({branch}) {
return {
message: 'A branch is invalid in the `branches` configuration.',
details: `Each branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must be either a string, a regexp or an object with a \`name\` property.
Your configuration for the problematic branch is \`${stringify(branch)}\`.`,
}),
EINVALIDBRANCHNAME: ({branch}) => ({
};
}
export function EINVALIDBRANCHNAME({branch}) {
return {
message: 'A branch name is invalid in the `branches` configuration.',
details: `Each branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must be a [valid Git reference](https://git-scm.com/docs/git-check-ref-format#_description).
Your configuration for the problematic branch is \`${stringify(branch)}\`.`,
}),
EDUPLICATEBRANCHES: ({duplicates}) => ({
};
}
export function EDUPLICATEBRANCHES({duplicates}) {
return {
message: 'The `branches` configuration has duplicate branches.',
details: `Each branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must havea unique name.
Your configuration contains duplicates for the following branch names: \`${stringify(duplicates)}\`.`,
}),
EMAINTENANCEBRANCH: ({branch}) => ({
};
}
export function EMAINTENANCEBRANCH({branch}) {
return {
message: 'A maintenance branch is invalid in the `branches` configuration.',
details: `Each maintenance branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must have a \`range\` property formatted like \`N.x\`, \`N.x.x\` or \`N.N.x\` (\`N\` is a number).
Your configuration for the problematic branch is \`${stringify(branch)}\`.`,
}),
EMAINTENANCEBRANCHES: ({branches}) => ({
};
}
export function EMAINTENANCEBRANCHES({branches}) {
return {
message: 'The maintenance branches are invalid in the `branches` configuration.',
details: `Each maintenance branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must have a unique \`range\` property.
Your configuration for the problematic branches is \`${stringify(branches)}\`.`,
}),
ERELEASEBRANCHES: ({branches}) => ({
};
}
export function ERELEASEBRANCHES({branches}) {
return {
message: 'The release branches are invalid in the `branches` configuration.',
details: `A minimum of 1 and a maximum of 3 release branches are required in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
@ -189,24 +243,33 @@ Your configuration for the problematic branches is \`${stringify(branches)}\`.`,
This may occur if your repository does not have a release branch, such as \`master\`.
Your configuration for the problematic branches is \`${stringify(branches)}\`.`,
}),
EPRERELEASEBRANCH: ({branch}) => ({
};
}
export function EPRERELEASEBRANCH({branch}) {
return {
message: 'A pre-release branch configuration is invalid in the `branches` configuration.',
details: `Each pre-release branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must have a \`prerelease\` property valid per the [Semantic Versioning Specification](https://semver.org/#spec-item-9). If the \`prerelease\` property is set to \`true\`, then the \`name\` property is used instead.
Your configuration for the problematic branch is \`${stringify(branch)}\`.`,
}),
EPRERELEASEBRANCHES: ({branches}) => ({
};
}
export function EPRERELEASEBRANCHES({branches}) {
return {
message: 'The pre-release branches are invalid in the `branches` configuration.',
details: `Each pre-release branch in the [branches configuration](${linkify(
'docs/usage/configuration.md#branches'
)}) must have a unique \`prerelease\` property. If the \`prerelease\` property is set to \`true\`, then the \`name\` property is used instead.
Your configuration for the problematic branches is \`${stringify(branches)}\`.`,
}),
EINVALIDNEXTVERSION: ({nextRelease: {version}, branch: {name, range}, commits, validBranches}) => ({
};
}
export function EINVALIDNEXTVERSION({nextRelease: {version}, branch: {name, range}, commits, validBranches}) {
return {
message: `The release \`${version}\` on branch \`${name}\` cannot be published as it is out of range.`,
details: `Based on the releases published on other branches, only versions within the range \`${range}\` can be published from branch \`${name}\`.
@ -215,18 +278,21 @@ ${commits.map(({commit: {short}, subject}) => `- ${subject} (${short})`).join('\
${
commits.length > 1 ? 'Those commits' : 'This commit'
} should be moved to a valid branch with [git merge](https://git-scm.com/docs/git-merge) or [git cherry-pick](https://git-scm.com/docs/git-cherry-pick) and removed from branch \`${name}\` with [git revert](https://git-scm.com/docs/git-revert) or [git reset](https://git-scm.com/docs/git-reset).
} should be moved to a valid branch with [git merge](https://git-scm.com/docs/git-merge) or [git cherry-pick](https://git-scm.com/docs/git-cherry-pick) and removed from branch \`${name}\` with [git revert](https://git-scm.com/docs/git-revert) or [git reset](https://git-scm.com/docs/git-reset).
A valid branch could be ${wordsList(validBranches.map(({name}) => `\`${name}\``))}.
See the [workflow configuration documentation](${linkify('docs/usage/workflow-configuration.md')}) for more details.`,
}),
EINVALIDMAINTENANCEMERGE: ({nextRelease: {channel, gitTag, version}, branch: {mergeRange, name}}) => ({
};
}
export function EINVALIDMAINTENANCEMERGE({nextRelease: {channel, gitTag, version}, branch: {mergeRange, name}}) {
return {
message: `The release \`${version}\` on branch \`${name}\` cannot be published as it is out of range.`,
details: `Only releases within the range \`${mergeRange}\` can be merged into the maintenance branch \`${name}\` and published to the \`${channel}\` distribution channel.
The branch \`${name}\` head should be [reset](https://git-scm.com/docs/git-reset) to a previous commit so the commit with tag \`${gitTag}\` is removed from the branch history.
See the [workflow configuration documentation](${linkify('docs/usage/workflow-configuration.md')}) for more details.`,
}),
};
};
}

View File

@ -1,12 +1,12 @@
/* eslint require-atomic-updates: off */
const {isString, isPlainObject} = require('lodash');
const {getGitHead} = require('../git');
const hideSensitive = require('../hide-sensitive');
const {hideSensitiveValues} = require('../utils');
const {RELEASE_TYPE, RELEASE_NOTES_SEPARATOR} = require('./constants');
import {isPlainObject, isString} from 'lodash-es';
import {getGitHead} from '../git.js';
import hideSensitive from '../hide-sensitive.js';
import {hideSensitiveValues} from '../utils.js';
import {RELEASE_NOTES_SEPARATOR, RELEASE_TYPE} from './constants.js';
module.exports = {
export default {
verifyConditions: {
required: false,
dryRun: true,

View File

@ -1,5 +1,7 @@
const debug = require('debug')('semantic-release:get-commits');
const {getCommits} = require('./git');
import debugCommits from "debug";
import { getCommits } from "./git.js";
const debug = debugCommits("semantic-release:get-commits");
/**
* Retrieve the list of commits on the current branch since the commit sha associated with the last release, or all the commits of the current branch if there is no last released version.
@ -8,16 +10,22 @@ const {getCommits} = require('./git');
*
* @return {Promise<Array<Object>>} The list of commits on the branch `branch` since the last release.
*/
module.exports = async ({cwd, env, lastRelease: {gitHead: from}, nextRelease: {gitHead: to = 'HEAD'} = {}, logger}) => {
export default async ({
cwd,
env,
lastRelease: { gitHead: from },
nextRelease: { gitHead: to = "HEAD" } = {},
logger,
}) => {
if (from) {
debug('Use from: %s', from);
debug("Use from: %s", from);
} else {
logger.log('No previous release found, retrieving all commits');
logger.log("No previous release found, retrieving all commits");
}
const commits = await getCommits(from, to, {cwd, env});
const commits = await getCommits(from, to, { cwd, env });
logger.log(`Found ${commits.length} commits since last release`);
debug('Parsed commits: %o', commits);
debug("Parsed commits: %o", commits);
return commits;
};

View File

@ -1,40 +1,49 @@
const {castArray, pickBy, isNil, isString, isPlainObject} = require('lodash');
const readPkgUp = require('read-pkg-up');
const {cosmiconfig} = require('cosmiconfig');
const resolveFrom = require('resolve-from');
const debug = require('debug')('semantic-release:config');
const {repoUrl} = require('./git');
const PLUGINS_DEFINITIONS = require('./definitions/plugins');
const plugins = require('./plugins');
const {validatePlugin, parseConfig} = require('./plugins/utils');
import { dirname, resolve } from "node:path";
import { fileURLToPath } from "node:url";
import { createRequire } from "node:module";
const CONFIG_NAME = 'release';
import { castArray, isNil, isPlainObject, isString, pickBy } from "lodash-es";
import { readPackageUp } from "read-pkg-up";
import { cosmiconfig } from "cosmiconfig";
import resolveFrom from "resolve-from";
import debugConfig from "debug";
import { repoUrl } from "./git.js";
import PLUGINS_DEFINITIONS from "./definitions/plugins.js";
import plugins from "./plugins/index.js";
import { parseConfig, validatePlugin } from "./plugins/utils.js";
module.exports = async (context, cliOptions) => {
const {cwd, env} = context;
const {config, filepath} = (await cosmiconfig(CONFIG_NAME).search(cwd)) || {};
const debug = debugConfig("semantic-release:config");
const __dirname = dirname(fileURLToPath(import.meta.url));
const require = createRequire(import.meta.url);
debug('load config from: %s', filepath);
const CONFIG_NAME = "release";
export default async (context, cliOptions) => {
const { cwd, env } = context;
const { config, filepath } = (await cosmiconfig(CONFIG_NAME).search(cwd)) || {};
debug("load config from: %s", filepath);
// Merge config file options and CLI/API options
let options = {...config, ...cliOptions};
let options = { ...config, ...cliOptions };
const pluginsPath = {};
let extendPaths;
({extends: extendPaths, ...options} = options);
({ extends: extendPaths, ...options } = options);
if (extendPaths) {
// If `extends` is defined, load and merge each shareable config with `options`
options = {
...castArray(extendPaths).reduce((result, extendPath) => {
...(await castArray(extendPaths).reduce(async (eventualResult, extendPath) => {
const result = await eventualResult;
const extendsOptions = require(resolveFrom.silent(__dirname, extendPath) || resolveFrom(cwd, extendPath));
// For each plugin defined in a shareable config, save in `pluginsPath` the extendable config path,
// so those plugin will be loaded relatively to the config file
// so those plugin will be loaded relative to the config file
Object.entries(extendsOptions)
.filter(([, value]) => Boolean(value))
.reduce((pluginsPath, [option, value]) => {
castArray(value).forEach((plugin) => {
if (option === 'plugins' && validatePlugin(plugin)) {
if (option === "plugins" && validatePlugin(plugin)) {
pluginsPath[parseConfig(plugin)[0]] = extendPath;
} else if (
PLUGINS_DEFINITIONS[option] &&
@ -46,8 +55,8 @@ module.exports = async (context, cliOptions) => {
return pluginsPath;
}, pluginsPath);
return {...result, ...extendsOptions};
}, {}),
return { ...result, ...extendsOptions };
}, {})),
...options,
};
}
@ -55,36 +64,36 @@ module.exports = async (context, cliOptions) => {
// Set default options values if not defined yet
options = {
branches: [
'+([0-9])?(.{+([0-9]),x}).x',
'master',
'next',
'next-major',
{name: 'beta', prerelease: true},
{name: 'alpha', prerelease: true},
"+([0-9])?(.{+([0-9]),x}).x",
"master",
"next",
"next-major",
{ name: "beta", prerelease: true },
{ name: "alpha", prerelease: true },
],
repositoryUrl: (await pkgRepoUrl({normalize: false, cwd})) || (await repoUrl({cwd, env})),
repositoryUrl: (await pkgRepoUrl({ normalize: false, cwd })) || (await repoUrl({ cwd, env })),
tagFormat: `v\${version}`,
plugins: [
'@semantic-release/commit-analyzer',
'@semantic-release/release-notes-generator',
'@semantic-release/npm',
'@semantic-release/github',
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/npm",
"@semantic-release/github",
],
// Remove `null` and `undefined` options so they can be replaced with default ones
// Remove `null` and `undefined` options, so they can be replaced with default ones
...pickBy(options, (option) => !isNil(option)),
...(options.branches ? {branches: castArray(options.branches)} : {}),
...(options.branches ? { branches: castArray(options.branches) } : {}),
};
if (options.ci === false) {
options.noCi = true;
}
debug('options values: %O', options);
debug("options values: %O", options);
return {options, plugins: await plugins({...context, options}, pluginsPath)};
return { options, plugins: await plugins({ ...context, options }, pluginsPath) };
};
async function pkgRepoUrl(options) {
const {packageJson} = (await readPkgUp(options)) || {};
const { packageJson } = (await readPackageUp(options)) || {};
return packageJson && (isPlainObject(packageJson.repository) ? packageJson.repository.url : packageJson.repository);
}

View File

@ -1,7 +1,7 @@
const SemanticReleaseError = require('@semantic-release/error');
const ERROR_DEFINITIONS = require('./definitions/errors');
import SemanticReleaseError from "@semantic-release/error";
import * as ERROR_DEFINITIONS from "./definitions/errors.js";
module.exports = (code, ctx = {}) => {
const {message, details} = ERROR_DEFINITIONS[code](ctx);
export default (code, ctx = {}) => {
const { message, details } = ERROR_DEFINITIONS[code](ctx);
return new SemanticReleaseError(message, code, details);
};

View File

@ -1,8 +1,10 @@
const {parse, format} = require('url'); // eslint-disable-line node/no-deprecated-api
const {isNil} = require('lodash');
const hostedGitInfo = require('hosted-git-info');
const {verifyAuth} = require('./git');
const debug = require('debug')('semantic-release:get-git-auth-url');
import { format, parse } from "node:url";
import { isNil } from "lodash-es";
import hostedGitInfo from "hosted-git-info";
import debugAuthUrl from "debug";
import { verifyAuth } from "./git.js";
const debug = debugAuthUrl("semantic-release:get-git-auth-url");
/**
* Machinery to format a repository URL with the given credentials
@ -16,15 +18,15 @@ const debug = require('debug')('semantic-release:get-git-auth-url');
function formatAuthUrl(protocol, repositoryUrl, gitCredentials) {
const [match, auth, host, basePort, path] =
/^(?!.+:\/\/)(?:(?<auth>.*)@)?(?<host>.*?):(?<port>\d+)?:?\/?(?<path>.*)$/.exec(repositoryUrl) || [];
const {port, hostname, ...parsed} = parse(
match ? `ssh://${auth ? `${auth}@` : ''}${host}${basePort ? `:${basePort}` : ''}/${path}` : repositoryUrl
const { port, hostname, ...parsed } = parse(
match ? `ssh://${auth ? `${auth}@` : ""}${host}${basePort ? `:${basePort}` : ""}/${path}` : repositoryUrl
);
return format({
...parsed,
auth: gitCredentials,
host: `${hostname}${protocol === 'ssh:' ? '' : port ? `:${port}` : ''}`,
protocol: protocol && /http[^s]/.test(protocol) ? 'http' : 'https',
host: `${hostname}${protocol === "ssh:" ? "" : port ? `:${port}` : ""}`,
protocol: protocol && /http[^s]/.test(protocol) ? "http" : "https",
});
}
@ -36,9 +38,9 @@ function formatAuthUrl(protocol, repositoryUrl, gitCredentials) {
*
* @return {String} The authUrl as is if the connection was successfull, null otherwise
*/
async function ensureValidAuthUrl({cwd, env, branch}, authUrl) {
async function ensureValidAuthUrl({ cwd, env, branch }, authUrl) {
try {
await verifyAuth(authUrl, branch.name, {cwd, env});
await verifyAuth(authUrl, branch.name, { cwd, env });
return authUrl;
} catch (error) {
debug(error);
@ -57,45 +59,45 @@ async function ensureValidAuthUrl({cwd, env, branch}, authUrl) {
*
* @return {String} The formatted Git repository URL.
*/
module.exports = async (context) => {
const {cwd, env, branch} = context;
export default async (context) => {
const { cwd, env, branch } = context;
const GIT_TOKENS = {
GIT_CREDENTIALS: undefined,
GH_TOKEN: undefined,
// GitHub Actions require the "x-access-token:" prefix for git access
// https://developer.github.com/apps/building-github-apps/authenticating-with-github-apps/#http-based-git-access-by-an-installation
GITHUB_TOKEN: isNil(env.GITHUB_ACTION) ? undefined : 'x-access-token:',
GL_TOKEN: 'gitlab-ci-token:',
GITLAB_TOKEN: 'gitlab-ci-token:',
BB_TOKEN: 'x-token-auth:',
BITBUCKET_TOKEN: 'x-token-auth:',
BB_TOKEN_BASIC_AUTH: '',
BITBUCKET_TOKEN_BASIC_AUTH: '',
GITHUB_TOKEN: isNil(env.GITHUB_ACTION) ? undefined : "x-access-token:",
GL_TOKEN: "gitlab-ci-token:",
GITLAB_TOKEN: "gitlab-ci-token:",
BB_TOKEN: "x-token-auth:",
BITBUCKET_TOKEN: "x-token-auth:",
BB_TOKEN_BASIC_AUTH: "",
BITBUCKET_TOKEN_BASIC_AUTH: "",
};
let {repositoryUrl} = context.options;
const info = hostedGitInfo.fromUrl(repositoryUrl, {noGitPlus: true});
const {protocol, ...parsed} = parse(repositoryUrl);
let { repositoryUrl } = context.options;
const info = hostedGitInfo.fromUrl(repositoryUrl, { noGitPlus: true });
const { protocol, ...parsed } = parse(repositoryUrl);
if (info && info.getDefaultRepresentation() === 'shortcut') {
if (info && info.getDefaultRepresentation() === "shortcut") {
// Expand shorthand URLs (such as `owner/repo` or `gitlab:owner/repo`)
repositoryUrl = info.https();
} else if (protocol && protocol.includes('http')) {
} else if (protocol && protocol.includes("http")) {
// Replace `git+https` and `git+http` with `https` or `http`
repositoryUrl = format({...parsed, protocol: protocol.includes('https') ? 'https' : 'http', href: null});
repositoryUrl = format({ ...parsed, protocol: protocol.includes("https") ? "https" : "http", href: null });
}
// Test if push is allowed without transforming the URL (e.g. is ssh keys are set up)
try {
debug('Verifying ssh auth by attempting to push to %s', repositoryUrl);
await verifyAuth(repositoryUrl, branch.name, {cwd, env});
debug("Verifying ssh auth by attempting to push to %s", repositoryUrl);
await verifyAuth(repositoryUrl, branch.name, { cwd, env });
} catch {
debug('SSH key auth failed, falling back to https.');
debug("SSH key auth failed, falling back to https.");
const envVars = Object.keys(GIT_TOKENS).filter((envVar) => !isNil(env[envVar]));
// Skip verification if there is no ambiguity on which env var to use for authentication
if (envVars.length === 1) {
const gitCredentials = `${GIT_TOKENS[envVars[0]] || ''}${env[envVars[0]]}`;
const gitCredentials = `${GIT_TOKENS[envVars[0]] || ""}${env[envVars[0]]}`;
return formatAuthUrl(protocol, repositoryUrl, gitCredentials);
}
@ -104,7 +106,7 @@ module.exports = async (context) => {
const candidateRepositoryUrls = [];
for (const envVar of envVars) {
const gitCredentials = `${GIT_TOKENS[envVar] || ''}${env[envVar]}`;
const gitCredentials = `${GIT_TOKENS[envVar] || ""}${env[envVar]}`;
const authUrl = formatAuthUrl(protocol, repositoryUrl, gitCredentials);
candidateRepositoryUrls.push(ensureValidAuthUrl(context, authUrl));
}

View File

@ -1,6 +1,6 @@
const {isUndefined} = require('lodash');
const semver = require('semver');
const {makeTag, isSameChannel} = require('./utils');
import { isUndefined } from "lodash-es";
import semver from "semver";
import { isSameChannel, makeTag } from "./utils.js";
/**
* Last release.
@ -18,7 +18,7 @@ const {makeTag, isSameChannel} = require('./utils');
*
* - Filter out the branch tags that are not valid semantic version
* - Sort the versions
* - Retrive the highest version
* - Retrieve the highest version
*
* @param {Object} context semantic-release context.
* @param {Object} params Function parameters.
@ -26,18 +26,18 @@ const {makeTag, isSameChannel} = require('./utils');
*
* @return {LastRelease} The last tagged release or empty object if none is found.
*/
module.exports = ({branch, options: {tagFormat}}, {before} = {}) => {
const [{version, gitTag, channels} = {}] = branch.tags
export default ({ branch, options: { tagFormat } }, { before } = {}) => {
const [{ version, gitTag, channels } = {}] = branch.tags
.filter(
(tag) =>
((branch.type === 'prerelease' && tag.channels.some((channel) => isSameChannel(branch.channel, channel))) ||
((branch.type === "prerelease" && tag.channels.some((channel) => isSameChannel(branch.channel, channel))) ||
!semver.prerelease(tag.version)) &&
(isUndefined(before) || semver.lt(tag.version, before))
)
.sort((a, b) => semver.rcompare(a.version, b.version));
if (gitTag) {
return {version, gitTag, channels, gitHead: gitTag, name: makeTag(tagFormat, version)};
return { version, gitTag, channels, gitHead: gitTag, name: makeTag(tagFormat, version) };
}
return {};

View File

@ -1,16 +1,18 @@
const {Signale} = require('signale');
const figures = require('figures');
import signale from "signale";
import figures from "figures";
module.exports = ({stdout, stderr}) =>
const { Signale } = signale;
export default ({ stdout, stderr }) =>
new Signale({
config: {displayTimestamp: true, underlineMessage: false, displayLabel: false},
config: { displayTimestamp: true, underlineMessage: false, displayLabel: false },
disabled: false,
interactive: false,
scope: 'semantic-release',
scope: "semantic-release",
stream: [stdout],
types: {
error: {badge: figures.cross, color: 'red', label: '', stream: [stderr]},
log: {badge: figures.info, color: 'magenta', label: '', stream: [stdout]},
success: {badge: figures.tick, color: 'green', label: '', stream: [stdout]},
error: { badge: figures.cross, color: "red", label: "", stream: [stderr] },
log: { badge: figures.info, color: "magenta", label: "", stream: [stdout] },
success: { badge: figures.tick, color: "green", label: "", stream: [stdout] },
},
});

View File

@ -1,20 +1,20 @@
const semver = require('semver');
const {FIRST_RELEASE, FIRSTPRERELEASE} = require('./definitions/constants');
const {isSameChannel, getLatestVersion, tagsToVersions, highest} = require('./utils');
import semver from "semver";
import { FIRST_RELEASE, FIRSTPRERELEASE } from "./definitions/constants.js";
import { getLatestVersion, highest, isSameChannel, tagsToVersions } from "./utils.js";
module.exports = ({branch, nextRelease: {type, channel}, lastRelease, logger}) => {
export default ({ branch, nextRelease: { type, channel }, lastRelease, logger }) => {
let version;
if (lastRelease.version) {
const {major, minor, patch} = semver.parse(lastRelease.version);
const { major, minor, patch } = semver.parse(lastRelease.version);
if (branch.type === 'prerelease') {
if (branch.type === "prerelease") {
if (
semver.prerelease(lastRelease.version) &&
lastRelease.channels.some((lastReleaseChannel) => isSameChannel(lastReleaseChannel, channel))
) {
version = highest(
semver.inc(lastRelease.version, 'prerelease'),
`${semver.inc(getLatestVersion(tagsToVersions(branch.tags), {withPrerelease: true}), type)}-${
semver.inc(lastRelease.version, "prerelease"),
`${semver.inc(getLatestVersion(tagsToVersions(branch.tags), { withPrerelease: true }), type)}-${
branch.prerelease
}.${FIRSTPRERELEASE}`
);
@ -25,9 +25,9 @@ module.exports = ({branch, nextRelease: {type, channel}, lastRelease, logger}) =
version = semver.inc(lastRelease.version, type);
}
logger.log('The next release version is %s', version);
logger.log("The next release version is %s", version);
} else {
version = branch.type === 'prerelease' ? `${FIRST_RELEASE}-${branch.prerelease}.${FIRSTPRERELEASE}` : FIRST_RELEASE;
version = branch.type === "prerelease" ? `${FIRST_RELEASE}-${branch.prerelease}.${FIRSTPRERELEASE}` : FIRST_RELEASE;
logger.log(`There is no previous release, the next release version is ${version}`);
}

View File

@ -1,8 +1,8 @@
const {uniqBy, intersection} = require('lodash');
const semver = require('semver');
const semverDiff = require('semver-diff');
const getLastRelease = require('./get-last-release');
const {makeTag, getLowerBound} = require('./utils');
import { intersection, uniqBy } from "lodash-es";
import semver from "semver";
import semverDiff from "semver-diff";
import getLastRelease from "./get-last-release.js";
import { getLowerBound, makeTag } from "./utils.js";
/**
* Find releases that have been merged from from a higher branch but not added on the channel of the current branch.
@ -11,42 +11,42 @@ const {makeTag, getLowerBound} = require('./utils');
*
* @return {Array<Object>} Last release and next release to be added on the channel of the current branch.
*/
module.exports = (context) => {
export default (context) => {
const {
branch,
branches,
options: {tagFormat},
options: { tagFormat },
} = context;
const higherChannels = branches
// Consider only releases of higher branches
.slice(branches.findIndex(({name}) => name === branch.name) + 1)
.slice(branches.findIndex(({ name }) => name === branch.name) + 1)
// Exclude prerelease branches
.filter(({type}) => type !== 'prerelease')
.map(({channel}) => channel || null);
.filter(({ type }) => type !== "prerelease")
.map(({ channel }) => channel || null);
const versiontoAdd = uniqBy(
branch.tags.filter(
({channels, version}) =>
({ channels, version }) =>
!channels.includes(branch.channel || null) &&
intersection(channels, higherChannels).length > 0 &&
(branch.type !== 'maintenance' || semver.gte(version, getLowerBound(branch.mergeRange)))
(branch.type !== "maintenance" || semver.gte(version, getLowerBound(branch.mergeRange)))
),
'version'
"version"
).sort((a, b) => semver.compare(b.version, a.version))[0];
if (versiontoAdd) {
const {version, gitTag, channels} = versiontoAdd;
const lastRelease = getLastRelease(context, {before: version});
const { version, gitTag, channels } = versiontoAdd;
const lastRelease = getLastRelease(context, { before: version });
if (semver.gt(getLastRelease(context).version, version)) {
return;
}
const type = lastRelease.version ? semverDiff(lastRelease.version, version) : 'major';
const type = lastRelease.version ? semverDiff(lastRelease.version, version) : "major";
const name = makeTag(tagFormat, version);
return {
lastRelease,
currentRelease: {type, version, channels, gitTag, name, gitHead: gitTag},
currentRelease: { type, version, channels, gitTag, name, gitHead: gitTag },
nextRelease: {
type,
version,

View File

@ -1,10 +1,12 @@
const gitLogParser = require('git-log-parser');
const getStream = require('get-stream');
const execa = require('execa');
const debug = require('debug')('semantic-release:git');
const {GIT_NOTE_REF} = require('./definitions/constants');
import gitLogParser from "git-log-parser";
import getStream from "get-stream";
import { execa } from "execa";
import debugGit from "debug";
import { GIT_NOTE_REF } from "./definitions/constants.js";
Object.assign(gitLogParser.fields, {hash: 'H', message: 'B', gitTags: 'd', committerDate: {key: 'ci', type: Date}});
const debug = debugGit("semantic-release:git");
Object.assign(gitLogParser.fields, { hash: "H", message: "B", gitTags: "d", committerDate: { key: "ci", type: Date } });
/**
* Get the commit sha for a given tag.
@ -14,8 +16,8 @@ Object.assign(gitLogParser.fields, {hash: 'H', message: 'B', gitTags: 'd', commi
*
* @return {String} The commit sha of the tag in parameter or `null`.
*/
async function getTagHead(tagName, execaOptions) {
return (await execa('git', ['rev-list', '-1', tagName], execaOptions)).stdout;
export async function getTagHead(tagName, execaOptions) {
return (await execa("git", ["rev-list", "-1", tagName], execaOptions)).stdout;
}
/**
@ -27,9 +29,9 @@ async function getTagHead(tagName, execaOptions) {
* @return {Array<String>} List of git tags.
* @throws {Error} If the `git` command fails.
*/
async function getTags(branch, execaOptions) {
return (await execa('git', ['tag', '--merged', branch], execaOptions)).stdout
.split('\n')
export async function getTags(branch, execaOptions) {
return (await execa("git", ["tag", "--merged", branch], execaOptions)).stdout
.split("\n")
.map((tag) => tag.trim())
.filter(Boolean);
}
@ -42,15 +44,15 @@ async function getTags(branch, execaOptions) {
* @param {Object} [execaOpts] Options to pass to `execa`.
* @return {Promise<Array<Object>>} The list of commits between `from` and `to`.
*/
async function getCommits(from, to, execaOptions) {
export async function getCommits(from, to, execaOptions) {
return (
await getStream.array(
gitLogParser.parse(
{_: `${from ? from + '..' : ''}${to}`},
{cwd: execaOptions.cwd, env: {...process.env, ...execaOptions.env}}
{ _: `${from ? from + ".." : ""}${to}` },
{ cwd: execaOptions.cwd, env: { ...process.env, ...execaOptions.env } }
)
)
).map(({message, gitTags, ...commit}) => ({...commit, message: message.trim(), gitTags: gitTags.trim()}));
).map(({ message, gitTags, ...commit }) => ({ ...commit, message: message.trim(), gitTags: gitTags.trim() }));
}
/**
@ -62,9 +64,9 @@ async function getCommits(from, to, execaOptions) {
* @return {Array<String>} List of git branches.
* @throws {Error} If the `git` command fails.
*/
async function getBranches(repositoryUrl, execaOptions) {
return (await execa('git', ['ls-remote', '--heads', repositoryUrl], execaOptions)).stdout
.split('\n')
export async function getBranches(repositoryUrl, execaOptions) {
return (await execa("git", ["ls-remote", "--heads", repositoryUrl], execaOptions)).stdout
.split("\n")
.filter(Boolean)
.map((branch) => branch.match(/^.+refs\/heads\/(?<branch>.+)$/)[1]);
}
@ -77,9 +79,9 @@ async function getBranches(repositoryUrl, execaOptions) {
*
* @return {Boolean} `true` if the reference exists, falsy otherwise.
*/
async function isRefExists(ref, execaOptions) {
export async function isRefExists(ref, execaOptions) {
try {
return (await execa('git', ['rev-parse', '--verify', ref], execaOptions)).exitCode === 0;
return (await execa("git", ["rev-parse", "--verify", ref], execaOptions)).exitCode === 0;
} catch (error) {
debug(error);
}
@ -99,32 +101,32 @@ async function isRefExists(ref, execaOptions) {
* @param {String} branch The repository branch to fetch.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function fetch(repositoryUrl, branch, ciBranch, execaOptions) {
export async function fetch(repositoryUrl, branch, ciBranch, execaOptions) {
const isDetachedHead =
(await execa('git', ['rev-parse', '--abbrev-ref', 'HEAD'], {...execaOptions, reject: false})).stdout === 'HEAD';
(await execa("git", ["rev-parse", "--abbrev-ref", "HEAD"], { ...execaOptions, reject: false })).stdout === "HEAD";
try {
await execa(
'git',
"git",
[
'fetch',
'--unshallow',
'--tags',
"fetch",
"--unshallow",
"--tags",
...(branch === ciBranch && !isDetachedHead
? [repositoryUrl]
: ['--update-head-ok', repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
: ["--update-head-ok", repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
],
execaOptions
);
} catch {
await execa(
'git',
"git",
[
'fetch',
'--tags',
"fetch",
"--tags",
...(branch === ciBranch && !isDetachedHead
? [repositoryUrl]
: ['--update-head-ok', repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
: ["--update-head-ok", repositoryUrl, `+refs/heads/${branch}:refs/heads/${branch}`]),
],
execaOptions
);
@ -137,15 +139,15 @@ async function fetch(repositoryUrl, branch, ciBranch, execaOptions) {
* @param {String} repositoryUrl The remote repository URL.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function fetchNotes(repositoryUrl, execaOptions) {
export async function fetchNotes(repositoryUrl, execaOptions) {
try {
await execa(
'git',
['fetch', '--unshallow', repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`],
"git",
["fetch", "--unshallow", repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`],
execaOptions
);
} catch {
await execa('git', ['fetch', repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`], {
await execa("git", ["fetch", repositoryUrl, `+refs/notes/${GIT_NOTE_REF}:refs/notes/${GIT_NOTE_REF}`], {
...execaOptions,
reject: false,
});
@ -159,8 +161,8 @@ async function fetchNotes(repositoryUrl, execaOptions) {
*
* @return {String} the sha of the HEAD commit.
*/
async function getGitHead(execaOptions) {
return (await execa('git', ['rev-parse', 'HEAD'], execaOptions)).stdout;
export async function getGitHead(execaOptions) {
return (await execa("git", ["rev-parse", "HEAD"], execaOptions)).stdout;
}
/**
@ -170,9 +172,9 @@ async function getGitHead(execaOptions) {
*
* @return {string} The value of the remote git URL.
*/
async function repoUrl(execaOptions) {
export async function repoUrl(execaOptions) {
try {
return (await execa('git', ['config', '--get', 'remote.origin.url'], execaOptions)).stdout;
return (await execa("git", ["config", "--get", "remote.origin.url"], execaOptions)).stdout;
} catch (error) {
debug(error);
}
@ -185,9 +187,9 @@ async function repoUrl(execaOptions) {
*
* @return {Boolean} `true` if the current working directory is in a git repository, falsy otherwise.
*/
async function isGitRepo(execaOptions) {
export async function isGitRepo(execaOptions) {
try {
return (await execa('git', ['rev-parse', '--git-dir'], execaOptions)).exitCode === 0;
return (await execa("git", ["rev-parse", "--git-dir"], execaOptions)).exitCode === 0;
} catch (error) {
debug(error);
}
@ -202,9 +204,9 @@ async function isGitRepo(execaOptions) {
*
* @throws {Error} if not authorized to push.
*/
async function verifyAuth(repositoryUrl, branch, execaOptions) {
export async function verifyAuth(repositoryUrl, branch, execaOptions) {
try {
await execa('git', ['push', '--dry-run', '--no-verify', repositoryUrl, `HEAD:${branch}`], execaOptions);
await execa("git", ["push", "--dry-run", "--no-verify", repositoryUrl, `HEAD:${branch}`], execaOptions);
} catch (error) {
debug(error);
throw error;
@ -220,8 +222,8 @@ async function verifyAuth(repositoryUrl, branch, execaOptions) {
*
* @throws {Error} if the tag creation failed.
*/
async function tag(tagName, ref, execaOptions) {
await execa('git', ['tag', tagName, ref], execaOptions);
export async function tag(tagName, ref, execaOptions) {
await execa("git", ["tag", tagName, ref], execaOptions);
}
/**
@ -232,8 +234,8 @@ async function tag(tagName, ref, execaOptions) {
*
* @throws {Error} if the push failed.
*/
async function push(repositoryUrl, execaOptions) {
await execa('git', ['push', '--tags', repositoryUrl], execaOptions);
export async function push(repositoryUrl, execaOptions) {
await execa("git", ["push", "--tags", repositoryUrl], execaOptions);
}
/**
@ -244,8 +246,8 @@ async function push(repositoryUrl, execaOptions) {
*
* @throws {Error} if the push failed.
*/
async function pushNotes(repositoryUrl, execaOptions) {
await execa('git', ['push', repositoryUrl, `refs/notes/${GIT_NOTE_REF}`], execaOptions);
export async function pushNotes(repositoryUrl, execaOptions) {
await execa("git", ["push", repositoryUrl, `refs/notes/${GIT_NOTE_REF}`], execaOptions);
}
/**
@ -256,9 +258,9 @@ async function pushNotes(repositoryUrl, execaOptions) {
*
* @return {Boolean} `true` if valid, falsy otherwise.
*/
async function verifyTagName(tagName, execaOptions) {
export async function verifyTagName(tagName, execaOptions) {
try {
return (await execa('git', ['check-ref-format', `refs/tags/${tagName}`], execaOptions)).exitCode === 0;
return (await execa("git", ["check-ref-format", `refs/tags/${tagName}`], execaOptions)).exitCode === 0;
} catch (error) {
debug(error);
}
@ -272,9 +274,9 @@ async function verifyTagName(tagName, execaOptions) {
*
* @return {Boolean} `true` if valid, falsy otherwise.
*/
async function verifyBranchName(branch, execaOptions) {
export async function verifyBranchName(branch, execaOptions) {
try {
return (await execa('git', ['check-ref-format', `refs/heads/${branch}`], execaOptions)).exitCode === 0;
return (await execa("git", ["check-ref-format", `refs/heads/${branch}`], execaOptions)).exitCode === 0;
} catch (error) {
debug(error);
}
@ -289,10 +291,10 @@ async function verifyBranchName(branch, execaOptions) {
*
* @return {Boolean} `true` is the HEAD of the current local branch is the same as the HEAD of the remote branch, falsy otherwise.
*/
async function isBranchUpToDate(repositoryUrl, branch, execaOptions) {
export async function isBranchUpToDate(repositoryUrl, branch, execaOptions) {
return (
(await getGitHead(execaOptions)) ===
(await execa('git', ['ls-remote', '--heads', repositoryUrl, branch], execaOptions)).stdout.match(/^(?<ref>\w+)?/)[1]
(await execa("git", ["ls-remote", "--heads", repositoryUrl, branch], execaOptions)).stdout.match(/^(?<ref>\w+)?/)[1]
);
}
@ -304,9 +306,9 @@ async function isBranchUpToDate(repositoryUrl, branch, execaOptions) {
*
* @return {Object} the parsed JSON note if there is one, an empty object otherwise.
*/
async function getNote(ref, execaOptions) {
export async function getNote(ref, execaOptions) {
try {
return JSON.parse((await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'show', ref], execaOptions)).stdout);
return JSON.parse((await execa("git", ["notes", "--ref", GIT_NOTE_REF, "show", ref], execaOptions)).stdout);
} catch (error) {
if (error.exitCode === 1) {
return {};
@ -324,28 +326,6 @@ async function getNote(ref, execaOptions) {
* @param {String} ref The Git reference to add the note to.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function addNote(note, ref, execaOptions) {
await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'add', '-f', '-m', JSON.stringify(note), ref], execaOptions);
export async function addNote(note, ref, execaOptions) {
await execa("git", ["notes", "--ref", GIT_NOTE_REF, "add", "-f", "-m", JSON.stringify(note), ref], execaOptions);
}
module.exports = {
getTagHead,
getTags,
getCommits,
getBranches,
isRefExists,
fetch,
fetchNotes,
getGitHead,
repoUrl,
isGitRepo,
verifyAuth,
tag,
push,
pushNotes,
verifyTagName,
isBranchUpToDate,
verifyBranchName,
getNote,
addNote,
};

View File

@ -1,10 +1,10 @@
const {escapeRegExp, size, isString} = require('lodash');
const {SECRET_REPLACEMENT, SECRET_MIN_SIZE} = require('./definitions/constants');
import { escapeRegExp, isString, size } from "lodash-es";
import { SECRET_MIN_SIZE, SECRET_REPLACEMENT } from "./definitions/constants.js";
module.exports = (env) => {
export default (env) => {
const toReplace = Object.keys(env).filter((envVar) => {
// https://github.com/semantic-release/semantic-release/issues/1558
if (envVar === 'GOPRIVATE') {
if (envVar === "GOPRIVATE") {
return false;
}
@ -12,8 +12,8 @@ module.exports = (env) => {
});
const regexp = new RegExp(
toReplace.map((envVar) => `${escapeRegExp(env[envVar])}|${escapeRegExp(encodeURI(env[envVar]))}`).join('|'),
'g'
toReplace.map((envVar) => `${escapeRegExp(env[envVar])}|${escapeRegExp(encodeURI(env[envVar]))}`).join("|"),
"g"
);
return (output) =>
output && isString(output) && toReplace.length > 0 ? output.toString().replace(regexp, SECRET_REPLACEMENT) : output;

View File

@ -1,12 +1,12 @@
const {identity, isPlainObject, omit, castArray, isNil, isString} = require('lodash');
const AggregateError = require('aggregate-error');
const getError = require('../get-error');
const PLUGINS_DEFINITIONS = require('../definitions/plugins');
const {validatePlugin, validateStep, loadPlugin, parseConfig} = require('./utils');
const pipeline = require('./pipeline');
const normalize = require('./normalize');
import {castArray, identity, isNil, isPlainObject, isString, omit} from 'lodash-es';
import AggregateError from 'aggregate-error';
import getError from '../get-error.js';
import PLUGINS_DEFINITIONS from '../definitions/plugins.js';
import {loadPlugin, parseConfig, validatePlugin, validateStep} from './utils.js';
import pipeline from './pipeline.js';
import normalize from './normalize.js';
module.exports = async (context, pluginsPath) => {
export default async (context, pluginsPath) => {
let {options, logger} = context;
const errors = [];
@ -100,4 +100,4 @@ module.exports = async (context, pluginsPath) => {
}
return pluginsConfig;
};
}

View File

@ -1,11 +1,13 @@
const {isPlainObject, isFunction, noop, cloneDeep, omit} = require('lodash');
const debug = require('debug')('semantic-release:plugins');
const getError = require('../get-error');
const {extractErrors} = require('../utils');
const PLUGINS_DEFINITIONS = require('../definitions/plugins');
const {loadPlugin, parseConfig} = require('./utils');
import {cloneDeep, isFunction, isPlainObject, noop, omit} from 'lodash-es';
import debugPlugins from 'debug';
import getError from '../get-error.js';
import {extractErrors} from '../utils.js';
import PLUGINS_DEFINITIONS from '../definitions/plugins.js';
import {loadPlugin, parseConfig} from './utils.js';
module.exports = async (context, type, pluginOpt, pluginsPath) => {
const debug = debugPlugins('semantic-release:plugins');
export default async (context, type, pluginOpt, pluginsPath) => {
const {stdout, stderr, options, logger} = context;
if (!pluginOpt) {
return noop;
@ -64,4 +66,4 @@ module.exports = async (context, type, pluginOpt, pluginsPath) => {
}
return validator;
};
}

View File

@ -1,7 +1,7 @@
const {identity} = require('lodash');
const pReduce = require('p-reduce');
const AggregateError = require('aggregate-error');
const {extractErrors} = require('../utils');
import {identity} from 'lodash-es';
import pReduce from 'p-reduce';
import AggregateError from 'aggregate-error';
import {extractErrors} from '../utils.js';
/**
* A Function that execute a list of function sequencially. If at least one Function ins the pipeline throws an Error or rejects, the pipeline function rejects as well.
@ -23,9 +23,9 @@ const {extractErrors} = require('../utils');
* @param {Function} [options.getNextInput=identity] Function called after each step is executed, with the last step input and the current current step result; the returned value will be used as the input of the next step.
* @param {Function} [options.transform=identity] Function called after each step is executed, with the current step result, the step function and the last step input; the returned value will be saved in the pipeline results.
*
* @return {Pipeline} A Function that execute the `steps` sequencially
* @return {Pipeline} A Function that execute the `steps` sequentially
*/
module.exports = (steps, {settleAll = false, getNextInput = identity, transform = identity} = {}) => async (input) => {
export default (steps, {settleAll = false, getNextInput = identity, transform = identity} = {}) => async (input) => {
const results = [];
const errors = [];
await pReduce(
@ -55,4 +55,4 @@ module.exports = (steps, {settleAll = false, getNextInput = identity, transform
}
return results;
};
}

View File

@ -1,6 +1,9 @@
const {dirname} = require('path');
const {isString, isFunction, castArray, isArray, isPlainObject, isNil} = require('lodash');
const resolveFrom = require('resolve-from');
import {dirname} from 'node:path';
import {fileURLToPath} from 'node:url';
import {castArray, isArray, isFunction, isNil, isPlainObject, isString} from 'lodash-es';
import resolveFrom from 'resolve-from';
const __dirname = dirname(fileURLToPath(import.meta.url));
const validateSteps = (conf) => {
return conf.every((conf) => {
@ -24,7 +27,7 @@ const validateSteps = (conf) => {
});
};
function validatePlugin(conf) {
export function validatePlugin(conf) {
return (
isString(conf) ||
(isArray(conf) &&
@ -35,7 +38,7 @@ function validatePlugin(conf) {
);
}
function validateStep({required}, conf) {
export function validateStep({required}, conf) {
conf = castArray(conf).filter(Boolean);
if (required) {
return conf.length >= 1 && validateSteps(conf);
@ -44,7 +47,7 @@ function validateStep({required}, conf) {
return conf.length === 0 || validateSteps(conf);
}
async function loadPlugin({cwd}, name, pluginsPath) {
export async function loadPlugin({cwd}, name, pluginsPath) {
const basePath = pluginsPath[name]
? dirname(resolveFrom.silent(__dirname, pluginsPath[name]) || resolveFrom(cwd, pluginsPath[name]))
: __dirname;
@ -54,7 +57,7 @@ async function loadPlugin({cwd}, name, pluginsPath) {
return isFunction(name) ? name : (await import(resolveFrom.silent(basePath, name) || resolveFrom(cwd, name))).default;
}
function parseConfig(plugin) {
export function parseConfig(plugin) {
let path;
let config;
if (isArray(plugin)) {
@ -67,5 +70,3 @@ function parseConfig(plugin) {
return [path, config || {}];
}
module.exports = {validatePlugin, validateStep, loadPlugin, parseConfig};

View File

@ -1,12 +1,12 @@
const {isFunction, union, template} = require('lodash');
const semver = require('semver');
const hideSensitive = require('./hide-sensitive');
import { isFunction, template, union } from "lodash-es";
import semver from "semver";
import hideSensitive from "./hide-sensitive.js";
function extractErrors(err) {
return err && isFunction(err[Symbol.iterator]) ? [...err] : [err];
export function extractErrors(err) {
return err && err.errors ? [...err.errors] : [err];
}
function hideSensitiveValues(env, objs) {
export function hideSensitiveValues(env, objs) {
const hideFunction = hideSensitive(env);
return objs.map((object) => {
Object.getOwnPropertyNames(object).forEach((prop) => {
@ -18,51 +18,51 @@ function hideSensitiveValues(env, objs) {
});
}
function tagsToVersions(tags) {
return tags.map(({version}) => version);
export function tagsToVersions(tags) {
return tags.map(({ version }) => version);
}
function isMajorRange(range) {
export function isMajorRange(range) {
return /^\d+\.x(?:\.x)?$/i.test(range);
}
function isMaintenanceRange(range) {
export function isMaintenanceRange(range) {
return /^\d+\.(?:\d+|x)(?:\.x)?$/i.test(range);
}
function getUpperBound(range) {
export function getUpperBound(range) {
const result = semver.valid(range)
? range
: ((semver.validRange(range) || '').match(/<(?<upperBound>\d+\.\d+\.\d+(-\d+)?)$/) || [])[1];
: ((semver.validRange(range) || "").match(/<(?<upperBound>\d+\.\d+\.\d+(-\d+)?)$/) || [])[1];
return result
? // https://github.com/npm/node-semver/issues/322
result.replace(/-\d+$/, '')
result.replace(/-\d+$/, "")
: result;
}
function getLowerBound(range) {
return ((semver.validRange(range) || '').match(/(?<lowerBound>\d+\.\d+\.\d+)/) || [])[1];
export function getLowerBound(range) {
return ((semver.validRange(range) || "").match(/(?<lowerBound>\d+\.\d+\.\d+)/) || [])[1];
}
function highest(version1, version2) {
export function highest(version1, version2) {
return version1 && version2 ? (semver.gt(version1, version2) ? version1 : version2) : version1 || version2;
}
function lowest(version1, version2) {
export function lowest(version1, version2) {
return version1 && version2 ? (semver.lt(version1, version2) ? version1 : version2) : version1 || version2;
}
function getLatestVersion(versions, {withPrerelease} = {}) {
export function getLatestVersion(versions, { withPrerelease } = {}) {
return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.rcompare)[0];
}
function getEarliestVersion(versions, {withPrerelease} = {}) {
export function getEarliestVersion(versions, { withPrerelease } = {}) {
return versions.filter((version) => withPrerelease || !semver.prerelease(version)).sort(semver.compare)[0];
}
function getFirstVersion(versions, lowerBranches) {
const lowerVersion = union(...lowerBranches.map(({tags}) => tagsToVersions(tags))).sort(semver.rcompare);
export function getFirstVersion(versions, lowerBranches) {
const lowerVersion = union(...lowerBranches.map(({ tags }) => tagsToVersions(tags))).sort(semver.rcompare);
if (lowerVersion[0]) {
return versions.sort(semver.compare).find((version) => semver.gt(version, lowerVersion[0]));
}
@ -70,32 +70,14 @@ function getFirstVersion(versions, lowerBranches) {
return getEarliestVersion(versions);
}
function getRange(min, max) {
return `>=${min}${max ? ` <${max}` : ''}`;
export function getRange(min, max) {
return `>=${min}${max ? ` <${max}` : ""}`;
}
function makeTag(tagFormat, version) {
return template(tagFormat)({version});
export function makeTag(tagFormat, version) {
return template(tagFormat)({ version });
}
function isSameChannel(channel, otherChannel) {
export function isSameChannel(channel, otherChannel) {
return channel === otherChannel || (!channel && !otherChannel);
}
module.exports = {
extractErrors,
hideSensitiveValues,
tagsToVersions,
isMajorRange,
isMaintenanceRange,
getUpperBound,
getLowerBound,
highest,
lowest,
getLatestVersion,
getEarliestVersion,
getFirstVersion,
getRange,
makeTag,
isSameChannel,
};

View File

@ -1,39 +1,39 @@
const {template, isString, isPlainObject} = require('lodash');
const AggregateError = require('aggregate-error');
const {isGitRepo, verifyTagName} = require('./git');
const getError = require('./get-error');
import { isPlainObject, isString, template } from "lodash-es";
import AggregateError from "aggregate-error";
import { isGitRepo, verifyTagName } from "./git.js";
import getError from "./get-error.js";
module.exports = async (context) => {
export default async (context) => {
const {
cwd,
env,
options: {repositoryUrl, tagFormat, branches},
options: { repositoryUrl, tagFormat, branches },
} = context;
const errors = [];
if (!(await isGitRepo({cwd, env}))) {
errors.push(getError('ENOGITREPO', {cwd}));
if (!(await isGitRepo({ cwd, env }))) {
errors.push(getError("ENOGITREPO", { cwd }));
} else if (!repositoryUrl) {
errors.push(getError('ENOREPOURL'));
errors.push(getError("ENOREPOURL"));
}
// Verify that compiling the `tagFormat` produce a valid Git tag
if (!(await verifyTagName(template(tagFormat)({version: '0.0.0'})))) {
errors.push(getError('EINVALIDTAGFORMAT', context));
if (!(await verifyTagName(template(tagFormat)({ version: "0.0.0" })))) {
errors.push(getError("EINVALIDTAGFORMAT", context));
}
// Verify the `tagFormat` contains the variable `version` by compiling the `tagFormat` template
// with a space as the `version` value and verify the result contains the space.
// The space is used as it's an invalid tag character, so it's guaranteed to no be present in the `tagFormat`.
if ((template(tagFormat)({version: ' '}).match(/ /g) || []).length !== 1) {
errors.push(getError('ETAGNOVERSION', context));
if ((template(tagFormat)({ version: " " }).match(/ /g) || []).length !== 1) {
errors.push(getError("ETAGNOVERSION", context));
}
branches.forEach((branch) => {
if (
!((isString(branch) && branch.trim()) || (isPlainObject(branch) && isString(branch.name) && branch.name.trim()))
) {
errors.push(getError('EINVALIDBRANCH', {branch}));
errors.push(getError("EINVALIDBRANCH", { branch }));
}
});

14646
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -2,11 +2,16 @@
"name": "semantic-release",
"description": "Automated semver compliant package publishing",
"version": "0.0.0-development",
"type": "module",
"author": "Stephan Bönnemann <stephan@boennemann.me> (http://boennemann.me)",
"ava": {
"files": [
"test/**/*.test.js"
],
"nodeArguments": [
"--loader=testdouble",
"--no-warnings"
],
"timeout": "2m"
},
"bin": {
@ -17,7 +22,8 @@
},
"contributors": [
"Gregor Martynus (https://twitter.com/gr2m)",
"Pierre Vanduynslager (https://twitter.com/@pvdlg_)"
"Pierre Vanduynslager (https://twitter.com/@pvdlg_)",
"Matt Travi <npm@travi.org> (https://matt.travi.org/)"
],
"dependencies": {
"@semantic-release/commit-analyzer": "^9.0.2",
@ -25,29 +31,29 @@
"@semantic-release/github": "^8.0.0",
"@semantic-release/npm": "^9.0.0",
"@semantic-release/release-notes-generator": "^10.0.0",
"aggregate-error": "^3.0.0",
"aggregate-error": "^4.0.1",
"cosmiconfig": "^7.0.0",
"debug": "^4.0.0",
"env-ci": "^5.0.0",
"execa": "^5.0.0",
"figures": "^3.0.0",
"find-versions": "^4.0.0",
"env-ci": "^8.0.0",
"execa": "^6.1.0",
"figures": "^5.0.0",
"find-versions": "^5.1.0",
"get-stream": "^6.0.0",
"git-log-parser": "^1.2.0",
"hook-std": "^2.0.0",
"hosted-git-info": "^4.0.0",
"lodash": "^4.17.21",
"marked": "^4.0.10",
"marked-terminal": "^5.0.0",
"hook-std": "^3.0.0",
"hosted-git-info": "^5.1.0",
"lodash-es": "^4.17.21",
"marked": "^4.1.0",
"marked-terminal": "^5.1.1",
"micromatch": "^4.0.2",
"p-each-series": "^2.1.0",
"p-reduce": "^2.0.0",
"read-pkg-up": "^7.0.0",
"p-each-series": "^3.0.0",
"p-reduce": "^3.0.0",
"read-pkg-up": "^9.1.0",
"resolve-from": "^5.0.0",
"semver": "^7.3.2",
"semver-diff": "^3.1.1",
"signale": "^1.2.1",
"yargs": "^16.2.0"
"yargs": "^17.5.1"
},
"devDependencies": {
"ava": "4.3.3",
@ -56,21 +62,21 @@
"codecov": "3.8.3",
"delay": "5.0.0",
"dockerode": "3.3.4",
"file-url": "3.0.0",
"fs-extra": "9.1.0",
"got": "11.8.5",
"file-url": "^4.0.0",
"fs-extra": "^10.1.0",
"got": "^12.5.0",
"js-yaml": "4.1.0",
"mockserver-client": "5.14.0",
"nock": "13.2.9",
"p-retry": "4.6.2",
"p-retry": "^5.1.1",
"prettier": "^2.7.1",
"sinon": "14.0.0",
"stream-buffers": "3.0.2",
"tempy": "1.0.1",
"testdouble": "3.16.6",
"xo": "0.32.1"
"tempy": "^3.0.0",
"testdouble": "3.16.6"
},
"engines": {
"node": ">=16 || ^14.17"
"node": ">=18"
},
"files": [
"bin",
@ -119,19 +125,12 @@
},
"scripts": {
"codecov": "codecov -f coverage/coverage-final.json",
"lint": "xo",
"lint": "prettier --check \"*.{js,json,md}\" \".github/**/*.{md,yml}\" \"docs/**/*.md\" \"{bin,lib,test}/*.js\"",
"lint:fix": "prettier --write \"*.{js,json,md}\" \".github/**/*.{md,yml}\" \"docs/**/*.md\" \"{bin,lib,test}/*.js\"",
"pretest": "npm run lint",
"semantic-release": "./bin/semantic-release.js",
"test": "c8 ava -v",
"test:ci": "c8 ava -v"
},
"xo": {
"prettier": true,
"space": true,
"rules": {
"unicorn/no-reduce": "off",
"unicorn/string-content": "off"
}
"test": "c8 ava --verbose",
"test:ci": "c8 ava --verbose"
},
"renovate": {
"extends": [

View File

@ -1,7 +1,7 @@
const test = require('ava');
const {union} = require('lodash');
const semver = require('semver');
const td = require('testdouble');
import test from 'ava';
import {union} from 'lodash-es';
import semver from 'semver';
import * as td from 'testdouble';
const getBranch = (branches, branch) => branches.find(({name}) => name === branch);
const release = (branches, name, version) => getBranch(branches, name).tags.push({version});
@ -11,8 +11,21 @@ const merge = (branches, source, target, tag) => {
getBranch(branches, target).tags
);
};
const remoteBranches = [];
const repositoryUrl = 'repositoryUrl';
let expand, getTags, getBranches;
test('Enforce ranges with branching release workflow', async (t) => {
test.beforeEach(async (t) => {
getTags = (await td.replaceEsm('../../lib/branches/get-tags.js')).default;
expand = (await td.replaceEsm('../../lib/branches/expand.js')).default;
getBranches = (await import('../../lib/branches/index.js')).default;
})
test.afterEach.always((t) => {
td.reset();
});
test.serial('Enforce ranges with branching release workflow', async (t) => {
const branches = [
{name: '1.x', tags: []},
{name: '1.0.x', tags: []},
@ -22,14 +35,11 @@ test('Enforce ranges with branching release workflow', async (t) => {
{name: 'beta', prerelease: true, tags: []},
{name: 'alpha', prerelease: true, tags: []},
];
td.replace('../../lib/branches/get-tags', () => branches);
td.replace('../../lib/branches/expand', () => []);
const getBranches = require('../../lib/branches');
const context = {options: {branches}};
td.when(expand(repositoryUrl, context, branches)).thenResolve(remoteBranches);
td.when(getTags(context, remoteBranches)).thenResolve(branches);
let result = (await getBranches('repositoryUrl', 'master', {options: {branches}})).map(({name, range}) => ({
name,
range,
}));
let result = (await getBranches(repositoryUrl, 'master', context)).map(({name, range}) => ({name, range,}));
t.is(getBranch(result, '1.0.x').range, '>=1.0.0 <1.0.0', 'Cannot release on 1.0.x before a releasing on master');
t.is(getBranch(result, '1.x').range, '>=1.1.0 <1.0.0', 'Cannot release on 1.x before a releasing on master');
t.is(getBranch(result, 'master').range, '>=1.0.0');
@ -37,10 +47,7 @@ test('Enforce ranges with branching release workflow', async (t) => {
t.is(getBranch(result, 'next-major').range, '>=1.0.0');
release(branches, 'master', '1.0.0');
result = (await getBranches('repositoryUrl', 'master', {options: {branches}})).map(({name, range}) => ({
name,
range,
}));
result = (await getBranches('repositoryUrl', 'master', context)).map(({name, range}) => ({name, range}));
t.is(getBranch(result, '1.0.x').range, '>=1.0.0 <1.0.0', 'Cannot release on 1.0.x before a releasing on master');
t.is(getBranch(result, '1.x').range, '>=1.1.0 <1.0.0', 'Cannot release on 1.x before a releasing on master');
t.is(getBranch(result, 'master').range, '>=1.0.0');
@ -191,7 +198,7 @@ test('Enforce ranges with branching release workflow', async (t) => {
t.is(getBranch(result, '1.x').range, '>=1.2.0 <2.0.0', 'Can release on 1.x only within range');
});
test('Throw SemanticReleaseError for invalid configurations', async (t) => {
test.serial('Throw SemanticReleaseError for invalid configurations', async (t) => {
const branches = [
{name: '123', range: '123', tags: []},
{name: '1.x', tags: []},
@ -201,10 +208,12 @@ test('Throw SemanticReleaseError for invalid configurations', async (t) => {
{name: 'alpha', prerelease: 'alpha', tags: []},
{name: 'preview', prerelease: 'alpha', tags: []},
];
td.replace('../../lib/branches/get-tags', () => branches);
td.replace('../../lib/branches/expand', () => []);
const getBranches = require('../../lib/branches');
const errors = [...(await t.throwsAsync(getBranches('repositoryUrl', 'master', {options: {branches}})))];
const context = {options: {branches}};
td.when(expand(repositoryUrl, context, branches)).thenResolve(remoteBranches);
td.when(getTags(context, remoteBranches)).thenResolve(branches);
const error = await t.throwsAsync(getBranches(repositoryUrl, 'master', context));
const errors = [...error.errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'EMAINTENANCEBRANCH');
@ -228,16 +237,16 @@ test('Throw SemanticReleaseError for invalid configurations', async (t) => {
t.truthy(errors[4].details);
});
test('Throw a SemanticReleaseError if there is duplicate branches', async (t) => {
test.serial('Throw a SemanticReleaseError if there is duplicate branches', async (t) => {
const branches = [
{name: 'master', tags: []},
{name: 'master', tags: []},
];
td.replace('../../lib/branches/get-tags', () => branches);
td.replace('../../lib/branches/expand', () => []);
const getBranches = require('../../lib/branches');
const context = {options: {branches}};
td.when(expand(repositoryUrl, context, branches)).thenResolve(remoteBranches);
td.when(getTags(context, remoteBranches)).thenResolve(branches);
const errors = [...(await t.throwsAsync(getBranches('repositoryUrl', 'master', {options: {branches}})))];
const errors = [...(await t.throwsAsync(getBranches(repositoryUrl, 'master', context))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'EDUPLICATEBRANCHES');
@ -245,16 +254,17 @@ test('Throw a SemanticReleaseError if there is duplicate branches', async (t) =>
t.truthy(errors[0].details);
});
test('Throw a SemanticReleaseError for each invalid branch name', async (t) => {
test.serial('Throw a SemanticReleaseError for each invalid branch name', async (t) => {
const branches = [
{name: '~master', tags: []},
{name: '^master', tags: []},
];
td.replace('../../lib/branches/get-tags', () => branches);
td.replace('../../lib/branches/expand', () => []);
const getBranches = require('../../lib/branches');
const context = {options: {branches}};
const remoteBranches = [];
td.when(expand(repositoryUrl, context, branches)).thenResolve(remoteBranches);
td.when(getTags(context, remoteBranches)).thenResolve(branches);
const errors = [...(await t.throwsAsync(getBranches('repositoryUrl', 'master', {options: {branches}})))];
const errors = [...(await t.throwsAsync(getBranches(repositoryUrl, 'master', context))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'EINVALIDBRANCHNAME');

View File

@ -1,6 +1,6 @@
const test = require('ava');
const expand = require('../../lib/branches/expand');
const {gitRepo, gitCommits, gitCheckout, gitPush} = require('../helpers/git-utils');
import test from 'ava';
import expand from '../../lib/branches/expand.js';
import {gitCheckout, gitCommits, gitPush, gitRepo} from '../helpers/git-utils.js';
test('Expand branches defined with globs', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);

View File

@ -1,6 +1,6 @@
const test = require('ava');
const getTags = require('../../lib/branches/get-tags');
const {gitRepo, gitCommits, gitTagVersion, gitCheckout, gitAddNote} = require('../helpers/git-utils');
import test from 'ava';
import getTags from '../../lib/branches/get-tags.js';
import {gitAddNote, gitCheckout, gitCommits, gitRepo, gitTagVersion} from '../helpers/git-utils.js';
test('Get the valid tags', async (t) => {
const {cwd} = await gitRepo();

View File

@ -1,5 +1,5 @@
const test = require('ava');
const normalize = require('../../lib/branches/normalize');
import test from 'ava';
import * as normalize from '../../lib/branches/normalize.js';
const toTags = (versions) => versions.map((version) => ({version}));

View File

@ -1,19 +1,19 @@
const test = require('ava');
const {escapeRegExp} = require('lodash');
const td = require('testdouble');
const {stub} = require('sinon');
const {SECRET_REPLACEMENT} = require('../lib/definitions/constants');
import test from "ava";
import { escapeRegExp } from "lodash-es";
import * as td from "testdouble";
import { stub } from "sinon";
import { SECRET_REPLACEMENT } from "../lib/definitions/constants.js";
let previousArgv;
let previousEnv;
test.beforeEach((t) => {
t.context.logs = '';
t.context.errors = '';
t.context.stdout = stub(process.stdout, 'write').callsFake((value) => {
t.context.logs = "";
t.context.errors = "";
t.context.stdout = stub(process.stdout, "write").callsFake((value) => {
t.context.logs += value.toString();
});
t.context.stderr = stub(process.stderr, 'write').callsFake((value) => {
t.context.stderr = stub(process.stderr, "write").callsFake((value) => {
t.context.errors += value.toString();
});
@ -27,166 +27,206 @@ test.afterEach.always((t) => {
process.argv = previousArgv;
process.env = previousEnv;
td.reset();
});
test.serial('Pass options to semantic-release API', async (t) => {
const run = stub().resolves(true);
test.serial("Pass options to semantic-release API", async (t) => {
const argv = [
'',
'',
'-b',
'master',
'next',
'-r',
'https://github/com/owner/repo.git',
'-t',
"",
"",
"-b",
"master",
"next",
"-r",
"https://github/com/owner/repo.git",
"-t",
`v\${version}`,
'-p',
'plugin1',
'plugin2',
'-e',
'config1',
'config2',
'--verify-conditions',
'condition1',
'condition2',
'--analyze-commits',
'analyze',
'--verify-release',
'verify1',
'verify2',
'--generate-notes',
'notes',
'--prepare',
'prepare1',
'prepare2',
'--publish',
'publish1',
'publish2',
'--success',
'success1',
'success2',
'--fail',
'fail1',
'fail2',
'--debug',
'-d',
"-p",
"plugin1",
"plugin2",
"-e",
"config1",
"config2",
"--verify-conditions",
"condition1",
"condition2",
"--analyze-commits",
"analyze",
"--verify-release",
"verify1",
"verify2",
"--generate-notes",
"notes",
"--prepare",
"prepare1",
"prepare2",
"--publish",
"publish1",
"publish2",
"--success",
"success1",
"success2",
"--fail",
"fail1",
"fail2",
"--debug",
"-d",
];
td.replace('..', run);
const index = await td.replaceEsm("../index.js");
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
t.deepEqual(run.args[0][0].branches, ['master', 'next']);
t.is(run.args[0][0].repositoryUrl, 'https://github/com/owner/repo.git');
t.is(run.args[0][0].tagFormat, `v\${version}`);
t.deepEqual(run.args[0][0].plugins, ['plugin1', 'plugin2']);
t.deepEqual(run.args[0][0].extends, ['config1', 'config2']);
t.deepEqual(run.args[0][0].verifyConditions, ['condition1', 'condition2']);
t.is(run.args[0][0].analyzeCommits, 'analyze');
t.deepEqual(run.args[0][0].verifyRelease, ['verify1', 'verify2']);
t.deepEqual(run.args[0][0].generateNotes, ['notes']);
t.deepEqual(run.args[0][0].prepare, ['prepare1', 'prepare2']);
t.deepEqual(run.args[0][0].publish, ['publish1', 'publish2']);
t.deepEqual(run.args[0][0].success, ['success1', 'success2']);
t.deepEqual(run.args[0][0].fail, ['fail1', 'fail2']);
t.is(run.args[0][0].debug, true);
t.is(run.args[0][0].dryRun, true);
td.verify(
index.default({
branches: ["master", "next"],
b: ["master", "next"],
"repository-url": "https://github/com/owner/repo.git",
repositoryUrl: "https://github/com/owner/repo.git",
r: "https://github/com/owner/repo.git",
"tag-format": `v\${version}`,
tagFormat: `v\${version}`,
t: `v\${version}`,
plugins: ["plugin1", "plugin2"],
p: ["plugin1", "plugin2"],
extends: ["config1", "config2"],
e: ["config1", "config2"],
"dry-run": true,
dryRun: true,
d: true,
verifyConditions: ["condition1", "condition2"],
"verify-conditions": ["condition1", "condition2"],
analyzeCommits: "analyze",
"analyze-commits": "analyze",
verifyRelease: ["verify1", "verify2"],
"verify-release": ["verify1", "verify2"],
generateNotes: ["notes"],
"generate-notes": ["notes"],
prepare: ["prepare1", "prepare2"],
publish: ["publish1", "publish2"],
success: ["success1", "success2"],
fail: ["fail1", "fail2"],
debug: true,
_: [],
$0: "",
})
);
t.is(exitCode, 0);
});
test.serial('Pass options to semantic-release API with alias arguments', async (t) => {
const run = stub().resolves(true);
test.serial("Pass options to semantic-release API with alias arguments", async (t) => {
const argv = [
'',
'',
'--branches',
'master',
'--repository-url',
'https://github/com/owner/repo.git',
'--tag-format',
"",
"",
"--branches",
"master",
"--repository-url",
"https://github/com/owner/repo.git",
"--tag-format",
`v\${version}`,
'--plugins',
'plugin1',
'plugin2',
'--extends',
'config1',
'config2',
'--dry-run',
"--plugins",
"plugin1",
"plugin2",
"--extends",
"config1",
"config2",
"--dry-run",
];
td.replace('..', run);
const index = await td.replaceEsm("../index.js");
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
t.deepEqual(run.args[0][0].branches, ['master']);
t.is(run.args[0][0].repositoryUrl, 'https://github/com/owner/repo.git');
t.is(run.args[0][0].tagFormat, `v\${version}`);
t.deepEqual(run.args[0][0].plugins, ['plugin1', 'plugin2']);
t.deepEqual(run.args[0][0].extends, ['config1', 'config2']);
t.is(run.args[0][0].dryRun, true);
td.verify(
index.default({
branches: ["master"],
b: ["master"],
"repository-url": "https://github/com/owner/repo.git",
repositoryUrl: "https://github/com/owner/repo.git",
r: "https://github/com/owner/repo.git",
"tag-format": `v\${version}`,
tagFormat: `v\${version}`,
t: `v\${version}`,
plugins: ["plugin1", "plugin2"],
p: ["plugin1", "plugin2"],
extends: ["config1", "config2"],
e: ["config1", "config2"],
"dry-run": true,
dryRun: true,
d: true,
_: [],
$0: "",
})
);
t.is(exitCode, 0);
});
test.serial('Pass unknown options to semantic-release API', async (t) => {
const run = stub().resolves(true);
const argv = ['', '', '--bool', '--first-option', 'value1', '--second-option', 'value2', '--second-option', 'value3'];
td.replace('..', run);
test.serial("Pass unknown options to semantic-release API", async (t) => {
const argv = ["", "", "--bool", "--first-option", "value1", "--second-option", "value2", "--second-option", "value3"];
const index = await td.replaceEsm("../index.js");
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
t.is(run.args[0][0].bool, true);
t.is(run.args[0][0].firstOption, 'value1');
t.deepEqual(run.args[0][0].secondOption, ['value2', 'value3']);
td.verify(
index.default({
bool: true,
firstOption: "value1",
"first-option": "value1",
secondOption: ["value2", "value3"],
"second-option": ["value2", "value3"],
_: [],
$0: "",
})
);
t.is(exitCode, 0);
});
test.serial('Pass empty Array to semantic-release API for list option set to "false"', async (t) => {
const run = stub().resolves(true);
const argv = ['', '', '--publish', 'false'];
td.replace('..', run);
const argv = ["", "", "--publish", "false"];
const index = await td.replaceEsm("../index.js");
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
t.deepEqual(run.args[0][0].publish, []);
td.verify(index.default({ publish: [], _: [], $0: "" }));
t.is(exitCode, 0);
});
test.serial('Do not set properties in option for which arg is not in command line', async (t) => {
test.serial("Do not set properties in option for which arg is not in command line", async (t) => {
const run = stub().resolves(true);
const argv = ['', '', '-b', 'master'];
td.replace('..', run);
const argv = ["", "", "-b", "master"];
await td.replaceEsm("../index.js", null, run);
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
await cli();
t.false('ci' in run.args[0][0]);
t.false('d' in run.args[0][0]);
t.false('dry-run' in run.args[0][0]);
t.false('debug' in run.args[0][0]);
t.false('r' in run.args[0][0]);
t.false('t' in run.args[0][0]);
t.false('p' in run.args[0][0]);
t.false('e' in run.args[0][0]);
t.false("ci" in run.args[0][0]);
t.false("d" in run.args[0][0]);
t.false("dry-run" in run.args[0][0]);
t.false("debug" in run.args[0][0]);
t.false("r" in run.args[0][0]);
t.false("t" in run.args[0][0]);
t.false("p" in run.args[0][0]);
t.false("e" in run.args[0][0]);
});
test.serial('Display help', async (t) => {
test.serial("Display help", async (t) => {
const run = stub().resolves(true);
const argv = ['', '', '--help'];
td.replace('..', run);
const argv = ["", "", "--help"];
await td.replaceEsm("../index.js", null, run);
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
@ -194,12 +234,12 @@ test.serial('Display help', async (t) => {
t.is(exitCode, 0);
});
test.serial('Return error exitCode and prints help if called with a command', async (t) => {
test.serial("Return error exitCode and prints help if called with a command", async (t) => {
const run = stub().resolves(true);
const argv = ['', '', 'pre'];
td.replace('..', run);
const argv = ["", "", "pre"];
await td.replaceEsm("../index.js", null, run);
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
@ -208,12 +248,12 @@ test.serial('Return error exitCode and prints help if called with a command', as
t.is(exitCode, 1);
});
test.serial('Return error exitCode if multiple plugin are set for single plugin', async (t) => {
test.serial("Return error exitCode if multiple plugin are set for single plugin", async (t) => {
const run = stub().resolves(true);
const argv = ['', '', '--analyze-commits', 'analyze1', 'analyze2'];
td.replace('..', run);
const argv = ["", "", "--analyze-commits", "analyze1", "analyze2"];
await td.replaceEsm("../index.js", null, run);
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
@ -222,12 +262,12 @@ test.serial('Return error exitCode if multiple plugin are set for single plugin'
t.is(exitCode, 1);
});
test.serial('Return error exitCode if semantic-release throw error', async (t) => {
const run = stub().rejects(new Error('semantic-release error'));
const argv = ['', ''];
td.replace('..', run);
test.serial("Return error exitCode if semantic-release throw error", async (t) => {
const argv = ["", ""];
const index = await td.replaceEsm("../index.js");
td.when(index.default({ _: [], $0: "" })).thenReject(new Error("semantic-release error"));
process.argv = argv;
const cli = require('../cli');
const cli = (await import("../cli.js")).default;
const exitCode = await cli();
@ -235,14 +275,14 @@ test.serial('Return error exitCode if semantic-release throw error', async (t) =
t.is(exitCode, 1);
});
test.serial('Hide sensitive environment variable values from the logs', async (t) => {
const env = {MY_TOKEN: 'secret token'};
const run = stub().rejects(new Error(`Throw error: Exposing token ${env.MY_TOKEN}`));
const argv = ['', ''];
td.replace('..', run);
test.serial("Hide sensitive environment variable values from the logs", async (t) => {
const env = { MY_TOKEN: "secret token" };
const argv = ["", ""];
const index = await td.replaceEsm("../index.js");
td.when(index.default({ _: [], $0: "" })).thenReject(new Error(`Throw error: Exposing token ${env.MY_TOKEN}`));
process.argv = argv;
process.env = {...process.env, ...env};
const cli = require('../cli');
process.env = { ...process.env, ...env };
const cli = (await import("../cli.js")).default;
const exitCode = await cli();

View File

@ -1,5 +1,5 @@
const test = require('ava');
const {maintenance, prerelease, release} = require('../../lib/definitions/branches');
import test from 'ava';
import {maintenance, prerelease, release} from '../../lib/definitions/branches.js';
test('A "maintenance" branch is identified by having a "range" property or a "name" formatted like "N.x", "N.x.x" or "N.N.x"', (t) => {
/* eslint-disable unicorn/no-fn-reference-in-iterator */

View File

@ -1,6 +1,6 @@
const test = require('ava');
const plugins = require('../../lib/definitions/plugins');
const {RELEASE_NOTES_SEPARATOR, SECRET_REPLACEMENT} = require('../../lib/definitions/constants');
import test from 'ava';
import plugins from '../../lib/definitions/plugins.js';
import {RELEASE_NOTES_SEPARATOR, SECRET_REPLACEMENT} from '../../lib/definitions/constants.js';
test('The "analyzeCommits" plugin output must be either undefined or a valid semver release type', (t) => {
t.false(plugins.analyzeCommits.outputValidator('invalid'));

View File

@ -1 +1 @@
module.exports = () => {};
export default () => {}

View File

@ -1,4 +1,4 @@
const SemanticReleaseError = require('@semantic-release/error');
import SemanticReleaseError from '@semantic-release/error';
class InheritedError extends SemanticReleaseError {
constructor(message, code) {
@ -9,6 +9,6 @@ class InheritedError extends SemanticReleaseError {
}
}
module.exports = () => {
export default () => {
throw new InheritedError('Inherited error', 'EINHERITED');
};
}

View File

@ -1,5 +1,5 @@
module.exports = () => {
export default () => {
const error = new Error('a');
error.errorProperty = 'errorProperty';
throw error;
};
}

View File

@ -1,5 +1,5 @@
const AggregateError = require('aggregate-error');
import AggregateError from 'aggregate-error';
module.exports = () => {
export default () => {
throw new AggregateError([new Error('a'), new Error('b')]);
};
}

View File

@ -1 +1 @@
module.exports = (pluginConfig, context) => context;
export default (pluginConfig, context) => context

View File

@ -1,6 +1,6 @@
module.exports = (pluginConfig, {env, logger}) => {
export default (pluginConfig, {env, logger}) => {
console.log(`Console: Exposing token ${env.MY_TOKEN}`);
logger.log(`Log: Exposing token ${env.MY_TOKEN}`);
logger.error(`Error: Console token ${env.MY_TOKEN}`);
throw new Error(`Throw error: Exposing ${env.MY_TOKEN}`);
};
}

3
test/fixtures/plugin-noop.cjs vendored Normal file
View File

@ -0,0 +1,3 @@
module.exports = function noop() {
};

View File

@ -1 +0,0 @@
module.exports = () => {};

View File

@ -1 +1 @@
module.exports = (pluginConfig, context) => ({pluginConfig, context});
export default (pluginConfig, context) => ({pluginConfig, context})

View File

@ -1,39 +1,39 @@
const test = require('ava');
const {stub} = require('sinon');
const getCommits = require('../lib/get-commits');
const {gitRepo, gitCommits, gitDetachedHead} = require('./helpers/git-utils');
import test from "ava";
import { stub } from "sinon";
import getCommits from "../lib/get-commits.js";
import { gitCommits, gitDetachedHead, gitRepo } from "./helpers/git-utils.js";
test.beforeEach((t) => {
// Stub the logger functions
t.context.log = stub();
t.context.error = stub();
t.context.logger = {log: t.context.log, error: t.context.error};
t.context.logger = { log: t.context.log, error: t.context.error };
});
test('Get all commits when there is no last release', async (t) => {
test("Get all commits when there is no last release", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd});
const commits = await gitCommits(["First", "Second"], { cwd });
// Retrieve the commits with the commits module
const result = await getCommits({cwd, lastRelease: {}, logger: t.context.logger});
const result = await getCommits({ cwd, lastRelease: {}, logger: t.context.logger });
// Verify the commits created and retrieved by the module are identical
t.is(result.length, 2);
t.deepEqual(result, commits);
});
test('Get all commits since gitHead (from lastRelease)', async (t) => {
test("Get all commits since gitHead (from lastRelease)", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd});
const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Retrieve the commits with the commits module, since commit 'First'
const result = await getCommits({
cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash},
lastRelease: { gitHead: commits[commits.length - 1].hash },
logger: t.context.logger,
});
@ -42,18 +42,18 @@ test('Get all commits since gitHead (from lastRelease)', async (t) => {
t.deepEqual(result, commits.slice(0, 2));
});
test('Get all commits since gitHead (from lastRelease) on a detached head repo', async (t) => {
test("Get all commits since gitHead (from lastRelease) on a detached head repo", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo();
let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd});
const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Create a detached head repo at commit 'feat: Second'
cwd = await gitDetachedHead(repositoryUrl, commits[1].hash);
// Retrieve the commits with the commits module, since commit 'First'
const result = await getCommits({
cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash},
lastRelease: { gitHead: commits[commits.length - 1].hash },
logger: t.context.logger,
});
@ -66,17 +66,17 @@ test('Get all commits since gitHead (from lastRelease) on a detached head repo',
t.truthy(result[0].committer.name);
});
test('Get all commits between lastRelease.gitHead and a shas', async (t) => {
test("Get all commits between lastRelease.gitHead and a shas", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second', 'Third'], {cwd});
const commits = await gitCommits(["First", "Second", "Third"], { cwd });
// Retrieve the commits with the commits module, between commit 'First' and 'Third'
const result = await getCommits({
cwd,
lastRelease: {gitHead: commits[commits.length - 1].hash},
nextRelease: {gitHead: commits[1].hash},
lastRelease: { gitHead: commits[commits.length - 1].hash },
nextRelease: { gitHead: commits[1].hash },
logger: t.context.logger,
});
@ -85,16 +85,16 @@ test('Get all commits between lastRelease.gitHead and a shas', async (t) => {
t.deepEqual(result, commits.slice(1, -1));
});
test('Return empty array if lastRelease.gitHead is the last commit', async (t) => {
test("Return empty array if lastRelease.gitHead is the last commit", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd});
const commits = await gitCommits(["First", "Second"], { cwd });
// Retrieve the commits with the commits module, since commit 'Second' (therefore none)
const result = await getCommits({
cwd,
lastRelease: {gitHead: commits[0].hash},
lastRelease: { gitHead: commits[0].hash },
logger: t.context.logger,
});
@ -102,12 +102,12 @@ test('Return empty array if lastRelease.gitHead is the last commit', async (t) =
t.deepEqual(result, []);
});
test('Return empty array if there is no commits', async (t) => {
test("Return empty array if there is no commits", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Retrieve the commits with the commits module
const result = await getCommits({cwd, lastRelease: {}, logger: t.context.logger});
const result = await getCommits({ cwd, lastRelease: {}, logger: t.context.logger });
// Verify no commit is retrieved
t.deepEqual(result, []);

View File

@ -1,579 +1,597 @@
const path = require('path');
const {format} = require('util');
const test = require('ava');
const {writeFile, outputJson} = require('fs-extra');
const {omit} = require('lodash');
const td = require('testdouble');
const {stub} = require('sinon');
const yaml = require('js-yaml');
const {gitRepo, gitTagVersion, gitCommits, gitShallowClone, gitAddConfig} = require('./helpers/git-utils');
import path from "node:path";
import { format } from "node:util";
import test from "ava";
import fsExtra from "fs-extra";
import { omit } from "lodash-es";
import * as td from "testdouble";
import yaml from "js-yaml";
import { gitAddConfig, gitCommits, gitRepo, gitShallowClone, gitTagVersion } from "./helpers/git-utils.js";
const { outputJson, writeFile } = fsExtra;
const pluginsConfig = { foo: "bar", baz: "qux" };
let plugins;
const DEFAULT_PLUGINS = [
'@semantic-release/commit-analyzer',
'@semantic-release/release-notes-generator',
'@semantic-release/npm',
'@semantic-release/github',
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/npm",
"@semantic-release/github",
];
test.beforeEach((t) => {
t.context.plugins = stub().returns({});
td.replace('../lib/plugins', t.context.plugins);
t.context.getConfig = require('../lib/get-config');
test.beforeEach(async (t) => {
plugins = (await td.replaceEsm("../lib/plugins/index.js")).default;
t.context.getConfig = (await import("../lib/get-config.js")).default;
});
test('Default values, reading repositoryUrl from package.json', async (t) => {
const pkg = {repository: 'https://host.null/owner/package.git'};
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true);
await gitCommits(['First'], {cwd});
await gitTagVersion('v1.0.0', undefined, {cwd});
await gitTagVersion('v1.1.0', undefined, {cwd});
// Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'git@host.null:owner/repo.git', {cwd});
// Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg);
test.afterEach.always((t) => {
td.reset();
});
const {options: result} = await t.context.getConfig({cwd});
test("Default values, reading repositoryUrl from package.json", async (t) => {
const pkg = { repository: "https://host.null/owner/package.git" };
// Create a git repository, set the current working directory at the root of the repo
const { cwd } = await gitRepo(true);
await gitCommits(["First"], { cwd });
await gitTagVersion("v1.0.0", undefined, { cwd });
await gitTagVersion("v1.1.0", undefined, { cwd });
// Add remote.origin.url config
await gitAddConfig("remote.origin.url", "git@host.null:owner/repo.git", { cwd });
// Create package.json in repository root
await outputJson(path.resolve(cwd, "package.json"), pkg);
const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set
t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x',
'master',
'next',
'next-major',
{name: 'beta', prerelease: true},
{name: 'alpha', prerelease: true},
"+([0-9])?(.{+([0-9]),x}).x",
"master",
"next",
"next-major",
{ name: "beta", prerelease: true },
{ name: "alpha", prerelease: true },
]);
t.is(result.repositoryUrl, 'https://host.null/owner/package.git');
t.is(result.repositoryUrl, "https://host.null/owner/package.git");
t.is(result.tagFormat, `v\${version}`);
});
test('Default values, reading repositoryUrl from repo if not set in package.json', async (t) => {
test("Default values, reading repositoryUrl from repo if not set in package.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true);
const { cwd } = await gitRepo(true);
// Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'https://host.null/owner/module.git', {cwd});
await gitAddConfig("remote.origin.url", "https://host.null/owner/module.git", { cwd });
const {options: result} = await t.context.getConfig({cwd});
const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set
t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x',
'master',
'next',
'next-major',
{name: 'beta', prerelease: true},
{name: 'alpha', prerelease: true},
"+([0-9])?(.{+([0-9]),x}).x",
"master",
"next",
"next-major",
{ name: "beta", prerelease: true },
{ name: "alpha", prerelease: true },
]);
t.is(result.repositoryUrl, 'https://host.null/owner/module.git');
t.is(result.repositoryUrl, "https://host.null/owner/module.git");
t.is(result.tagFormat, `v\${version}`);
});
test('Default values, reading repositoryUrl (http url) from package.json if not set in repo', async (t) => {
const pkg = {repository: 'https://host.null/owner/module.git'};
test("Default values, reading repositoryUrl (http url) from package.json if not set in repo", async (t) => {
const pkg = { repository: "https://host.null/owner/module.git" };
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg);
await outputJson(path.resolve(cwd, "package.json"), pkg);
const {options: result} = await t.context.getConfig({cwd});
const { options: result } = await t.context.getConfig({ cwd });
// Verify the default options are set
t.deepEqual(result.branches, [
'+([0-9])?(.{+([0-9]),x}).x',
'master',
'next',
'next-major',
{name: 'beta', prerelease: true},
{name: 'alpha', prerelease: true},
"+([0-9])?(.{+([0-9]),x}).x",
"master",
"next",
"next-major",
{ name: "beta", prerelease: true },
{ name: "alpha", prerelease: true },
]);
t.is(result.repositoryUrl, 'https://host.null/owner/module.git');
t.is(result.repositoryUrl, "https://host.null/owner/module.git");
t.is(result.tagFormat, `v\${version}`);
});
test('Convert "ci" option to "noCi"', async (t) => {
const pkg = {repository: 'https://host.null/owner/module.git', release: {ci: false}};
const pkg = { repository: "https://host.null/owner/module.git", release: { ci: false } };
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg);
await outputJson(path.resolve(cwd, "package.json"), pkg);
const {options: result} = await t.context.getConfig({cwd});
const { options: result } = await t.context.getConfig({ cwd });
t.is(result.noCi, true);
});
test('Read options from package.json', async (t) => {
test.serial("Read options from package.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
generateNotes: 'generateNotes',
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: "generateNotes",
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Verify the plugins module is called with the plugin options from package.json
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
// Create package.json in repository root
await outputJson(path.resolve(cwd, "package.json"), { release: options });
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test.serial("Read options from .releaserc.yml", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const { cwd } = await gitRepo();
const options = {
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: options});
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from package.json
t.deepEqual(result, expected);
await writeFile(path.resolve(cwd, ".releaserc.yml"), yaml.dump(options));
// Verify the plugins module is called with the plugin options from package.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read options from .releaserc.yml', async (t) => {
test.serial("Read options from .releaserc.json", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json in repository root
await writeFile(path.resolve(cwd, '.releaserc.yml'), yaml.dump(options));
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from package.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, ".releaserc.json"), options);
// Verify the plugins module is called with the plugin options from package.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read options from .releaserc.json', async (t) => {
test.serial("Read options from .releaserc.js", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json in repository root
await outputJson(path.resolve(cwd, '.releaserc.json'), options);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from package.json
t.deepEqual(result, expected);
await writeFile(path.resolve(cwd, ".releaserc.js"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from package.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read options from .releaserc.js', async (t) => {
test.serial("Read options from .releaserc.cjs", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json in repository root
await writeFile(path.resolve(cwd, '.releaserc.js'), `module.exports = ${JSON.stringify(options)}`);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from package.json
t.deepEqual(result, expected);
// Verify the plugins module is called with the plugin options from package.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
});
test('Read options from .releaserc.cjs', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create .releaserc.cjs in repository root
await writeFile(path.resolve(cwd, '.releaserc.cjs'), `module.exports = ${JSON.stringify(options)}`);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from .releaserc.cjs
t.deepEqual(result, expected);
await writeFile(path.resolve(cwd, ".releaserc.cjs"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from .releaserc.cjs
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from .releaserc.cjs
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read options from release.config.js', async (t) => {
test.serial("Read options from release.config.js", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json in repository root
await writeFile(path.resolve(cwd, 'release.config.js'), `module.exports = ${JSON.stringify(options)}`);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from package.json
t.deepEqual(result, expected);
await writeFile(path.resolve(cwd, "release.config.js"), `module.exports = ${JSON.stringify(options)}`);
// Verify the plugins module is called with the plugin options from package.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read options from release.config.cjs', async (t) => {
test.serial("Read options from release.config.cjs", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create release.config.cjs in repository root
await writeFile(path.resolve(cwd, 'release.config.cjs'), `module.exports = ${JSON.stringify(options)}`);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from release.config.cjs
t.deepEqual(result, expected);
// Verify the plugins module is called with the plugin options from release.config.cjs
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
// Create release.config.cjs in repository root
await writeFile(path.resolve(cwd, "release.config.cjs"), `module.exports = ${JSON.stringify(options)}`);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from release.config.cjs
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Prioritise CLI/API parameters over file configuration and git repo', async (t) => {
test.serial("Prioritise CLI/API parameters over file configuration and git repo", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo();
await gitCommits(['First'], {cwd});
let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(["First"], { cwd });
// Create a clone
cwd = await gitShallowClone(repositoryUrl);
const pkgOptions = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_pkg'},
branches: ['branch_pkg'],
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_pkg" },
branches: ["branch_pkg"],
};
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_cli'},
branches: ['branch_cli'],
repositoryUrl: 'http://cli-url.com/owner/package',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_cli" },
branches: ["branch_cli"],
repositoryUrl: "http://cli-url.com/owner/package",
tagFormat: `cli\${version}`,
plugins: false,
};
const pkg = {release: pkgOptions, repository: 'git@host.null:owner/module.git'};
// Create package.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), pkg);
const result = await t.context.getConfig({cwd}, options);
const expected = {...options, branches: ['branch_cli']};
// Verify the options contains the plugin config from CLI/API
t.deepEqual(result.options, expected);
// Verify the plugins module is called with the plugin options from CLI/API
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(plugins({ cwd, options }, {})).thenResolve(pluginsConfig);
const pkg = { release: pkgOptions, repository: "git@host.null:owner/module.git" };
// Create package.json in repository root
await outputJson(path.resolve(cwd, "package.json"), pkg);
const result = await t.context.getConfig({ cwd }, options);
// Verify the options contains the plugin config from CLI/API
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read configuration from file path in "extends"', async (t) => {
test.serial('Read configuration from file path in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const pkgOptions = {extends: './shareable.json'};
const { cwd } = await gitRepo();
const pkgOptions = { extends: "./shareable.json" };
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
generateNotes: 'generateNotes',
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: "generateNotes",
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: ['plugin-1', ['plugin-2', {plugin2Opt: 'value'}]],
plugins: ["plugin-1", ["plugin-2", { plugin2Opt: "value" }]],
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable.json'), options);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable.json"), options);
// Verify the plugins module is called with the plugin options from shareable.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
t.deepEqual(t.context.plugins.args[0][1], {
analyzeCommits: './shareable.json',
generateNotes: './shareable.json',
'plugin-1': './shareable.json',
'plugin-2': './shareable.json',
});
td.when(
plugins(
{ cwd, options },
{
analyzeCommits: "./shareable.json",
generateNotes: "./shareable.json",
"plugin-1": "./shareable.json",
"plugin-2": "./shareable.json",
}
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read configuration from module path in "extends"', async (t) => {
test.serial('Read configuration from module path in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const pkgOptions = {extends: 'shareable'};
const { cwd } = await gitRepo();
const pkgOptions = { extends: "shareable" };
const options = {
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
generateNotes: 'generateNotes',
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
generateNotes: "generateNotes",
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'node_modules/shareable/index.json'), options);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options, branches: ['test_branch']};
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "node_modules/shareable/index.json"), options);
// Verify the plugins module is called with the plugin options from shareable.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
t.deepEqual(t.context.plugins.args[0][1], {
analyzeCommits: 'shareable',
generateNotes: 'shareable',
});
td.when(plugins({ cwd, options }, { analyzeCommits: "shareable", generateNotes: "shareable" })).thenResolve(
pluginsConfig
);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, { options, plugins: pluginsConfig });
});
test('Read configuration from an array of paths in "extends"', async (t) => {
test.serial('Read configuration from an array of paths in "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const pkgOptions = {extends: ['./shareable1.json', './shareable2.json']};
const { cwd } = await gitRepo();
const pkgOptions = { extends: ["./shareable1.json", "./shareable2.json"] };
const options1 = {
verifyRelease: 'verifyRelease1',
analyzeCommits: {path: 'analyzeCommits1', param: 'analyzeCommits_param1'},
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
verifyRelease: "verifyRelease1",
analyzeCommits: { path: "analyzeCommits1", param: "analyzeCommits_param1" },
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
};
const options2 = {
verifyRelease: 'verifyRelease2',
generateNotes: 'generateNotes2',
analyzeCommits: {path: 'analyzeCommits2', param: 'analyzeCommits_param2'},
branches: ['test_branch'],
verifyRelease: "verifyRelease2",
generateNotes: "generateNotes2",
analyzeCommits: { path: "analyzeCommits2", param: "analyzeCommits_param2" },
branches: ["test_branch"],
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable1.json'), options1);
await outputJson(path.resolve(cwd, 'shareable2.json'), options2);
const {options: result} = await t.context.getConfig({cwd});
const expected = {...options1, ...options2, branches: ['test_branch']};
// Verify the options contains the plugin config from shareable1.json and shareable2.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await outputJson(path.resolve(cwd, "shareable2.json"), options2);
const expectedOptions = { ...options1, ...options2, branches: ["test_branch"] };
// Verify the plugins module is called with the plugin options from shareable1.json and shareable2.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
t.deepEqual(t.context.plugins.args[0][1], {
verifyRelease1: './shareable1.json',
verifyRelease2: './shareable2.json',
generateNotes2: './shareable2.json',
analyzeCommits1: './shareable1.json',
analyzeCommits2: './shareable2.json',
});
td.when(
plugins(
{ options: expectedOptions, cwd },
{
verifyRelease1: "./shareable1.json",
verifyRelease2: "./shareable2.json",
generateNotes2: "./shareable2.json",
analyzeCommits1: "./shareable1.json",
analyzeCommits2: "./shareable2.json",
}
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable1.json and shareable2.json
t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
});
test('Prioritize configuration from config file over "extends"', async (t) => {
test.serial('Prioritize configuration from config file over "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const pkgOptions = {
extends: './shareable.json',
branches: ['test_pkg'],
generateNotes: 'generateNotes',
publish: [{path: 'publishPkg', param: 'publishPkg_param'}],
extends: "./shareable.json",
branches: ["test_pkg"],
generateNotes: "generateNotes",
publish: [{ path: "publishPkg", param: "publishPkg_param" }],
};
const options1 = {
analyzeCommits: 'analyzeCommits',
generateNotes: 'generateNotesShareable',
publish: [{path: 'publishShareable', param: 'publishShareable_param'}],
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: "analyzeCommits",
generateNotes: "generateNotesShareable",
publish: [{ path: "publishShareable", param: "publishShareable_param" }],
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable.json'), options1);
const {options: result} = await t.context.getConfig({cwd});
const expected = omit({...options1, ...pkgOptions, branches: ['test_pkg']}, 'extends');
// Verify the options contains the plugin config from package.json and shareable.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable.json"), options1);
const expectedOptions = omit({ ...options1, ...pkgOptions, branches: ["test_pkg"] }, "extends");
// Verify the plugins module is called with the plugin options from package.json and shareable.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
t.deepEqual(t.context.plugins.args[0][1], {
analyzeCommits: './shareable.json',
generateNotesShareable: './shareable.json',
publishShareable: './shareable.json',
});
td.when(
plugins(
{ cwd, options: expectedOptions },
{
analyzeCommits: "./shareable.json",
generateNotesShareable: "./shareable.json",
publishShareable: "./shareable.json",
}
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from package.json and shareable.json
t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
});
test('Prioritize configuration from cli/API options over "extends"', async (t) => {
test.serial('Prioritize configuration from cli/API options over "extends"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const cliOptions = {
extends: './shareable2.json',
branches: ['branch_opts'],
publish: [{path: 'publishOpts', param: 'publishOpts_param'}],
repositoryUrl: 'https://host.null/owner/module.git',
extends: "./shareable2.json",
branches: ["branch_opts"],
publish: [{ path: "publishOpts", param: "publishOpts_param" }],
repositoryUrl: "https://host.null/owner/module.git",
};
const pkgOptions = {
extends: './shareable1.json',
branches: ['branch_pkg'],
generateNotes: 'generateNotes',
publish: [{path: 'publishPkg', param: 'publishPkg_param'}],
extends: "./shareable1.json",
branches: ["branch_pkg"],
generateNotes: "generateNotes",
publish: [{ path: "publishPkg", param: "publishPkg_param" }],
};
const options1 = {
analyzeCommits: 'analyzeCommits1',
generateNotes: 'generateNotesShareable1',
publish: [{path: 'publishShareable', param: 'publishShareable_param1'}],
branches: ['test_branch1'],
repositoryUrl: 'https://host.null/owner/module.git',
analyzeCommits: "analyzeCommits1",
generateNotes: "generateNotesShareable1",
publish: [{ path: "publishShareable", param: "publishShareable_param1" }],
branches: ["test_branch1"],
repositoryUrl: "https://host.null/owner/module.git",
};
const options2 = {
analyzeCommits: 'analyzeCommits2',
publish: [{path: 'publishShareable', param: 'publishShareable_param2'}],
branches: ['test_branch2'],
analyzeCommits: "analyzeCommits2",
publish: [{ path: "publishShareable", param: "publishShareable_param2" }],
branches: ["test_branch2"],
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json, shareable1.json and shareable2.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable1.json'), options1);
await outputJson(path.resolve(cwd, 'shareable2.json'), options2);
const {options: result} = await t.context.getConfig({cwd}, cliOptions);
const expected = omit({...options2, ...pkgOptions, ...cliOptions, branches: ['branch_opts']}, 'extends');
// Verify the options contains the plugin config from package.json and shareable2.json
t.deepEqual(result, expected);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await outputJson(path.resolve(cwd, "shareable2.json"), options2);
const expectedOptions = omit({ ...options2, ...pkgOptions, ...cliOptions, branches: ["branch_opts"] }, "extends");
// Verify the plugins module is called with the plugin options from package.json and shareable2.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
td.when(
plugins(
{ cwd, options: expectedOptions },
{ analyzeCommits2: "./shareable2.json", publishShareable: "./shareable2.json" }
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd }, cliOptions);
// Verify the options contains the plugin config from package.json and shareable2.json
t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
});
test('Allow to unset properties defined in shareable config with "null"', async (t) => {
test.serial('Allow to unset properties defined in shareable config with "null"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const pkgOptions = {
extends: './shareable.json',
extends: "./shareable.json",
analyzeCommits: null,
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
plugins: null,
};
const options1 = {
generateNotes: 'generateNotes',
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
generateNotes: "generateNotes",
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
tagFormat: `v\${version}`,
plugins: ['test-plugin'],
plugins: ["test-plugin"],
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable.json'), options1);
const {options} = await t.context.getConfig({cwd});
// Verify the options contains the plugin config from shareable.json and the default `plugins`
t.deepEqual(options, {
...omit(options1, ['analyzeCommits']),
...omit(pkgOptions, ['extends', 'analyzeCommits']),
plugins: DEFAULT_PLUGINS,
});
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable.json"), options1);
// Verify the plugins module is called with the plugin options from shareable.json and the default `plugins`
t.deepEqual(t.context.plugins.args[0][0], {
td.when(
plugins(
{
options: {
...omit(options1, 'analyzeCommits'),
...omit(pkgOptions, ['extends', 'analyzeCommits']),
...omit(options1, "analyzeCommits"),
...omit(pkgOptions, ["extends", "analyzeCommits"]),
plugins: DEFAULT_PLUGINS,
},
cwd,
});
},
{
generateNotes: "./shareable.json",
analyzeCommits: "./shareable.json",
"test-plugin": "./shareable.json",
}
)
).thenResolve(pluginsConfig);
t.deepEqual(t.context.plugins.args[0][1], {
generateNotes: './shareable.json',
analyzeCommits: './shareable.json',
'test-plugin': './shareable.json',
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json and the default `plugins`
t.deepEqual(result, {
options: {
...omit(options1, ["analyzeCommits"]),
...omit(pkgOptions, ["extends", "analyzeCommits"]),
plugins: DEFAULT_PLUGINS,
},
plugins: pluginsConfig,
});
});
test('Allow to unset properties defined in shareable config with "undefined"', async (t) => {
test.serial('Allow to unset properties defined in shareable config with "undefined"', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
const pkgOptions = {
extends: './shareable.json',
extends: "./shareable.json",
analyzeCommits: undefined,
branches: ['test_branch'],
repositoryUrl: 'https://host.null/owner/module.git',
branches: ["test_branch"],
repositoryUrl: "https://host.null/owner/module.git",
};
const options1 = {
generateNotes: 'generateNotes',
analyzeCommits: {path: 'analyzeCommits', param: 'analyzeCommits_param'},
generateNotes: "generateNotes",
analyzeCommits: { path: "analyzeCommits", param: "analyzeCommits_param" },
tagFormat: `v\${version}`,
plugins: false,
};
// Create package.json and release.config.js in repository root
await writeFile(path.resolve(cwd, 'release.config.js'), `module.exports = ${format(pkgOptions)}`);
await outputJson(path.resolve(cwd, 'shareable.json'), options1);
const {options: result} = await t.context.getConfig({cwd});
const expected = {
...omit(options1, 'analyzeCommits'),
...omit(pkgOptions, ['extends', 'analyzeCommits']),
branches: ['test_branch'],
// Create release.config.js and shareable.json in repository root
await writeFile(path.resolve(cwd, "release.config.js"), `module.exports = ${format(pkgOptions)}`);
await outputJson(path.resolve(cwd, "shareable.json"), options1);
const expectedOptions = {
...omit(options1, "analyzeCommits"),
...omit(pkgOptions, ["extends", "analyzeCommits"]),
branches: ["test_branch"],
};
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, expected);
// Verify the plugins module is called with the plugin options from shareable.json
t.deepEqual(t.context.plugins.args[0][0], {options: expected, cwd});
t.deepEqual(t.context.plugins.args[0][1], {
generateNotes: './shareable.json',
analyzeCommits: './shareable.json',
});
td.when(
plugins(
{ options: expectedOptions, cwd },
{ generateNotes: "./shareable.json", analyzeCommits: "./shareable.json" }
)
).thenResolve(pluginsConfig);
const result = await t.context.getConfig({ cwd });
// Verify the options contains the plugin config from shareable.json
t.deepEqual(result, { options: expectedOptions, plugins: pluginsConfig });
});
test('Throw an Error if one of the shareable config cannot be found', async (t) => {
test("Throw an Error if one of the shareable config cannot be found", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const pkgOptions = {extends: ['./shareable1.json', 'non-existing-path']};
const options1 = {analyzeCommits: 'analyzeCommits'};
const { cwd } = await gitRepo();
const pkgOptions = { extends: ["./shareable1.json", "non-existing-path"] };
const options1 = { analyzeCommits: "analyzeCommits" };
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'shareable1.json'), options1);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "shareable1.json"), options1);
await t.throwsAsync(t.context.getConfig({cwd}), {
await t.throwsAsync(t.context.getConfig({ cwd }), {
message: /Cannot find module 'non-existing-path'/,
code: 'MODULE_NOT_FOUND',
code: "MODULE_NOT_FOUND",
});
});
test('Convert "ci" option to "noCi" when set from extended config', async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const pkgOptions = {extends: './no-ci.json'};
const { cwd } = await gitRepo();
const pkgOptions = { extends: "./no-ci.json" };
const options = {
ci: false,
};
// Create package.json and shareable.json in repository root
await outputJson(path.resolve(cwd, 'package.json'), {release: pkgOptions});
await outputJson(path.resolve(cwd, 'no-ci.json'), options);
await outputJson(path.resolve(cwd, "package.json"), { release: pkgOptions });
await outputJson(path.resolve(cwd, "no-ci.json"), options);
const {options: result} = await t.context.getConfig({cwd});
const { options: result } = await t.context.getConfig({ cwd });
t.is(result.ci, false);
t.is(result.noCi, true);

View File

@ -1,408 +1,413 @@
const test = require('ava');
const getAuthUrl = require('../lib/get-git-auth-url');
const {gitRepo} = require('./helpers/git-utils');
import test from "ava";
import getAuthUrl from "../lib/get-git-auth-url.js";
import { gitRepo } from "./helpers/git-utils.js";
const env = {GIT_ASKPASS: 'echo', GIT_TERMINAL_PROMPT: 0};
const env = { GIT_ASKPASS: "echo", GIT_TERMINAL_PROMPT: 0 };
test('Return the same "git" formatted URL if "gitCredentials" is not defined', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({cwd, env, branch: {name: 'master'}, options: {repositoryUrl: 'git@host.null:owner/repo.git'}}),
'git@host.null:owner/repo.git'
await getAuthUrl({
cwd,
env,
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
"git@host.null:owner/repo.git"
);
});
test('Return the same "https" formatted URL if "gitCredentials" is not defined', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'https://host.null/owner/repo.git'},
branch: { name: "master" },
options: { repositoryUrl: "https://host.null/owner/repo.git" },
}),
'https://host.null/owner/repo.git'
"https://host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is not defined and repositoryUrl is a "git+https" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'git+https://host.null/owner/repo.git'},
branch: { name: "master" },
options: { repositoryUrl: "git+https://host.null/owner/repo.git" },
}),
'https://host.null/owner/repo.git'
"https://host.null/owner/repo.git"
);
});
test('Do not add trailing ".git" if not present in the origian URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({cwd, env, vranch: {name: 'master'}, options: {repositoryUrl: 'git@host.null:owner/repo'}}),
'git@host.null:owner/repo'
await getAuthUrl({ cwd, env, vranch: { name: "master" }, options: { repositoryUrl: "git@host.null:owner/repo" } }),
"git@host.null:owner/repo"
);
});
test('Handle "https" URL with group and subgroup', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'https://host.null/group/subgroup/owner/repo.git'},
branch: { name: "master" },
options: { repositoryUrl: "https://host.null/group/subgroup/owner/repo.git" },
}),
'https://host.null/group/subgroup/owner/repo.git'
"https://host.null/group/subgroup/owner/repo.git"
);
});
test('Handle "git" URL with group and subgroup', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:group/subgroup/owner/repo.git'},
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:group/subgroup/owner/repo.git" },
}),
'git@host.null:group/subgroup/owner/repo.git'
"git@host.null:group/subgroup/owner/repo.git"
);
});
test('Convert shorthand URL', async (t) => {
const {cwd} = await gitRepo();
test("Convert shorthand URL", async (t) => {
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'semantic-release/semantic-release'},
branch: { name: "master" },
options: { repositoryUrl: "semantic-release/semantic-release" },
}),
'https://github.com/semantic-release/semantic-release.git'
"https://github.com/semantic-release/semantic-release.git"
);
});
test('Convert GitLab shorthand URL', async (t) => {
const {cwd} = await gitRepo();
test("Convert GitLab shorthand URL", async (t) => {
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env,
branch: {name: 'master'},
options: {repositoryUrl: 'gitlab:semantic-release/semantic-release'},
branch: { name: "master" },
options: { repositoryUrl: "gitlab:semantic-release/semantic-release" },
}),
'https://gitlab.com/semantic-release/semantic-release.git'
"https://gitlab.com/semantic-release/semantic-release.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://user:pass@host.null/owner/repo.git'
"https://user:pass@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
options: {branch: 'master', repositoryUrl: 'host.null:owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: { branch: "master", repositoryUrl: "host.null:owner/repo.git" },
}),
'https://user:pass@host.null/owner/repo.git'
"https://user:pass@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
options: {branch: 'master', repositoryUrl: 'host.null:6666:owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: { branch: "master", repositoryUrl: "host.null:6666:owner/repo.git" },
}),
'https://user:pass@host.null:6666/owner/repo.git'
"https://user:pass@host.null:6666/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git" URL without user and with a custom port followed by a slash', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
options: {branch: 'master', repositoryUrl: 'host.null:6666:/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: { branch: "master", repositoryUrl: "host.null:6666:/owner/repo.git" },
}),
'https://user:pass@host.null:6666/owner/repo.git'
"https://user:pass@host.null:6666/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "https" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'https://host.null/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "https://host.null/owner/repo.git" },
}),
'https://user:pass@host.null/owner/repo.git'
"https://user:pass@host.null/owner/repo.git"
);
});
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'http://host.null/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "http://host.null/owner/repo.git" },
}),
'http://user:pass@host.null/owner/repo.git'
"http://user:pass@host.null/owner/repo.git"
);
});
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "http" URL with custom port', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
options: {branch: 'master', repositoryUrl: 'http://host.null:8080/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: { branch: "master", repositoryUrl: "http://host.null:8080/owner/repo.git" },
}),
'http://user:pass@host.null:8080/owner/repo.git'
"http://user:pass@host.null:8080/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+https" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'git+https://host.null/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "git+https://host.null/owner/repo.git" },
}),
'https://user:pass@host.null/owner/repo.git'
"https://user:pass@host.null/owner/repo.git"
);
});
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "git+http" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'git+http://host.null/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "git+http://host.null/owner/repo.git" },
}),
'http://user:pass@host.null/owner/repo.git'
"http://user:pass@host.null/owner/repo.git"
);
});
test('Return the "http" formatted URL if "gitCredentials" is defined and repositoryUrl is a "ssh" URL', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
options: {branch: 'master', repositoryUrl: 'ssh://git@host.null:2222/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
options: { branch: "master", repositoryUrl: "ssh://git@host.null:2222/owner/repo.git" },
}),
'https://user:pass@host.null/owner/repo.git'
"https://user:pass@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "GH_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GH_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GH_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://token@host.null/owner/repo.git'
"https://token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "GITHUB_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GITHUB_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GITHUB_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://token@host.null/owner/repo.git'
"https://token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "GL_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GL_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GL_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://gitlab-ci-token:token@host.null/owner/repo.git'
"https://gitlab-ci-token:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "GITLAB_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GITLAB_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GITLAB_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://gitlab-ci-token:token@host.null/owner/repo.git'
"https://gitlab-ci-token:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, BB_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, BB_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://x-token-auth:token@host.null/owner/repo.git'
"https://x-token-auth:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, BITBUCKET_TOKEN: 'token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, BITBUCKET_TOKEN: "token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://x-token-auth:token@host.null/owner/repo.git'
"https://x-token-auth:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "BB_TOKEN_BASIC_AUTH"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, BB_TOKEN_BASIC_AUTH: 'username:token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, BB_TOKEN_BASIC_AUTH: "username:token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://username:token@host.null/owner/repo.git'
"https://username:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "gitCredentials" is defined with "BITBUCKET_TOKEN_BASIC_AUTH"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, BITBUCKET_TOKEN_BASIC_AUTH: 'username:token'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, BITBUCKET_TOKEN_BASIC_AUTH: "username:token" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://username:token@host.null/owner/repo.git'
"https://username:token@host.null/owner/repo.git"
);
});
test('Return the "https" formatted URL if "GITHUB_ACTION" is set', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GITHUB_ACTION: 'foo', GITHUB_TOKEN: 'token'},
options: {branch: 'master', repositoryUrl: 'git@host.null:owner/repo.git'},
env: { ...env, GITHUB_ACTION: "foo", GITHUB_TOKEN: "token" },
options: { branch: "master", repositoryUrl: "git@host.null:owner/repo.git" },
}),
'https://x-access-token:token@host.null/owner/repo.git'
"https://x-access-token:token@host.null/owner/repo.git"
);
});
test('Handle "https" URL with group and subgroup, with "GIT_CREDENTIALS"', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'https://host.null/group/subgroup/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "https://host.null/group/subgroup/owner/repo.git" },
}),
'https://user:pass@host.null/group/subgroup/owner/repo.git'
"https://user:pass@host.null/group/subgroup/owner/repo.git"
);
});
test('Handle "git" URL with group and subgroup, with "GIT_CREDENTIALS', async (t) => {
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl: 'git@host.null:group/subgroup/owner/repo.git'},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl: "git@host.null:group/subgroup/owner/repo.git" },
}),
'https://user:pass@host.null/group/subgroup/owner/repo.git'
"https://user:pass@host.null/group/subgroup/owner/repo.git"
);
});
test('Do not add git credential to repositoryUrl if push is allowed', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
test("Do not add git credential to repositoryUrl if push is allowed", async (t) => {
const { cwd, repositoryUrl } = await gitRepo(true);
t.is(
await getAuthUrl({
cwd,
env: {...env, GIT_CREDENTIALS: 'user:pass'},
branch: {name: 'master'},
options: {repositoryUrl},
env: { ...env, GIT_CREDENTIALS: "user:pass" },
branch: { name: "master" },
options: { repositoryUrl },
}),
repositoryUrl
);

View File

@ -1,80 +1,80 @@
const test = require('ava');
const getLastRelease = require('../lib/get-last-release');
import test from "ava";
import getLastRelease from "../lib/get-last-release.js";
test('Get the highest non-prerelease valid tag', (t) => {
test("Get the highest non-prerelease valid tag", (t) => {
const result = getLastRelease({
branch: {
name: 'master',
name: "master",
tags: [
{version: '2.0.0', gitTag: 'v2.0.0', gitHead: 'v2.0.0'},
{version: '1.0.0', gitTag: 'v1.0.0', gitHead: 'v1.0.0'},
{version: '3.0.0-beta.1', gitTag: 'v3.0.0-beta.1', gitHead: 'v3.0.0-beta.1'},
{ version: "2.0.0", gitTag: "v2.0.0", gitHead: "v2.0.0" },
{ version: "1.0.0", gitTag: "v1.0.0", gitHead: "v1.0.0" },
{ version: "3.0.0-beta.1", gitTag: "v3.0.0-beta.1", gitHead: "v3.0.0-beta.1" },
],
type: 'release',
type: "release",
},
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {version: '2.0.0', gitTag: 'v2.0.0', name: 'v2.0.0', gitHead: 'v2.0.0', channels: undefined});
t.deepEqual(result, { version: "2.0.0", gitTag: "v2.0.0", name: "v2.0.0", gitHead: "v2.0.0", channels: undefined });
});
test('Get the highest prerelease valid tag, ignoring other tags from other prerelease channels', (t) => {
test("Get the highest prerelease valid tag, ignoring other tags from other prerelease channels", (t) => {
const result = getLastRelease({
branch: {
name: 'beta',
prerelease: 'beta',
channel: 'beta',
name: "beta",
prerelease: "beta",
channel: "beta",
tags: [
{version: '1.0.0-beta.1', gitTag: 'v1.0.0-beta.1', gitHead: 'v1.0.0-beta.1', channels: ['beta']},
{version: '1.0.0-beta.2', gitTag: 'v1.0.0-beta.2', gitHead: 'v1.0.0-beta.2', channels: ['beta']},
{version: '1.0.0-alpha.1', gitTag: 'v1.0.0-alpha.1', gitHead: 'v1.0.0-alpha.1', channels: ['alpha']},
{ version: "1.0.0-beta.1", gitTag: "v1.0.0-beta.1", gitHead: "v1.0.0-beta.1", channels: ["beta"] },
{ version: "1.0.0-beta.2", gitTag: "v1.0.0-beta.2", gitHead: "v1.0.0-beta.2", channels: ["beta"] },
{ version: "1.0.0-alpha.1", gitTag: "v1.0.0-alpha.1", gitHead: "v1.0.0-alpha.1", channels: ["alpha"] },
],
type: 'prerelease',
type: "prerelease",
},
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
version: '1.0.0-beta.2',
gitTag: 'v1.0.0-beta.2',
name: 'v1.0.0-beta.2',
gitHead: 'v1.0.0-beta.2',
channels: ['beta'],
version: "1.0.0-beta.2",
gitTag: "v1.0.0-beta.2",
name: "v1.0.0-beta.2",
gitHead: "v1.0.0-beta.2",
channels: ["beta"],
});
});
test('Return empty object if no valid tag is found', (t) => {
test("Return empty object if no valid tag is found", (t) => {
const result = getLastRelease({
branch: {
name: 'master',
tags: [{version: '3.0.0-beta.1', gitTag: 'v3.0.0-beta.1', gitHead: 'v3.0.0-beta.1'}],
type: 'release',
name: "master",
tags: [{ version: "3.0.0-beta.1", gitTag: "v3.0.0-beta.1", gitHead: "v3.0.0-beta.1" }],
type: "release",
},
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {});
});
test('Get the highest non-prerelease valid tag before a certain version', (t) => {
test("Get the highest non-prerelease valid tag before a certain version", (t) => {
const result = getLastRelease(
{
branch: {
name: 'master',
name: "master",
channel: undefined,
tags: [
{version: '2.0.0', gitTag: 'v2.0.0', gitHead: 'v2.0.0'},
{version: '1.0.0', gitTag: 'v1.0.0', gitHead: 'v1.0.0'},
{version: '2.0.0-beta.1', gitTag: 'v2.0.0-beta.1', gitHead: 'v2.0.0-beta.1'},
{version: '2.1.0', gitTag: 'v2.1.0', gitHead: 'v2.1.0'},
{version: '2.1.1', gitTag: 'v2.1.1', gitHead: 'v2.1.1'},
{ version: "2.0.0", gitTag: "v2.0.0", gitHead: "v2.0.0" },
{ version: "1.0.0", gitTag: "v1.0.0", gitHead: "v1.0.0" },
{ version: "2.0.0-beta.1", gitTag: "v2.0.0-beta.1", gitHead: "v2.0.0-beta.1" },
{ version: "2.1.0", gitTag: "v2.1.0", gitHead: "v2.1.0" },
{ version: "2.1.1", gitTag: "v2.1.1", gitHead: "v2.1.1" },
],
type: 'release',
type: "release",
},
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
},
{before: '2.1.0'}
{ before: "2.1.0" }
);
t.deepEqual(result, {version: '2.0.0', gitTag: 'v2.0.0', name: 'v2.0.0', gitHead: 'v2.0.0', channels: undefined});
t.deepEqual(result, { version: "2.0.0", gitTag: "v2.0.0", name: "v2.0.0", gitHead: "v2.0.0", channels: undefined });
});

View File

@ -1,15 +1,15 @@
const test = require('ava');
const {spy} = require('sinon');
const getLogger = require('../lib/get-logger');
import test from "ava";
import { spy } from "sinon";
import getLogger from "../lib/get-logger.js";
test('Expose "error", "success" and "log" functions', (t) => {
const stdout = spy();
const stderr = spy();
const logger = getLogger({stdout: {write: stdout}, stderr: {write: stderr}});
const logger = getLogger({ stdout: { write: stdout }, stderr: { write: stderr } });
logger.log('test log');
logger.success('test success');
logger.error('test error');
logger.log("test log");
logger.success("test success");
logger.error("test error");
t.regex(stdout.args[0][0], /.*test log/);
t.regex(stdout.args[1][0], /.*test success/);

View File

@ -1,277 +1,277 @@
const test = require('ava');
const {stub} = require('sinon');
const getNextVersion = require('../lib/get-next-version');
import test from "ava";
import { stub } from "sinon";
import getNextVersion from "../lib/get-next-version.js";
test.beforeEach((t) => {
// Stub the logger functions
t.context.log = stub();
t.context.logger = {log: t.context.log};
t.context.logger = { log: t.context.log };
});
test('Increase version for patch release', (t) => {
test("Increase version for patch release", (t) => {
t.is(
getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]},
nextRelease: {type: 'patch'},
lastRelease: {version: '1.0.0', channels: [null]},
branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: { type: "patch" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'1.0.1'
"1.0.1"
);
});
test('Increase version for minor release', (t) => {
test("Increase version for minor release", (t) => {
t.is(
getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]},
nextRelease: {type: 'minor'},
lastRelease: {version: '1.0.0', channels: [null]},
branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: { type: "minor" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'1.1.0'
"1.1.0"
);
});
test('Increase version for major release', (t) => {
test("Increase version for major release", (t) => {
t.is(
getNextVersion({
branch: {name: 'master', type: 'release', tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}]},
nextRelease: {type: 'major'},
lastRelease: {version: '1.0.0', channels: [null]},
branch: { name: "master", type: "release", tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }] },
nextRelease: { type: "major" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'2.0.0'
"2.0.0"
);
});
test('Return 1.0.0 if there is no previous release', (t) => {
test("Return 1.0.0 if there is no previous release", (t) => {
t.is(
getNextVersion({
branch: {name: 'master', type: 'release', tags: []},
nextRelease: {type: 'minor'},
branch: { name: "master", type: "release", tags: [] },
nextRelease: { type: "minor" },
lastRelease: {},
logger: t.context.logger,
}),
'1.0.0'
"1.0.0"
);
});
test('Increase version for patch release on prerelease branch', (t) => {
test("Increase version for patch release on prerelease branch", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}],
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
},
nextRelease: {type: 'patch', channel: 'beta'},
lastRelease: {version: '1.0.0', channels: [null]},
nextRelease: { type: "patch", channel: "beta" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'1.0.1-beta.1'
"1.0.1-beta.1"
);
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.0.1-beta.1', version: '1.0.1-beta.1', channels: ['beta']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.0.1-beta.1", version: "1.0.1-beta.1", channels: ["beta"] },
],
},
nextRelease: {type: 'patch', channel: 'beta'},
lastRelease: {version: '1.0.1-beta.1', channels: ['beta']},
nextRelease: { type: "patch", channel: "beta" },
lastRelease: { version: "1.0.1-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'1.0.1-beta.2'
"1.0.1-beta.2"
);
t.is(
getNextVersion({
branch: {
name: 'alpha',
type: 'prerelease',
prerelease: 'alpha',
tags: [{gitTag: 'v1.0.1-beta.1', version: '1.0.1-beta.1', channels: ['beta']}],
name: "alpha",
type: "prerelease",
prerelease: "alpha",
tags: [{ gitTag: "v1.0.1-beta.1", version: "1.0.1-beta.1", channels: ["beta"] }],
},
nextRelease: {type: 'patch', channel: 'alpha'},
lastRelease: {version: '1.0.1-beta.1', channels: ['beta']},
nextRelease: { type: "patch", channel: "alpha" },
lastRelease: { version: "1.0.1-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'1.0.2-alpha.1'
"1.0.2-alpha.1"
);
});
test('Increase version for minor release on prerelease branch', (t) => {
test("Increase version for minor release on prerelease branch", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}],
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
},
nextRelease: {type: 'minor', channel: 'beta'},
lastRelease: {version: '1.0.0', channels: [null]},
nextRelease: { type: "minor", channel: "beta" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'1.1.0-beta.1'
"1.1.0-beta.1"
);
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: ['beta']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: ["beta"] },
],
},
nextRelease: {type: 'minor', channel: 'beta'},
lastRelease: {version: '1.1.0-beta.1', channels: ['beta']},
nextRelease: { type: "minor", channel: "beta" },
lastRelease: { version: "1.1.0-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'1.1.0-beta.2'
"1.1.0-beta.2"
);
t.is(
getNextVersion({
branch: {
name: 'alpha',
type: 'prerelease',
prerelease: 'alpha',
tags: [{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: ['beta']}],
name: "alpha",
type: "prerelease",
prerelease: "alpha",
tags: [{ gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: ["beta"] }],
},
nextRelease: {type: 'minor', channel: 'alpha'},
lastRelease: {version: '1.1.0-beta.1', channels: ['beta']},
nextRelease: { type: "minor", channel: "alpha" },
lastRelease: { version: "1.1.0-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'1.2.0-alpha.1'
"1.2.0-alpha.1"
);
});
test('Increase version for major release on prerelease branch', (t) => {
test("Increase version for major release on prerelease branch", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]}],
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] }],
},
nextRelease: {type: 'major', channel: 'beta'},
lastRelease: {version: '1.0.0', channels: [null]},
nextRelease: { type: "major", channel: "beta" },
lastRelease: { version: "1.0.0", channels: [null] },
logger: t.context.logger,
}),
'2.0.0-beta.1'
"2.0.0-beta.1"
);
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v2.0.0-beta.1', version: '2.0.0-beta.1', channels: ['beta']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v2.0.0-beta.1", version: "2.0.0-beta.1", channels: ["beta"] },
],
},
nextRelease: {type: 'major', channel: 'beta'},
lastRelease: {version: '2.0.0-beta.1', channels: ['beta']},
nextRelease: { type: "major", channel: "beta" },
lastRelease: { version: "2.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'2.0.0-beta.2'
"2.0.0-beta.2"
);
t.is(
getNextVersion({
branch: {
name: 'alpha',
type: 'prerelease',
prerelease: 'alpha',
tags: [{gitTag: 'v2.0.0-beta.1', version: '2.0.0-beta.1', channels: ['beta']}],
name: "alpha",
type: "prerelease",
prerelease: "alpha",
tags: [{ gitTag: "v2.0.0-beta.1", version: "2.0.0-beta.1", channels: ["beta"] }],
},
nextRelease: {type: 'major', channel: 'alpha'},
lastRelease: {version: '2.0.0-beta.1', channels: ['beta']},
nextRelease: { type: "major", channel: "alpha" },
lastRelease: { version: "2.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'3.0.0-alpha.1'
"3.0.0-alpha.1"
);
});
test('Return 1.0.0 if there is no previous release on prerelease branch', (t) => {
test("Return 1.0.0 if there is no previous release on prerelease branch", (t) => {
t.is(
getNextVersion({
branch: {name: 'beta', type: 'prerelease', prerelease: 'beta', tags: []},
nextRelease: {type: 'minor'},
branch: { name: "beta", type: "prerelease", prerelease: "beta", tags: [] },
nextRelease: { type: "minor" },
lastRelease: {},
logger: t.context.logger,
}),
'1.0.0-beta.1'
"1.0.0-beta.1"
);
});
test('Increase version for release on prerelease branch after previous commits were merged to release branch', (t) => {
test("Increase version for release on prerelease branch after previous commits were merged to release branch", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]}, // Version v1.1.0 released on default branch after beta was merged into master
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: [null, 'beta']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: [null] }, // Version v1.1.0 released on default branch after beta was merged into master
{ gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: [null, "beta"] },
],
},
nextRelease: {type: 'minor'},
lastRelease: {version: '1.1.0', channels: [null]},
nextRelease: { type: "minor" },
lastRelease: { version: "1.1.0", channels: [null] },
logger: t.context.logger,
}),
'1.2.0-beta.1'
"1.2.0-beta.1"
);
});
test('Increase version for release on prerelease branch based on highest commit type since last regular release', (t) => {
test("Increase version for release on prerelease branch based on highest commit type since last regular release", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0-beta.1', version: '1.1.0-beta.1', channels: [null, 'beta']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0-beta.1", version: "1.1.0-beta.1", channels: [null, "beta"] },
],
},
nextRelease: {type: 'major'},
lastRelease: {version: 'v1.1.0-beta.1', channels: [null]},
nextRelease: { type: "major" },
lastRelease: { version: "v1.1.0-beta.1", channels: [null] },
logger: t.context.logger,
}),
'2.0.0-beta.1'
"2.0.0-beta.1"
);
});
test('Increase version for release on prerelease branch when there is no regular releases on other branches', (t) => {
test("Increase version for release on prerelease branch when there is no regular releases on other branches", (t) => {
t.is(
getNextVersion({
branch: {
name: 'beta',
type: 'prerelease',
prerelease: 'beta',
tags: [{gitTag: 'v1.0.0-beta.1', version: '1.0.0-beta.1', channels: ['beta']}],
name: "beta",
type: "prerelease",
prerelease: "beta",
tags: [{ gitTag: "v1.0.0-beta.1", version: "1.0.0-beta.1", channels: ["beta"] }],
},
nextRelease: {type: 'minor', channel: 'beta'},
lastRelease: {version: 'v1.0.0-beta.1', channels: ['beta']},
nextRelease: { type: "minor", channel: "beta" },
lastRelease: { version: "v1.0.0-beta.1", channels: ["beta"] },
logger: t.context.logger,
}),
'1.0.0-beta.2'
"1.0.0-beta.2"
);
});

View File

@ -1,189 +1,189 @@
const test = require('ava');
const getReleaseToAdd = require('../lib/get-release-to-add');
import test from "ava";
import getReleaseToAdd from "../lib/get-release-to-add.js";
test('Return versions merged from release to maintenance branch, excluding lower than branch start range', (t) => {
test("Return versions merged from release to maintenance branch, excluding lower than branch start range", (t) => {
const result = getReleaseToAdd({
branch: {
name: '2.x',
channel: '2.x',
type: 'maintenance',
mergeRange: '>=2.0.0 <3.0.0',
name: "2.x",
channel: "2.x",
type: "maintenance",
mergeRange: ">=2.0.0 <3.0.0",
tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['2.x']},
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]},
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]},
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null]},
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]},
{ gitTag: "v2.0.0", version: "2.0.0", channels: ["2.x"] },
{ gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{ gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{ gitTag: "v2.1.1", version: "2.1.1", channels: [null] },
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
],
},
branches: [{name: '2.x', channel: '2.x'}, {name: 'master'}],
options: {tagFormat: `v\${version}`},
branches: [{ name: "2.x", channel: "2.x" }, { name: "master" }],
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
lastRelease: {version: '2.1.0', channels: [null], gitTag: 'v2.1.0', name: 'v2.1.0', gitHead: 'v2.1.0'},
lastRelease: { version: "2.1.0", channels: [null], gitTag: "v2.1.0", name: "v2.1.0", gitHead: "v2.1.0" },
currentRelease: {
type: 'patch',
version: '2.1.1',
type: "patch",
version: "2.1.1",
channels: [null],
gitTag: 'v2.1.1',
name: 'v2.1.1',
gitHead: 'v2.1.1',
gitTag: "v2.1.1",
name: "v2.1.1",
gitHead: "v2.1.1",
},
nextRelease: {
type: 'patch',
version: '2.1.1',
channel: '2.x',
gitTag: 'v2.1.1',
name: 'v2.1.1',
gitHead: 'v2.1.1',
type: "patch",
version: "2.1.1",
channel: "2.x",
gitTag: "v2.1.1",
name: "v2.1.1",
gitHead: "v2.1.1",
},
});
});
test('Return versions merged between release branches', (t) => {
test("Return versions merged between release branches", (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
name: "master",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']},
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']},
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['next-major']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{ gitTag: "v2.0.0", version: "2.0.0", channels: ["next-major"] },
],
},
branches: [{name: 'master'}, {name: 'next', channel: 'next'}, {name: 'next-major', channel: 'next-major'}],
options: {tagFormat: `v\${version}`},
branches: [{ name: "master" }, { name: "next", channel: "next" }, { name: "next-major", channel: "next-major" }],
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
lastRelease: {
version: '1.1.0',
gitTag: 'v1.1.0',
name: 'v1.1.0',
gitHead: 'v1.1.0',
channels: ['next'],
version: "1.1.0",
gitTag: "v1.1.0",
name: "v1.1.0",
gitHead: "v1.1.0",
channels: ["next"],
},
currentRelease: {
type: 'major',
version: '2.0.0',
channels: ['next-major'],
gitTag: 'v2.0.0',
name: 'v2.0.0',
gitHead: 'v2.0.0',
type: "major",
version: "2.0.0",
channels: ["next-major"],
gitTag: "v2.0.0",
name: "v2.0.0",
gitHead: "v2.0.0",
},
nextRelease: {
type: 'major',
version: '2.0.0',
type: "major",
version: "2.0.0",
channel: null,
gitTag: 'v2.0.0',
name: 'v2.0.0',
gitHead: 'v2.0.0',
gitTag: "v2.0.0",
name: "v2.0.0",
gitHead: "v2.0.0",
},
});
});
test('Return releases sorted by ascending order', (t) => {
test("Return releases sorted by ascending order", (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
name: "master",
tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['next-major']},
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']},
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']},
{ gitTag: "v2.0.0", version: "2.0.0", channels: ["next-major"] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
],
},
branches: [{name: 'master'}, {name: 'next', channel: 'next'}, {name: 'next-major', channel: 'next-major'}],
options: {tagFormat: `v\${version}`},
branches: [{ name: "master" }, { name: "next", channel: "next" }, { name: "next-major", channel: "next-major" }],
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
lastRelease: {version: '1.1.0', gitTag: 'v1.1.0', name: 'v1.1.0', gitHead: 'v1.1.0', channels: ['next']},
lastRelease: { version: "1.1.0", gitTag: "v1.1.0", name: "v1.1.0", gitHead: "v1.1.0", channels: ["next"] },
currentRelease: {
type: 'major',
version: '2.0.0',
channels: ['next-major'],
gitTag: 'v2.0.0',
name: 'v2.0.0',
gitHead: 'v2.0.0',
type: "major",
version: "2.0.0",
channels: ["next-major"],
gitTag: "v2.0.0",
name: "v2.0.0",
gitHead: "v2.0.0",
},
nextRelease: {
type: 'major',
version: '2.0.0',
type: "major",
version: "2.0.0",
channel: null,
gitTag: 'v2.0.0',
name: 'v2.0.0',
gitHead: 'v2.0.0',
gitTag: "v2.0.0",
name: "v2.0.0",
gitHead: "v2.0.0",
},
});
});
test('No lastRelease', (t) => {
test("No lastRelease", (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
tags: [{gitTag: 'v1.0.0', version: '1.0.0', channels: ['next']}],
name: "master",
tags: [{ gitTag: "v1.0.0", version: "1.0.0", channels: ["next"] }],
},
branches: [{name: 'master'}, {name: 'next', channel: 'next'}],
options: {tagFormat: `v\${version}`},
branches: [{ name: "master" }, { name: "next", channel: "next" }],
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
lastRelease: {},
currentRelease: {
type: 'major',
version: '1.0.0',
channels: ['next'],
gitTag: 'v1.0.0',
name: 'v1.0.0',
gitHead: 'v1.0.0',
type: "major",
version: "1.0.0",
channels: ["next"],
gitTag: "v1.0.0",
name: "v1.0.0",
gitHead: "v1.0.0",
},
nextRelease: {
type: 'major',
version: '1.0.0',
type: "major",
version: "1.0.0",
channel: null,
gitTag: 'v1.0.0',
name: 'v1.0.0',
gitHead: 'v1.0.0',
gitTag: "v1.0.0",
name: "v1.0.0",
gitHead: "v1.0.0",
},
});
});
test('Ignore pre-release versions', (t) => {
test("Ignore pre-release versions", (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
name: "master",
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null, 'next']},
{gitTag: 'v1.1.0', version: '1.1.0', channels: ['next']},
{gitTag: 'v2.0.0-alpha.1', version: '2.0.0-alpha.1', channels: ['alpha']},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null, "next"] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: ["next"] },
{ gitTag: "v2.0.0-alpha.1", version: "2.0.0-alpha.1", channels: ["alpha"] },
],
},
branches: [
{name: 'master'},
{name: 'next', channel: 'next'},
{name: 'alpha', type: 'prerelease', channel: 'alpha'},
{ name: "master" },
{ name: "next", channel: "next" },
{ name: "alpha", type: "prerelease", channel: "alpha" },
],
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.deepEqual(result, {
lastRelease: {version: '1.0.0', channels: [null, 'next'], gitTag: 'v1.0.0', name: 'v1.0.0', gitHead: 'v1.0.0'},
lastRelease: { version: "1.0.0", channels: [null, "next"], gitTag: "v1.0.0", name: "v1.0.0", gitHead: "v1.0.0" },
currentRelease: {
type: 'minor',
version: '1.1.0',
channels: ['next'],
gitTag: 'v1.1.0',
name: 'v1.1.0',
gitHead: 'v1.1.0',
type: "minor",
version: "1.1.0",
channels: ["next"],
gitTag: "v1.1.0",
name: "v1.1.0",
gitHead: "v1.1.0",
},
nextRelease: {
type: 'minor',
version: '1.1.0',
type: "minor",
version: "1.1.0",
channel: null,
gitTag: 'v1.1.0',
name: 'v1.1.0',
gitHead: 'v1.1.0',
gitTag: "v1.1.0",
name: "v1.1.0",
gitHead: "v1.1.0",
},
});
});
@ -191,24 +191,24 @@ test('Ignore pre-release versions', (t) => {
test('Exclude versions merged from release to maintenance branch if they have the same "channel"', (t) => {
const result = getReleaseToAdd({
branch: {
name: '2.x',
channel: 'latest',
type: 'maintenance',
mergeRange: '>=2.0.0 <3.0.0',
name: "2.x",
channel: "latest",
type: "maintenance",
mergeRange: ">=2.0.0 <3.0.0",
tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]},
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]},
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]},
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null]},
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]},
{ gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{ gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{ gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{ gitTag: "v2.1.1", version: "2.1.1", channels: [null] },
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
],
},
branches: [
{name: '2.x', channel: 'latest'},
{name: 'master', channel: 'latest'},
{ name: "2.x", channel: "latest" },
{ name: "master", channel: "latest" },
],
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.is(result, undefined);
@ -217,20 +217,20 @@ test('Exclude versions merged from release to maintenance branch if they have th
test('Exclude versions merged between release branches if they have the same "channel"', (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
channel: 'latest',
name: "master",
channel: "latest",
tags: [
{gitTag: 'v1.0.0', channels: ['latest'], version: '1.0.0'},
{gitTag: 'v1.1.0', channels: ['latest'], version: '1.1.0'},
{gitTag: 'v2.0.0', channels: ['latest'], version: '2.0.0'},
{ gitTag: "v1.0.0", channels: ["latest"], version: "1.0.0" },
{ gitTag: "v1.1.0", channels: ["latest"], version: "1.1.0" },
{ gitTag: "v2.0.0", channels: ["latest"], version: "2.0.0" },
],
},
branches: [
{name: 'master', channel: 'latest'},
{name: 'next', channel: 'latest'},
{name: 'next-major', channel: 'latest'},
{ name: "master", channel: "latest" },
{ name: "next", channel: "latest" },
{ name: "next-major", channel: "latest" },
],
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.is(result, undefined);
@ -239,43 +239,43 @@ test('Exclude versions merged between release branches if they have the same "ch
test('Exclude versions merged between release branches if they all have "channel" set to "false"', (t) => {
const result = getReleaseToAdd({
branch: {
name: 'master',
name: "master",
channel: false,
tags: [
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]},
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]},
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
{ gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
],
},
branches: [
{name: 'master', channel: false},
{name: 'next', channel: false},
{name: 'next-major', channel: false},
{ name: "master", channel: false },
{ name: "next", channel: false },
{ name: "next-major", channel: false },
],
options: {tagFormat: `v\${version}`},
options: { tagFormat: `v\${version}` },
});
t.is(result, undefined);
});
test('Exclude versions number less than the latest version already released on that branch', (t) => {
test("Exclude versions number less than the latest version already released on that branch", (t) => {
const result = getReleaseToAdd({
branch: {
name: '2.x',
channel: '2.x',
type: 'maintenance',
mergeRange: '>=2.0.0 <3.0.0',
name: "2.x",
channel: "2.x",
type: "maintenance",
mergeRange: ">=2.0.0 <3.0.0",
tags: [
{gitTag: 'v2.0.0', version: '2.0.0', channels: ['2.x']},
{gitTag: 'v2.0.0', version: '2.0.0', channels: [null]},
{gitTag: 'v2.1.0', version: '2.1.0', channels: [null]},
{gitTag: 'v2.1.1', version: '2.1.1', channels: [null, '2.x']},
{gitTag: 'v1.0.0', version: '1.0.0', channels: [null]},
{gitTag: 'v1.1.0', version: '1.1.0', channels: [null]},
{ gitTag: "v2.0.0", version: "2.0.0", channels: ["2.x"] },
{ gitTag: "v2.0.0", version: "2.0.0", channels: [null] },
{ gitTag: "v2.1.0", version: "2.1.0", channels: [null] },
{ gitTag: "v2.1.1", version: "2.1.1", channels: [null, "2.x"] },
{ gitTag: "v1.0.0", version: "1.0.0", channels: [null] },
{ gitTag: "v1.1.0", version: "1.1.0", channels: [null] },
],
},
branches: [{name: '2.x', channel: '2.x'}, {name: 'master'}],
options: {tagFormat: `v\${version}`},
branches: [{ name: "2.x", channel: "2.x" }, { name: "master" }],
options: { tagFormat: `v\${version}` },
});
t.is(result, undefined);

View File

@ -1,417 +1,417 @@
const test = require('ava');
const tempy = require('tempy');
const {
getTagHead,
isRefExists,
import test from "ava";
import { temporaryDirectory } from "tempy";
import {
addNote,
fetch,
fetchNotes,
getBranches,
getGitHead,
getNote,
getTagHead,
getTags,
isBranchUpToDate,
isGitRepo,
isRefExists,
push,
repoUrl,
tag,
push,
getTags,
getBranches,
isGitRepo,
verifyTagName,
isBranchUpToDate,
getNote,
addNote,
fetchNotes,
} = require('../lib/git');
const {
gitRepo,
gitCommits,
gitCheckout,
gitTagVersion,
gitShallowClone,
gitGetCommits,
} from "../lib/git.js";
import {
gitAddConfig,
gitAddNote,
gitCheckout,
gitCommits,
gitCommitTag,
gitRemoteTagHead,
gitPush,
gitDetachedHead,
gitDetachedHeadFromBranch,
gitAddNote,
gitGetNote,
gitFetch,
gitGetCommits,
gitGetNote,
gitPush,
gitRemoteTagHead,
gitRepo,
gitShallowClone,
gitTagVersion,
initGit,
} = require('./helpers/git-utils');
} from "./helpers/git-utils.js";
test('Get the last commit sha', async (t) => {
test("Get the last commit sha", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
const result = await getGitHead({cwd});
const result = await getGitHead({ cwd });
t.is(result, commits[0].hash);
});
test('Throw error if the last commit sha cannot be found', async (t) => {
test("Throw error if the last commit sha cannot be found", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
await t.throwsAsync(getGitHead({cwd}));
await t.throwsAsync(getGitHead({ cwd }));
});
test('Unshallow and fetch repository', async (t) => {
test("Unshallow and fetch repository", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo();
let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch
await gitCommits(['First', 'Second'], {cwd});
await gitCommits(["First", "Second"], { cwd });
// Create a shallow clone with only 1 commit
cwd = await gitShallowClone(repositoryUrl);
// Verify the shallow clone contains only one commit
t.is((await gitGetCommits(undefined, {cwd})).length, 1);
t.is((await gitGetCommits(undefined, { cwd })).length, 1);
await fetch(repositoryUrl, 'master', 'master', {cwd});
await fetch(repositoryUrl, "master", "master", { cwd });
// Verify the shallow clone contains all the commits
t.is((await gitGetCommits(undefined, {cwd})).length, 2);
t.is((await gitGetCommits(undefined, { cwd })).length, 2);
});
test('Do not throw error when unshallow a complete repository', async (t) => {
test("Do not throw error when unshallow a complete repository", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd, repositoryUrl} = await gitRepo(true);
await gitCommits(['First'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitCheckout('second-branch', true, {cwd});
await gitCommits(['Second'], {cwd});
await gitPush(repositoryUrl, 'second-branch', {cwd});
const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout("second-branch", true, { cwd });
await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, "second-branch", { cwd });
await t.notThrowsAsync(fetch(repositoryUrl, 'master', 'master', {cwd}));
await t.notThrowsAsync(fetch(repositoryUrl, 'second-branch', 'master', {cwd}));
await t.notThrowsAsync(fetch(repositoryUrl, "master", "master", { cwd }));
await t.notThrowsAsync(fetch(repositoryUrl, "second-branch", "master", { cwd }));
});
test('Fetch all tags on a detached head repository', async (t) => {
let {cwd, repositoryUrl} = await gitRepo();
test("Fetch all tags on a detached head repository", async (t) => {
let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd});
await gitTagVersion('v1.0.0', undefined, {cwd});
await gitCommits(['Second'], {cwd});
await gitTagVersion('v1.0.1', undefined, {cwd});
const [commit] = await gitCommits(['Third'], {cwd});
await gitTagVersion('v1.1.0', undefined, {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitCommits(["First"], { cwd });
await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(["Second"], { cwd });
await gitTagVersion("v1.0.1", undefined, { cwd });
const [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd});
await fetch(repositoryUrl, "master", "master", { cwd });
t.deepEqual((await getTags('master', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0'].sort());
t.deepEqual((await getTags("master", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0"].sort());
});
test('Fetch all tags on a repository with a detached head from branch (CircleCI)', async (t) => {
let {cwd, repositoryUrl} = await gitRepo();
test("Fetch all tags on a repository with a detached head from branch (CircleCI)", async (t) => {
let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd});
await gitTagVersion('v1.0.0', undefined, {cwd});
await gitCommits(['Second'], {cwd});
await gitTagVersion('v1.0.1', undefined, {cwd});
const [commit] = await gitCommits(['Third'], {cwd});
await gitTagVersion('v1.1.0', undefined, {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitCheckout('other-branch', true, {cwd});
await gitPush(repositoryUrl, 'other-branch', {cwd});
await gitCheckout('master', false, {cwd});
await gitCommits(['Fourth'], {cwd});
await gitTagVersion('v2.0.0', undefined, {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
cwd = await gitDetachedHeadFromBranch(repositoryUrl, 'other-branch', commit.hash);
await gitCommits(["First"], { cwd });
await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(["Second"], { cwd });
await gitTagVersion("v1.0.1", undefined, { cwd });
const [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout("other-branch", true, { cwd });
await gitPush(repositoryUrl, "other-branch", { cwd });
await gitCheckout("master", false, { cwd });
await gitCommits(["Fourth"], { cwd });
await gitTagVersion("v2.0.0", undefined, { cwd });
await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHeadFromBranch(repositoryUrl, "other-branch", commit.hash);
await fetch(repositoryUrl, 'master', 'other-branch', {cwd});
await fetch(repositoryUrl, 'other-branch', 'other-branch', {cwd});
await fetch(repositoryUrl, "master", "other-branch", { cwd });
await fetch(repositoryUrl, "other-branch", "other-branch", { cwd });
t.deepEqual((await getTags('other-branch', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0'].sort());
t.deepEqual((await getTags('master', {cwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0', 'v2.0.0'].sort());
t.deepEqual((await getTags("other-branch", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0"].sort());
t.deepEqual((await getTags("master", { cwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0", "v2.0.0"].sort());
});
test('Fetch all tags on a detached head repository with outdated cached repo (GitLab CI)', async (t) => {
const {cwd, repositoryUrl} = await gitRepo();
test("Fetch all tags on a detached head repository with outdated cached repo (GitLab CI)", async (t) => {
const { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd});
await gitTagVersion('v1.0.0', undefined, {cwd});
await gitCommits(['Second'], {cwd});
await gitTagVersion('v1.0.1', undefined, {cwd});
let [commit] = await gitCommits(['Third'], {cwd});
await gitTagVersion('v1.1.0', undefined, {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitCommits(["First"], { cwd });
await gitTagVersion("v1.0.0", undefined, { cwd });
await gitCommits(["Second"], { cwd });
await gitTagVersion("v1.0.1", undefined, { cwd });
let [commit] = await gitCommits(["Third"], { cwd });
await gitTagVersion("v1.1.0", undefined, { cwd });
await gitPush(repositoryUrl, "master", { cwd });
// Create a clone (as first CI run would)
const cloneCwd = await gitShallowClone(repositoryUrl);
await gitFetch(repositoryUrl, {cwd: cloneCwd});
await gitCheckout(commit.hash, false, {cwd: cloneCwd});
await gitFetch(repositoryUrl, { cwd: cloneCwd });
await gitCheckout(commit.hash, false, { cwd: cloneCwd });
// Push tag to remote
[commit] = await gitCommits(['Fourth'], {cwd});
await gitTagVersion('v1.2.0', undefined, {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
[commit] = await gitCommits(["Fourth"], { cwd });
await gitTagVersion("v1.2.0", undefined, { cwd });
await gitPush(repositoryUrl, "master", { cwd });
// Fetch on the cached repo and make detached head, leaving master outdated
await fetch(repositoryUrl, 'master', 'master', {cwd: cloneCwd});
await gitCheckout(commit.hash, false, {cwd: cloneCwd});
await fetch(repositoryUrl, "master", "master", { cwd: cloneCwd });
await gitCheckout(commit.hash, false, { cwd: cloneCwd });
t.deepEqual((await getTags('master', {cwd: cloneCwd})).sort(), ['v1.0.0', 'v1.0.1', 'v1.1.0', 'v1.2.0'].sort());
t.deepEqual((await getTags("master", { cwd: cloneCwd })).sort(), ["v1.0.0", "v1.0.1", "v1.1.0", "v1.2.0"].sort());
});
test('Verify if a branch exists', async (t) => {
test("Verify if a branch exists", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
await gitCommits(['First'], {cwd});
await gitCommits(["First"], { cwd });
// Create the new branch 'other-branch' from master
await gitCheckout('other-branch', true, {cwd});
await gitCheckout("other-branch", true, { cwd });
// Add commits to the 'other-branch' branch
await gitCommits(['Second'], {cwd});
await gitCommits(["Second"], { cwd });
t.true(await isRefExists('master', {cwd}));
t.true(await isRefExists('other-branch', {cwd}));
t.falsy(await isRefExists('next', {cwd}));
t.true(await isRefExists("master", { cwd }));
t.true(await isRefExists("other-branch", { cwd }));
t.falsy(await isRefExists("next", { cwd }));
});
test('Get all branches', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
await gitCommits(['First'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitCheckout('second-branch', true, {cwd});
await gitCommits(['Second'], {cwd});
await gitPush(repositoryUrl, 'second-branch', {cwd});
await gitCheckout('third-branch', true, {cwd});
await gitCommits(['Third'], {cwd});
await gitPush(repositoryUrl, 'third-branch', {cwd});
test("Get all branches", async (t) => {
const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
await gitCheckout("second-branch", true, { cwd });
await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, "second-branch", { cwd });
await gitCheckout("third-branch", true, { cwd });
await gitCommits(["Third"], { cwd });
await gitPush(repositoryUrl, "third-branch", { cwd });
t.deepEqual((await getBranches(repositoryUrl, {cwd})).sort(), ['master', 'second-branch', 'third-branch'].sort());
t.deepEqual((await getBranches(repositoryUrl, { cwd })).sort(), ["master", "second-branch", "third-branch"].sort());
});
test('Return empty array if there are no branches', async (t) => {
const {cwd, repositoryUrl} = await initGit(true);
t.deepEqual(await getBranches(repositoryUrl, {cwd}), []);
test("Return empty array if there are no branches", async (t) => {
const { cwd, repositoryUrl } = await initGit(true);
t.deepEqual(await getBranches(repositoryUrl, { cwd }), []);
});
test('Get the commit sha for a given tag', async (t) => {
test("Get the commit sha for a given tag", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
// Create the tag corresponding to version 1.0.0
await gitTagVersion('v1.0.0', undefined, {cwd});
await gitTagVersion("v1.0.0", undefined, { cwd });
t.is(await getTagHead('v1.0.0', {cwd}), commits[0].hash);
t.is(await getTagHead("v1.0.0", { cwd }), commits[0].hash);
});
test('Return git remote repository url from config', async (t) => {
test("Return git remote repository url from config", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add remote.origin.url config
await gitAddConfig('remote.origin.url', 'git@hostname.com:owner/package.git', {cwd});
await gitAddConfig("remote.origin.url", "git@hostname.com:owner/package.git", { cwd });
t.is(await repoUrl({cwd}), 'git@hostname.com:owner/package.git');
t.is(await repoUrl({ cwd }), "git@hostname.com:owner/package.git");
});
test('Return git remote repository url set while cloning', async (t) => {
test("Return git remote repository url set while cloning", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo();
await gitCommits(['First'], {cwd});
let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(["First"], { cwd });
// Create a clone
cwd = await gitShallowClone(repositoryUrl);
t.is(await repoUrl({cwd}), repositoryUrl);
t.is(await repoUrl({ cwd }), repositoryUrl);
});
test('Return falsy if git repository url is not set', async (t) => {
test("Return falsy if git repository url is not set", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
t.falsy(await repoUrl({cwd}));
t.falsy(await repoUrl({ cwd }));
});
test('Add tag on head commit', async (t) => {
test("Add tag on head commit", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const commits = await gitCommits(['Test commit'], {cwd});
const { cwd } = await gitRepo();
const commits = await gitCommits(["Test commit"], { cwd });
await tag('tag_name', 'HEAD', {cwd});
await tag("tag_name", "HEAD", { cwd });
await t.is(await gitCommitTag(commits[0].hash, {cwd}), 'tag_name');
await t.is(await gitCommitTag(commits[0].hash, { cwd }), "tag_name");
});
test('Push tag to remote repository', async (t) => {
test("Push tag to remote repository", async (t) => {
// Create a git repository with a remote, set the current working directory at the root of the repo
const {cwd, repositoryUrl} = await gitRepo(true);
const commits = await gitCommits(['Test commit'], {cwd});
const { cwd, repositoryUrl } = await gitRepo(true);
const commits = await gitCommits(["Test commit"], { cwd });
await tag('tag_name', 'HEAD', {cwd});
await push(repositoryUrl, {cwd});
await tag("tag_name", "HEAD", { cwd });
await push(repositoryUrl, { cwd });
t.is(await gitRemoteTagHead(repositoryUrl, 'tag_name', {cwd}), commits[0].hash);
t.is(await gitRemoteTagHead(repositoryUrl, "tag_name", { cwd }), commits[0].hash);
});
test('Push tag to remote repository with remote branch ahead', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
const commits = await gitCommits(['First'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
test("Push tag to remote repository with remote branch ahead", async (t) => {
const { cwd, repositoryUrl } = await gitRepo(true);
const commits = await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
const temporaryRepo = await gitShallowClone(repositoryUrl);
await gitCommits(['Second'], {cwd: temporaryRepo});
await gitPush('origin', 'master', {cwd: temporaryRepo});
await gitCommits(["Second"], { cwd: temporaryRepo });
await gitPush("origin", "master", { cwd: temporaryRepo });
await tag('tag_name', 'HEAD', {cwd});
await push(repositoryUrl, {cwd});
await tag("tag_name", "HEAD", { cwd });
await push(repositoryUrl, { cwd });
t.is(await gitRemoteTagHead(repositoryUrl, 'tag_name', {cwd}), commits[0].hash);
t.is(await gitRemoteTagHead(repositoryUrl, "tag_name", { cwd }), commits[0].hash);
});
test('Return "true" if in a Git repository', async (t) => {
// Create a git repository with a remote, set the current working directory at the root of the repo
const {cwd} = await gitRepo(true);
const { cwd } = await gitRepo(true);
t.true(await isGitRepo({cwd}));
t.true(await isGitRepo({ cwd }));
});
test('Return falsy if not in a Git repository', async (t) => {
const cwd = tempy.directory();
test("Return falsy if not in a Git repository", async (t) => {
const cwd = temporaryDirectory();
t.falsy(await isGitRepo({cwd}));
t.falsy(await isGitRepo({ cwd }));
});
test('Return "true" for valid tag names', async (t) => {
t.true(await verifyTagName('1.0.0'));
t.true(await verifyTagName('v1.0.0'));
t.true(await verifyTagName('tag_name'));
t.true(await verifyTagName('tag/name'));
t.true(await verifyTagName("1.0.0"));
t.true(await verifyTagName("v1.0.0"));
t.true(await verifyTagName("tag_name"));
t.true(await verifyTagName("tag/name"));
});
test('Return falsy for invalid tag names', async (t) => {
t.falsy(await verifyTagName('?1.0.0'));
t.falsy(await verifyTagName('*1.0.0'));
t.falsy(await verifyTagName('[1.0.0]'));
t.falsy(await verifyTagName('1.0.0..'));
test("Return falsy for invalid tag names", async (t) => {
t.falsy(await verifyTagName("?1.0.0"));
t.falsy(await verifyTagName("*1.0.0"));
t.falsy(await verifyTagName("[1.0.0]"));
t.falsy(await verifyTagName("1.0.0.."));
});
test('Throws error if obtaining the tags fails', async (t) => {
const cwd = tempy.directory();
test("Throws error if obtaining the tags fails", async (t) => {
const cwd = temporaryDirectory();
await t.throwsAsync(getTags('master', {cwd}));
await t.throwsAsync(getTags("master", { cwd }));
});
test('Return "true" if repository is up to date', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
await gitCommits(['First'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(["First"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
t.true(await isBranchUpToDate(repositoryUrl, 'master', {cwd}));
t.true(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
});
test('Return falsy if repository is not up to date', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
await gitCommits(['First'], {cwd});
await gitCommits(['Second'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
test("Return falsy if repository is not up to date", async (t) => {
const { cwd, repositoryUrl } = await gitRepo(true);
await gitCommits(["First"], { cwd });
await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
t.true(await isBranchUpToDate(repositoryUrl, 'master', {cwd}));
t.true(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
const temporaryRepo = await gitShallowClone(repositoryUrl);
await gitCommits(['Third'], {cwd: temporaryRepo});
await gitPush('origin', 'master', {cwd: temporaryRepo});
await gitCommits(["Third"], { cwd: temporaryRepo });
await gitPush("origin", "master", { cwd: temporaryRepo });
t.falsy(await isBranchUpToDate(repositoryUrl, 'master', {cwd}));
t.falsy(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
});
test('Return falsy if detached head repository is not up to date', async (t) => {
let {cwd, repositoryUrl} = await gitRepo();
test("Return falsy if detached head repository is not up to date", async (t) => {
let { cwd, repositoryUrl } = await gitRepo();
const [commit] = await gitCommits(['First'], {cwd});
await gitCommits(['Second'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
const [commit] = await gitCommits(["First"], { cwd });
await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd});
await fetch(repositoryUrl, "master", "master", { cwd });
t.falsy(await isBranchUpToDate(repositoryUrl, 'master', {cwd}));
t.falsy(await isBranchUpToDate(repositoryUrl, "master", { cwd }));
});
test('Get a commit note', async (t) => {
test("Get a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
await gitAddNote(JSON.stringify({note: 'note'}), commits[0].hash, {cwd});
await gitAddNote(JSON.stringify({ note: "note" }), commits[0].hash, { cwd });
t.deepEqual(await getNote(commits[0].hash, {cwd}), {note: 'note'});
t.deepEqual(await getNote(commits[0].hash, { cwd }), { note: "note" });
});
test('Return empty object if there is no commit note', async (t) => {
test("Return empty object if there is no commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
t.deepEqual(await getNote(commits[0].hash, {cwd}), {});
t.deepEqual(await getNote(commits[0].hash, { cwd }), {});
});
test('Throw error if a commit note in invalid', async (t) => {
test("Throw error if a commit note in invalid", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
await gitAddNote('non-json note', commits[0].hash, {cwd});
await gitAddNote("non-json note", commits[0].hash, { cwd });
await t.throwsAsync(getNote(commits[0].hash, {cwd}));
await t.throwsAsync(getNote(commits[0].hash, { cwd }));
});
test('Add a commit note', async (t) => {
test("Add a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
await addNote({note: 'note'}, commits[0].hash, {cwd});
await addNote({ note: "note" }, commits[0].hash, { cwd });
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note"}');
t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note"}');
});
test('Overwrite a commit note', async (t) => {
test("Overwrite a commit note", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
const {cwd} = await gitRepo();
const { cwd } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First'], {cwd});
const commits = await gitCommits(["First"], { cwd });
await addNote({note: 'note'}, commits[0].hash, {cwd});
await addNote({note: 'note2'}, commits[0].hash, {cwd});
await addNote({ note: "note" }, commits[0].hash, { cwd });
await addNote({ note: "note2" }, commits[0].hash, { cwd });
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note2"}');
t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note2"}');
});
test('Unshallow and fetch repository with notes', async (t) => {
test("Unshallow and fetch repository with notes", async (t) => {
// Create a git repository, set the current working directory at the root of the repo
let {cwd, repositoryUrl} = await gitRepo();
let { cwd, repositoryUrl } = await gitRepo();
// Add commits to the master branch
const commits = await gitCommits(['First', 'Second'], {cwd});
await gitAddNote(JSON.stringify({note: 'note'}), commits[0].hash, {cwd});
const commits = await gitCommits(["First", "Second"], { cwd });
await gitAddNote(JSON.stringify({ note: "note" }), commits[0].hash, { cwd });
// Create a shallow clone with only 1 commit
cwd = await gitShallowClone(repositoryUrl);
// Verify the shallow clone doesn't contains the note
await t.throwsAsync(gitGetNote(commits[0].hash, {cwd}));
await t.throwsAsync(gitGetNote(commits[0].hash, { cwd }));
await fetch(repositoryUrl, 'master', 'master', {cwd});
await fetchNotes(repositoryUrl, {cwd});
await fetch(repositoryUrl, "master", "master", { cwd });
await fetchNotes(repositoryUrl, { cwd });
// Verify the shallow clone contains the note
t.is(await gitGetNote(commits[0].hash, {cwd}), '{"note":"note"}');
t.is(await gitGetNote(commits[0].hash, { cwd }), '{"note":"note"}');
});
test('Fetch all notes on a detached head repository', async (t) => {
let {cwd, repositoryUrl} = await gitRepo();
test("Fetch all notes on a detached head repository", async (t) => {
let { cwd, repositoryUrl } = await gitRepo();
await gitCommits(['First'], {cwd});
const [commit] = await gitCommits(['Second'], {cwd});
await gitPush(repositoryUrl, 'master', {cwd});
await gitAddNote(JSON.stringify({note: 'note'}), commit.hash, {cwd});
await gitCommits(["First"], { cwd });
const [commit] = await gitCommits(["Second"], { cwd });
await gitPush(repositoryUrl, "master", { cwd });
await gitAddNote(JSON.stringify({ note: "note" }), commit.hash, { cwd });
cwd = await gitDetachedHead(repositoryUrl, commit.hash);
await fetch(repositoryUrl, 'master', 'master', {cwd});
await fetchNotes(repositoryUrl, {cwd});
await fetch(repositoryUrl, "master", "master", { cwd });
await fetchNotes(repositoryUrl, { cwd });
t.is(await gitGetNote(commit.hash, {cwd}), '{"note":"note"}');
t.is(await gitGetNote(commit.hash, { cwd }), '{"note":"note"}');
});

View File

@ -1,10 +1,10 @@
const tempy = require('tempy');
const execa = require('execa');
const fileUrl = require('file-url');
const pEachSeries = require('p-each-series');
const gitLogParser = require('git-log-parser');
const getStream = require('get-stream');
const {GIT_NOTE_REF} = require('../../lib/definitions/constants');
import {temporaryDirectory} from 'tempy';
import {execa} from 'execa';
import fileUrl from 'file-url';
import pEachSeries from 'p-each-series';
import gitLogParser from 'git-log-parser';
import getStream from 'get-stream';
import {GIT_NOTE_REF} from '../../lib/definitions/constants.js';
/**
* Commit message information.
@ -23,8 +23,8 @@ const {GIT_NOTE_REF} = require('../../lib/definitions/constants');
* @param {Boolean} withRemote `true` to create a shallow clone of a bare repository.
* @return {String} The path of the repository
*/
async function initGit(withRemote) {
const cwd = tempy.directory();
export async function initGit(withRemote) {
const cwd = temporaryDirectory();
const args = withRemote ? ['--bare', '--initial-branch=master'] : ['--initial-branch=master'];
await execa('git', ['init', ...args], {cwd}).catch(() => {
@ -45,7 +45,7 @@ async function initGit(withRemote) {
* @param {String} [branch='master'] The branch to initialize.
* @return {String} The path of the clone if `withRemote` is `true`, the path of the repository otherwise.
*/
async function gitRepo(withRemote, branch = 'master') {
export async function gitRepo(withRemote, branch = 'master') {
let {cwd, repositoryUrl} = await initGit(withRemote);
if (withRemote) {
await initBareRepo(repositoryUrl, branch);
@ -70,8 +70,8 @@ async function gitRepo(withRemote, branch = 'master') {
* @param {String} repositoryUrl The URL of the bare repository.
* @param {String} [branch='master'] the branch to initialize.
*/
async function initBareRepo(repositoryUrl, branch = 'master') {
const cwd = tempy.directory();
export async function initBareRepo(repositoryUrl, branch = 'master') {
const cwd = temporaryDirectory();
await execa('git', ['clone', '--no-hardlinks', repositoryUrl, cwd], {cwd});
await gitCheckout(branch, true, {cwd});
await gitCommits(['Initial commit'], {cwd});
@ -86,7 +86,7 @@ async function initBareRepo(repositoryUrl, branch = 'master') {
*
* @returns {Array<Commit>} The created commits, in reverse order (to match `git log` order).
*/
async function gitCommits(messages, execaOptions) {
export async function gitCommits(messages, execaOptions) {
await pEachSeries(
messages,
async (message) =>
@ -103,7 +103,7 @@ async function gitCommits(messages, execaOptions) {
*
* @return {Array<Object>} The list of parsed commits.
*/
async function gitGetCommits(from, execaOptions) {
export async function gitGetCommits(from, execaOptions) {
Object.assign(gitLogParser.fields, {hash: 'H', message: 'B', gitTags: 'd', committerDate: {key: 'ci', type: Date}});
return (
await getStream.array(
@ -126,7 +126,7 @@ async function gitGetCommits(from, execaOptions) {
* @param {Boolean} create to create the branch, `false` to checkout an existing branch.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitCheckout(branch, create, execaOptions) {
export async function gitCheckout(branch, create, execaOptions) {
await execa('git', create ? ['checkout', '-b', branch] : ['checkout', branch], execaOptions);
}
@ -136,7 +136,7 @@ async function gitCheckout(branch, create, execaOptions) {
* @param {String} repositoryUrl The repository remote URL.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitFetch(repositoryUrl, execaOptions) {
export async function gitFetch(repositoryUrl, execaOptions) {
await execa('git', ['fetch', repositoryUrl], execaOptions);
}
@ -147,7 +147,7 @@ async function gitFetch(repositoryUrl, execaOptions) {
*
* @return {String} The sha of the head commit in the current git repository.
*/
async function gitHead(execaOptions) {
export async function gitHead(execaOptions) {
return (await execa('git', ['rev-parse', 'HEAD'], execaOptions)).stdout;
}
@ -158,7 +158,7 @@ async function gitHead(execaOptions) {
* @param {String} [sha] The commit on which to create the tag. If undefined the tag is created on the last commit.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitTagVersion(tagName, sha, execaOptions) {
export async function gitTagVersion(tagName, sha, execaOptions) {
await execa('git', sha ? ['tag', '-f', tagName, sha] : ['tag', tagName], execaOptions);
}
@ -171,8 +171,8 @@ async function gitTagVersion(tagName, sha, execaOptions) {
* @param {Number} [depth=1] The number of commit to clone.
* @return {String} The path of the cloned repository.
*/
async function gitShallowClone(repositoryUrl, branch = 'master', depth = 1) {
const cwd = tempy.directory();
export async function gitShallowClone(repositoryUrl, branch = 'master', depth = 1) {
const cwd = temporaryDirectory();
await execa('git', ['clone', '--no-hardlinks', '--no-tags', '-b', branch, '--depth', depth, repositoryUrl, cwd], {
cwd,
@ -187,8 +187,8 @@ async function gitShallowClone(repositoryUrl, branch = 'master', depth = 1) {
* @param {Number} head A commit sha of the remote repo that will become the detached head of the new one.
* @return {String} The path of the new repository.
*/
async function gitDetachedHead(repositoryUrl, head) {
const cwd = tempy.directory();
export async function gitDetachedHead(repositoryUrl, head) {
const cwd = temporaryDirectory();
await execa('git', ['init'], {cwd});
await execa('git', ['remote', 'add', 'origin', repositoryUrl], {cwd});
@ -197,8 +197,8 @@ async function gitDetachedHead(repositoryUrl, head) {
return cwd;
}
async function gitDetachedHeadFromBranch(repositoryUrl, branch, head) {
const cwd = tempy.directory();
export async function gitDetachedHeadFromBranch(repositoryUrl, branch, head) {
const cwd = temporaryDirectory();
await execa('git', ['init'], {cwd});
await execa('git', ['remote', 'add', 'origin', repositoryUrl], {cwd});
@ -215,7 +215,7 @@ async function gitDetachedHeadFromBranch(repositoryUrl, branch, head) {
* @param {String} value Config value.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitAddConfig(name, value, execaOptions) {
export async function gitAddConfig(name, value, execaOptions) {
await execa('git', ['config', '--add', name, value], execaOptions);
}
@ -227,7 +227,7 @@ async function gitAddConfig(name, value, execaOptions) {
*
* @return {String} The sha of the commit associated with `tagName` on the local repository.
*/
async function gitTagHead(tagName, execaOptions) {
export async function gitTagHead(tagName, execaOptions) {
return (await execa('git', ['rev-list', '-1', tagName], execaOptions)).stdout;
}
@ -240,7 +240,7 @@ async function gitTagHead(tagName, execaOptions) {
*
* @return {String} The sha of the commit associated with `tagName` on the remote repository.
*/
async function gitRemoteTagHead(repositoryUrl, tagName, execaOptions) {
export async function gitRemoteTagHead(repositoryUrl, tagName, execaOptions) {
return (await execa('git', ['ls-remote', '--tags', repositoryUrl, tagName], execaOptions)).stdout
.split('\n')
.filter((tag) => Boolean(tag))
@ -255,7 +255,7 @@ async function gitRemoteTagHead(repositoryUrl, tagName, execaOptions) {
*
* @return {String} The tag associatedwith the sha in parameter or `null`.
*/
async function gitCommitTag(gitHead, execaOptions) {
export async function gitCommitTag(gitHead, execaOptions) {
return (await execa('git', ['describe', '--tags', '--exact-match', gitHead], execaOptions)).stdout;
}
@ -268,7 +268,7 @@ async function gitCommitTag(gitHead, execaOptions) {
*
* @throws {Error} if the push failed.
*/
async function gitPush(repositoryUrl, branch, execaOptions) {
export async function gitPush(repositoryUrl, branch, execaOptions) {
await execa('git', ['push', '--tags', repositoryUrl, `HEAD:${branch}`], execaOptions);
}
@ -278,7 +278,7 @@ async function gitPush(repositoryUrl, branch, execaOptions) {
* @param {String} ref The ref to merge.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function merge(ref, execaOptions) {
export async function merge(ref, execaOptions) {
await execa('git', ['merge', '--no-ff', ref], execaOptions);
}
@ -288,7 +288,7 @@ async function merge(ref, execaOptions) {
* @param {String} ref The ref to merge.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function mergeFf(ref, execaOptions) {
export async function mergeFf(ref, execaOptions) {
await execa('git', ['merge', '--ff', ref], execaOptions);
}
@ -298,7 +298,7 @@ async function mergeFf(ref, execaOptions) {
* @param {String} ref The ref to merge.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function rebase(ref, execaOptions) {
export async function rebase(ref, execaOptions) {
await execa('git', ['rebase', ref], execaOptions);
}
@ -309,7 +309,7 @@ async function rebase(ref, execaOptions) {
* @param {String} ref The ref to add the note to.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitAddNote(note, ref, execaOptions) {
export async function gitAddNote(note, ref, execaOptions) {
await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'add', '-m', note, ref], execaOptions);
}
@ -319,31 +319,6 @@ async function gitAddNote(note, ref, execaOptions) {
* @param {String} ref The ref to get the note from.
* @param {Object} [execaOpts] Options to pass to `execa`.
*/
async function gitGetNote(ref, execaOptions) {
export async function gitGetNote(ref, execaOptions) {
return (await execa('git', ['notes', '--ref', GIT_NOTE_REF, 'show', ref], execaOptions)).stdout;
}
module.exports = {
initGit,
gitRepo,
initBareRepo,
gitCommits,
gitGetCommits,
gitCheckout,
gitFetch,
gitHead,
gitTagVersion,
gitShallowClone,
gitDetachedHead,
gitDetachedHeadFromBranch,
gitAddConfig,
gitTagHead,
gitRemoteTagHead,
gitCommitTag,
gitPush,
merge,
mergeFf,
rebase,
gitAddNote,
gitGetNote,
};

View File

@ -1,7 +1,7 @@
const Docker = require('dockerode');
const getStream = require('get-stream');
const pRetry = require('p-retry');
const {initBareRepo, gitShallowClone} = require('./git-utils');
import Docker from 'dockerode';
import getStream from 'get-stream';
import pRetry from 'p-retry';
import {gitShallowClone, initBareRepo} from './git-utils.js';
const IMAGE = 'semanticrelease/docker-gitbox:latest';
const SERVER_PORT = 80;
@ -12,12 +12,12 @@ const GIT_PASSWORD = 'suchsecure';
const docker = new Docker();
let container;
const gitCredential = `${GIT_USERNAME}:${GIT_PASSWORD}`;
export const gitCredential = `${GIT_USERNAME}:${GIT_PASSWORD}`;
/**
* Download the `gitbox` Docker image, create a new container and start it.
*/
async function start() {
export async function start() {
await getStream(await docker.pull(IMAGE));
container = await docker.createContainer({
@ -38,7 +38,7 @@ async function start() {
/**
* Stop and remote the `mockserver` Docker container.
*/
async function stop() {
export async function stop() {
await container.stop();
await container.remove();
}
@ -51,7 +51,7 @@ async function stop() {
* @param {String} [description=`Repository ${name}`] The repository description.
* @return {Object} The `repositoryUrl` (URL without auth) and `authUrl` (URL with auth).
*/
async function createRepo(name, branch = 'master', description = `Repository ${name}`) {
export async function createRepo(name, branch = 'master', description = `Repository ${name}`) {
const exec = await container.exec({
Cmd: ['repo-admin', '-n', name, '-d', description],
AttachStdout: true,
@ -68,5 +68,3 @@ async function createRepo(name, branch = 'master', description = `Repository ${n
return {cwd, repositoryUrl, authUrl};
}
module.exports = {start, stop, gitCredential, createRepo};

View File

@ -1,8 +1,8 @@
const Docker = require('dockerode');
const getStream = require('get-stream');
const got = require('got');
const pRetry = require('p-retry');
const {mockServerClient} = require('mockserver-client');
import Docker from 'dockerode';
import getStream from 'get-stream';
import got from 'got';
import pRetry from 'p-retry';
import {mockServerClient} from 'mockserver-client';
const IMAGE = 'mockserver/mockserver:latest';
const MOCK_SERVER_PORT = 1080;
@ -13,7 +13,7 @@ let container;
/**
* Download the `mockserver` Docker image, create a new container and start it.
*/
async function start() {
export async function start() {
await getStream(await docker.pull(IMAGE));
container = await docker.createContainer({
@ -38,7 +38,7 @@ async function start() {
/**
* Stop and remove the `mockserver` Docker container.
*/
async function stop() {
export async function stop() {
await container.stop();
await container.remove();
}
@ -50,7 +50,7 @@ const client = mockServerClient(MOCK_SERVER_HOST, MOCK_SERVER_PORT);
/**
* @type {string} the url of the `mockserver` instance
*/
const url = `http://${MOCK_SERVER_HOST}:${MOCK_SERVER_PORT}`;
export const url = `http://${MOCK_SERVER_HOST}:${MOCK_SERVER_PORT}`;
/**
* Set up the `mockserver` instance response for a specific request.
@ -65,7 +65,7 @@ const url = `http://${MOCK_SERVER_HOST}:${MOCK_SERVER_PORT}`;
* @param {Object} response.body The JSON object to respond in the response body.
* @return {Object} An object representation the expectation. Pass to the `verify` function to validate the `mockserver` has been called with a `request` matching the expectations.
*/
async function mock(
export async function mock(
path,
{body: requestBody, headers: requestHeaders},
{method = 'POST', statusCode = 200, body: responseBody}
@ -96,8 +96,6 @@ async function mock(
* @param {Object} expectation The expectation created with `mock` function.
* @return {Promise} A Promise that resolves if the expectation is met or reject otherwise.
*/
function verify(expectation) {
export function verify(expectation) {
return client.verify(expectation);
}
module.exports = {start, stop, mock, verify, url};

View File

@ -1,23 +1,25 @@
const Docker = require('dockerode');
const getStream = require('get-stream');
const got = require('got');
const path = require('path');
const delay = require('delay');
const pRetry = require('p-retry');
import path, {dirname} from 'node:path';
import {fileURLToPath} from 'node:url';
import Docker from 'dockerode';
import getStream from 'get-stream';
import got from 'got';
import delay from 'delay';
import pRetry from 'p-retry';
const IMAGE = 'verdaccio/verdaccio:4';
const IMAGE = 'verdaccio/verdaccio:5';
const REGISTRY_PORT = 4873;
const REGISTRY_HOST = 'localhost';
const NPM_USERNAME = 'integration';
const NPM_PASSWORD = 'suchsecure';
const NPM_EMAIL = 'integration@test.com';
const docker = new Docker();
let container;
const __dirname = dirname(fileURLToPath(import.meta.url));
let container, npmToken;
/**
* Download the `npm-registry-docker` Docker image, create a new container and start it.
*/
async function start() {
export async function start() {
await getStream(await docker.pull(IMAGE));
container = await docker.createContainer({
@ -53,23 +55,28 @@ async function start() {
email: NPM_EMAIL,
},
});
// Create token for user
({token: npmToken} = await got(`http://${REGISTRY_HOST}:${REGISTRY_PORT}/-/npm/v1/tokens`, {
username: NPM_USERNAME,
password: NPM_PASSWORD,
method: 'POST',
headers: {'content-type': 'application/json'},
json: {password: NPM_PASSWORD, readonly: false, cidr_whitelist: []}
}).json());
}
const url = `http://${REGISTRY_HOST}:${REGISTRY_PORT}/`;
export const url = `http://${REGISTRY_HOST}:${REGISTRY_PORT}/`;
const authEnv = {
export const authEnv = () => ({
npm_config_registry: url, // eslint-disable-line camelcase
NPM_USERNAME,
NPM_PASSWORD,
NPM_EMAIL,
};
NPM_TOKEN: npmToken,
});
/**
* Stop and remote the `npm-registry-docker` Docker container.
*/
async function stop() {
export async function stop() {
await container.stop();
await container.remove();
}
module.exports = {start, stop, authEnv, url};

View File

@ -1,7 +1,5 @@
const execa = require('execa');
import {execa} from 'execa';
async function npmView(packageName, env) {
export async function npmView(packageName, env) {
return JSON.parse((await execa('npm', ['view', packageName, '--json'], {env})).stdout);
}
module.exports = {npmView};

View File

@ -1,18 +1,18 @@
const test = require('ava');
const {repeat} = require('lodash');
const hideSensitive = require('../lib/hide-sensitive');
const {SECRET_REPLACEMENT, SECRET_MIN_SIZE} = require('../lib/definitions/constants');
import test from "ava";
import { repeat } from "lodash-es";
import hideSensitive from "../lib/hide-sensitive.js";
import { SECRET_MIN_SIZE, SECRET_REPLACEMENT } from "../lib/definitions/constants.js";
test('Replace multiple sensitive environment variable values', (t) => {
const env = {SOME_PASSWORD: 'password', SOME_TOKEN: 'secret'};
test("Replace multiple sensitive environment variable values", (t) => {
const env = { SOME_PASSWORD: "password", SOME_TOKEN: "secret" };
t.is(
hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=${env.SOME_TOKEN}`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}`
);
});
test('Replace multiple occurences of sensitive environment variable values', (t) => {
const env = {secretKey: 'secret'};
test("Replace multiple occurences of sensitive environment variable values", (t) => {
const env = { secretKey: "secret" };
t.is(
hideSensitive(env)(`https://user:${env.secretKey}@host.com?token=${env.secretKey}`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=${SECRET_REPLACEMENT}`
@ -20,28 +20,28 @@ test('Replace multiple occurences of sensitive environment variable values', (t)
});
test('Replace sensitive environment variable matching specific regex for "private"', (t) => {
const env = {privateKey: 'secret', GOPRIVATE: 'host.com'};
const env = { privateKey: "secret", GOPRIVATE: "host.com" };
t.is(hideSensitive(env)(`https://host.com?token=${env.privateKey}`), `https://host.com?token=${SECRET_REPLACEMENT}`);
});
test('Replace url-encoded environment variable', (t) => {
const env = {privateKey: 'secret '};
test("Replace url-encoded environment variable", (t) => {
const env = { privateKey: "secret " };
t.is(
hideSensitive(env)(`https://host.com?token=${encodeURI(env.privateKey)}`),
`https://host.com?token=${SECRET_REPLACEMENT}`
);
});
test('Escape regexp special characters', (t) => {
const env = {SOME_CREDENTIALS: 'p$^{.+}\\w[a-z]o.*rd'};
test("Escape regexp special characters", (t) => {
const env = { SOME_CREDENTIALS: "p$^{.+}\\w[a-z]o.*rd" };
t.is(
hideSensitive(env)(`https://user:${env.SOME_CREDENTIALS}@host.com`),
`https://user:${SECRET_REPLACEMENT}@host.com`
);
});
test('Escape regexp special characters in url-encoded environment variable', (t) => {
const env = {SOME_PASSWORD: 'secret password p$^{.+}\\w[a-z]o.*rd)('};
test("Escape regexp special characters in url-encoded environment variable", (t) => {
const env = { SOME_PASSWORD: "secret password p$^{.+}\\w[a-z]o.*rd)(" };
t.is(
hideSensitive(env)(`https://user:${encodeURI(env.SOME_PASSWORD)}@host.com`),
`https://user:${SECRET_REPLACEMENT}@host.com`
@ -52,31 +52,31 @@ test('Accept "undefined" input', (t) => {
t.is(hideSensitive({})(), undefined);
});
test('Return same string if no environment variable has to be replaced', (t) => {
t.is(hideSensitive({})('test'), 'test');
test("Return same string if no environment variable has to be replaced", (t) => {
t.is(hideSensitive({})("test"), "test");
});
test('Exclude empty environment variables from the regexp', (t) => {
const env = {SOME_PASSWORD: 'password', SOME_TOKEN: ''};
test("Exclude empty environment variables from the regexp", (t) => {
const env = { SOME_PASSWORD: "password", SOME_TOKEN: "" };
t.is(
hideSensitive(env)(`https://user:${env.SOME_PASSWORD}@host.com?token=`),
`https://user:${SECRET_REPLACEMENT}@host.com?token=`
);
});
test('Exclude empty environment variables from the regexp if there is only empty ones', (t) => {
t.is(hideSensitive({SOME_PASSWORD: '', SOME_TOKEN: ' \n '})(`https://host.com?token=`), 'https://host.com?token=');
test("Exclude empty environment variables from the regexp if there is only empty ones", (t) => {
t.is(hideSensitive({ SOME_PASSWORD: "", SOME_TOKEN: " \n " })(`https://host.com?token=`), "https://host.com?token=");
});
test('Exclude nonsensitive GOPRIVATE environment variable for Golang projects from the regexp', (t) => {
const env = {GOPRIVATE: 'host.com'};
t.is(hideSensitive(env)(`https://host.com?token=`), 'https://host.com?token=');
test("Exclude nonsensitive GOPRIVATE environment variable for Golang projects from the regexp", (t) => {
const env = { GOPRIVATE: "host.com" };
t.is(hideSensitive(env)(`https://host.com?token=`), "https://host.com?token=");
});
test('Exclude environment variables with value shorter than SECRET_MIN_SIZE from the regexp', (t) => {
const SHORT_TOKEN = repeat('a', SECRET_MIN_SIZE - 1);
const LONG_TOKEN = repeat('b', SECRET_MIN_SIZE);
const env = {SHORT_TOKEN, LONG_TOKEN};
test("Exclude environment variables with value shorter than SECRET_MIN_SIZE from the regexp", (t) => {
const SHORT_TOKEN = repeat("a", SECRET_MIN_SIZE - 1);
const LONG_TOKEN = repeat("b", SECRET_MIN_SIZE);
const env = { SHORT_TOKEN, LONG_TOKEN };
t.is(
hideSensitive(env)(`https://user:${SHORT_TOKEN}@host.com?token=${LONG_TOKEN}`),
`https://user:${SHORT_TOKEN}@host.com?token=${SECRET_REPLACEMENT}`

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,7 @@
const test = require('ava');
const {noop} = require('lodash');
const {stub} = require('sinon');
const normalize = require('../../lib/plugins/normalize');
import test from 'ava';
import {noop} from 'lodash-es';
import {stub} from 'sinon';
import normalize from '../../lib/plugins/normalize.js';
const cwd = process.cwd();
@ -23,37 +23,37 @@ test('Normalize and load plugin from string', async (t) => {
const plugin = await normalize(
{cwd, options: {}, logger: t.context.logger},
'verifyConditions',
'./test/fixtures/plugin-noop',
'./test/fixtures/plugin-noop.cjs',
{}
);
t.is(plugin.pluginName, './test/fixtures/plugin-noop');
t.is(plugin.pluginName, './test/fixtures/plugin-noop.cjs');
t.is(typeof plugin, 'function');
t.deepEqual(t.context.success.args[0], ['Loaded plugin "verifyConditions" from "./test/fixtures/plugin-noop"']);
t.deepEqual(t.context.success.args[0], ['Loaded plugin "verifyConditions" from "./test/fixtures/plugin-noop.cjs"']);
});
test('Normalize and load plugin from object', async (t) => {
const plugin = await normalize(
{cwd, options: {}, logger: t.context.logger},
'publish',
{path: './test/fixtures/plugin-noop'},
{path: './test/fixtures/plugin-noop.cjs'},
{}
);
t.is(plugin.pluginName, './test/fixtures/plugin-noop');
t.is(plugin.pluginName, './test/fixtures/plugin-noop.cjs');
t.is(typeof plugin, 'function');
t.deepEqual(t.context.success.args[0], ['Loaded plugin "publish" from "./test/fixtures/plugin-noop"']);
t.deepEqual(t.context.success.args[0], ['Loaded plugin "publish" from "./test/fixtures/plugin-noop.cjs"']);
});
test('Normalize and load plugin from a base file path', async (t) => {
const plugin = await normalize({cwd, options: {}, logger: t.context.logger}, 'verifyConditions', './plugin-noop', {
'./plugin-noop': './test/fixtures',
const plugin = await normalize({cwd, options: {}, logger: t.context.logger}, 'verifyConditions', './plugin-noop.cjs', {
'./plugin-noop.cjs': './test/fixtures',
});
t.is(plugin.pluginName, './plugin-noop');
t.is(plugin.pluginName, './plugin-noop.cjs');
t.is(typeof plugin, 'function');
t.deepEqual(t.context.success.args[0], [
'Loaded plugin "verifyConditions" from "./plugin-noop" in shareable config "./test/fixtures"',
'Loaded plugin "verifyConditions" from "./plugin-noop.cjs" in shareable config "./test/fixtures"',
]);
});
@ -72,7 +72,7 @@ test('Wrap plugin in a function that add the "pluginName" to multiple errors"',
'./plugin-errors': './test/fixtures',
});
const errors = [...(await t.throwsAsync(plugin({options: {}})))];
const errors = [...(await t.throwsAsync(plugin({options: {}}))).errors];
for (const error of errors) {
t.is(error.pluginName, './plugin-errors');
}
@ -90,12 +90,12 @@ test('Normalize and load plugin that retuns multiple functions', async (t) => {
const plugin = await normalize(
{cwd, options: {}, logger: t.context.logger},
'verifyConditions',
'./test/fixtures/multi-plugin',
'./test/fixtures/multi-plugin.cjs',
{}
);
t.is(typeof plugin, 'function');
t.deepEqual(t.context.success.args[0], ['Loaded plugin "verifyConditions" from "./test/fixtures/multi-plugin"']);
t.deepEqual(t.context.success.args[0], ['Loaded plugin "verifyConditions" from "./test/fixtures/multi-plugin.cjs"']);
});
test('Wrap "analyzeCommits" plugin in a function that validate the output of the plugin', async (t) => {
@ -258,7 +258,7 @@ test('Always pass a defined "pluginConfig" for plugin defined with path', async
test('Throws an error if the plugin return an object without the expected plugin function', async (t) => {
const error = await t.throwsAsync(() =>
normalize({cwd, options: {}, logger: t.context.logger}, 'inexistantPlugin', './test/fixtures/multi-plugin', {})
normalize({cwd, options: {}, logger: t.context.logger}, 'nonExistentPlugin', './test/fixtures/multi-plugin.cjs', {})
);
t.is(error.code, 'EPLUGIN');
@ -269,7 +269,7 @@ test('Throws an error if the plugin return an object without the expected plugin
test('Throws an error if the plugin is not found', async (t) => {
await t.throwsAsync(
() => normalize({cwd, options: {}, logger: t.context.logger}, 'inexistantPlugin', 'non-existing-path', {}),
() => normalize({cwd, options: {}, logger: t.context.logger}, 'nonExistentPlugin', 'non-existing-path', {}),
{
message: /Cannot find module 'non-existing-path'/,
code: 'MODULE_NOT_FOUND',

View File

@ -1,7 +1,7 @@
const test = require('ava');
const {stub} = require('sinon');
const AggregateError = require('aggregate-error');
const pipeline = require('../../lib/plugins/pipeline');
import test from 'ava';
import {stub} from 'sinon';
import AggregateError from 'aggregate-error';
import pipeline from '../../lib/plugins/pipeline.js';
test('Execute each function in series passing the same input', async (t) => {
const step1 = stub().resolves(1);
@ -116,9 +116,9 @@ test('Throw all errors from the first step throwing an AggregateError', async (t
const step2 = stub().rejects(new AggregateError([error1, error2]));
const step3 = stub().resolves(3);
const errors = await t.throwsAsync(pipeline([step1, step2, step3])(0));
const error = await t.throwsAsync(pipeline([step1, step2, step3])(0));
t.deepEqual([...errors], [error1, error2]);
t.deepEqual([...error.errors], [error1, error2]);
t.true(step1.calledWith(0));
t.true(step2.calledWith(0));
t.true(step3.notCalled);
@ -131,9 +131,9 @@ test('Execute all even if a Promise rejects', async (t) => {
const step2 = stub().rejects(error1);
const step3 = stub().rejects(error2);
const errors = await t.throwsAsync(pipeline([step1, step2, step3], {settleAll: true})(0));
const error = await t.throwsAsync(pipeline([step1, step2, step3], {settleAll: true})(0));
t.deepEqual([...errors], [error1, error2]);
t.deepEqual([...error.errors], [error1, error2]);
t.true(step1.calledWith(0));
t.true(step2.calledWith(0));
t.true(step3.calledWith(0));
@ -147,9 +147,9 @@ test('Throw all errors from all steps throwing an AggregateError', async (t) =>
const step1 = stub().rejects(new AggregateError([error1, error2]));
const step2 = stub().rejects(new AggregateError([error3, error4]));
const errors = await t.throwsAsync(pipeline([step1, step2], {settleAll: true})(0));
const error = await t.throwsAsync(pipeline([step1, step2], {settleAll: true})(0));
t.deepEqual([...errors], [error1, error2, error3, error4]);
t.deepEqual([...error.errors], [error1, error2, error3, error4]);
t.true(step1.calledWith(0));
t.true(step2.calledWith(0));
});
@ -163,9 +163,9 @@ test('Execute each function in series passing a transformed input even if a step
const step4 = stub().resolves(4);
const getNextInput = (previousResult, result) => previousResult + result;
const errors = await t.throwsAsync(pipeline([step1, step2, step3, step4], {settleAll: true, getNextInput})(0));
const error = await t.throwsAsync(pipeline([step1, step2, step3, step4], {settleAll: true, getNextInput})(0));
t.deepEqual([...errors], [error2, error3]);
t.deepEqual([...error.errors], [error2, error3]);
t.true(step1.calledWith(0));
t.true(step2.calledWith(0 + 1));
t.true(step3.calledWith(0 + 1 + error2));

View File

@ -1,11 +1,11 @@
const path = require('path');
const test = require('ava');
const {copy, outputFile} = require('fs-extra');
const {stub} = require('sinon');
const tempy = require('tempy');
const getPlugins = require('../../lib/plugins');
import path from 'path';
import test from 'ava';
import {copy, outputFile} from 'fs-extra';
import {stub} from 'sinon';
import {temporaryDirectory} from 'tempy';
import getPlugins from '../../lib/plugins/index.js';
// Save the current working diretory
// Save the current working directory
const cwd = process.cwd();
test.beforeEach((t) => {
@ -35,9 +35,9 @@ test('Export plugins based on steps config', async (t) => {
cwd,
logger: t.context.logger,
options: {
verifyConditions: ['./test/fixtures/plugin-noop', {path: './test/fixtures/plugin-noop'}],
generateNotes: './test/fixtures/plugin-noop',
analyzeCommits: {path: './test/fixtures/plugin-noop'},
verifyConditions: ['./test/fixtures/plugin-noop.cjs', {path: './test/fixtures/plugin-noop.cjs'}],
generateNotes: './test/fixtures/plugin-noop.cjs',
analyzeCommits: {path: './test/fixtures/plugin-noop.cjs'},
verifyRelease: () => {},
},
},
@ -137,9 +137,9 @@ test('Unknown steps of plugins configured in "plugins" are ignored', async (t) =
});
test('Export plugins loaded from the dependency of a shareable config module', async (t) => {
const cwd = tempy.directory();
const cwd = temporaryDirectory();
await copy(
'./test/fixtures/plugin-noop.js',
'./test/fixtures/plugin-noop.cjs',
path.resolve(cwd, 'node_modules/shareable-config/node_modules/custom-plugin/index.js')
);
await outputFile(path.resolve(cwd, 'node_modules/shareable-config/index.js'), '');
@ -170,8 +170,8 @@ test('Export plugins loaded from the dependency of a shareable config module', a
});
test('Export plugins loaded from the dependency of a shareable config file', async (t) => {
const cwd = tempy.directory();
await copy('./test/fixtures/plugin-noop.js', path.resolve(cwd, 'plugin/plugin-noop.js'));
const cwd = temporaryDirectory();
await copy('./test/fixtures/plugin-noop.cjs', path.resolve(cwd, 'plugin/plugin-noop.cjs'));
await outputFile(path.resolve(cwd, 'shareable-config.js'), '');
const plugins = await getPlugins(
@ -179,9 +179,9 @@ test('Export plugins loaded from the dependency of a shareable config file', asy
cwd,
logger: t.context.logger,
options: {
verifyConditions: ['./plugin/plugin-noop', {path: './plugin/plugin-noop'}],
generateNotes: './plugin/plugin-noop',
analyzeCommits: {path: './plugin/plugin-noop'},
verifyConditions: ['./plugin/plugin-noop.cjs', {path: './plugin/plugin-noop.cjs'}],
generateNotes: './plugin/plugin-noop.cjs',
analyzeCommits: {path: './plugin/plugin-noop.cjs'},
verifyRelease: () => {},
},
},
@ -269,7 +269,7 @@ test('Throw an error for each invalid plugin configuration', async (t) => {
},
{}
)
)),
)).errors,
];
t.is(errors[0].name, 'SemanticReleaseError');
@ -289,11 +289,11 @@ test('Throw EPLUGINSCONF error if the "plugins" option contains an old plugin de
{
cwd,
logger: t.context.logger,
options: {plugins: ['./test/fixtures/multi-plugin', './test/fixtures/plugin-noop', () => {}]},
options: {plugins: ['./test/fixtures/multi-plugin.cjs', './test/fixtures/plugin-noop.cjs', () => {}]},
},
{}
)
)),
)).errors,
];
t.is(errors[0].name, 'SemanticReleaseError');
@ -306,7 +306,7 @@ test('Throw EPLUGINSCONF error for each invalid definition if the "plugins" opti
const errors = [
...(await t.throwsAsync(() =>
getPlugins({cwd, logger: t.context.logger, options: {plugins: [1, {path: 1}, [() => {}, {}, {}]]}}, {})
)),
)).errors,
];
t.is(errors[0].name, 'SemanticReleaseError');

View File

@ -1,5 +1,5 @@
const test = require('ava');
const {validatePlugin, validateStep, loadPlugin, parseConfig} = require('../../lib/plugins/utils');
import test from 'ava';
import {loadPlugin, parseConfig, validatePlugin, validateStep} from '../../lib/plugins/utils.js';
test('validatePlugin', (t) => {
const path = 'plugin-module';
@ -193,10 +193,10 @@ test('loadPlugin', async (t) => {
const cwd = process.cwd();
const func = () => {};
t.is(require('../fixtures/plugin-noop'), await loadPlugin({cwd: './test/fixtures'}, './plugin-noop', {}), 'From cwd');
t.is((await import('../fixtures/plugin-noop.cjs')).default, await loadPlugin({cwd: './test/fixtures'}, './plugin-noop.cjs', {}), 'From cwd');
t.is(
require('../fixtures/plugin-noop'),
await loadPlugin({cwd}, './plugin-noop', {'./plugin-noop': './test/fixtures'}),
(await import('../fixtures/plugin-noop.cjs')).default,
await loadPlugin({cwd}, './plugin-noop.cjs', {'./plugin-noop.cjs': './test/fixtures'}),
'From a shareable config context'
);
t.is(func, await loadPlugin({cwd}, func, {}), 'Defined as a function');

View File

@ -1,186 +1,189 @@
const test = require('ava');
const AggregateError = require('aggregate-error');
const {
import test from "ava";
import AggregateError from "aggregate-error";
import {
extractErrors,
tagsToVersions,
isMajorRange,
isMaintenanceRange,
getUpperBound,
getLowerBound,
highest,
lowest,
getLatestVersion,
getEarliestVersion,
getFirstVersion,
getLatestVersion,
getLowerBound,
getRange,
makeTag,
getUpperBound,
highest,
isMaintenanceRange,
isMajorRange,
isSameChannel,
} = require('../lib/utils');
lowest,
makeTag,
tagsToVersions,
} from "../lib/utils.js";
test('extractErrors', (t) => {
const errors = [new Error('Error 1'), new Error('Error 2')];
test("extractErrors", (t) => {
const errors = [new Error("Error 1"), new Error("Error 2")];
t.deepEqual(extractErrors(new AggregateError(errors)), errors);
t.deepEqual(extractErrors(errors[0]), [errors[0]]);
});
test('tagsToVersions', (t) => {
t.deepEqual(tagsToVersions([{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]), [
'1.0.0',
'1.1.0',
'1.2.0',
test("tagsToVersions", (t) => {
t.deepEqual(tagsToVersions([{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }]), [
"1.0.0",
"1.1.0",
"1.2.0",
]);
});
test('isMajorRange', (t) => {
t.false(isMajorRange('1.1.x'));
t.false(isMajorRange('1.11.x'));
t.false(isMajorRange('11.1.x'));
t.false(isMajorRange('11.11.x'));
t.false(isMajorRange('1.1.X'));
t.false(isMajorRange('1.1.0'));
test("isMajorRange", (t) => {
t.false(isMajorRange("1.1.x"));
t.false(isMajorRange("1.11.x"));
t.false(isMajorRange("11.1.x"));
t.false(isMajorRange("11.11.x"));
t.false(isMajorRange("1.1.X"));
t.false(isMajorRange("1.1.0"));
t.true(isMajorRange('1.x.x'));
t.true(isMajorRange('11.x.x'));
t.true(isMajorRange('1.X.X'));
t.true(isMajorRange('1.x'));
t.true(isMajorRange('11.x'));
t.true(isMajorRange('1.X'));
t.true(isMajorRange("1.x.x"));
t.true(isMajorRange("11.x.x"));
t.true(isMajorRange("1.X.X"));
t.true(isMajorRange("1.x"));
t.true(isMajorRange("11.x"));
t.true(isMajorRange("1.X"));
});
test('isMaintenanceRange', (t) => {
t.true(isMaintenanceRange('1.1.x'));
t.true(isMaintenanceRange('11.1.x'));
t.true(isMaintenanceRange('11.11.x'));
t.true(isMaintenanceRange('1.11.x'));
t.true(isMaintenanceRange('1.x.x'));
t.true(isMaintenanceRange('11.x.x'));
t.true(isMaintenanceRange('1.x'));
t.true(isMaintenanceRange('11.x'));
t.true(isMaintenanceRange('1.1.X'));
t.true(isMaintenanceRange('1.X.X'));
t.true(isMaintenanceRange('1.X'));
test("isMaintenanceRange", (t) => {
t.true(isMaintenanceRange("1.1.x"));
t.true(isMaintenanceRange("11.1.x"));
t.true(isMaintenanceRange("11.11.x"));
t.true(isMaintenanceRange("1.11.x"));
t.true(isMaintenanceRange("1.x.x"));
t.true(isMaintenanceRange("11.x.x"));
t.true(isMaintenanceRange("1.x"));
t.true(isMaintenanceRange("11.x"));
t.true(isMaintenanceRange("1.1.X"));
t.true(isMaintenanceRange("1.X.X"));
t.true(isMaintenanceRange("1.X"));
t.false(isMaintenanceRange('1.1.0'));
t.false(isMaintenanceRange('11.1.0'));
t.false(isMaintenanceRange('1.11.0'));
t.false(isMaintenanceRange('11.11.0'));
t.false(isMaintenanceRange('~1.0.0'));
t.false(isMaintenanceRange('^1.0.0'));
t.false(isMaintenanceRange("1.1.0"));
t.false(isMaintenanceRange("11.1.0"));
t.false(isMaintenanceRange("1.11.0"));
t.false(isMaintenanceRange("11.11.0"));
t.false(isMaintenanceRange("~1.0.0"));
t.false(isMaintenanceRange("^1.0.0"));
});
test('getUpperBound', (t) => {
t.is(getUpperBound('1.x.x'), '2.0.0');
t.is(getUpperBound('1.X.X'), '2.0.0');
t.is(getUpperBound('10.x.x'), '11.0.0');
t.is(getUpperBound('1.x'), '2.0.0');
t.is(getUpperBound('10.x'), '11.0.0');
t.is(getUpperBound('1.0.x'), '1.1.0');
t.is(getUpperBound('10.0.x'), '10.1.0');
t.is(getUpperBound('10.10.x'), '10.11.0');
t.is(getUpperBound('1.0.0'), '1.0.0');
t.is(getUpperBound('10.0.0'), '10.0.0');
test("getUpperBound", (t) => {
t.is(getUpperBound("1.x.x"), "2.0.0");
t.is(getUpperBound("1.X.X"), "2.0.0");
t.is(getUpperBound("10.x.x"), "11.0.0");
t.is(getUpperBound("1.x"), "2.0.0");
t.is(getUpperBound("10.x"), "11.0.0");
t.is(getUpperBound("1.0.x"), "1.1.0");
t.is(getUpperBound("10.0.x"), "10.1.0");
t.is(getUpperBound("10.10.x"), "10.11.0");
t.is(getUpperBound("1.0.0"), "1.0.0");
t.is(getUpperBound("10.0.0"), "10.0.0");
t.is(getUpperBound('foo'), undefined);
t.is(getUpperBound("foo"), undefined);
});
test('getLowerBound', (t) => {
t.is(getLowerBound('1.x.x'), '1.0.0');
t.is(getLowerBound('1.X.X'), '1.0.0');
t.is(getLowerBound('10.x.x'), '10.0.0');
t.is(getLowerBound('1.x'), '1.0.0');
t.is(getLowerBound('10.x'), '10.0.0');
t.is(getLowerBound('1.0.x'), '1.0.0');
t.is(getLowerBound('10.0.x'), '10.0.0');
t.is(getLowerBound('1.10.x'), '1.10.0');
t.is(getLowerBound('1.0.0'), '1.0.0');
t.is(getLowerBound('10.0.0'), '10.0.0');
test("getLowerBound", (t) => {
t.is(getLowerBound("1.x.x"), "1.0.0");
t.is(getLowerBound("1.X.X"), "1.0.0");
t.is(getLowerBound("10.x.x"), "10.0.0");
t.is(getLowerBound("1.x"), "1.0.0");
t.is(getLowerBound("10.x"), "10.0.0");
t.is(getLowerBound("1.0.x"), "1.0.0");
t.is(getLowerBound("10.0.x"), "10.0.0");
t.is(getLowerBound("1.10.x"), "1.10.0");
t.is(getLowerBound("1.0.0"), "1.0.0");
t.is(getLowerBound("10.0.0"), "10.0.0");
t.is(getLowerBound('foo'), undefined);
t.is(getLowerBound("foo"), undefined);
});
test('highest', (t) => {
t.is(highest('1.0.0', '2.0.0'), '2.0.0');
t.is(highest('1.1.1', '1.1.0'), '1.1.1');
t.is(highest(null, '1.0.0'), '1.0.0');
t.is(highest('1.0.0'), '1.0.0');
test("highest", (t) => {
t.is(highest("1.0.0", "2.0.0"), "2.0.0");
t.is(highest("1.1.1", "1.1.0"), "1.1.1");
t.is(highest(null, "1.0.0"), "1.0.0");
t.is(highest("1.0.0"), "1.0.0");
t.is(highest(), undefined);
});
test('lowest', (t) => {
t.is(lowest('1.0.0', '2.0.0'), '1.0.0');
t.is(lowest('1.1.1', '1.1.0'), '1.1.0');
t.is(lowest(null, '1.0.0'), '1.0.0');
test("lowest", (t) => {
t.is(lowest("1.0.0", "2.0.0"), "1.0.0");
t.is(lowest("1.1.1", "1.1.0"), "1.1.0");
t.is(lowest(null, "1.0.0"), "1.0.0");
t.is(lowest(), undefined);
});
test.serial('getLatestVersion', (t) => {
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1']), '1.2.0');
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined);
test.serial("getLatestVersion", (t) => {
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"]), "1.2.0");
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1']), '1.2.0');
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined);
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"]), "1.2.0");
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1'], {withPrerelease: true}), '1.2.3-alpha.3');
t.is(getLatestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2'], {withPrerelease: true}), '1.2.3-alpha.3');
t.is(
getLatestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"], { withPrerelease: true }),
"1.2.3-alpha.3"
);
t.is(getLatestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"], { withPrerelease: true }), "1.2.3-alpha.3");
t.is(getLatestVersion([]), undefined);
});
test.serial('getEarliestVersion', (t) => {
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.0', '1.0.1-alpha.1']), '1.0.0');
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined);
test.serial("getEarliestVersion", (t) => {
t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.0", "1.0.1-alpha.1"]), "1.0.0");
t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.0', '1.0.1-alpha.1']), '1.0.0');
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2']), undefined);
t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.0", "1.0.1-alpha.1"]), "1.0.0");
t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"]), undefined);
t.is(
getEarliestVersion(['1.2.3-alpha.3', '1.2.0', '1.0.1', '1.0.0-alpha.1'], {withPrerelease: true}),
'1.0.0-alpha.1'
getEarliestVersion(["1.2.3-alpha.3", "1.2.0", "1.0.1", "1.0.0-alpha.1"], { withPrerelease: true }),
"1.0.0-alpha.1"
);
t.is(getEarliestVersion(['1.2.3-alpha.3', '1.2.3-alpha.2'], {withPrerelease: true}), '1.2.3-alpha.2');
t.is(getEarliestVersion(["1.2.3-alpha.3", "1.2.3-alpha.2"], { withPrerelease: true }), "1.2.3-alpha.2");
t.is(getEarliestVersion([]), undefined);
});
test('getFirstVersion', (t) => {
t.is(getFirstVersion(['1.2.0', '1.0.0', '1.3.0', '1.1.0', '1.4.0'], []), '1.0.0');
test("getFirstVersion", (t) => {
t.is(getFirstVersion(["1.2.0", "1.0.0", "1.3.0", "1.1.0", "1.4.0"], []), "1.0.0");
t.is(
getFirstVersion(
['1.2.0', '1.0.0', '1.3.0', '1.1.0', '1.4.0'],
["1.2.0", "1.0.0", "1.3.0", "1.1.0", "1.4.0"],
[
{name: 'master', tags: [{version: '1.0.0'}, {version: '1.1.0'}]},
{name: 'next', tags: [{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]},
{ name: "master", tags: [{ version: "1.0.0" }, { version: "1.1.0" }] },
{ name: "next", tags: [{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }] },
]
),
'1.3.0'
"1.3.0"
);
t.is(
getFirstVersion(
['1.2.0', '1.0.0', '1.1.0'],
["1.2.0", "1.0.0", "1.1.0"],
[
{name: 'master', tags: [{version: '1.0.0'}, {version: '1.1.0'}]},
{name: 'next', tags: [{version: '1.0.0'}, {version: '1.1.0'}, {version: '1.2.0'}]},
{ name: "master", tags: [{ version: "1.0.0" }, { version: "1.1.0" }] },
{ name: "next", tags: [{ version: "1.0.0" }, { version: "1.1.0" }, { version: "1.2.0" }] },
]
),
undefined
);
});
test('getRange', (t) => {
t.is(getRange('1.0.0', '1.1.0'), '>=1.0.0 <1.1.0');
t.is(getRange('1.0.0'), '>=1.0.0');
test("getRange", (t) => {
t.is(getRange("1.0.0", "1.1.0"), ">=1.0.0 <1.1.0");
t.is(getRange("1.0.0"), ">=1.0.0");
});
test('makeTag', (t) => {
t.is(makeTag(`v\${version}`, '1.0.0'), 'v1.0.0');
test("makeTag", (t) => {
t.is(makeTag(`v\${version}`, "1.0.0"), "v1.0.0");
});
test('isSameChannel', (t) => {
t.true(isSameChannel('next', 'next'));
test("isSameChannel", (t) => {
t.true(isSameChannel("next", "next"));
t.true(isSameChannel(null, undefined));
t.true(isSameChannel(false, undefined));
t.true(isSameChannel('', false));
t.true(isSameChannel("", false));
t.false(isSameChannel('next', false));
t.false(isSameChannel("next", false));
});

View File

@ -1,117 +1,117 @@
const test = require('ava');
const tempy = require('tempy');
const verify = require('../lib/verify');
const {gitRepo} = require('./helpers/git-utils');
import test from "ava";
import { temporaryDirectory } from "tempy";
import verify from "../lib/verify.js";
import { gitRepo } from "./helpers/git-utils.js";
test('Throw a AggregateError', async (t) => {
const {cwd} = await gitRepo();
const options = {branches: [{name: 'master'}, {name: ''}]};
test("Throw a AggregateError", async (t) => {
const { cwd } = await gitRepo();
const options = { branches: [{ name: "master" }, { name: "" }] };
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'ENOREPOURL');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "ENOREPOURL");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
t.is(errors[1].name, 'SemanticReleaseError');
t.is(errors[1].code, 'EINVALIDTAGFORMAT');
t.is(errors[1].name, "SemanticReleaseError");
t.is(errors[1].code, "EINVALIDTAGFORMAT");
t.truthy(errors[1].message);
t.truthy(errors[1].details);
t.is(errors[2].name, 'SemanticReleaseError');
t.is(errors[2].code, 'ETAGNOVERSION');
t.is(errors[2].name, "SemanticReleaseError");
t.is(errors[2].code, "ETAGNOVERSION");
t.truthy(errors[2].message);
t.truthy(errors[2].details);
t.is(errors[3].name, 'SemanticReleaseError');
t.is(errors[3].code, 'EINVALIDBRANCH');
t.is(errors[3].name, "SemanticReleaseError");
t.is(errors[3].code, "EINVALIDBRANCH");
t.truthy(errors[3].message);
t.truthy(errors[3].details);
});
test('Throw a SemanticReleaseError if does not run on a git repository', async (t) => {
const cwd = tempy.directory();
const options = {branches: []};
test("Throw a SemanticReleaseError if does not run on a git repository", async (t) => {
const cwd = temporaryDirectory();
const options = { branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'ENOGITREPO');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "ENOGITREPO");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
});
test('Throw a SemanticReleaseError if the "tagFormat" is not valid', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `?\${version}`, branches: []};
const { cwd, repositoryUrl } = await gitRepo(true);
const options = { repositoryUrl, tagFormat: `?\${version}`, branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'EINVALIDTAGFORMAT');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "EINVALIDTAGFORMAT");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
});
test('Throw a SemanticReleaseError if the "tagFormat" does not contains the "version" variable', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
const options = {repositoryUrl, tagFormat: 'test', branches: []};
const { cwd, repositoryUrl } = await gitRepo(true);
const options = { repositoryUrl, tagFormat: "test", branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'ETAGNOVERSION');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "ETAGNOVERSION");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
});
test('Throw a SemanticReleaseError if the "tagFormat" contains multiple "version" variables', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `\${version}v\${version}`, branches: []};
const { cwd, repositoryUrl } = await gitRepo(true);
const options = { repositoryUrl, tagFormat: `\${version}v\${version}`, branches: [] };
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'ETAGNOVERSION');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "ETAGNOVERSION");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
});
test('Throw a SemanticReleaseError for each invalid branch', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
test("Throw a SemanticReleaseError for each invalid branch", async (t) => {
const { cwd, repositoryUrl } = await gitRepo(true);
const options = {
repositoryUrl,
tagFormat: `v\${version}`,
branches: [{name: ''}, {name: ' '}, {name: 1}, {}, {name: ''}, 1, 'master'],
branches: [{ name: "" }, { name: " " }, { name: 1 }, {}, { name: "" }, 1, "master"],
};
const errors = [...(await t.throwsAsync(verify({cwd, options})))];
const errors = [...(await t.throwsAsync(verify({ cwd, options }))).errors];
t.is(errors[0].name, 'SemanticReleaseError');
t.is(errors[0].code, 'EINVALIDBRANCH');
t.is(errors[0].name, "SemanticReleaseError");
t.is(errors[0].code, "EINVALIDBRANCH");
t.truthy(errors[0].message);
t.truthy(errors[0].details);
t.is(errors[1].name, 'SemanticReleaseError');
t.is(errors[1].code, 'EINVALIDBRANCH');
t.is(errors[1].name, "SemanticReleaseError");
t.is(errors[1].code, "EINVALIDBRANCH");
t.truthy(errors[1].message);
t.truthy(errors[1].details);
t.is(errors[2].name, 'SemanticReleaseError');
t.is(errors[2].code, 'EINVALIDBRANCH');
t.is(errors[2].name, "SemanticReleaseError");
t.is(errors[2].code, "EINVALIDBRANCH");
t.truthy(errors[2].message);
t.truthy(errors[2].details);
t.is(errors[3].name, 'SemanticReleaseError');
t.is(errors[3].code, 'EINVALIDBRANCH');
t.is(errors[3].name, "SemanticReleaseError");
t.is(errors[3].code, "EINVALIDBRANCH");
t.truthy(errors[3].message);
t.truthy(errors[3].details);
t.is(errors[4].code, 'EINVALIDBRANCH');
t.is(errors[4].code, "EINVALIDBRANCH");
t.truthy(errors[4].message);
t.truthy(errors[4].details);
t.is(errors[5].code, 'EINVALIDBRANCH');
t.is(errors[5].code, "EINVALIDBRANCH");
t.truthy(errors[5].message);
t.truthy(errors[5].details);
});
test('Return "true" if all verification pass', async (t) => {
const {cwd, repositoryUrl} = await gitRepo(true);
const options = {repositoryUrl, tagFormat: `v\${version}`, branches: [{name: 'master'}]};
const { cwd, repositoryUrl } = await gitRepo(true);
const options = { repositoryUrl, tagFormat: `v\${version}`, branches: [{ name: "master" }] };
await t.notThrowsAsync(verify({cwd, options}));
await t.notThrowsAsync(verify({ cwd, options }));
});