зеркало из https://github.com/microsoft/lage.git
Docs update & jest-worker (#456)
* adds a weight for tests * add jest worker change * making everything faster in cookbook! * fastest everything! * docs * unique fixture * make unique IDs for tests * try to do e2e tests one by one for now * forgot to add the same dependsOn * lage e2e also
This commit is contained in:
Родитель
0a7af89c82
Коммит
224e7db14f
|
@ -0,0 +1,63 @@
|
|||
---
|
||||
title: Make Jest Fast
|
||||
tags:
|
||||
- version 2
|
||||
---
|
||||
|
||||
import { Badge } from "@site/src/components/Badge";
|
||||
|
||||
<Badge>version 2</Badge>
|
||||
|
||||
`jest` is arguably the fastest test runner in the JavaScript ecosystem. Other monorepo task runner tip toe around `jest` because it has its own worker pool / multi-threaded capability. In fact, the fact is monorepo task runners like `nx` and `turbo` all have to work around the fact that we only have a certain finite number of CPU cores! The solution given usually is to switch to using `--runInBand` when running `jest` as part of those tools. This is will result in the task runners scheduling the work across CPU cores instead of `jest`. The problem arises when a single package contains a substantially large number of tests - those packages are punished to be running all their tests in serial against one single core. Another consequence of running in band is in the case of local development - if you are modifying a package with lots of tests, you would end up hitting the serialized test run case for every change!
|
||||
|
||||
Thankfully, `lage` has learned to play well with `jest` via a concept called `weighted targets`!
|
||||
|
||||
## Weighty targets and workers
|
||||
|
||||
First off, we will configure our `lage.config.js` like this:
|
||||
|
||||
```js
|
||||
// @filename: lage.config.js
|
||||
const glob = require("glob");
|
||||
|
||||
module.exports = {
|
||||
pipeline: {
|
||||
test: {
|
||||
type: "worker",
|
||||
weight: (target) => {
|
||||
// glob for the target.cwd and return the number of test files
|
||||
return glob.sync("**/*.test.js", { cwd: target.cwd }).length;
|
||||
},
|
||||
maxWorkers: 8,
|
||||
options: {
|
||||
worker: path.join(__dirname, "scripts/worker/jest-worker.js"),
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
Notice above that we have added a "weight" key to the target configuration. It can be a constant number `weight: 4`, or `weight: os.cpus().length - 1`. But it is really good to help the scheduler know how many workers are really needed given a package's count of test files (this is to make sure packages without many tests do not arbitrarily take up more cores than needed).
|
||||
|
||||
The `jest-workers.js` implementation will use the calculated `weight` in its call into `jest` APIs:
|
||||
|
||||
```js
|
||||
// @filename: jest-worker.js
|
||||
|
||||
const { runCLI } = require("jest");
|
||||
|
||||
module.exports = async function jest(data) {
|
||||
const { target, weight } = data;
|
||||
console.log(`Running ${target.id} with a maxWorker setting of ${weight}`);
|
||||
|
||||
const { results } = await runCLI({ maxWorkers: weight, rootDir: target.cwd, passWithNoTests: true, verbose: true }, [target.cwd]);
|
||||
|
||||
if (results.success) {
|
||||
console.log("Tests passed");
|
||||
} else {
|
||||
throw new Error("FAILED");
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
And just like that, `lage` and `jest` work in harmony to provide the best developer experience via: remote cache, scoped test skipping by package dependencies, and cooperating worker pool via weighted targets.
|
|
@ -1,5 +1,5 @@
|
|||
---
|
||||
title: Make ESLint Fast with Dedicated Workers
|
||||
title: Make ESLint Fast
|
||||
tags:
|
||||
- version 2
|
||||
---
|
||||
|
@ -27,14 +27,11 @@ First, let's change our `lint` task configuration in the pipeline to a "worker"
|
|||
```js twoslash
|
||||
// @filename: lage.config.js
|
||||
|
||||
const os = require("os");
|
||||
|
||||
module.exports = {
|
||||
pipeline: {
|
||||
lint: {
|
||||
type: "worker",
|
||||
options: {
|
||||
maxWorker: os.cpus().length - 1,
|
||||
worker: "scripts/eslint-worker.js",
|
||||
},
|
||||
},
|
||||
|
@ -50,9 +47,7 @@ Then, we implement an `eslint-worker.js` such as this:
|
|||
const path = require("path");
|
||||
|
||||
const { ESLint } = require("eslint");
|
||||
const { registerWorker } = require("@lage-run/lage");
|
||||
const { readFile } = require("fs/promises");
|
||||
const { threadId } = require("worker_threads");
|
||||
|
||||
/** this is the workspace root - find it however you want! */
|
||||
const PROJECT_ROOT = path.resolve(__dirname, "..");
|
||||
|
@ -81,14 +76,6 @@ function getEslintInstance() {
|
|||
/** Workers should have a run function that gets called per package task */
|
||||
async function run(data) {
|
||||
const { target } = data;
|
||||
const packageJson = JSON.parse(await readFile(path.join(target.cwd, "package.json"), "utf8"));
|
||||
|
||||
// If you had a package that doesn't use ESLint, you can skip with this condition
|
||||
// Here, we use the "lint" definition inside package.json's scripts key
|
||||
if (!packageJson.scripts?.[target.task]) {
|
||||
process.stdout.write(`No script found for ${target.task} in ${target.cwd}\n`);
|
||||
return;
|
||||
}
|
||||
|
||||
const eslint = getEslintInstance();
|
||||
|
||||
|
@ -111,6 +98,6 @@ async function run(data) {
|
|||
}
|
||||
}
|
||||
|
||||
// Be sure call this to register the worker with `lage`'s worker pool
|
||||
registerWorker(run);
|
||||
// the module exports is picked up by `lage` to run inside a worker - the module state is preserved from target run to target run
|
||||
module.exports = run;
|
||||
```
|
||||
|
|
|
@ -0,0 +1,243 @@
|
|||
---
|
||||
title: Make TypeScript Fast
|
||||
tags:
|
||||
- version 2
|
||||
---
|
||||
|
||||
import { Badge } from '@site/src/components/Badge'
|
||||
|
||||
<Badge>version 2</Badge>
|
||||
|
||||
TypeScript is a superset on top of JavaScript that adds type safety checks and a downlevel conversion of modern ECMAScript-like syntax with types to a target ECMAScript level. In a monorepo, we are interested in how to accelerate two aspects of TypeScript: transpilation and type checking. Let's go!
|
||||
|
||||
## Fastest transpilation
|
||||
|
||||
TypeScript's npm package `typescript` comes with a CLI program called `tsc`. This program will do type checking as well as transpilation. We can configure it to transpile code only. This would speed up the compilation by doing work on a file-by-file basis (not quite true, but close enough):
|
||||
|
||||
```
|
||||
tsc -p some-package/tsconfig.json --isolatedModules
|
||||
```
|
||||
|
||||
This is something you can place inside your package's `package.json` like so:
|
||||
|
||||
```
|
||||
// @filename package.json
|
||||
|
||||
{
|
||||
"scripts": {
|
||||
"transpile": "tsc -p some-package/tsconfig.json --isolatedModules"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Recently, many pure transpilers have been made to make transpilation even faster. You can use any of these packages: `swc` (Rust based), `esbuild` (Go based), `sucrase` (Node.js based). In this document, we will show one of the packages listed - `swc`:
|
||||
|
||||
```
|
||||
# if you use npm
|
||||
$ npm i -D @swc/cli @swc/core
|
||||
|
||||
# if you use yarn
|
||||
$ yarn add -D @swc/cli @swc/core
|
||||
```
|
||||
|
||||
```
|
||||
// @filename package.json
|
||||
|
||||
{
|
||||
"scripts": {
|
||||
"transpile": "swc ./src -d lib"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
You could also skip `@swc/cli` package, and make your own custom worker script (configure this inside the `lage.config.js` pipeline as a "worker" type):
|
||||
|
||||
```js
|
||||
const path = require("path");
|
||||
const fs = require("fs/promises");
|
||||
const swc = require("@swc/core");
|
||||
|
||||
module.exports = async function transpile(data) {
|
||||
const { target } = data;
|
||||
|
||||
const queue = [target.cwd];
|
||||
|
||||
// recursively transpile everything in sight
|
||||
while (queue.length > 0) {
|
||||
const dir = queue.shift();
|
||||
|
||||
let entries = await fs.readdir(dir, { withFileTypes: true });
|
||||
|
||||
for (let entry of entries) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
|
||||
// some basic "excluded directory" list: node_modules, lib, tests, dist
|
||||
if (entry.isDirectory() && entry.name !== "node_modules" && entry.name !== "lib" && entry.name !== "tests" && entry.name !== "dist") {
|
||||
queue.push(fullPath);
|
||||
}
|
||||
// if file extension is .ts - you maybe want to include .tsx here as well for repos that have TSX files
|
||||
else if (entry.isFile() && entry.name.endsWith(".ts")) {
|
||||
const swcOutput = await swc.transformFile(fullPath);
|
||||
const dest = fullPath.replace(/([/\\])src/, "$1lib").replace(".ts", ".js");
|
||||
await fs.mkdir(path.dirname(dest), { recursive: true });
|
||||
await fs.writeFile(dest, swcOutput.code);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Fastest Type Checking
|
||||
|
||||
The industry is abuzz about how to replace TypeScript with a faster transpiler. There is still no open sourced TypeScript type checker that retains the full fidelity of the work that is done by `tsc`. TypeScript compiler is a single threaded program, so previously the fastest way to type check without caching (i.e fastest first run) is to flatten everything into a single TS program with all the TypeScript source files found inside monorepo. This indeed is currently the fastest way to type check, however, we can do better. In particular, we would like to have the best of these features of `lage`:
|
||||
|
||||
1. remote cache
|
||||
2. scope skipping
|
||||
3. pipeline across workers (multi-core)
|
||||
|
||||
A naive approach in achieving a faster build would have been to subdivide the TypeScript project by packages each with its own `tsc -p tsconfig.json` script inside a `build` script of a `package.json` file. It then would allow the project to be subdivided in smaller pieces that can be cached and executed in parallel topologically. This is subtly different than the project reference feature of TypeScript.
|
||||
|
||||
> :note: Project references are not preferable because it incurs an overhead of resolution of modules as well as having a tool-specific cache that isn't hooked up with a remote cache
|
||||
|
||||
This solution will scale to a certain degree of scale. The speed up is highly dependent on the shape of the package dependency graph. This is because (1) remote caching, (2) scope skipping, and even a distributed execution (not present in lage (yet?)) is highly dependent on the the shape of the graph. To truly achieve the optimal type checker that can compete with the single flattened TS project strategy is to see why it is faster. The answer is that TS is spending an large amount of time given a complex repo in re-processing source files. You can see this in an individual trace of a single package - much of the time is in "ts.findSourceFile()" processing the `d.ts` files from the package dependencies. Even with `skipLibCheck`, we still have to load type information from these module dependency into memory each time. A single compilation for all packages would have the ability to re-use this from memory.
|
||||
|
||||
`lage` worker is here to rescue us from the single-threaded, no-remote-cache bleak state! `lage` has been applied inside various 10+ million lines of code repositories and has shown to cut type checking time by at least 2 (build agents are slower than local development machines):
|
||||
|
||||
```js
|
||||
// @filename tsc-worker.js
|
||||
|
||||
const ts = require("typescript");
|
||||
const path = require("path");
|
||||
const { existsSync } = require("fs");
|
||||
|
||||
// Save the previously run ts.program to be fed inside the next call
|
||||
let oldProgram;
|
||||
|
||||
let compilerHost;
|
||||
|
||||
/** this is the patch to ts.compilerHost that retains sourceFiles in a Map **/
|
||||
function createCompilerHost(compilerOptions) {
|
||||
const host = ts.createCompilerHost(compilerOptions, true);
|
||||
const sourceFiles = new Map();
|
||||
const originalGetSourceFile = host.getSourceFile;
|
||||
|
||||
// monkey patch host to cache source files
|
||||
host.getSourceFile = (
|
||||
fileName,
|
||||
languageVersion,
|
||||
onError,
|
||||
shouldCreateNewSourceFile
|
||||
) => {
|
||||
if (sourceFiles.has(fileName)) {
|
||||
return sourceFiles.get(fileName);
|
||||
}
|
||||
|
||||
const sourceFile = originalGetSourceFile(
|
||||
fileName,
|
||||
languageVersion,
|
||||
onError,
|
||||
shouldCreateNewSourceFile
|
||||
);
|
||||
|
||||
sourceFiles.set(fileName, sourceFile);
|
||||
|
||||
return sourceFile;
|
||||
};
|
||||
|
||||
return host;
|
||||
}
|
||||
|
||||
async function tsc(data) {
|
||||
const { target } = data; // Lage target data
|
||||
|
||||
const pathString = path.normalize(target.cwd);
|
||||
const packageString = pathString.substring(pathString.lastIndexOf("\\") + 1);
|
||||
|
||||
const tsconfigFile = "tsconfig.lage.json";
|
||||
const tsconfigJsonFile = path.join(target.cwd, tsconfigFile);
|
||||
|
||||
if (!existsSync(tsconfigJsonFile)) {
|
||||
// this package has no tsconfig.json, skipping work!
|
||||
return;
|
||||
}
|
||||
|
||||
// Parse tsconfig
|
||||
const configParserHost = parseConfigHostFromCompilerHostLike(
|
||||
compilerHost ?? ts.sys
|
||||
);
|
||||
const parsedCommandLine = ts.getParsedCommandLineOfConfigFile(
|
||||
tsconfigJsonFile,
|
||||
{},
|
||||
configParserHost
|
||||
);
|
||||
if (!parsedCommandLine) {
|
||||
throw new Error("Could not parse tsconfig.json");
|
||||
}
|
||||
const compilerOptions = parsedCommandLine.options;
|
||||
|
||||
// Creating compilation host program
|
||||
compilerHost = compilerHost ?? createCompilerHost(compilerOptions);
|
||||
|
||||
// The re-use of oldProgram is a trick we all learned from gulp-typescript, credit to ivogabe
|
||||
// @see https://github.com/ivogabe/gulp-typescript
|
||||
const program = ts.createProgram(
|
||||
parsedCommandLine.fileNames,
|
||||
compilerOptions,
|
||||
compilerHost,
|
||||
oldProgram
|
||||
);
|
||||
|
||||
oldProgram = program;
|
||||
|
||||
const errors = {
|
||||
semantics: program.getSemanticDiagnostics(),
|
||||
declaration: program.getDeclarationDiagnostics(),
|
||||
syntactic: program.getSyntacticDiagnostics(),
|
||||
global: program.getGlobalDiagnostics(),
|
||||
};
|
||||
|
||||
const allErrors = [];
|
||||
|
||||
try {
|
||||
program.emit();
|
||||
} catch (e) {
|
||||
console.log(e.messageText);
|
||||
throw new Error("Encountered errors while emitting");
|
||||
}
|
||||
|
||||
let hasErrors = false;
|
||||
|
||||
for (const kind of Object.keys(errors)) {
|
||||
for (const diagnostics of errors[kind]) {
|
||||
hasErrors = true;
|
||||
allErrors.push(diagnostics);
|
||||
}
|
||||
}
|
||||
|
||||
if (hasErrors) {
|
||||
console.log(ts.formatDiagnosticsWithColorAndContext(allErrors, compilerHost))
|
||||
throw new Error("Failed to compile");
|
||||
} else {
|
||||
console.log("Compiled successfully\n");
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
function parseConfigHostFromCompilerHostLike(host) {
|
||||
return {
|
||||
fileExists: f => host.fileExists(f),
|
||||
readDirectory(root, extensions, excludes, includes, depth) {
|
||||
return host.readDirectory(root, extensions, excludes, includes, depth);
|
||||
},
|
||||
readFile: f => host.readFile(f),
|
||||
useCaseSensitiveFileNames: host.useCaseSensitiveFileNames,
|
||||
getCurrentDirectory: host.getCurrentDirectory,
|
||||
onUnRecoverableConfigFileDiagnostic: d => {
|
||||
throw new Error(ts.flattenDiagnosticMessageText(d.messageText, "\n"));
|
||||
},
|
||||
trace: host.trace,
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = tsc;
|
||||
```
|
|
@ -1,12 +0,0 @@
|
|||
import React from "react";
|
||||
import useDocusaurusContext from "@docusaurus/useDocusaurusContext";
|
||||
import "../css/tailwind.css";
|
||||
|
||||
export default function Test() {
|
||||
const { siteConfig } = useDocusaurusContext();
|
||||
return (
|
||||
<div className="theme-color" id="tailwind">
|
||||
<div className="md:bg-navbar">test</div>
|
||||
</div>
|
||||
);
|
||||
}
|
|
@ -1,11 +1,21 @@
|
|||
// @ts-check
|
||||
const path = require("path");
|
||||
const fastGlob = require("fast-glob");
|
||||
|
||||
/** @type {import("@lage-run/cli").ConfigOptions} */
|
||||
module.exports = {
|
||||
pipeline: {
|
||||
build: ["^build"],
|
||||
test: ["build"],
|
||||
test: {
|
||||
type: "worker",
|
||||
weight: (target) => {
|
||||
return fastGlob.sync("tests/**/*.test.ts", { cwd: target.cwd }).length;
|
||||
},
|
||||
options: {
|
||||
worker: path.join(__dirname, "scripts/worker/jest.js"),
|
||||
},
|
||||
dependsOn: ["build"],
|
||||
},
|
||||
lint: {
|
||||
type: "worker",
|
||||
options: {
|
||||
|
@ -13,6 +23,20 @@ module.exports = {
|
|||
},
|
||||
},
|
||||
start: [],
|
||||
"@lage-run/e2e-tests#test": {
|
||||
type: "npmScript",
|
||||
dependsOn: ["build"],
|
||||
},
|
||||
"lage#test": {
|
||||
type: "npmScript",
|
||||
dependsOn: ["build"],
|
||||
},
|
||||
"@lage-run/docs#test": {
|
||||
type: "npmScript",
|
||||
},
|
||||
},
|
||||
npmClient: "yarn",
|
||||
cacheOptions: {
|
||||
environmentGlob: ["*.js", "*.json", ".github/**"],
|
||||
},
|
||||
};
|
||||
|
|
|
@ -42,7 +42,8 @@
|
|||
"prettier": "2.7.1",
|
||||
"ts-jest": "29.0.3",
|
||||
"typescript": "4.6.4",
|
||||
"patch-package": "6.4.7"
|
||||
"patch-package": "6.4.7",
|
||||
"fast-glob": "3.2.12"
|
||||
},
|
||||
"lint-staged": {
|
||||
"*.gitignore": "prettier --config .prettierrc --write --ignore-path .gitignore",
|
||||
|
|
|
@ -102,7 +102,7 @@ describe("basic failure case where a dependent target has failed", () => {
|
|||
|
||||
it("when a failure happens in `--safe-exit`, be sure to have exit code of !== 0", () => {
|
||||
expect.hasAssertions();
|
||||
const repo = new Monorepo("basics");
|
||||
const repo = new Monorepo("basics-safe-exit");
|
||||
|
||||
repo.init();
|
||||
repo.install();
|
||||
|
|
|
@ -59,7 +59,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should operate with local provider ONLY by default", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-local-only");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -105,7 +105,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should allow read-only mode when given a remote (or custom) cache config", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-read-only");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -160,7 +160,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should allow read-write mode when given a special environment variable", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-read-write-env-var");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
|
|
@ -3,7 +3,7 @@ import { parseNdJson } from "./parseNdJson";
|
|||
|
||||
describe("basic failure case where a dependent target has failed", () => {
|
||||
it("when a failure happens, halts all other targets", () => {
|
||||
const repo = new Monorepo("basics");
|
||||
const repo = new Monorepo("basics-fail-1");
|
||||
|
||||
repo.init();
|
||||
repo.install();
|
||||
|
@ -38,7 +38,7 @@ describe("basic failure case where a dependent target has failed", () => {
|
|||
});
|
||||
|
||||
it("when a failure happens in `--continue` mode, halts all other dependent targets but continue to build as much as possible", () => {
|
||||
const repo = new Monorepo("basics");
|
||||
const repo = new Monorepo("basics-fail-continue-1");
|
||||
|
||||
repo.init();
|
||||
repo.install();
|
||||
|
@ -74,7 +74,7 @@ describe("basic failure case where a dependent target has failed", () => {
|
|||
|
||||
it("when a failure happens be sure to have exit code of !== 0", () => {
|
||||
expect.hasAssertions();
|
||||
const repo = new Monorepo("basics");
|
||||
const repo = new Monorepo("basics-fail-exit-1");
|
||||
|
||||
repo.init();
|
||||
repo.install();
|
||||
|
@ -100,7 +100,7 @@ describe("basic failure case where a dependent target has failed", () => {
|
|||
|
||||
it("when a failure happens in `--safe-exit`, be sure to have exit code of !== 0", () => {
|
||||
expect.hasAssertions();
|
||||
const repo = new Monorepo("basics");
|
||||
const repo = new Monorepo("basic-safe-exit-1");
|
||||
|
||||
repo.init();
|
||||
repo.install();
|
||||
|
|
|
@ -9,7 +9,7 @@ const cacheLocation = ".cache/backfill";
|
|||
|
||||
describe("Cache clear", () => {
|
||||
it("should clear cache when internalCacheFolder is passed", () => {
|
||||
const repo = new Monorepo("cache");
|
||||
const repo = new Monorepo("cache-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -62,7 +62,7 @@ describe("Cache clear", () => {
|
|||
});
|
||||
|
||||
it("should clear cache with the default cache location", () => {
|
||||
const repo = new Monorepo("cache-default");
|
||||
const repo = new Monorepo("cache-default-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
|
|
@ -3,7 +3,7 @@ import { parseNdJson } from "./parseNdJson";
|
|||
|
||||
describe("RemoteFallbackCacheProvider", () => {
|
||||
it("should skip local cache population if --skip-local-cache is enabled", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -58,7 +58,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should operate with local provider ONLY by default", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-read-only-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -103,7 +103,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should allow read-only mode when given a remote (or custom) cache config", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-read-only-custom-config-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
@ -157,7 +157,7 @@ describe("RemoteFallbackCacheProvider", () => {
|
|||
});
|
||||
|
||||
it("should allow read-write mode when given a special environment variable", () => {
|
||||
const repo = new Monorepo("fallback");
|
||||
const repo = new Monorepo("fallback-read-write-env-var-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(
|
||||
|
|
|
@ -6,7 +6,7 @@ describe("transitive task deps test", () => {
|
|||
// This test follows the model as documented here:
|
||||
// https://microsoft.github.io/lage/guide/levels.html
|
||||
it("produces a build graph even when some scripts are missing in package.json", () => {
|
||||
const repo = new Monorepo("transitiveDeps");
|
||||
const repo = new Monorepo("transitiveDeps-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(`module.exports = {
|
||||
|
@ -51,7 +51,7 @@ describe("transitive task deps test", () => {
|
|||
});
|
||||
|
||||
it("only runs package local dependencies for no-prefix dependencies", () => {
|
||||
const repo = new Monorepo("transitiveDeps-no-prefix");
|
||||
const repo = new Monorepo("transitiveDeps-no-prefix-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(`module.exports = {
|
||||
|
@ -100,7 +100,7 @@ describe("transitive task deps test", () => {
|
|||
});
|
||||
|
||||
it("only runs direct dependencies for ^ prefix dependencies -- ", () => {
|
||||
const repo = new Monorepo("transitiveDeps-carat-prefix");
|
||||
const repo = new Monorepo("transitiveDeps-carat-prefix-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(`module.exports = {
|
||||
|
@ -151,7 +151,7 @@ describe("transitive task deps test", () => {
|
|||
});
|
||||
|
||||
it("Runs transitive dependencies for ^^ prefix dependencies", () => {
|
||||
const repo = new Monorepo("transitiveDeps-indirect");
|
||||
const repo = new Monorepo("transitiveDeps-indirect-1");
|
||||
|
||||
repo.init();
|
||||
repo.setLageConfig(`module.exports = {
|
||||
|
|
|
@ -0,0 +1,15 @@
|
|||
const { runCLI } = require("jest");
|
||||
|
||||
module.exports = async function jest(data) {
|
||||
const { target, weight } = data;
|
||||
|
||||
console.log(`Running ${target.id}, maxWorkers: ${weight}`);
|
||||
|
||||
const { results } = await runCLI({ maxWorkers: weight, rootDir: target.cwd, passWithNoTests: true, verbose: true }, [target.cwd]);
|
||||
|
||||
if (results.success) {
|
||||
console.log("Tests passed");
|
||||
} else {
|
||||
throw new Error("Test failed");
|
||||
}
|
||||
};
|
|
@ -6038,7 +6038,7 @@ fast-deep-equal@^3.1.1, fast-deep-equal@^3.1.3:
|
|||
resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525"
|
||||
integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==
|
||||
|
||||
fast-glob@^3.2.11, fast-glob@^3.2.12, fast-glob@^3.2.2, fast-glob@^3.2.9:
|
||||
fast-glob@3.2.12, fast-glob@^3.2.11, fast-glob@^3.2.12, fast-glob@^3.2.2, fast-glob@^3.2.9:
|
||||
version "3.2.12"
|
||||
resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.2.12.tgz#7f39ec99c2e6ab030337142da9e0c18f37afae80"
|
||||
integrity sha512-DVj4CQIYYow0BlaelwK1pHl5n5cRSJfM60UA0zK891sVInoPri2Ekj7+e1CT3/3qxXenpI+nBBmQAcJPJgaj4w==
|
||||
|
|
Загрузка…
Ссылка в новой задаче