Cloud Runner v0 - Reliable and trimmed down cloud runner (#353)

* Update cloud-runner-aws-pipeline.yml

* Update cloud-runner-k8s-pipeline.yml

* yarn build

* yarn build

* correct branch ref

* correct branch ref passed to target repo

* Create k8s-tests.yml

* Delete k8s-tests.yml

* correct branch ref passed to target repo

* correct branch ref passed to target repo

* Always describe AWS tasks for now, because unstable error handling

* Remove unused tree commands

* Use lfs guid sum

* Simple override cache push

* Simple override cache push and pull override to allow pure cloud storage driven caching

* Removal of early branch (breaks lfs caching)

* Remove unused tree commands

* Update action.yml

* Update action.yml

* Support cache and input override commands as input + full support custom hooks

* Increase k8s timeout

* replace filename being appended for unknclear reason

* cache key should not contain whitespaces

* Always try and deploy rook for k8s

* Apply k8s files for rook

* Update action.yml

* Apply k8s files for rook

* Apply k8s files for rook

* cache test and action description for kuber storage class

* Correct test and implement dependency health check and start

* GCP-secret run, cache key

* lfs smudge set explicit and undo explicit

* Run using external secret provider to speed up input

* Update cloud-runner-aws-pipeline.yml

* Add nodejs as build step dependency

* Add nodejs as build step dependency

* Cloud Runner Tests must be specified to capture logs from cloud runner for tests

* Cloud Runner Tests must be specified to capture logs from cloud runner for tests

* Refactor and cleanup - no async input, combined setup/build, removed github logs for cli runs

* Refactor and cleanup - no async input, combined setup/build, removed github logs for cli runs

* Refactor and cleanup - no async input, combined setup/build, removed github logs for cli runs

* Refactor and cleanup - no async input, combined setup/build, removed github logs for cli runs

* Refactor and cleanup - no async input, combined setup/build, removed github logs for cli runs

* better defaults for new inputs

* better defaults

* merge latest

* force build update

* use npm n to update node in unity builder

* use npm n to update node in unity builder

* use npm n to update node in unity builder

* correct new line

* quiet zipping

* quiet zipping

* default secrets for unity username and password

* default secrets for unity username and password

* ls active directory before lfs install

* Get cloud runner secrets from

* Get cloud runner secrets from

* Cleanup setup of default secrets

* Various fixes

* Cleanup setup of default secrets

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* Various fixes

* AWS secrets manager support

* less caching logs

* default k8s storage class to pd-standard

* more readable build commands

* Capture aws exit code 1 reliably

* Always replace /head from branch

* k8s default storage class to standard-rwo

* cleanup

* further cleanup input

* further cleanup input

* further cleanup input

* further cleanup input

* further cleanup input

* folder sizes to inspect caching

* dir command for local cloud runner test

* k8s wait for pending because pvc will not create earlier

* prefer k8s standard storage

* handle empty string as cloud runner cluster input

* local-system is now used for cloud runner test implementation AND correctly unset test CLI input

* local-system is now used for cloud runner test implementation AND correctly unset test CLI input

* fix unterminated quote

* fix unterminated quote

* do not share build parameters in tests - in cloud runner this will cause conflicts with resouces of the same name

* remove head and heads from branch prefix

* fix reversed caching direction of cache-push

* fixes

* fixes

* fixes

* cachePull cli

* fixes

* fixes

* fixes

* fixes

* fixes

* order cache test to be first

* order cache test to be first

* fixes

* populate cache key instead of using branch

* cleanup cli

* garbage-collect-aws cli can iterate over aws resources and cli scans all ts files

* import cli methods

* import cli files explicitly

* import cli files explicitly

* import cli files explicitly

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* import cli methods

* log parameters in cloud runner parameter test

* log parameters in cloud runner parameter test

* log parameters in cloud runner parameter test

* Cloud runner param test before caching because we have a fast local cache test now

* Using custom build path relative to repo root rather than project root

* aws-garbage-collect at end of pipeline

* aws-garbage-collect do not actually delete anything for now - just list

* remove some legacy du commands

* Update cloud-runner-aws-pipeline.yml

* log contents after cache pull and fix some scenarios with duplicate secrets

* log contents after cache pull and fix some scenarios with duplicate secrets

* log contents after cache pull and fix some scenarios with duplicate secrets

* PR comments

* Replace guid with uuid package

* use fileExists lambda instead of stat to check file exists in caching

* build failed results in core error message

* Delete sample.txt
This commit is contained in:
Frostebite
2022-04-11 00:00:37 +01:00
committed by GitHub
parent 2b399b2641
commit a61c02481f
65 changed files with 10262 additions and 2259 deletions

View File

@@ -1,23 +0,0 @@
const targets = new Array();
export function CliFunction(key: string, description: string) {
return function (target: any, propertyKey: string, descriptor: PropertyDescriptor) {
targets.push({
target,
propertyKey,
descriptor,
key,
description,
});
};
}
export function GetCliFunctions(key) {
return targets.find((x) => x.key === key);
}
export function GetAllCliModes() {
return targets.map((x) => {
return {
key: x.key,
description: x.description,
};
});
}

View File

@@ -0,0 +1,44 @@
export class CliFunctionsRepository {
private static targets: any[] = [];
public static PushCliFunction(
target: any,
propertyKey: string,
descriptor: PropertyDescriptor,
key: string,
description: string,
) {
CliFunctionsRepository.targets.push({
target,
propertyKey,
descriptor,
key,
description,
});
}
public static GetCliFunctions(key) {
const results = CliFunctionsRepository.targets.find((x) => x.key === key);
if (results === undefined || results.length === 0) {
throw new Error(`no CLI mode found for ${key}`);
}
return results;
}
public static GetAllCliModes() {
return CliFunctionsRepository.targets.map((x) => {
return {
key: x.key,
description: x.description,
};
});
}
// eslint-disable-next-line no-unused-vars
public static PushCliFunctionSource(cliFunction: any) {}
}
export function CliFunction(key: string, description: string) {
return (target: any, propertyKey: string, descriptor: PropertyDescriptor) => {
CliFunctionsRepository.PushCliFunction(target, propertyKey, descriptor, key, description);
};
}

View File

@@ -3,55 +3,85 @@ import { BuildParameters, CloudRunner, ImageTag, Input } from '..';
import * as core from '@actions/core';
import { ActionYamlReader } from '../input-readers/action-yaml';
import CloudRunnerLogger from '../cloud-runner/services/cloud-runner-logger';
import { CliFunction, GetAllCliModes, GetCliFunctions } from './cli-decorator';
import { RemoteClientLogger } from './remote-client/remote-client-services/remote-client-logger';
import { CloudRunnerState } from '../cloud-runner/state/cloud-runner-state';
import { SetupCloudRunnerRepository } from './remote-client/setup-cloud-runner-repository';
import * as SDK from 'aws-sdk';
import CloudRunnerQueryOverride from '../cloud-runner/services/cloud-runner-query-override';
import { CliFunction, CliFunctionsRepository } from './cli-functions-repository';
import { AwsCliCommands } from '../cloud-runner/providers/aws/commands/aws-cli-commands';
import { Caching } from '../cloud-runner/remote-client/caching';
import { LfsHashing } from '../cloud-runner/services/lfs-hashing';
import { RemoteClient } from '../cloud-runner/remote-client';
export class CLI {
static async RunCli(options: any): Promise<void> {
Input.githubInputEnabled = false;
const results = GetCliFunctions(options.mode);
if (results === undefined || results.length === 0) {
throw new Error('no CLI mode found');
export class Cli {
public static options;
static get isCliMode() {
return Cli.options !== undefined && Cli.options.mode !== undefined && Cli.options.mode !== '';
}
public static query(key, alternativeKey) {
if (Cli.options && Cli.options[key] !== undefined) {
return Cli.options[key];
}
CloudRunnerLogger.log(`Entrypoint: ${results.key}`);
options.versioning = 'None';
Input.cliOptions = options;
return await results.target[results.propertyKey]();
}
static isCliMode(options: any) {
return options.mode !== undefined && options.mode !== '';
if (Cli.options && alternativeKey && Cli.options[alternativeKey] !== undefined) {
return Cli.options[alternativeKey];
}
return;
}
public static SetupCli() {
public static InitCliMode() {
CliFunctionsRepository.PushCliFunctionSource(AwsCliCommands);
CliFunctionsRepository.PushCliFunctionSource(Caching);
CliFunctionsRepository.PushCliFunctionSource(LfsHashing);
CliFunctionsRepository.PushCliFunctionSource(RemoteClient);
const program = new Command();
program.version('0.0.1');
const properties = Object.getOwnPropertyNames(Input);
core.info(`\n`);
core.info(`INPUT:`);
const actionYamlReader: ActionYamlReader = new ActionYamlReader();
for (const element of properties) {
program.option(`--${element} <${element}>`, actionYamlReader.GetActionYamlValue(element));
if (Input[element] !== undefined && Input[element] !== '' && typeof Input[element] !== `function`) {
}
program.option(
'-m, --mode <mode>',
CliFunctionsRepository.GetAllCliModes()
.map((x) => `${x.key} (${x.description})`)
.join(` | `),
);
program.option('--populateOverride <populateOverride>', 'should use override query to pull input false by default');
program.option('--cachePushFrom <cachePushFrom>', 'cache push from source folder');
program.option('--cachePushTo <cachePushTo>', 'cache push to caching folder');
program.option('--artifactName <artifactName>', 'caching artifact name');
program.parse(process.argv);
Cli.options = program.opts();
return Cli.isCliMode;
}
static async RunCli(): Promise<void> {
Input.githubInputEnabled = false;
if (Cli.options['populateOverride'] === `true`) {
await CloudRunnerQueryOverride.PopulateQueryOverrideInput();
}
Cli.logInput();
const results = CliFunctionsRepository.GetCliFunctions(Cli.options.mode);
CloudRunnerLogger.log(`Entrypoint: ${results.key}`);
Cli.options.versioning = 'None';
return await results.target[results.propertyKey]();
}
@CliFunction(`print-input`, `prints all input`)
private static logInput() {
core.info(`\n`);
core.info(`INPUT:`);
const properties = Object.getOwnPropertyNames(Input);
for (const element of properties) {
if (
Input[element] !== undefined &&
Input[element] !== '' &&
typeof Input[element] !== `function` &&
element !== 'length' &&
element !== 'cliOptions' &&
element !== 'prototype'
) {
core.info(`${element} ${Input[element]}`);
}
}
core.info(`\n`);
program.option(
'-m, --mode <mode>',
GetAllCliModes()
.map((x) => `${x.key} (${x.description})`)
.join(` | `),
);
program.parse(process.argv);
return program.opts();
}
@CliFunction(`cli`, `runs a cloud runner build`)
@@ -60,29 +90,4 @@ export class CLI {
const baseImage = new ImageTag(buildParameter);
return await CloudRunner.run(buildParameter, baseImage.toString());
}
@CliFunction(`remote-cli`, `sets up a repository, usually before a game-ci build`)
static async runRemoteClientJob() {
const buildParameter = JSON.parse(process.env.BUILD_PARAMETERS || '{}');
RemoteClientLogger.log(`Build Params:
${JSON.stringify(buildParameter, undefined, 4)}
`);
CloudRunnerState.setup(buildParameter);
await SetupCloudRunnerRepository.run();
}
@CliFunction(`cach-push`, `push to cache`)
static async cachePush() {}
@CliFunction(`cach-pull`, `pull from cache`)
static async cachePull() {}
@CliFunction(`garbage-collect-aws`, `garbage collect aws`)
static async garbageCollectAws() {
process.env.AWS_REGION = Input.region;
const CF = new SDK.CloudFormation();
const stacks = await CF.listStacks().promise();
CloudRunnerLogger.log(JSON.stringify(stacks, undefined, 4));
}
}

View File

@@ -1,117 +0,0 @@
import { assert } from 'console';
import fs from 'fs';
import path from 'path';
import { Input } from '../../..';
import CloudRunnerLogger from '../../../cloud-runner/services/cloud-runner-logger';
import { CloudRunnerState } from '../../../cloud-runner/state/cloud-runner-state';
import { CloudRunnerSystem } from './cloud-runner-system';
import { LFSHashing } from './lfs-hashing';
import { RemoteClientLogger } from './remote-client-logger';
export class Caching {
public static async PushToCache(cacheFolder: string, sourceFolder: string, cacheKey: string) {
const startPath = process.cwd();
try {
if (!fs.existsSync(cacheFolder)) {
await CloudRunnerSystem.Run(`mkdir -p ${cacheFolder}`);
}
process.chdir(path.resolve(sourceFolder, '..'));
if (Input.cloudRunnerTests) {
CloudRunnerLogger.log(
`Hashed cache folder ${await LFSHashing.hashAllFiles(sourceFolder)} ${sourceFolder} ${path.basename(
sourceFolder,
)}`,
);
}
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`ls ${path.basename(sourceFolder)}`);
}
await CloudRunnerSystem.Run(`zip ${cacheKey}.zip ${path.basename(sourceFolder)}`);
assert(fs.existsSync(`${cacheKey}.zip`), 'cache zip exists');
assert(fs.existsSync(path.basename(sourceFolder)), 'source folder exists');
await CloudRunnerSystem.Run(`mv ${cacheKey}.zip ${cacheFolder}`);
RemoteClientLogger.log(`moved ${cacheKey}.zip to ${cacheFolder}`);
assert(fs.existsSync(`${path.join(cacheFolder, cacheKey)}.zip`), 'cache zip exists inside cache folder');
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`ls ${cacheFolder}`);
}
} catch (error) {
process.chdir(`${startPath}`);
throw error;
}
process.chdir(`${startPath}`);
}
public static async PullFromCache(cacheFolder: string, destinationFolder: string, cacheKey: string = ``) {
const startPath = process.cwd();
RemoteClientLogger.log(`Caching for ${path.basename(destinationFolder)}`);
try {
if (!fs.existsSync(cacheFolder)) {
fs.mkdirSync(cacheFolder);
}
if (!fs.existsSync(destinationFolder)) {
fs.mkdirSync(destinationFolder);
}
const latestInBranch = await (await CloudRunnerSystem.Run(`ls -t "${cacheFolder}" | grep .zip$ | head -1`))
.replace(/\n/g, ``)
.replace('.zip', '');
process.chdir(cacheFolder);
const cacheSelection = cacheKey !== `` && fs.existsSync(`${cacheKey}.zip`) ? cacheKey : latestInBranch;
await CloudRunnerLogger.log(`cache key ${cacheKey} selection ${cacheSelection}`);
if (fs.existsSync(`${cacheSelection}.zip`)) {
const resultsFolder = `results${CloudRunnerState.buildParams.buildGuid}`;
await CloudRunnerSystem.Run(`mkdir -p ${resultsFolder}`);
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`tree ${destinationFolder}`);
}
RemoteClientLogger.log(`cache item exists ${cacheFolder}/${cacheSelection}.zip`);
assert(`${fs.existsSync(destinationFolder)}`);
assert(`${fs.existsSync(`${cacheSelection}.zip`)}`);
const fullResultsFolder = path.join(cacheFolder, resultsFolder);
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`tree ${cacheFolder}`);
}
await CloudRunnerSystem.Run(`unzip ${cacheSelection}.zip -d ${path.basename(resultsFolder)}`);
RemoteClientLogger.log(`cache item extracted to ${fullResultsFolder}`);
assert(`${fs.existsSync(fullResultsFolder)}`);
const destinationParentFolder = path.resolve(destinationFolder, '..');
if (fs.existsSync(destinationFolder)) {
fs.rmSync(destinationFolder, { recursive: true, force: true });
}
await CloudRunnerSystem.Run(
`mv "${fullResultsFolder}/${path.basename(destinationFolder)}" "${destinationParentFolder}"`,
);
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`tree ${destinationParentFolder}`);
}
} else {
RemoteClientLogger.logWarning(`cache item ${cacheKey} doesn't exist ${destinationFolder}`);
if (cacheSelection !== ``) {
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`tree ${cacheFolder}`);
}
RemoteClientLogger.logWarning(`cache item ${cacheKey}.zip doesn't exist ${destinationFolder}`);
throw new Error(`Failed to get cache item, but cache hit was found: ${cacheSelection}`);
}
}
} catch (error) {
process.chdir(`${startPath}`);
throw error;
}
process.chdir(`${startPath}`);
}
public static handleCachePurging() {
if (process.env.PURGE_REMOTE_BUILDER_CACHE !== undefined) {
RemoteClientLogger.log(`purging ${CloudRunnerState.purgeRemoteCaching}`);
fs.rmdirSync(CloudRunnerState.cacheFolder, { recursive: true });
}
}
}

View File

@@ -1,37 +0,0 @@
import { exec } from 'child_process';
import { RemoteClientLogger } from './remote-client-logger';
export class CloudRunnerSystem {
public static async Run(command: string, suppressError = false) {
for (const element of command.split(`\n`)) {
RemoteClientLogger.log(element);
}
return await new Promise<string>((promise) => {
let output = '';
const child = exec(command, (error, stdout, stderr) => {
if (error && !suppressError) {
throw error;
}
if (stderr) {
const diagnosticOutput = `${stderr.toString()}`;
RemoteClientLogger.logCliDiagnostic(diagnosticOutput);
output += diagnosticOutput;
return;
}
const outputChunk = `${stdout}`;
output += outputChunk;
});
child.on('close', function (code) {
RemoteClientLogger.log(`[Exit code ${code}]`);
if (code !== 0 && !suppressError) {
throw new Error(output);
}
const outputLines = output.split(`\n`);
for (const element of outputLines) {
RemoteClientLogger.log(element);
}
promise(output);
});
});
}
}

View File

@@ -1,42 +0,0 @@
import path from 'path';
import { CloudRunnerState } from '../../../cloud-runner/state/cloud-runner-state';
import { CloudRunnerSystem } from './cloud-runner-system';
import fs from 'fs';
import { assert } from 'console';
import { Input } from '../../..';
import { RemoteClientLogger } from './remote-client-logger';
export class LFSHashing {
public static async createLFSHashFiles() {
try {
await CloudRunnerSystem.Run(`git lfs ls-files -l | cut -d ' ' -f1 | sort > .lfs-assets-guid`);
await CloudRunnerSystem.Run(`md5sum .lfs-assets-guid > .lfs-assets-guid-sum`);
assert(fs.existsSync(`.lfs-assets-guid-sum`));
assert(fs.existsSync(`.lfs-assets-guid`));
const lfsHashes = {
lfsGuid: fs
.readFileSync(`${path.join(CloudRunnerState.repoPathFull, `.lfs-assets-guid`)}`, 'utf8')
.replace(/\n/g, ``),
lfsGuidSum: fs
.readFileSync(`${path.join(CloudRunnerState.repoPathFull, `.lfs-assets-guid-sum`)}`, 'utf8')
.replace(/\n/g, ``),
};
if (Input.cloudRunnerTests) {
RemoteClientLogger.log(lfsHashes.lfsGuid);
RemoteClientLogger.log(lfsHashes.lfsGuidSum);
}
return lfsHashes;
} catch (error) {
throw error;
}
}
public static async hashAllFiles(folder: string) {
const startPath = process.cwd();
process.chdir(folder);
const result = await (await CloudRunnerSystem.Run(`find -type f -exec md5sum "{}" + | sort | md5sum`))
.replace(/\n/g, '')
.split(` `)[0];
process.chdir(startPath);
return result;
}
}

View File

@@ -1,19 +0,0 @@
import CloudRunnerLogger from '../../../cloud-runner/services/cloud-runner-logger';
export class RemoteClientLogger {
public static log(message: string) {
CloudRunnerLogger.log(`[Client] ${message}`);
}
public static logCliError(message: string) {
CloudRunnerLogger.log(`[Client][Error] ${message}`);
}
public static logCliDiagnostic(message: string) {
CloudRunnerLogger.log(`[Client][Diagnostic] ${message}`);
}
public static logWarning(message) {
CloudRunnerLogger.logWarning(message);
}
}

View File

@@ -1,81 +0,0 @@
import fs from 'fs';
import { CloudRunnerState } from '../../cloud-runner/state/cloud-runner-state';
import { Caching } from './remote-client-services/caching';
import { LFSHashing } from './remote-client-services/lfs-hashing';
import { CloudRunnerSystem } from './remote-client-services/cloud-runner-system';
import { Input } from '../..';
import { RemoteClientLogger } from './remote-client-services/remote-client-logger';
import path from 'path';
import { assert } from 'console';
export class SetupCloudRunnerRepository {
public static async run() {
try {
await CloudRunnerSystem.Run(`mkdir -p ${CloudRunnerState.buildPathFull}`);
await CloudRunnerSystem.Run(`mkdir -p ${CloudRunnerState.repoPathFull}`);
await CloudRunnerSystem.Run(`mkdir -p ${CloudRunnerState.cacheFolderFull}`);
process.chdir(CloudRunnerState.repoPathFull);
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`ls -lh`);
await CloudRunnerSystem.Run(`tree`);
}
await SetupCloudRunnerRepository.cloneRepoWithoutLFSFiles();
if (Input.cloudRunnerTests) {
await CloudRunnerSystem.Run(`ls -lh`);
await CloudRunnerSystem.Run(`tree`);
}
const lfsHashes = await LFSHashing.createLFSHashFiles();
if (fs.existsSync(CloudRunnerState.libraryFolderFull)) {
RemoteClientLogger.logWarning(`!Warning!: The Unity library was included in the git repository`);
}
await Caching.PullFromCache(
CloudRunnerState.lfsCacheFolderFull,
CloudRunnerState.lfsDirectoryFull,
`${lfsHashes.lfsGuid}`,
);
await SetupCloudRunnerRepository.pullLatestLFS();
await Caching.PushToCache(
CloudRunnerState.lfsCacheFolderFull,
CloudRunnerState.lfsDirectoryFull,
`${lfsHashes.lfsGuid}`,
);
await Caching.PullFromCache(CloudRunnerState.libraryCacheFolderFull, CloudRunnerState.libraryFolderFull);
Caching.handleCachePurging();
} catch (error) {
throw error;
}
}
private static async cloneRepoWithoutLFSFiles() {
try {
process.chdir(`${CloudRunnerState.repoPathFull}`);
RemoteClientLogger.log(`Initializing source repository for cloning with caching of LFS files`);
await CloudRunnerSystem.Run(`git config --global advice.detachedHead false`);
RemoteClientLogger.log(`Cloning the repository being built:`);
await CloudRunnerSystem.Run(`git lfs install --skip-smudge`);
await CloudRunnerSystem.Run(
`git clone -b ${CloudRunnerState.branchName} ${CloudRunnerState.targetBuildRepoUrl} ${path.resolve(
`..`,
path.basename(CloudRunnerState.repoPathFull),
)}`,
);
assert(fs.existsSync(`.git`));
RemoteClientLogger.log(`${CloudRunnerState.buildParams.branch}`);
await CloudRunnerSystem.Run(`git checkout ${CloudRunnerState.buildParams.branch}`);
assert(fs.existsSync(path.join(`.git`, `lfs`)), 'LFS folder should not exist before caching');
RemoteClientLogger.log(`Checked out ${process.env.GITHUB_SHA}`);
} catch (error) {
throw error;
}
}
private static async pullLatestLFS() {
await CloudRunnerSystem.Run(`ls -lh ${CloudRunnerState.lfsDirectoryFull}/..`);
process.chdir(CloudRunnerState.repoPathFull);
await CloudRunnerSystem.Run(`git lfs pull`);
RemoteClientLogger.log(`pulled latest LFS files`);
assert(fs.existsSync(CloudRunnerState.lfsDirectoryFull));
await CloudRunnerSystem.Run(`ls -lh ${CloudRunnerState.lfsDirectoryFull}/..`);
}
}