File Storage 2.0 (#16825)

* Refactor storage logic to testable setup

* Add tests to get storage config

* Add tests for register drivers

* Tests for index

* Organize imports

* Add missing getStorage

* Setup boilerplate for cloudinary drive

* Add build script

* Add cloudinary configuration

* WIP tweaks for R&D

* Start storage abstraction v2

* Make storage manager single file

* Add test coverage

* Organize imports

* Setup local driver boilerplate

* [WIP] Start on local driver 2.0

* Add more methods

* Lunchtime

* Add put method

* Add list method

* [WIP] Try using storage in api

* Use node16 module-resolution

* Lets not mess with apply-query rn

* I love CJS. Death to CJS.

* Use dynamic imports

* Make things work

* Add path normalizer

* Add azure storage driver

* Update lock

* Start on tests

* Add getBuffer test

* Add getStat tests

* Add tests for exists

* Add tests for move

* Add tests for copy

* Add tests for put

* Add tests for delete

* Add test coverage for list

* Add removeLeading option to normalizePath

* Use removeLeading option

* Start on gcs

* Add fullpath test

* Add getStream

* Add getBuffer

* Add getStat

* Add exists

* Add move

* Add copy

* Add put

* Add delete

* Finish DriverGCS

* Cleanup tests a bit

* Start s3

* Add getStream

* Add getBuffer

* Please Wryn

* Add #exists

* Use randomize data

* No more hardcoded values 🙅‍♀️

* Add tests for copy

* Add tests for put

* Add put/copy/delete

* Add tests for delete

* WIP list

* Remove duplicate fullPath check

* Finish AWS tests

* Listen to wryn

* Mission critical tweak

* Add randomization, cleanup tests

* Check src vs dest full path

* Start on Cloudinary

* Add parameter signature helper

* Fix ESM building of shared

* Fix ESM building of shared

* Improve tests

* Update pnpm

* Remove old build commands

* Generated d.ts files

* Fix export naming

* Move ESM only utils to new @directus/utils

* Update lockfile

* Fix type exports

* Implement getStream

* Cleanup tests

* Simplify api

* Simplify API

* Simplify api

* Simplify API

* Add read/stat

* Cleanup / add exists

* Add move

* Add write

* Move uploadChunk to separate method

* Add test for #uploadChunk

* Add tests for write

* Add copy

* Add delete

* Add list

* Add list error handling

* Remove old drive packages

* Start updating API usage of storage

* Use Readable instead of NodeJS.ReadableStream

* Use readable instead of buffer

* Restore shared to main

* Update lockfile

* Use a streaming pipeline for sharp metadata

* Add basic e2e test for local upload and delete

* Fix integer filesize in SQLite

* fixed environment mocking in unit tests

* applied the same env mocking to other tests

* Update api/src/storage/register-drivers.ts

Co-authored-by: Pascal Jufer <pascal-jufer@bluewin.ch>

* Use sha256 by default

* Add base read test for /assets

* Replace exifr with exif-reader

* Fix tests for storage/index

* Install faking lib for tests

* Add test for register-drivers

* Add tests for register-locations

* Finish tests

* Organize imports

* Resolve Azris comments

* Fix outdated tests

Certainly not the cleanest, but I need to do a bigger pass on all these tests to get them up to date with the latest team requirements. Gonna do that in a separate PR..

* Test for sha256

* Attempt 1 at fixing toString errorr

I'm not seeing this issue locally, so we'll spam a couple commits here to get things going

* Use node 18 in tests?!

* Fix localhost resolution with 127.0.0.1

* Mock getEnv()

* Use @directus/tsconfig instead of duplicated conf

* Does this fix it?

* OK fun detour

* Recreate lockfile

* Update config files

* Use multipart uploads in S3

* Cleanup imports

* File Storage 2.0: Make metadata extraction backward-compatible (#16868)

* Reinstall packages using pnpm instead of manually removing them (#16871)

* Added extra environment setting for sharp processing of invalid images (#16811)

* Added extra environment setting for sharp processing of invalid images

* renamed environment var to `ASSETS_INVALID_IMAGE_SENSITIVITY_LEVEL`

* Remove unused excludes from tsconfig

* Remove copy/paste leftover

* Update packages/utils/readme.md

Co-authored-by: Pascal Jufer <pascal-jufer@bluewin.ch>

* Update packages/utils/package.json

Co-authored-by: Pascal Jufer <pascal-jufer@bluewin.ch>

Co-authored-by: ian <licitdev@gmail.com>
Co-authored-by: Brainslug <tim@brainslug.nl>
Co-authored-by: Pascal Jufer <pascal-jufer@bluewin.ch>
Co-authored-by: Brainslug <br41nslug@users.noreply.github.com>
This commit is contained in:
Rijk van Zanten
2022-12-21 10:04:03 -05:00
committed by GitHub
parent 4bb8a463f6
commit 00865fbd84
146 changed files with 8978 additions and 5389 deletions

View File

@@ -12,7 +12,7 @@ runs:
- name: Install Node.js
uses: actions/setup-node@v3
with:
node-version: 16
node-version: 18
- uses: pnpm/action-setup@v2.2.4
name: Install pnpm

View File

@@ -42,14 +42,11 @@ jobs:
- name: Prepare
uses: ./.github/actions/prepare
- name: Prune Dependencies
run: find . -type d -name node_modules -prune -exec rm -rf {} \;
- name: Install Production Dependencies
- name: Reinstall production dependencies only
run: pnpm install --prod
- name: Install Root Dev Dependencies
run: pnpm install -w --dev
- name: Reinstall dev dependencies for workspace root
run: pnpm install --workspace-root --dev
- name: Install Oracle client
if: matrix.vendor == 'oracle'
@@ -61,14 +58,14 @@ jobs:
env:
ORACLE_DL: oracle-instantclient-basic-21.4.0.0.0-1.el8.x86_64.rpm
- name: Start Services (SQLite)
- name: Start services (SQLite)
if: matrix.vendor == 'sqlite3'
run: docker compose -f tests-blackbox/docker-compose.yml up auth-saml -d --quiet-pull --wait
- name: Start Services (Other vendors)
- name: Start services (other vendors)
if: matrix.vendor != 'sqlite3'
run:
docker compose -f tests-blackbox/docker-compose.yml up ${{ matrix.vendor }} auth-saml -d --quiet-pull --wait
- name: Run Tests
- name: Run tests
run: TEST_DB=${{ matrix.vendor }} pnpm run -w test:blackbox

View File

@@ -30,17 +30,14 @@ jobs:
- name: Prepare
uses: ./.github/actions/prepare
- name: Prune Dependencies
run: find . -type d -name node_modules -prune -exec rm -rf {} \;
- name: Install Production Dependencies
- name: Reinstall production dependencies only
run: pnpm install --prod
- name: Install Root Dev Dependencies
run: pnpm install -w --dev
- name: Reinstall dev dependencies in workspace root
run: pnpm install --workspace-root --dev
- name: Start Services
- name: Start services
run: docker compose -f tests-blackbox/docker-compose.yml up auth-saml -d --quiet-pull --wait
- name: Run Tests
- name: Run tests
run: TEST_DB=sqlite3 pnpm run -w test:blackbox

1
.gitignore vendored
View File

@@ -23,3 +23,4 @@ schema.yaml
schema.json
.*.swp
debug
debug.ts

View File

@@ -76,15 +76,17 @@
"@authenio/samlify-node-xmllint": "2.0.0",
"@aws-sdk/client-ses": "3.211.0",
"@directus/app": "workspace:*",
"@directus/drive": "workspace:*",
"@directus/drive-azure": "workspace:*",
"@directus/drive-gcs": "workspace:*",
"@directus/drive-s3": "workspace:*",
"@directus/extensions-sdk": "workspace:*",
"@directus/format-title": "9.15.0",
"@directus/schema": "workspace:*",
"@directus/shared": "workspace:*",
"@directus/specs": "workspace:*",
"@directus/storage": "workspace:*",
"@directus/storage-driver-azure": "workspace:*",
"@directus/storage-driver-cloudinary": "workspace:*",
"@directus/storage-driver-gcs": "workspace:*",
"@directus/storage-driver-local": "workspace:*",
"@directus/storage-driver-s3": "workspace:*",
"@godaddy/terminus": "4.11.2",
"@rollup/plugin-alias": "4.0.2",
"@rollup/plugin-virtual": "3.0.1",
@@ -108,7 +110,7 @@
"encodeurl": "1.0.2",
"eventemitter2": "6.4.9",
"execa": "5.1.1",
"exifr": "7.1.3",
"exif-reader": "1.0.3",
"express": "4.18.2",
"fast-redact": "3.1.2",
"flat": "5.0.2",
@@ -117,6 +119,7 @@
"graphql": "16.6.0",
"graphql-compose": "9.0.10",
"helmet": "6.0.0",
"icc": "2.0.0",
"inquirer": "8.2.4",
"ioredis": "5.2.4",
"joi": "17.7.0",
@@ -145,8 +148,8 @@
"otplib": "12.0.1",
"pino": "8.7.0",
"pino-http": "8.2.1",
"pino-pretty": "9.1.1",
"pino-http-print": "3.1.0",
"pino-pretty": "9.1.1",
"qs": "6.11.0",
"rate-limiter-flexible": "2.4.1",
"rollup": "3.3.0",
@@ -164,6 +167,7 @@
"wellknown": "0.5.0"
},
"devDependencies": {
"@ngneat/falso": "6.3.0",
"@types/async": "3.2.15",
"@types/busboy": "1.5.0",
"@types/bytes": "3.1.1",
@@ -172,6 +176,7 @@
"@types/deep-diff": "1.0.1",
"@types/destroy": "1.0.0",
"@types/encodeurl": "1.0.0",
"@types/exif-reader": "1.0.0",
"@types/express": "4.17.14",
"@types/express-serve-static-core": "4.17.31",
"@types/fast-redact": "3.0.2",

View File

@@ -16,17 +16,19 @@ vi.mock('./database', () => ({
vi.mock('./env', async () => {
const actual = (await vi.importActual('./env')) as { default: Record<string, any> };
const MOCK_ENV = {
...actual.default,
KEY: 'xxxxxxx-xxxxxx-xxxxxxxx-xxxxxxxxxx',
SECRET: 'abcdef',
SERVE_APP: true,
PUBLIC_URL: 'http://localhost:8055/directus',
TELEMETRY: false,
LOG_STYLE: 'raw',
};
return {
default: {
...actual.default,
KEY: 'xxxxxxx-xxxxxx-xxxxxxxx-xxxxxxxxxx',
SECRET: 'abcdef',
SERVE_APP: true,
PUBLIC_URL: 'http://localhost:8055/directus',
TELEMETRY: false,
LOG_STYLE: 'raw',
},
default: MOCK_ENV,
getEnv: () => MOCK_ENV,
};
});

View File

@@ -6,19 +6,20 @@ import { test, describe, expect, vi, beforeEach } from 'vitest';
vi.mock('../../src/env', async () => {
const actual = (await vi.importActual('../../src/env')) as { default: Record<string, any> };
const MOCK_ENV = {
...actual.default,
EXTENSIONS_PATH: '',
SERVE_APP: false,
DB_CLIENT: 'pg',
DB_HOST: 'localhost',
DB_PORT: 5432,
DB_DATABASE: 'directus',
DB_USER: 'postgres',
DB_PASSWORD: 'psql1234',
};
return {
default: {
...actual.default,
EXTENSIONS_PATH: '',
SERVE_APP: false,
DB_CLIENT: 'pg',
DB_HOST: 'localhost',
DB_PORT: 5432,
DB_DATABASE: 'directus',
DB_USER: 'postgres',
DB_PASSWORD: 'psql1234',
},
default: MOCK_ENV,
getEnv: () => MOCK_ENV,
};
});

View File

@@ -1,4 +1,6 @@
import { Range } from '@directus/drive';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { Range } from '@directus/storage';
import { parseJSON } from '@directus/shared/utils';
import { Router } from 'express';
import { merge, pick } from 'lodash';

View File

@@ -1,5 +1,7 @@
import { BaseException } from '@directus/shared/exceptions';
import { Range } from '@directus/drive';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { Range } from '@directus/storage';
export class RangeNotSatisfiableException extends BaseException {
constructor(range?: Range) {

View File

@@ -10,11 +10,15 @@ import { Request, Response } from 'express';
import { Knex } from 'knex';
vi.mock('../../src/database');
vi.mock('../../src/env', () => ({
default: {
vi.mock('../../src/env', () => {
const MOCK_ENV = {
SECRET: 'test',
},
}));
};
return {
default: MOCK_ENV,
getEnv: () => MOCK_ENV,
};
});
afterEach(() => {
vi.resetAllMocks();

View File

@@ -1,6 +1,9 @@
import { Range, StatResponse } from '@directus/drive';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { Range, Stat } from '@directus/storage';
import { Accountability } from '@directus/shared/types';
import { Semaphore } from 'async-mutex';
import type { Readable } from 'node:stream';
import { Knex } from 'knex';
import { contentType } from 'mime-types';
import hash from 'object-hash';
@@ -11,7 +14,7 @@ import getDatabase from '../database';
import env from '../env';
import { ForbiddenException, IllegalAssetTransformation, RangeNotSatisfiableException } from '../exceptions';
import logger from '../logger';
import storage from '../storage';
import { getStorage } from '../storage';
import { AbstractServiceOptions, File, Transformation, TransformationParams, TransformationPreset } from '../types';
import * as TransformationUtils from '../utils/transformations';
import { AuthorizationService } from './authorization';
@@ -37,7 +40,9 @@ export class AssetsService {
id: string,
transformation: TransformationParams | TransformationPreset,
range?: Range
): Promise<{ stream: NodeJS.ReadableStream; file: any; stat: StatResponse }> {
): Promise<{ stream: Readable; file: any; stat: Stat }> {
const storage = await getStorage();
const publicSettings = await this.knex
.select('project_logo', 'public_background', 'public_foreground')
.from('directus_settings')
@@ -62,7 +67,7 @@ export class AssetsService {
if (!file) throw new ForbiddenException();
const { exists } = await storage.disk(file.storage).exists(file.filename_disk);
const exists = await storage.location(file.storage).exists(file.filename_disk);
if (!exists) throw new ForbiddenException();
@@ -116,7 +121,7 @@ export class AssetsService {
getAssetSuffix(transforms) +
(maybeNewFormat ? `.${maybeNewFormat}` : path.extname(file.filename_disk));
const { exists } = await storage.disk(file.storage).exists(assetFilename);
const exists = await storage.location(file.storage).exists(assetFilename);
if (maybeNewFormat) {
file.type = contentType(assetFilename) || null;
@@ -124,9 +129,9 @@ export class AssetsService {
if (exists) {
return {
stream: storage.disk(file.storage).getStream(assetFilename, range),
stream: await storage.location(file.storage).read(assetFilename, range),
file,
stat: await storage.disk(file.storage).getStat(assetFilename),
stat: await storage.location(file.storage).stat(assetFilename),
};
}
@@ -147,7 +152,7 @@ export class AssetsService {
}
return await semaphore.runExclusive(async () => {
const readStream = storage.disk(file.storage).getStream(file.filename_disk, range);
const readStream = await storage.location(file.storage).read(file.filename_disk, range);
const transformer = sharp({
limitInputPixels: Math.pow(env.ASSETS_TRANSFORM_IMAGE_MAX_DIMENSION, 2),
sequentialRead: true,
@@ -158,22 +163,22 @@ export class AssetsService {
transforms.forEach(([method, ...args]) => (transformer[method] as any).apply(transformer, args));
readStream.on('error', (e) => {
readStream.on('error', (e: Error) => {
logger.error(e, `Couldn't transform file ${file.id}`);
readStream.unpipe(transformer);
});
await storage.disk(file.storage).put(assetFilename, readStream.pipe(transformer), type);
await storage.location(file.storage).write(assetFilename, readStream.pipe(transformer), type);
return {
stream: storage.disk(file.storage).getStream(assetFilename, range),
stat: await storage.disk(file.storage).getStat(assetFilename),
stream: await storage.location(file.storage).read(assetFilename, range),
stat: await storage.location(file.storage).stat(assetFilename),
file,
};
});
} else {
const readStream = storage.disk(file.storage).getStream(file.filename_disk, range);
const stat = await storage.disk(file.storage).getStat(file.filename_disk);
const readStream = await storage.location(file.storage).read(file.filename_disk, range);
const stat = await storage.location(file.storage).stat(file.filename_disk);
return { stream: readStream, file, stat };
}
}

View File

@@ -1,11 +1,8 @@
import exifr from 'exifr';
import knex, { Knex } from 'knex';
import { MockClient, Tracker, getTracker } from 'knex-mock-client';
import { getTracker, MockClient, Tracker } from 'knex-mock-client';
import { afterEach, beforeAll, beforeEach, describe, expect, it, MockedFunction, SpyInstance, vi } from 'vitest';
import { FilesService, ItemsService } from '.';
import { InvalidPayloadException } from '../exceptions';
import { describe, beforeAll, afterEach, expect, it, vi, beforeEach, MockedFunction, SpyInstance } from 'vitest';
vi.mock('exifr');
describe('Integration Tests', () => {
let db: MockedFunction<Knex>;
@@ -60,37 +57,5 @@ describe('Integration Tests', () => {
expect(superCreateOne).toHaveBeenCalled();
});
});
describe('getMetadata', () => {
let service: FilesService;
let exifrParseSpy: SpyInstance<any>;
const sampleMetadata = {
CustomTagA: 'value a',
CustomTagB: 'value b',
CustomTagC: 'value c',
};
beforeEach(() => {
exifrParseSpy = vi.spyOn(exifr, 'parse');
service = new FilesService({
knex: db,
schema: { collections: {}, relations: [] },
});
});
it('accepts allowlist metadata tags', async () => {
exifrParseSpy.mockReturnValue(Promise.resolve({ ...sampleMetadata }));
const bufferContent = 'file buffer content';
const allowList = ['CustomTagB', 'CustomTagA'];
const metadata = await service.getMetadata(bufferContent, allowList);
expect(exifrParseSpy).toHaveBeenCalled();
expect(metadata.metadata.CustomTagA).toStrictEqual(sampleMetadata.CustomTagA);
expect(metadata.metadata.CustomTagB).toStrictEqual(sampleMetadata.CustomTagB);
expect(metadata.metadata.CustomTagC).toBeUndefined();
});
});
});
});

View File

@@ -1,22 +1,26 @@
import exifr from 'exifr';
import { toArray } from '@directus/shared/utils';
import { lookup } from 'dns';
import encodeURL from 'encodeurl';
import exif from 'exif-reader';
import { parse as parseIcc } from 'icc';
import { clone, pick } from 'lodash';
import { extension } from 'mime-types';
import net from 'net';
import type { Readable } from 'node:stream';
import { pipeline } from 'node:stream/promises';
import os from 'os';
import path from 'path';
import sharp from 'sharp';
import url, { URL } from 'url';
import { promisify } from 'util';
import { lookup } from 'dns';
import emitter from '../emitter';
import env from '../env';
import { ForbiddenException, InvalidPayloadException, ServiceUnavailableException } from '../exceptions';
import logger from '../logger';
import storage from '../storage';
import { AbstractServiceOptions, File, PrimaryKey, MutationOptions, Metadata } from '../types';
import { toArray } from '@directus/shared/utils';
import { getStorage } from '../storage';
import { AbstractServiceOptions, File, Metadata, MutationOptions, PrimaryKey } from '../types';
import { parseIptc, parseXmp } from '../utils/parse-image-metadata';
import { ItemsService } from './items';
import net from 'net';
import os from 'os';
import encodeURL from 'encodeurl';
// @ts-ignore
import formatTitle from '@directus/format-title';
@@ -32,11 +36,13 @@ export class FilesService extends ItemsService {
* Upload a single new file to the configured storage adapter
*/
async uploadOne(
stream: NodeJS.ReadableStream,
stream: Readable,
data: Partial<File> & { filename_download: string; storage: string },
primaryKey?: PrimaryKey,
opts?: MutationOptions
): Promise<PrimaryKey> {
const storage = await getStorage();
const payload = clone(data);
if ('folder' in payload === false) {
@@ -52,10 +58,10 @@ export class FilesService extends ItemsService {
// If the file you're uploading already exists, we'll consider this upload a replace. In that case, we'll
// delete the previously saved file and thumbnails to ensure they're generated fresh
const disk = storage.disk(payload.storage);
const disk = storage.location(payload.storage);
for await (const file of disk.flatList(String(primaryKey))) {
await disk.delete(file.path);
for await (const filepath of disk.list(String(primaryKey))) {
await disk.delete(filepath);
}
} else {
primaryKey = await this.createOne(payload, { emitEvents: false });
@@ -71,19 +77,19 @@ export class FilesService extends ItemsService {
}
try {
await storage.disk(data.storage).put(payload.filename_disk, stream, payload.type);
await storage.location(data.storage).write(payload.filename_disk, stream, payload.type);
} catch (err: any) {
logger.warn(`Couldn't save file ${payload.filename_disk}`);
logger.warn(err);
throw new ServiceUnavailableException(`Couldn't save file ${payload.filename_disk}`, { service: 'files' });
}
const { size } = await storage.disk(data.storage).getStat(payload.filename_disk);
const { size } = await storage.location(data.storage).stat(payload.filename_disk);
payload.filesize = size;
if (['image/jpeg', 'image/png', 'image/webp', 'image/gif', 'image/tiff'].includes(payload.type)) {
const buffer = await storage.disk(data.storage).getBuffer(payload.filename_disk);
const { height, width, description, title, tags, metadata } = await this.getMetadata(buffer.content);
const stream = await storage.location(data.storage).read(payload.filename_disk);
const { height, width, description, title, tags, metadata } = await this.getMetadata(stream);
payload.height ??= height;
payload.width ??= width;
@@ -128,58 +134,87 @@ export class FilesService extends ItemsService {
/**
* Extract metadata from a buffer's content
*/
async getMetadata(bufferContent: any, allowList = env.FILE_METADATA_ALLOW_LIST): Promise<Metadata> {
const metadata: Metadata = {};
async getMetadata(stream: Readable, allowList = env.FILE_METADATA_ALLOW_LIST): Promise<Metadata> {
return new Promise((resolve, reject) => {
pipeline(
stream,
sharp().metadata(async (err, sharpMetadata) => {
if (err) reject(err);
try {
const sharpMetadata = await sharp(bufferContent, {}).metadata();
const metadata: Metadata = {};
if (sharpMetadata.orientation && sharpMetadata.orientation >= 5) {
metadata.height = sharpMetadata.width;
metadata.width = sharpMetadata.height;
} else {
metadata.width = sharpMetadata.width;
metadata.height = sharpMetadata.height;
}
} catch (err: any) {
logger.warn(`Couldn't extract sharp metadata from file`);
logger.warn(err);
}
if (sharpMetadata.orientation && sharpMetadata.orientation >= 5) {
metadata.height = sharpMetadata.width;
metadata.width = sharpMetadata.height;
} else {
metadata.width = sharpMetadata.width;
metadata.height = sharpMetadata.height;
}
try {
const exifrMetadata = await exifr.parse(bufferContent, {
icc: false,
iptc: true,
ifd1: true,
interop: true,
translateValues: true,
reviveValues: true,
mergeOutput: false,
});
// Backward-compatible layout as it used to be with 'exifr'
const fullMetadata: {
ifd0?: Record<string, unknown>;
ifd1?: Record<string, unknown>;
exif?: Record<string, unknown>;
gps?: Record<string, unknown>;
interop?: Record<string, unknown>;
icc?: Record<string, unknown>;
iptc?: Record<string, unknown>;
xmp?: Record<string, unknown>;
} = {};
if (sharpMetadata.exif) {
const { image, thumbnail, interoperability, ...rest } = exif(sharpMetadata.exif);
if (image) {
fullMetadata.ifd0 = image;
}
if (thumbnail) {
fullMetadata.ifd1 = thumbnail;
}
if (interoperability) {
fullMetadata.interop = interoperability;
}
Object.assign(fullMetadata, rest);
}
if (sharpMetadata.icc) {
fullMetadata.icc = parseIcc(sharpMetadata.icc);
}
if (sharpMetadata.iptc) {
fullMetadata.iptc = parseIptc(sharpMetadata.iptc);
}
if (sharpMetadata.xmp) {
fullMetadata.xmp = parseXmp(sharpMetadata.xmp);
}
if (allowList === '*' || allowList?.[0] === '*') {
metadata.metadata = exifrMetadata;
} else {
metadata.metadata = pick(exifrMetadata, allowList);
}
if (fullMetadata?.iptc?.Caption && typeof fullMetadata.iptc.Caption === 'string') {
metadata.description = fullMetadata.iptc?.Caption;
}
if (fullMetadata?.iptc?.Headline && typeof fullMetadata.iptc.Headline === 'string') {
metadata.title = fullMetadata.iptc.Headline;
}
if (fullMetadata?.iptc?.Keywords) {
metadata.tags = fullMetadata.iptc.Keywords;
}
if (!metadata.description && exifrMetadata?.Caption) {
metadata.description = exifrMetadata.Caption;
}
if (allowList === '*' || allowList?.[0] === '*') {
metadata.metadata = fullMetadata;
} else {
metadata.metadata = pick(fullMetadata, allowList);
}
if (exifrMetadata?.Headline) {
metadata.title = exifrMetadata.Headline;
}
// Fix (incorrectly parsed?) values starting / ending with spaces,
// limited to one level and string values only
for (const section of Object.keys(metadata.metadata)) {
for (const [key, value] of Object.entries(metadata.metadata[section])) {
if (typeof value === 'string') {
metadata.metadata[section][key] = value.trim();
}
}
}
if (exifrMetadata?.Keywords) {
metadata.tags = exifrMetadata.Keywords;
}
} catch (err: any) {
logger.warn(`Couldn't extract EXIF metadata from file`);
logger.warn(err);
}
return metadata;
resolve(metadata);
})
);
});
}
/**
@@ -247,7 +282,7 @@ export class FilesService extends ItemsService {
let fileResponse;
try {
fileResponse = await axios.get<NodeJS.ReadableStream>(encodeURL(importURL), {
fileResponse = await axios.get<Readable>(encodeURL(importURL), {
responseType: 'stream',
});
} catch (err: any) {
@@ -296,6 +331,7 @@ export class FilesService extends ItemsService {
* Delete multiple files
*/
async deleteMany(keys: PrimaryKey[], opts?: MutationOptions): Promise<PrimaryKey[]> {
const storage = await getStorage();
const files = await super.readMany(keys, { fields: ['id', 'storage'], limit: -1 });
if (!files) {
@@ -305,11 +341,11 @@ export class FilesService extends ItemsService {
await super.deleteMany(keys);
for (const file of files) {
const disk = storage.disk(file.storage);
const disk = storage.location(file.storage);
// Delete file + thumbnails
for await (const { path } of disk.flatList(file.id)) {
await disk.delete(path);
for await (const filepath of disk.list(file.id)) {
await disk.delete(filepath);
}
}

View File

@@ -25,6 +25,7 @@ import { getDateFormatted } from '../utils/get-date-formatted';
import { FilesService } from './files';
import { ItemsService } from './items';
import { NotificationsService } from './notifications';
import type { Readable } from 'node:stream';
export class ImportService {
knex: Knex;
@@ -37,7 +38,7 @@ export class ImportService {
this.schema = options.schema;
}
async import(collection: string, mimetype: string, stream: NodeJS.ReadableStream): Promise<void> {
async import(collection: string, mimetype: string, stream: Readable): Promise<void> {
if (this.accountability?.admin !== true && collection.startsWith('directus_')) throw new ForbiddenException();
const createPermissions = this.accountability?.permissions?.find(
@@ -63,7 +64,7 @@ export class ImportService {
}
}
importJSON(collection: string, stream: NodeJS.ReadableStream): Promise<void> {
importJSON(collection: string, stream: Readable): Promise<void> {
const extractJSON = StreamArray.withParser();
return this.knex.transaction((trx) => {
@@ -104,7 +105,7 @@ export class ImportService {
});
}
importCSV(collection: string, stream: NodeJS.ReadableStream): Promise<void> {
importCSV(collection: string, stream: Readable): Promise<void> {
return this.knex.transaction((trx) => {
const service = new ItemsService(collection, {
knex: trx,

View File

@@ -11,12 +11,13 @@ import { getDatabaseClient } from '../../src/database/index';
vi.mock('../env', async () => {
const actual = (await vi.importActual('../env')) as { default: Record<string, any> };
const MOCK_ENV = {
...actual.default,
CACHE_AUTO_PURGE: true,
};
return {
default: {
...actual.default,
CACHE_AUTO_PURGE: true,
},
default: MOCK_ENV,
getEnv: () => MOCK_ENV,
};
});

View File

@@ -9,13 +9,14 @@ import getDatabase, { hasDatabaseConnection } from '../database';
import env from '../env';
import logger from '../logger';
import { rateLimiter } from '../middleware/rate-limiter';
import storage from '../storage';
import { getStorage } from '../storage';
import { AbstractServiceOptions } from '../types';
import { Accountability, SchemaOverview } from '@directus/shared/types';
import { toArray } from '@directus/shared/utils';
import getMailer from '../mailer';
import { SettingsService } from './settings';
import { getOSInfo } from '../utils/get-os-info';
import { Readable } from 'node:stream';
export class ServerService {
knex: Knex;
@@ -286,10 +287,12 @@ export class ServerService {
}
async function testStorage(): Promise<Record<string, HealthCheck[]>> {
const storage = await getStorage();
const checks: Record<string, HealthCheck[]> = {};
for (const location of toArray(env.STORAGE_LOCATIONS)) {
const disk = storage.disk(location);
const disk = storage.location(location);
const envThresholdKey = `STORAGE_${location}_HEALTHCHECK_THRESHOLD`.toUpperCase();
checks[`storage:${location}:responseTime`] = [
{
@@ -304,8 +307,8 @@ export class ServerService {
const startTime = performance.now();
try {
await disk.put(`health-${checkID}`, 'check');
await disk.get(`health-${checkID}`);
await disk.write(`health-${checkID}`, Readable.from(['check']));
await disk.read(`health-${checkID}`);
await disk.delete(`health-${checkID}`);
} catch (err: any) {
checks[`storage:${location}:responseTime`][0].status = 'error';

View File

@@ -1,70 +0,0 @@
import { LocalFileSystemStorage, Storage, StorageManager, StorageManagerConfig } from '@directus/drive';
import { AzureBlobWebServicesStorage } from '@directus/drive-azure';
import { GoogleCloudStorage } from '@directus/drive-gcs';
import { AmazonWebServicesS3Storage } from '@directus/drive-s3';
import env from './env';
import { getConfigFromEnv } from './utils/get-config-from-env';
import { toArray } from '@directus/shared/utils';
import { validateEnv } from './utils/validate-env';
validateEnv(['STORAGE_LOCATIONS']);
const storage = new StorageManager(getStorageConfig());
registerDrivers(storage);
export default storage;
function getStorageConfig(): StorageManagerConfig {
const config: StorageManagerConfig = {
disks: {},
};
const locations = toArray(env.STORAGE_LOCATIONS);
locations.forEach((location: string) => {
location = location.trim();
const diskConfig = {
driver: env[`STORAGE_${location.toUpperCase()}_DRIVER`],
config: getConfigFromEnv(`STORAGE_${location.toUpperCase()}_`),
};
delete diskConfig.config.publicUrl;
delete diskConfig.config.driver;
config.disks![location] = diskConfig;
});
return config;
}
function registerDrivers(storage: StorageManager) {
const usedDrivers: string[] = [];
for (const [key, value] of Object.entries(env)) {
if ((key.startsWith('STORAGE') && key.endsWith('DRIVER')) === false) continue;
if (value && usedDrivers.includes(value) === false) usedDrivers.push(value);
}
usedDrivers.forEach((driver) => {
const storageDriver = getStorageDriver(driver);
if (storageDriver) {
storage.registerDriver<Storage>(driver, storageDriver);
}
});
}
function getStorageDriver(driver: string) {
switch (driver) {
case 'local':
return LocalFileSystemStorage;
case 's3':
return AmazonWebServicesS3Storage;
case 'gcs':
return GoogleCloudStorage;
case 'azure':
return AzureBlobWebServicesStorage;
}
}

View File

@@ -0,0 +1,21 @@
import { randWord } from '@ngneat/falso';
import { expect, test } from 'vitest';
import { getStorageDriver, _aliasMap } from './get-storage-driver.js';
test('Returns imported installed driver for each supported driver', async () => {
for (const driverKey of Object.keys(_aliasMap)) {
const driver = await getStorageDriver(driverKey);
expect(driver).not.toBeUndefined();
}
});
test('Throws error for key that is not supported', async () => {
const driverKey = `fake-${randWord()}`;
try {
await getStorageDriver(driverKey);
} catch (err: any) {
expect(err).toBeInstanceOf(Error);
expect(err.message).toBe(`Driver "${driverKey}" doesn't exist.`);
}
});

View File

@@ -0,0 +1,20 @@
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { Driver } from '@directus/storage';
export const _aliasMap: Record<string, string> = {
local: '@directus/storage-driver-local',
s3: '@directus/storage-driver-s3',
gcs: '@directus/storage-driver-gcs',
azure: '@directus/storage-driver-azure',
cloudinary: '@directus/storage-driver-cloudinary',
};
export const getStorageDriver = async (driverName: string): Promise<typeof Driver> => {
if (driverName in _aliasMap) {
driverName = _aliasMap[driverName];
} else {
throw new Error(`Driver "${driverName}" doesn't exist.`);
}
return (await import(driverName)).default;
};

View File

@@ -0,0 +1,58 @@
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import { StorageManager } from '@directus/storage';
import { afterEach, beforeEach, expect, test, vi } from 'vitest';
import { validateEnv } from '../utils/validate-env.js';
import { getStorage, _cache } from './index.js';
import { registerDrivers } from './register-drivers.js';
import { registerLocations } from './register-locations.js';
vi.mock('@directus/storage');
vi.mock('./register-drivers.js');
vi.mock('./register-locations.js');
vi.mock('../utils/validate-env.js');
let mockStorage: StorageManager;
beforeEach(() => {
mockStorage = {} as StorageManager;
_cache.storage = null;
vi.mocked(StorageManager).mockReturnValue(mockStorage);
});
afterEach(() => {
vi.resetAllMocks();
});
test('Returns storage from cache immediately if cache has been filled', async () => {
_cache.storage = mockStorage;
expect(await getStorage());
});
test('Validates STORAGE_LOCATIONS to exist in env', async () => {
await getStorage();
expect(validateEnv).toHaveBeenCalledWith(['STORAGE_LOCATIONS']);
});
test('Creates new StorageManager instance in cache', async () => {
await getStorage();
expect(StorageManager).toHaveBeenCalledOnce();
expect(StorageManager).toHaveBeenCalledWith();
expect(_cache.storage).toBe(mockStorage);
});
test('Registers drivers against cached storage manager', async () => {
await getStorage();
expect(registerDrivers).toHaveBeenCalledWith(_cache.storage);
});
test('Registers locations against cached storage manager', async () => {
await getStorage();
expect(registerLocations).toHaveBeenCalledWith(_cache.storage);
});
test('Returns cached storage manager', async () => {
const storage = await getStorage();
expect(storage).toBe(_cache.storage);
expect(storage).toBe(mockStorage);
});

25
api/src/storage/index.ts Normal file
View File

@@ -0,0 +1,25 @@
import { validateEnv } from '../utils/validate-env';
import { registerDrivers } from './register-drivers';
import { registerLocations } from './register-locations';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { StorageManager } from '@directus/storage';
export const _cache: { storage: any | null } = {
storage: null,
};
export const getStorage = async (): Promise<StorageManager> => {
if (_cache.storage) return _cache.storage;
const { StorageManager } = await import('@directus/storage');
validateEnv(['STORAGE_LOCATIONS']);
_cache.storage = new StorageManager();
await registerDrivers(_cache.storage);
await registerLocations(_cache.storage);
return _cache.storage;
};

View File

@@ -0,0 +1,85 @@
import { randWord } from '@ngneat/falso';
import { afterEach, beforeEach, expect, test, vi } from 'vitest';
import { getEnv } from '../env.js';
import { getStorageDriver } from './get-storage-driver.js';
import { registerDrivers } from './register-drivers.js';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { Driver, StorageManager } from '@directus/storage';
vi.mock('./get-storage-driver.js');
vi.mock('../env');
let mockStorage: StorageManager;
let mockDriver: typeof Driver;
let sample: {
name: string;
};
beforeEach(() => {
mockStorage = {
registerDriver: vi.fn(),
} as unknown as StorageManager;
mockDriver = {} as unknown as typeof Driver;
vi.mocked(getStorageDriver).mockResolvedValue(mockDriver);
sample = {
name: randWord(),
};
});
afterEach(() => {
vi.resetAllMocks();
});
test('Does nothing if no storage drivers are configured in Env', async () => {
vi.mocked(getEnv).mockReturnValue({});
await registerDrivers(mockStorage);
expect(mockStorage.registerDriver).toHaveBeenCalledTimes(0);
});
test('Ignores environment variables that do not start with STORAGE_ and end with _DRIVER', async () => {
vi.mocked(getEnv).mockReturnValue({
[`NOSTORAGE_${randWord().toUpperCase()}_DRIVER`]: randWord(),
[`STORAGE_${randWord().toUpperCase()}_NODRIVER`]: randWord(),
});
await registerDrivers(mockStorage);
expect(mockStorage.registerDriver).toHaveBeenCalledTimes(0);
});
test('Only registers driver once per library', async () => {
vi.mocked(getEnv).mockReturnValue({
[`STORAGE_${randWord().toUpperCase()}_DRIVER`]: sample.name,
[`STORAGE_${randWord().toUpperCase()}_DRIVER`]: sample.name,
});
await registerDrivers(mockStorage);
expect(mockStorage.registerDriver).toHaveBeenCalledOnce();
});
test('Gets storage driver for name', async () => {
vi.mocked(getEnv).mockReturnValue({
[`STORAGE_${randWord().toUpperCase()}_DRIVER`]: sample.name,
});
await registerDrivers(mockStorage);
expect(getStorageDriver).toHaveBeenCalledWith(sample.name);
});
test('Registers storage driver to manager', async () => {
vi.mocked(getEnv).mockReturnValue({
[`STORAGE_${randWord().toUpperCase()}_DRIVER`]: sample.name,
});
await registerDrivers(mockStorage);
expect(mockStorage.registerDriver).toHaveBeenCalledWith(sample.name, mockDriver);
});

View File

@@ -0,0 +1,24 @@
import { getEnv } from '../env';
import { getStorageDriver } from './get-storage-driver';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { StorageManager } from '@directus/storage';
export const registerDrivers = async (storage: StorageManager) => {
const env = getEnv();
const usedDrivers: string[] = [];
for (const [key, value] of Object.entries(env)) {
if ((key.startsWith('STORAGE_') && key.endsWith('_DRIVER')) === false) continue;
if (value && usedDrivers.includes(value) === false) usedDrivers.push(value);
}
for (const driverName of usedDrivers) {
const storageDriver = await getStorageDriver(driverName);
if (storageDriver) {
storage.registerDriver(driverName, storageDriver);
}
}
};

View File

@@ -0,0 +1,82 @@
import { toArray } from '@directus/shared/utils';
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { StorageManager } from '@directus/storage';
import { randNumber, randWord } from '@ngneat/falso';
import { afterEach, beforeEach, expect, test, vi } from 'vitest';
import { getEnv } from '../env.js';
import { getConfigFromEnv } from '../utils/get-config-from-env.js';
import { registerLocations } from './register-locations.js';
vi.mock('../env.js');
vi.mock('@directus/shared/utils');
vi.mock('../utils/get-config-from-env.js');
let sample: {
options: {
[location: string]: {
[key: string]: string;
};
};
locations: string[];
};
let mockStorage: StorageManager;
beforeEach(() => {
sample = {
options: {},
locations: randWord({ length: randNumber({ min: 1, max: 10 }) }),
};
sample.locations.forEach((location) => {
const keys = randWord({ length: randNumber({ min: 1, max: 10 }) });
const values = randWord({ length: keys.length });
sample.options[`STORAGE_${location.toUpperCase()}_`] = {
driver: randWord(),
};
keys.forEach((key, index) => (sample.options[`STORAGE_${location.toUpperCase()}_`][key] = values[index]));
});
mockStorage = {
registerLocation: vi.fn(),
} as unknown as StorageManager;
vi.mocked(getConfigFromEnv).mockImplementation((name) => sample.options[name]);
vi.mocked(getEnv).mockReturnValue({
STORAGE_LOCATIONS: sample.locations.join(', '),
});
vi.mocked(toArray).mockReturnValue(sample.locations);
});
afterEach(() => {
vi.resetAllMocks();
});
test('Converts storage locations env var to array', async () => {
await registerLocations(mockStorage);
expect(toArray).toHaveBeenCalledWith(sample.locations.join(', '));
});
test('Gets config for each location', async () => {
await registerLocations(mockStorage);
expect(getConfigFromEnv).toHaveBeenCalledTimes(sample.locations.length);
sample.locations.forEach((location) =>
expect(getConfigFromEnv).toHaveBeenCalledWith(`STORAGE_${location.toUpperCase()}_`)
);
});
test('Registers location with driver options for each location', async () => {
await registerLocations(mockStorage);
expect(mockStorage.registerLocation).toHaveBeenCalledTimes(sample.locations.length);
sample.locations.forEach((location) => {
const { driver, ...options } = sample.options[`STORAGE_${location.toUpperCase()}_`];
expect(mockStorage.registerLocation).toHaveBeenCalledWith(location, {
driver,
options,
});
});
});

View File

@@ -0,0 +1,19 @@
// @ts-expect-error https://github.com/microsoft/TypeScript/issues/49721
import type { StorageManager } from '@directus/storage';
import { toArray } from '@directus/shared/utils';
import { getEnv } from '../env';
import { getConfigFromEnv } from '../utils/get-config-from-env';
export const registerLocations = async (storage: StorageManager) => {
const env = getEnv();
const locations = toArray(env.STORAGE_LOCATIONS);
locations.forEach((location: string) => {
location = location.trim();
const driverConfig = getConfigFromEnv(`STORAGE_${location.toUpperCase()}_`);
const { driver, ...options } = driverConfig;
storage.registerLocation(location, { driver, options });
});
};

View File

@@ -11,3 +11,8 @@ declare module 'pino-http' {
req: (req: any) => Record<string, any>;
};
}
declare module 'icc' {
const parse: (buf: Buffer) => Record<string, string>;
export { parse };
}

View File

@@ -1,14 +1,18 @@
import { getConfigFromEnv } from '../../src/utils/get-config-from-env';
import { describe, test, expect, vi } from 'vitest';
vi.mock('../../src/env', () => ({
default: {
vi.mock('../../src/env', () => {
const MOCK_ENV = {
OBJECT_BRAND__COLOR: 'purple',
OBJECT_BRAND__HEX: '#6644FF',
CAMELCASE_OBJECT__FIRST_KEY: 'firstValue',
CAMELCASE_OBJECT__SECOND_KEY: 'secondValue',
},
}));
};
return {
default: MOCK_ENV,
getEnv: () => MOCK_ENV,
};
});
describe('get config from env', () => {
test('Keys with double underscore should be an object', () => {

View File

@@ -1,12 +1,14 @@
import camelcase from 'camelcase';
import { set } from 'lodash';
import env from '../env';
import { getEnv } from '../env';
export function getConfigFromEnv(
prefix: string,
omitPrefix?: string | string[],
type: 'camelcase' | 'underscore' = 'camelcase'
): Record<string, any> {
const env = getEnv();
const config: any = {};
for (const [key, value] of Object.entries(env)) {

View File

@@ -0,0 +1,83 @@
const IPTC_ENTRY_TYPES = new Map([
[0x78, 'caption'],
[0x6e, 'credit'],
[0x19, 'keywords'],
[0x37, 'dateCreated'],
[0x50, 'byline'],
[0x55, 'bylineTitle'],
[0x7a, 'captionWriter'],
[0x69, 'headline'],
[0x74, 'copyright'],
[0x0f, 'category'],
]);
const IPTC_ENTRY_MARKER = Buffer.from([0x1c, 0x02]);
export function parseIptc(buffer: Buffer): Record<string, unknown> {
if (!Buffer.isBuffer(buffer)) return {};
const iptc: Record<string, any> = {};
let lastIptcEntryPos = buffer.indexOf(IPTC_ENTRY_MARKER);
while (lastIptcEntryPos !== -1) {
lastIptcEntryPos = buffer.indexOf(IPTC_ENTRY_MARKER, lastIptcEntryPos + IPTC_ENTRY_MARKER.byteLength);
const iptcBlockTypePos = lastIptcEntryPos + IPTC_ENTRY_MARKER.byteLength;
const iptcBlockSizePos = iptcBlockTypePos + 1;
const iptcBlockDataPos = iptcBlockSizePos + 2;
const iptcBlockType = buffer.readUInt8(iptcBlockTypePos);
const iptcBlockSize = buffer.readUInt16BE(iptcBlockSizePos);
if (!IPTC_ENTRY_TYPES.has(iptcBlockType)) {
continue;
}
const iptcBlockTypeId = IPTC_ENTRY_TYPES.get(iptcBlockType);
const iptcData = buffer.subarray(iptcBlockDataPos, iptcBlockDataPos + iptcBlockSize).toString();
if (iptcBlockTypeId) {
if (iptc[iptcBlockTypeId] == null) {
iptc[iptcBlockTypeId] = iptcData;
} else if (Array.isArray(iptc[iptcBlockTypeId])) {
iptc[iptcBlockTypeId].push(iptcData);
} else {
iptc[iptcBlockTypeId] = [iptc[iptcBlockTypeId], iptcData];
}
}
}
return iptc;
}
export function parseXmp(buffer: Buffer): Record<string, unknown> {
const xmp: Record<string, unknown> = {};
['title', 'description', 'rights', 'creator', 'subject'].forEach((x) => {
const tagRegex = new RegExp(`<dc:${x}>(.*?)</dc:${x}>`, 'smig'),
tagMatches = tagRegex.exec(buffer.toString());
if (!tagMatches || tagMatches.length === 0) {
return;
}
const value = tagMatches[1].trim();
if (value.toLowerCase().indexOf('<rdf:bag>') === 0) {
const r = new RegExp('<rdf:li>(.*?)</rdf:li>', 'smig');
let match = r.exec(value);
const result = [];
while (match) {
result.push(match[1]);
match = r.exec(value);
}
xmp[x] = result;
} else {
xmp[x] = value.replace(/<[^>]*>?/gm, '').trim();
}
});
return xmp;
}

View File

@@ -1,7 +1,9 @@
import env from '../env';
import { getEnv } from '../env';
import logger from '../logger';
export function validateEnv(requiredKeys: string[]): void {
const env = getEnv();
for (const requiredKey of requiredKeys) {
if (requiredKey in env === false) {
logger.error(`"${requiredKey}" Environment Variable is missing.`);

View File

@@ -2,10 +2,11 @@ module.exports = {
preset: 'ts-jest',
testEnvironment: 'node',
moduleNameMapper: {
'@directus/drive-s3(.*)$': `${__dirname}/packages/drive-s3/src/$1`,
'@directus/drive-gcs(.*)$': `${__dirname}/packages/drive-gcs/src/$1`,
'@directus/drive-azure(.*)$': `${__dirname}/packages/drive-azure/src/$1`,
'@directus/drive(.*)$': `${__dirname}/packages/drive/src/$1`,
'@directus/storage-driver-s3(.*)$': `${__dirname}/packages/storage-driver-s3/src/$1`,
'@directus/storage-driver-gcs(.*)$': `${__dirname}/packages/storage-driver-gcs/src/$1`,
'@directus/storage-driver-azure(.*)$': `${__dirname}/packages/storage-driver-azure/src/$1`,
'@directus/storage-driver-cloudinary(.*)$': `${__dirname}/packages/storage-driver-cloudinary/src/$1`,
'@directus/storage(.*)$': `${__dirname}/packages/storage/src/$1`,
'@directus/extension-sdk(.*)$': `${__dirname}/packages/extension-sdk/src/$1`,
'@directus/schema(.*)$': `${__dirname}/packages/schema/src/$1`,
'@directus/shared(.*)$': `${__dirname}/packages/shared/src/$1`,

View File

@@ -1,12 +0,0 @@
root=true
[*]
end_of_line = lf
insert_final_newline = true
charset = utf-8
indent_style = tab
trim_trailing_whitespace = true
[{package.json,*.yml,*.yaml}]
indent_style = space
indent_size = 2

View File

@@ -1,10 +0,0 @@
node_modules
/_*
/coverage
/dist*
/src/**/*.js
/*.d.ts
/*.d.ts.map
*.tgz

View File

@@ -1,9 +0,0 @@
require('dotenv').config();
module.exports = {
preset: 'ts-jest',
verbose: true,
setupFiles: ['dotenv/config'],
testURL: process.env.TEST_URL || 'http://localhost',
collectCoverageFrom: ['src/**/*.ts'],
};

View File

@@ -1,52 +0,0 @@
{
"name": "@directus/drive-azure",
"version": "9.21.2",
"description": "Azure Blob driver for @directus/drive",
"homepage": "https://directus.io",
"bugs": {
"url": "https://github.com/directus/directus/issues"
},
"repository": {
"type": "git",
"url": "https://github.com/directus/directus.git",
"directory": "packages/drive-azure"
},
"funding": "https://github.com/directus/directus?sponsor=1",
"license": "MIT",
"author": "Robin Grundvåg <robgru52@gmail.com>",
"contributors": [
"Rijk van Zanten <rijkvanzanten@me.com>"
],
"exports": {
".": "./dist/index.js",
"./package.json": "./package.json"
},
"main": "dist/index.js",
"files": [
"dist",
"!**/*.d.ts?(.map)"
],
"scripts": {
"build": "tsc --project ./tsconfig.json",
"test:watch": "jest --coverage --watchAll",
"test": "jest --coverage",
"dev": "pnpm build -w --preserveWatchOutput --incremental"
},
"dependencies": {
"@azure/storage-blob": "12.12.0",
"@directus/drive": "workspace:*",
"normalize-path": "3.0.0"
},
"devDependencies": {
"@types/jest": "29.2.3",
"@types/node": "18.11.9",
"@types/normalize-path": "3.0.0",
"dotenv": "16.0.3",
"jest": "29.3.1",
"ts-jest": "29.0.3",
"typescript": "4.9.3"
},
"publishConfig": {
"access": "public"
}
}

View File

@@ -1,3 +0,0 @@
# @directus/drive-azure
Azure storage layer for `@directus/drive`

View File

@@ -1,266 +0,0 @@
import {
Storage,
UnknownException,
FileNotFound,
SignedUrlOptions,
Response,
ExistsResponse,
ContentResponse,
SignedUrlResponse,
StatResponse,
FileListResponse,
DeleteResponse,
isReadableStream,
Range,
} from '@directus/drive';
import {
BlobServiceClient,
ContainerClient,
StorageSharedKeyCredential,
generateBlobSASQueryParameters,
ContainerSASPermissions,
} from '@azure/storage-blob';
import path from 'path';
import { PassThrough, Readable } from 'stream';
import normalize from 'normalize-path';
function handleError(err: Error, path: string): Error {
return new UnknownException(err, err.name, path);
}
export class AzureBlobWebServicesStorage extends Storage {
protected $client: BlobServiceClient;
protected $containerClient: ContainerClient;
protected $signedCredentials: StorageSharedKeyCredential;
protected $root: string;
constructor(config: AzureBlobWebServicesStorageConfig) {
super();
this.$signedCredentials = new StorageSharedKeyCredential(config.accountName, config.accountKey);
this.$client = new BlobServiceClient(
config.endpoint ?? `https://${config.accountName}.blob.core.windows.net`,
this.$signedCredentials
);
this.$containerClient = this.$client.getContainerClient(config.containerName);
this.$root = config.root ? normalize(config.root).replace(/^\//, '') : '';
}
/**
* Prefixes the given filePath with the storage root location
*/
protected _fullPath(filePath: string): string {
return normalize(path.join(this.$root, filePath));
}
public async copy(src: string, dest: string): Promise<Response> {
src = this._fullPath(src);
dest = this._fullPath(dest);
try {
const source = this.$containerClient.getBlockBlobClient(src);
const target = this.$containerClient.getBlockBlobClient(dest);
const poller = await target.beginCopyFromURL(source.url);
const result = await poller.pollUntilDone();
return { raw: result };
} catch (e: any) {
throw handleError(e, src);
}
}
public async delete(location: string): Promise<DeleteResponse> {
location = this._fullPath(location);
try {
const result = await this.$containerClient.getBlockBlobClient(location).deleteIfExists();
return { raw: result, wasDeleted: result.succeeded };
} catch (e: any) {
throw handleError(e, location);
}
}
public driver(): BlobServiceClient {
return this.$client;
}
public async exists(location: string): Promise<ExistsResponse> {
location = this._fullPath(location);
try {
const result = await this.$containerClient.getBlockBlobClient(location).exists();
return { exists: result, raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
public async get(location: string, encoding: BufferEncoding = 'utf-8'): Promise<ContentResponse<string>> {
try {
const bufferResult = await this.getBuffer(location);
return {
content: bufferResult.content.toString(encoding),
raw: bufferResult.raw,
};
} catch (e: any) {
throw new FileNotFound(e, location);
}
}
public async getBuffer(location: string): Promise<ContentResponse<Buffer>> {
location = this._fullPath(location);
try {
const client = this.$containerClient.getBlobClient(location);
return { content: await client.downloadToBuffer(), raw: client };
} catch (e: any) {
throw handleError(e, location);
}
}
public async getSignedUrl(location: string, options: SignedUrlOptions = {}): Promise<SignedUrlResponse> {
location = this._fullPath(location);
const { expiry = 900 } = options;
try {
const client = this.$containerClient.getBlobClient(location);
const blobSAS = generateBlobSASQueryParameters(
{
containerName: this.$containerClient.containerName,
blobName: location,
permissions: ContainerSASPermissions.parse('racwdl'),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + expiry),
},
this.$signedCredentials
).toString();
const sasUrl = client.url + '?' + blobSAS;
return { signedUrl: sasUrl, raw: client };
} catch (e: any) {
throw handleError(e, location);
}
}
public async getStat(location: string): Promise<StatResponse> {
location = this._fullPath(location);
try {
const props = await this.$containerClient.getBlobClient(location).getProperties();
return {
size: props.contentLength as number,
modified: props.lastModified as Date,
raw: props,
};
} catch (e: any) {
throw handleError(e, location);
}
}
public getStream(location: string, range?: Range): NodeJS.ReadableStream {
location = this._fullPath(location);
const intermediateStream = new PassThrough({ highWaterMark: 1 });
const stream = this.$containerClient
.getBlobClient(location)
.download(range?.start, range?.end ? range.end - (range.start || 0) : undefined);
try {
stream
.then((result) => result.readableStreamBody)
.then((stream) => {
if (!stream) {
throw handleError(new Error('Blobclient stream was not available'), location);
}
stream.pipe(intermediateStream);
})
.catch((error) => {
intermediateStream.emit('error', error);
});
} catch (error: any) {
intermediateStream.emit('error', error);
}
return intermediateStream;
}
public getUrl(location: string): string {
location = this._fullPath(location);
return this.$containerClient.getBlobClient(location).url;
}
public async move(src: string, dest: string): Promise<Response> {
src = this._fullPath(src);
dest = this._fullPath(dest);
const source = this.$containerClient.getBlockBlobClient(src);
const target = this.$containerClient.getBlockBlobClient(dest);
const poller = await target.beginCopyFromURL(source.url);
const result = await poller.pollUntilDone();
await source.deleteIfExists();
return { raw: result };
}
public async put(
location: string,
content: Buffer | NodeJS.ReadableStream | string,
type?: string
): Promise<Response> {
location = this._fullPath(location);
const blockBlobClient = this.$containerClient.getBlockBlobClient(location);
try {
if (isReadableStream(content)) {
const result = await blockBlobClient.uploadStream(content as Readable, undefined, undefined, {
blobHTTPHeaders: { blobContentType: type ?? 'application/octet-stream' },
});
return { raw: result };
}
const result = await blockBlobClient.upload(content, content.length);
return { raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
public async *flatList(prefix = ''): AsyncIterable<FileListResponse> {
prefix = this._fullPath(prefix);
try {
const blobs = this.$containerClient.listBlobsFlat({
prefix,
});
for await (const blob of blobs) {
yield {
raw: blob,
path: (blob.name as string).substring(this.$root.length),
};
}
} catch (e: any) {
throw handleError(e, prefix);
}
}
}
export interface AzureBlobWebServicesStorageConfig {
containerName: string;
accountName: string;
accountKey: string;
endpoint?: string;
root?: string;
}

View File

@@ -1 +0,0 @@
export * from './AzureBlobWebServices';

View File

@@ -1 +0,0 @@
storage

View File

@@ -1,27 +0,0 @@
import { StorageManager } from '@directus/drive';
import { AzureBlobWebServicesStorage } from '../src';
describe('drive', function () {
it('Instantiate', function () {
const storage = new StorageManager({
default: 'azure',
disks: {
remote: {
driver: 'azure',
config: {
containerName: 'containerName',
accountName: 'accountName',
accountKey: 'accountKey',
endpoint: 'http://localhost/accountName',
root: '/',
},
},
},
});
storage.registerDriver('azure', AzureBlobWebServicesStorage);
const disk = storage.disk('remote');
expect(disk).toBeInstanceOf(AzureBlobWebServicesStorage);
});
});

View File

@@ -1,7 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"rootDir": ".."
},
"include": [".", "../src"]
}

View File

@@ -1,32 +0,0 @@
{
"compilerOptions": {
"target": "ES2018",
"lib": ["ES2018"],
"module": "CommonJS",
"moduleResolution": "node",
"declaration": true,
"declarationMap": true,
"strict": true,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": true,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"rootDir": "./src",
"outDir": "./dist"
},
"include": ["./src/**/*.ts"]
}

View File

@@ -1,12 +0,0 @@
root=true
[*]
end_of_line = lf
insert_final_newline = true
charset = utf-8
indent_style = tab
trim_trailing_whitespace = true
[{package.json,*.yml,*.yaml}]
indent_style = space
indent_size = 2

View File

@@ -1,10 +0,0 @@
node_modules
/_*
/coverage
/dist*
/src/**/*.js
/*.d.ts
/*.d.ts.map
*.tgz

View File

@@ -1,14 +0,0 @@
Copyright 2020 - Slynova Romain Lanz
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit
persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -1,9 +0,0 @@
require('dotenv').config();
module.exports = {
preset: 'ts-jest',
verbose: true,
setupFiles: ['dotenv/config'],
testURL: process.env.TEST_URL || 'http://localhost',
collectCoverageFrom: ['src/**/*.ts'],
};

View File

@@ -1,54 +0,0 @@
{
"name": "@directus/drive-gcs",
"version": "9.21.2",
"description": "Google Cloud Storage driver for @directus/drive",
"homepage": "https://directus.io",
"bugs": {
"url": "https://github.com/directus/directus/issues"
},
"repository": {
"type": "git",
"url": "https://github.com/directus/directus.git",
"directory": "packages/drive-gcs"
},
"funding": "https://github.com/directus/directus?sponsor=1",
"license": "MIT",
"author": "Robin Grundvåg <robgru52@gmail.com>",
"contributors": [
"Rijk van Zanten <rijkvanzanten@me.com>"
],
"exports": {
".": "./dist/index.js",
"./package.json": "./package.json"
},
"main": "dist/index.js",
"files": [
"dist",
"!**/*.d.ts?(.map)"
],
"scripts": {
"build": "tsc --project ./tsconfig.json",
"test:watch": "jest --coverage --watchAll",
"test": "jest --coverage",
"dev": "pnpm build -w --preserveWatchOutput --incremental"
},
"dependencies": {
"@directus/drive": "workspace:*",
"@google-cloud/storage": "6.7.0",
"lodash": "4.17.21",
"normalize-path": "3.0.0"
},
"devDependencies": {
"@types/jest": "29.2.3",
"@types/lodash": "4.14.189",
"@types/node": "18.11.9",
"@types/normalize-path": "3.0.0",
"dotenv": "16.0.3",
"jest": "29.3.1",
"ts-jest": "29.0.3",
"typescript": "4.9.3"
},
"publishConfig": {
"access": "public"
}
}

View File

@@ -1,3 +0,0 @@
# @directus/drive-gcs
Google Cloud Storage storage layer for `@directus/drive`

View File

@@ -1,274 +0,0 @@
import {
Storage as GCSDriver,
StorageOptions,
Bucket,
File,
GetFilesOptions,
GetFilesResponse,
} from '@google-cloud/storage';
import {
Storage,
isReadableStream,
pipeline,
Response,
ExistsResponse,
ContentResponse,
SignedUrlResponse,
SignedUrlOptions,
StatResponse,
FileListResponse,
DeleteResponse,
FileNotFound,
PermissionMissing,
UnknownException,
AuthorizationRequired,
WrongKeyPath,
Range,
} from '@directus/drive';
import path from 'path';
import normalize from 'normalize-path';
import { mapKeys, snakeCase } from 'lodash';
function handleError(err: Error & { code?: number | string }, path: string): Error {
switch (err.code) {
case 401:
return new AuthorizationRequired(err, path);
case 403:
return new PermissionMissing(err, path);
case 404:
return new FileNotFound(err, path);
case 'ENOENT':
return new WrongKeyPath(err, path);
default:
return new UnknownException(err, String(err.code), path);
}
}
export class GoogleCloudStorage extends Storage {
protected $config: GoogleCloudStorageConfig;
protected $driver: GCSDriver;
protected $bucket: Bucket;
protected $root: string;
public constructor(config: GoogleCloudStorageConfig) {
super();
// This is necessary as only credentials are in snake_case, not camelCase. Ref #8601
if (config.credentials) {
config.credentials = mapKeys(config.credentials, (_value, key) => snakeCase(key));
}
this.$config = config;
const GCSStorage = require('@google-cloud/storage').Storage;
this.$driver = new GCSStorage(config);
this.$bucket = this.$driver.bucket(config.bucket);
this.$root = config.root ? normalize(config.root).replace(/^\//, '') : '';
}
/**
* Prefixes the given filePath with the storage root location
*/
protected _fullPath(filePath: string): string {
return normalize(path.join(this.$root, filePath));
}
private _file(filePath: string): File {
return this.$bucket.file(this._fullPath(filePath));
}
/**
* Copy a file to a location.
*/
public async copy(src: string, dest: string): Promise<Response> {
const srcFile = this._file(src);
const destFile = this._file(dest);
try {
const result = await srcFile.copy(destFile);
return { raw: result };
} catch (e: any) {
throw handleError(e, src);
}
}
/**
* Delete existing file.
*/
public async delete(location: string): Promise<DeleteResponse> {
try {
const result = await this._file(location).delete();
return { raw: result, wasDeleted: true };
} catch (e: any) {
const error = handleError(e, location);
if (error instanceof FileNotFound) {
return { raw: undefined, wasDeleted: false };
}
throw error;
}
}
/**
* Returns the driver.
*/
public driver(): GCSDriver {
return this.$driver;
}
/**
* Determines if a file or folder already exists.
*/
public async exists(location: string): Promise<ExistsResponse> {
try {
const result = await this._file(location).exists();
return { exists: result[0], raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns the file contents.
*/
public async get(location: string, encoding: BufferEncoding = 'utf-8'): Promise<ContentResponse<string>> {
try {
const result = await this._file(location).download();
return { content: result[0].toString(encoding), raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns the file contents as Buffer.
*/
public async getBuffer(location: string): Promise<ContentResponse<Buffer>> {
try {
const result = await this._file(location).download();
return { content: result[0], raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns signed url for an existing file.
*/
public async getSignedUrl(location: string, options: SignedUrlOptions = {}): Promise<SignedUrlResponse> {
const { expiry = 900 } = options;
try {
const result = await this._file(location).getSignedUrl({
action: 'read',
expires: Date.now() + expiry * 1000,
});
return { signedUrl: result[0], raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns file's size and modification date.
*/
public async getStat(location: string): Promise<StatResponse> {
try {
const result = await this._file(location).getMetadata();
return {
size: Number(result[0].size),
modified: new Date(result[0].updated),
raw: result,
};
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns the stream for the given file.
*/
public getStream(location: string, range?: Range): NodeJS.ReadableStream {
return this._file(location).createReadStream({ start: range?.start, end: range?.end });
}
/**
* Returns URL for a given location. Note this method doesn't
* validates the existence of file or it's visibility
* status.
*/
public getUrl(location: string): string {
return `https://storage.googleapis.com/${this.$bucket.name}/${this._fullPath(location)}`;
}
/**
* Move file to a new location.
*/
public async move(src: string, dest: string): Promise<Response> {
const srcFile = this._file(src);
const destFile = this._file(dest);
try {
const result = await srcFile.move(destFile);
return { raw: result };
} catch (e: any) {
throw handleError(e, src);
}
}
/**
* Creates a new file.
* This method will create missing directories on the fly.
*/
public async put(location: string, content: Buffer | NodeJS.ReadableStream | string): Promise<Response> {
const file = this._file(location);
try {
if (isReadableStream(content)) {
const destStream = file.createWriteStream({ resumable: false });
await pipeline(content, destStream);
return { raw: undefined };
}
const result = await file.save(content, { resumable: false });
return { raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Iterate over all files in the bucket.
*/
public async *flatList(prefix = ''): AsyncIterable<FileListResponse> {
prefix = this._fullPath(prefix);
let nextQuery: GetFilesOptions | undefined = {
prefix,
autoPaginate: false,
maxResults: 1000,
};
do {
try {
const result = (await this.$bucket.getFiles(nextQuery)) as GetFilesResponse;
nextQuery = result[1];
for (const file of result[0]) {
yield {
raw: file.metadata,
path: file.name.substring(this.$root.length),
};
}
} catch (e: any) {
throw handleError(e, prefix);
}
} while (nextQuery);
}
}
export interface GoogleCloudStorageConfig extends StorageOptions {
bucket: string;
root?: string;
}

View File

@@ -1 +0,0 @@
export * from './GoogleCloudStorage';

View File

@@ -1 +0,0 @@
storage

View File

@@ -1,23 +0,0 @@
import { StorageManager } from '@directus/drive';
import { GoogleCloudStorage, GoogleCloudStorageConfig } from '../src';
describe('drive', function () {
it('Instantiate', function () {
const storage = new StorageManager({
default: 'gcs',
disks: {
remote: {
driver: 'gcs',
config: {
bucket: 'bucket',
} as GoogleCloudStorageConfig,
},
},
});
storage.registerDriver('gcs', GoogleCloudStorage);
const disk = storage.disk('remote');
expect(disk).toBeInstanceOf(GoogleCloudStorage);
});
});

View File

@@ -1,7 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"rootDir": ".."
},
"include": [".", "../src"]
}

View File

@@ -1,32 +0,0 @@
{
"compilerOptions": {
"target": "ES2018",
"lib": ["ES2018"],
"module": "CommonJS",
"moduleResolution": "node",
"declaration": true,
"declarationMap": true,
"strict": true,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": true,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"rootDir": "./src",
"outDir": "./dist"
},
"include": ["./src/**/*.ts"]
}

View File

@@ -1,12 +0,0 @@
root=true
[*]
end_of_line = lf
insert_final_newline = true
charset = utf-8
indent_style = tab
trim_trailing_whitespace = true
[{package.json,*.yml,*.yaml}]
indent_style = space
indent_size = 2

View File

@@ -1,10 +0,0 @@
node_modules
/_*
/coverage
/dist*
/src/**/*.js
/*.d.ts
/*.d.ts.map
*.tgz

View File

@@ -1,14 +0,0 @@
Copyright 2020 - Slynova Romain Lanz
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit
persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -1,9 +0,0 @@
require('dotenv').config();
module.exports = {
preset: 'ts-jest',
verbose: true,
setupFiles: ['dotenv/config'],
testURL: process.env.TEST_URL || 'http://localhost',
collectCoverageFrom: ['src/**/*.ts'],
};

View File

@@ -1,52 +0,0 @@
{
"name": "@directus/drive-s3",
"version": "9.21.2",
"description": "AWS S3 driver for @directus/drive",
"homepage": "https://directus.io",
"bugs": {
"url": "https://github.com/directus/directus/issues"
},
"repository": {
"type": "git",
"url": "https://github.com/directus/directus.git",
"directory": "packages/drive-s3"
},
"funding": "https://github.com/directus/directus?sponsor=1",
"license": "MIT",
"author": "Robin Grundvåg <robgru52@gmail.com>",
"contributors": [
"Rijk van Zanten <rijkvanzanten@me.com>"
],
"exports": {
".": "./dist/index.js",
"./package.json": "./package.json"
},
"main": "dist/index.js",
"files": [
"dist",
"!**/*.d.ts?(.map)"
],
"scripts": {
"build": "tsc --project ./tsconfig.json",
"test:watch": "jest --coverage --watchAll",
"test": "jest --coverage",
"dev": "pnpm build -w --preserveWatchOutput --incremental"
},
"dependencies": {
"@directus/drive": "workspace:*",
"aws-sdk": "2.1256.0",
"normalize-path": "3.0.0"
},
"devDependencies": {
"@types/jest": "29.2.3",
"@types/node": "18.11.9",
"@types/normalize-path": "3.0.0",
"dotenv": "16.0.3",
"jest": "29.3.1",
"ts-jest": "29.0.3",
"typescript": "4.9.3"
},
"publishConfig": {
"access": "public"
}
}

View File

@@ -1,3 +0,0 @@
# @directus/drive-s3
S3 storage layer for `@directus/drive`

View File

@@ -1,320 +0,0 @@
import S3, { ClientConfiguration, ObjectList } from 'aws-sdk/clients/s3';
import {
Storage,
UnknownException,
NoSuchBucket,
FileNotFound,
PermissionMissing,
SignedUrlOptions,
Response,
ExistsResponse,
ContentResponse,
SignedUrlResponse,
StatResponse,
FileListResponse,
DeleteResponse,
Range,
} from '@directus/drive';
import path from 'path';
import normalize from 'normalize-path';
function handleError(err: Error, path: string, bucket: string): Error {
switch (err.name) {
case 'NoSuchBucket':
return new NoSuchBucket(err, bucket);
case 'NoSuchKey':
return new FileNotFound(err, path);
case 'AllAccessDisabled':
return new PermissionMissing(err, path);
default:
return new UnknownException(err, err.name, path);
}
}
export class AmazonWebServicesS3Storage extends Storage {
protected $driver: S3;
protected $bucket: string;
protected $root: string;
protected $acl?: string;
protected $serverSideEncryption?: string;
constructor(config: AmazonWebServicesS3StorageConfig) {
super();
const S3 = require('aws-sdk/clients/s3');
this.$driver = new S3({
accessKeyId: config.key,
secretAccessKey: config.secret,
...config,
});
this.$bucket = config.bucket;
this.$root = config.root ? normalize(config.root).replace(/^\//, '') : '';
this.$acl = config.acl;
this.$serverSideEncryption = config.serverSideEncryption;
}
/**
* Prefixes the given filePath with the storage root location
*/
protected _fullPath(filePath: string): string {
return normalize(path.join(this.$root, filePath));
}
/**
* Copy a file to a location.
*/
public async copy(src: string, dest: string): Promise<Response> {
src = this._fullPath(src);
dest = this._fullPath(dest);
const params = {
Key: dest,
Bucket: this.$bucket,
CopySource: `/${this.$bucket}/${src}`,
ACL: this.$acl,
ServerSideEncryption: this.$serverSideEncryption,
};
try {
const result = await this.$driver.copyObject(params).promise();
return { raw: result };
} catch (e: any) {
throw handleError(e, src, this.$bucket);
}
}
/**
* Delete existing file.
*/
public async delete(location: string): Promise<DeleteResponse> {
location = this._fullPath(location);
const params = { Key: location, Bucket: this.$bucket };
try {
const result = await this.$driver.deleteObject(params).promise();
// Amazon does not inform the client if anything was deleted.
return { raw: result, wasDeleted: null };
} catch (e: any) {
throw handleError(e, location, this.$bucket);
}
}
/**
* Returns the driver.
*/
public driver(): S3 {
return this.$driver;
}
/**
* Determines if a file or folder already exists.
*/
public async exists(location: string): Promise<ExistsResponse> {
location = this._fullPath(location);
const params = { Key: location, Bucket: this.$bucket };
try {
const result = await this.$driver.headObject(params).promise();
return { exists: true, raw: result };
} catch (e: any) {
if (e.statusCode === 404) {
return { exists: false, raw: e };
} else {
throw handleError(e, location, this.$bucket);
}
}
}
/**
* Returns the file contents.
*/
public async get(location: string, encoding: BufferEncoding = 'utf-8'): Promise<ContentResponse<string>> {
const bufferResult = await this.getBuffer(location);
return {
content: bufferResult.content.toString(encoding),
raw: bufferResult.raw,
};
}
/**
* Returns the file contents as Buffer.
*/
public async getBuffer(location: string): Promise<ContentResponse<Buffer>> {
location = this._fullPath(location);
const params = { Key: location, Bucket: this.$bucket };
try {
const result = await this.$driver.getObject(params).promise();
// S3.getObject returns a Buffer in Node.js
const body = result.Body as Buffer;
return { content: body, raw: result };
} catch (e: any) {
throw handleError(e, location, this.$bucket);
}
}
/**
* Returns signed url for an existing file
*/
public async getSignedUrl(location: string, options: SignedUrlOptions = {}): Promise<SignedUrlResponse> {
location = this._fullPath(location);
const { expiry = 900 } = options;
try {
const params = {
Key: location,
Bucket: this.$bucket,
Expires: expiry,
};
const result = await this.$driver.getSignedUrlPromise('getObject', params);
return { signedUrl: result, raw: result };
} catch (e: any) {
throw handleError(e, location, this.$bucket);
}
}
/**
* Returns file's size and modification date.
*/
public async getStat(location: string): Promise<StatResponse> {
location = this._fullPath(location);
const params = { Key: location, Bucket: this.$bucket };
try {
const result = await this.$driver.headObject(params).promise();
return {
size: result.ContentLength as number,
modified: result.LastModified as Date,
raw: result,
};
} catch (e: any) {
throw handleError(e, location, this.$bucket);
}
}
/**
* Returns the stream for the given file.
*/
public getStream(location: string, range?: Range): NodeJS.ReadableStream {
location = this._fullPath(location);
const params: S3.GetObjectRequest = {
Key: location,
Bucket: this.$bucket,
Range: range ? `bytes=${range.start}-${range.end || ''}` : undefined,
};
return this.$driver.getObject(params).createReadStream();
}
/**
* Returns url for a given key.
*/
public getUrl(location: string): string {
location = this._fullPath(location);
const { href } = this.$driver.endpoint;
if (href.startsWith('https://s3.amazonaws')) {
return `https://${this.$bucket}.s3.amazonaws.com/${location}`;
}
return `${href}${this.$bucket}/${location}`;
}
/**
* Moves file from one location to another. This
* method will call `copy` and `delete` under
* the hood.
*/
public async move(src: string, dest: string): Promise<Response> {
src = this._fullPath(src);
dest = this._fullPath(dest);
await this.copy(src, dest);
await this.delete(src);
return { raw: undefined };
}
/**
* Creates a new file.
* This method will create missing directories on the fly.
*/
public async put(
location: string,
content: Buffer | NodeJS.ReadableStream | string,
type?: string
): Promise<Response> {
location = this._fullPath(location);
const params = {
Key: location,
Body: content,
Bucket: this.$bucket,
ACL: this.$acl,
ContentType: type ? type : '',
ServerSideEncryption: this.$serverSideEncryption,
};
try {
const result = await this.$driver.upload(params).promise();
return { raw: result };
} catch (e: any) {
throw handleError(e, location, this.$bucket);
}
}
/**
* Iterate over all files in the bucket.
*/
public async *flatList(prefix = ''): AsyncIterable<FileListResponse> {
prefix = this._fullPath(prefix);
let continuationToken: string | undefined;
do {
try {
const response = await this.$driver
.listObjectsV2({
Bucket: this.$bucket,
Prefix: prefix,
ContinuationToken: continuationToken,
MaxKeys: 1000,
})
.promise();
continuationToken = response.NextContinuationToken;
for (const file of response.Contents as ObjectList) {
const path = file.Key as string;
yield {
raw: file,
path: path.substring(this.$root.length),
};
}
} catch (e: any) {
throw handleError(e, prefix, this.$bucket);
}
} while (continuationToken);
}
}
export interface AmazonWebServicesS3StorageConfig extends ClientConfiguration {
key: string;
secret: string;
bucket: string;
root?: string;
acl?: string;
serverSideEncryption?: string;
}

View File

@@ -1 +0,0 @@
export * from './AmazonWebServicesS3Storage';

View File

@@ -1 +0,0 @@
storage

View File

@@ -1,25 +0,0 @@
import { StorageManager } from '@directus/drive';
import { AmazonWebServicesS3Storage, AmazonWebServicesS3StorageConfig } from '../src';
describe('drive', function () {
it('Instantiate', function () {
const storage = new StorageManager({
default: 's3',
disks: {
remote: {
driver: 's3',
config: {
bucket: 'bucket',
key: 'key',
secret: 'secret',
} as AmazonWebServicesS3StorageConfig,
},
},
});
storage.registerDriver('s3', AmazonWebServicesS3Storage);
const disk = storage.disk('remote');
expect(disk).toBeInstanceOf(AmazonWebServicesS3Storage);
});
});

View File

@@ -1,7 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"rootDir": ".."
},
"include": [".", "../src"]
}

View File

@@ -1,32 +0,0 @@
{
"compilerOptions": {
"target": "ES2018",
"lib": ["ES2018"],
"module": "CommonJS",
"moduleResolution": "node",
"declaration": true,
"declarationMap": true,
"strict": true,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": true,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"rootDir": "./src",
"outDir": "./dist"
},
"include": ["./src/**/*.ts"]
}

View File

@@ -1,12 +0,0 @@
root=true
[*]
end_of_line = lf
insert_final_newline = true
charset = utf-8
indent_style = tab
trim_trailing_whitespace = true
[{package.json,*.yml,*.yaml}]
indent_style = space
indent_size = 2

View File

@@ -1,10 +0,0 @@
node_modules
/_*
/coverage
/dist*
/src/**/*.js
/*.d.ts
/*.d.ts.map
*.tgz

View File

@@ -1,14 +0,0 @@
Copyright 2020 - Slynova Romain Lanz
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit
persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the
Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -1,9 +0,0 @@
require('dotenv').config();
module.exports = {
preset: 'ts-jest',
verbose: true,
setupFiles: ['dotenv/config'],
testURL: process.env.TEST_URL || 'http://localhost',
collectCoverageFrom: ['src/**/*.ts'],
};

View File

@@ -1,50 +0,0 @@
{
"name": "@directus/drive",
"version": "9.21.2",
"description": "Object storage abstraction layer for Directus",
"homepage": "https://directus.io",
"bugs": {
"url": "https://github.com/directus/directus/issues"
},
"repository": {
"type": "git",
"url": "https://github.com/directus/directus.git",
"directory": "packages/drive"
},
"funding": "https://github.com/directus/directus?sponsor=1",
"license": "MIT",
"contributors": [
"Rijk van Zanten <rijkvanzanten@me.com>"
],
"exports": {
".": "./dist/index.js",
"./package.json": "./package.json"
},
"main": "dist/index.js",
"files": [
"dist",
"!**/*.d.ts?(.map)"
],
"scripts": {
"build": "tsc --project ./tsconfig.json",
"test:watch": "jest --coverage --watchAll",
"test": "jest --coverage",
"dev": "pnpm build -w --preserveWatchOutput --incremental"
},
"dependencies": {
"fs-extra": "10.1.0",
"node-exceptions": "4.0.1"
},
"devDependencies": {
"@types/fs-extra": "9.0.13",
"@types/jest": "29.2.0",
"@types/node": "18.11.2",
"dotenv": "16.0.3",
"jest": "29.3.1",
"ts-jest": "29.0.3",
"typescript": "4.9.3"
},
"publishConfig": {
"access": "public"
}
}

View File

@@ -1,8 +0,0 @@
# @directus/drive
Object storage abstraction layer.
## Credits
This package started life as a fork of [`@slynova/flydrive`](https://github.com/slynova-org/flydrive) by Robin Grundvåg
<robgru52@gmail.com>.

View File

@@ -1,243 +0,0 @@
import * as fse from 'fs-extra';
//import { promises as fs } from 'fs';
import { dirname, join, resolve, relative, sep } from 'path';
import Storage from './Storage';
import { isReadableStream, pipeline } from './utils';
import { FileNotFound, UnknownException, PermissionMissing } from './exceptions';
import {
Response,
ExistsResponse,
ContentResponse,
StatResponse,
FileListResponse,
DeleteResponse,
Range,
} from './types';
function handleError(err: Error & { code: string; path?: string }, location: string): Error {
switch (err.code) {
case 'ENOENT':
return new FileNotFound(err, location);
case 'EPERM':
return new PermissionMissing(err, location);
default:
return new UnknownException(err, err.code, location);
}
}
export class LocalFileSystemStorage extends Storage {
private $root: string;
constructor(config: LocalFileSystemStorageConfig) {
super();
this.$root = resolve(config.root);
}
/**
* Returns full path relative to the storage's root directory.
*/
private _fullPath(relativePath: string): string {
return join(this.$root, join(sep, relativePath));
}
/**
* Appends content to a file.
*/
public async append(location: string, content: Buffer | string): Promise<Response> {
try {
const result = await fse.appendFile(this._fullPath(location), content);
return { raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Copy a file to a location.
*/
public async copy(src: string, dest: string): Promise<Response> {
try {
const result = await fse.copy(this._fullPath(src), this._fullPath(dest));
return { raw: result };
} catch (e: any) {
throw handleError(e, `${src} -> ${dest}`);
}
}
/**
* Delete existing file.
*/
public async delete(location: string): Promise<DeleteResponse> {
try {
const result = await fse.unlink(this._fullPath(location));
return { raw: result, wasDeleted: true };
} catch (e: any) {
const error = handleError(e, location);
if (error instanceof FileNotFound) {
return { raw: undefined, wasDeleted: false };
}
throw error;
}
}
/**
* Returns the driver.
*/
public driver(): typeof fse {
return fse;
}
/**
* Determines if a file or folder already exists.
*/
public async exists(location: string): Promise<ExistsResponse> {
try {
const result = await fse.pathExists(this._fullPath(location));
return { exists: result, raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns the file contents as string.
*/
public async get(location: string, encoding = 'utf-8'): Promise<ContentResponse<string>> {
try {
const result = await fse.readFile(this._fullPath(location), encoding);
return { content: result, raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns the file contents as Buffer.
*/
public async getBuffer(location: string): Promise<ContentResponse<Buffer>> {
try {
const result = await fse.readFile(this._fullPath(location));
return { content: result, raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns file size in bytes.
*/
public async getStat(location: string): Promise<StatResponse> {
try {
const stat = await fse.stat(this._fullPath(location));
return {
size: stat.size,
modified: stat.mtime,
raw: stat,
};
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* Returns a read stream for a file location.
*/
public getStream(location: string, range?: Range): NodeJS.ReadableStream {
return fse.createReadStream(this._fullPath(location), {
start: range?.start,
end: range?.end,
});
}
/**
* Move file to a new location.
*/
public async move(src: string, dest: string): Promise<Response> {
try {
const result = await fse.move(this._fullPath(src), this._fullPath(dest));
return { raw: result };
} catch (e: any) {
throw handleError(e, `${src} -> ${dest}`);
}
}
/**
* Prepends content to a file.
*/
public async prepend(location: string, content: Buffer | string): Promise<Response> {
try {
const { content: actualContent } = await this.get(location, 'utf-8');
return this.put(location, `${content}${actualContent}`);
} catch (e: any) {
if (e instanceof FileNotFound) {
return this.put(location, content);
}
throw e;
}
}
/**
* Creates a new file.
* This method will create missing directories on the fly.
*/
public async put(location: string, content: Buffer | NodeJS.ReadableStream | string): Promise<Response> {
const fullPath = this._fullPath(location);
try {
if (isReadableStream(content)) {
const dir = dirname(fullPath);
await fse.ensureDir(dir);
const ws = fse.createWriteStream(fullPath);
await pipeline(content, ws);
return { raw: undefined };
}
const result = await fse.outputFile(fullPath, content);
return { raw: result };
} catch (e: any) {
throw handleError(e, location);
}
}
/**
* List files with a given prefix.
*/
public flatList(prefix = ''): AsyncIterable<FileListResponse> {
const fullPrefix = this._fullPath(prefix);
return this._flatDirIterator(fullPrefix, prefix);
}
private async *_flatDirIterator(prefix: string, originalPrefix: string): AsyncIterable<FileListResponse> {
const prefixDirectory = prefix[prefix.length - 1] === sep ? prefix : dirname(prefix);
try {
const dir = await fse.opendir(prefixDirectory);
for await (const file of dir) {
const fileName = join(prefixDirectory, file.name);
if (fileName.toLowerCase().startsWith(prefix.toLowerCase())) {
if (file.isDirectory()) {
yield* this._flatDirIterator(join(fileName, sep), originalPrefix);
} else if (file.isFile()) {
const path = relative(this.$root, fileName);
yield {
raw: null,
path,
};
}
}
}
} catch (e: any) {
if (e.code !== 'ENOENT') {
throw handleError(e, originalPrefix);
}
}
}
}
export type LocalFileSystemStorageConfig = {
root: string;
};

View File

@@ -1,155 +0,0 @@
import { MethodNotSupported } from './exceptions';
import {
Response,
SignedUrlResponse,
ContentResponse,
ExistsResponse,
SignedUrlOptions,
StatResponse,
FileListResponse,
DeleteResponse,
Range,
} from './types';
export default abstract class Storage {
/**
* Appends content to a file.
*
* Supported drivers: "local"
*/
append(_location: string, _content: Buffer | string): Promise<Response> {
throw new MethodNotSupported('append', this.constructor.name);
}
/**
* Copy a file to a location.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
copy(_src: string, _dest: string): Promise<Response> {
throw new MethodNotSupported('copy', this.constructor.name);
}
/**
* Delete existing file.
* The value returned by this method will have a `wasDeleted` property that
* can be either a boolean (`true` if a file was deleted, `false` if there was
* no file to delete) or `null` (if no information about the file is available).
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
delete(_location: string): Promise<DeleteResponse> {
throw new MethodNotSupported('delete', this.constructor.name);
}
/**
* Returns the driver.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
public driver(): unknown {
throw new MethodNotSupported('driver', this.constructor.name);
}
/**
* Determines if a file or folder already exists.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
exists(_location: string): Promise<ExistsResponse> {
throw new MethodNotSupported('exists', this.constructor.name);
}
/**
* Returns the file contents as a string.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
get(_location: string, _encoding?: string): Promise<ContentResponse<string>> {
throw new MethodNotSupported('get', this.constructor.name);
}
/**
* Returns the file contents as a Buffer.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
getBuffer(_location: string): Promise<ContentResponse<Buffer>> {
throw new MethodNotSupported('getBuffer', this.constructor.name);
}
/**
* Returns signed url for an existing file.
*
* Supported drivers: "s3", "gcs", "azure"
*/
getSignedUrl(_location: string, _options?: SignedUrlOptions): Promise<SignedUrlResponse> {
throw new MethodNotSupported('getSignedUrl', this.constructor.name);
}
/**
* Returns file's size and modification date.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
getStat(_location: string): Promise<StatResponse> {
throw new MethodNotSupported('getStat', this.constructor.name);
}
/**
* Returns the stream for the given file.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
getStream(_location: string, _range?: Range): NodeJS.ReadableStream {
throw new MethodNotSupported('getStream', this.constructor.name);
}
/**
* Returns url for a given key. Note this method doesn't
* validates the existence of file or it's visibility
* status.
*
* Supported drivers: "s3", "gcs", "azure"
*/
getUrl(_location: string): string {
throw new MethodNotSupported('getUrl', this.constructor.name);
}
/**
* Move file to a new location.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
move(_src: string, _dest: string): Promise<Response> {
throw new MethodNotSupported('move', this.constructor.name);
}
/**
* Creates a new file.
* This method will create missing directories on the fly.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
put(_location: string, _content: Buffer | NodeJS.ReadableStream | string, _type?: string): Promise<Response> {
throw new MethodNotSupported('put', this.constructor.name);
}
/**
* Prepends content to a file.
*
* Supported drivers: "local"
*/
prepend(_location: string, _content: Buffer | string): Promise<Response> {
throw new MethodNotSupported('prepend', this.constructor.name);
}
/**
* List files with a given prefix.
*
* Supported drivers: "local", "s3", "gcs", "azure"
*/
flatList(_prefix?: string): AsyncIterable<FileListResponse> {
throw new MethodNotSupported('flatList', this.constructor.name);
}
}

View File

@@ -1,108 +0,0 @@
import { LocalFileSystemStorage } from './LocalFileSystemStorage';
import Storage from './Storage';
import { InvalidConfig, DriverNotSupported } from './exceptions';
import { StorageManagerConfig, StorageManagerDiskConfig, StorageManagerSingleDiskConfig } from './types';
interface StorageConstructor<T extends Storage = Storage> {
new (...args: any[]): T;
}
export default class StorageManager {
/**
* Default disk.
*/
private defaultDisk: string | undefined;
/**
* Configured disks.
*/
private disksConfig: StorageManagerDiskConfig;
/**
* Instantiated disks.
*/
private _disks: Map<string, Storage> = new Map();
/**
* List of available drivers.
*/
private _drivers: Map<string, StorageConstructor<Storage>> = new Map();
constructor(config: StorageManagerConfig) {
this.defaultDisk = config.default;
this.disksConfig = config.disks || {};
this.registerDriver('local', LocalFileSystemStorage);
}
/**
* Get the instantiated disks
*/
getDisks(): Map<string, Storage> {
return this._disks;
}
/**
* Get the registered drivers
*/
getDrivers(): Map<string, StorageConstructor<Storage>> {
return this._drivers;
}
/**
* Get a disk instance.
*/
disk<T extends Storage = Storage>(name?: string): T {
name = name || this.defaultDisk;
/**
* No name is defined and neither there
* are any defaults.
*/
if (!name) {
throw InvalidConfig.missingDiskName();
}
if (this._disks.has(name)) {
return this._disks.get(name) as T;
}
const diskConfig = this.disksConfig[name];
/**
* Configuration for the defined disk is missing
*/
if (!diskConfig) {
throw InvalidConfig.missingDiskConfig(name);
}
/**
* There is no driver defined on disk configuration
*/
if (!diskConfig.driver) {
throw InvalidConfig.missingDiskDriver(name);
}
const Driver = this._drivers.get(diskConfig.driver);
if (!Driver) {
throw DriverNotSupported.driver(diskConfig.driver);
}
const disk = new Driver(diskConfig.config);
this._disks.set(name, disk);
return disk as T;
}
addDisk(name: string, config: StorageManagerSingleDiskConfig): void {
if (this.disksConfig[name]) {
throw InvalidConfig.duplicateDiskName(name);
}
this.disksConfig[name] = config;
}
/**
* Register a custom driver.
*/
registerDriver<T extends Storage>(name: string, driver: StorageConstructor<T>): void {
this._drivers.set(name, driver);
}
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class AuthorizationRequired extends RuntimeException {
raw: Error;
constructor(err: Error, path: string) {
super(`Unauthorized to access file ${path}\n${err.message}`, 500, 'E_AUTHORIZATION_REQUIRED');
this.raw = err;
}
}

View File

@@ -1,17 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class DriverNotSupported extends RuntimeException {
public driver!: string;
private constructor(message: string, code?: number) {
super(message, code);
}
public static driver(name: string): DriverNotSupported {
const exception = new this(`Driver ${name} is not supported`, 400);
exception.driver = name;
return exception;
}
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class FileNotFound extends RuntimeException {
raw: Error;
constructor(err: Error, path: string) {
super(`The file ${path} doesn't exist\n${err.message}`, 500, 'E_FILE_NOT_FOUND');
this.raw = err;
}
}

View File

@@ -1,23 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class InvalidConfig extends RuntimeException {
private constructor(message: string, status?: number, code?: string) {
super(message, status, code);
}
public static missingDiskName(): InvalidConfig {
return new this('Make sure to define a default disk name inside config file', 500, 'E_INVALID_CONFIG');
}
public static missingDiskConfig(name: string): InvalidConfig {
return new this(`Make sure to define config for ${name} disk`, 500, 'E_INVALID_CONFIG');
}
public static missingDiskDriver(name: string): InvalidConfig {
return new this(`Make sure to define driver for ${name} disk`, 500, 'E_INVALID_CONFIG');
}
public static duplicateDiskName(name: string): InvalidConfig {
return new this(`A disk named ${name} is already defined`, 500, 'E_INVALID_CONFIG');
}
}

View File

@@ -1,7 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class MethodNotSupported extends RuntimeException {
constructor(name: string, driver: string) {
super(`Method ${name} is not supported for the driver ${driver}`, 500, 'E_METHOD_NOT_SUPPORTED');
}
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class NoSuchBucket extends RuntimeException {
raw: Error;
constructor(err: Error, bucket: string) {
super(`The bucket ${bucket} doesn't exist\n${err.message}`, 500, 'E_NO_SUCH_BUCKET');
this.raw = err;
}
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class PermissionMissing extends RuntimeException {
raw: Error;
constructor(err: Error, path: string) {
super(`Missing permission for file ${path}\n${err.message}`, 500, 'E_PERMISSION_MISSING');
this.raw = err;
}
}

View File

@@ -1,17 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class UnknownException extends RuntimeException {
raw: Error;
constructor(err: Error, errorCode: string, path: string) {
super(
`An unknown error happened with the file ${path}.
Error code: ${errorCode}
Original stack:
${err.stack}`,
500,
'E_UNKNOWN'
);
this.raw = err;
}
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from 'node-exceptions';
export class WrongKeyPath extends RuntimeException {
raw: Error;
constructor(err: Error, path: string) {
super(`The key path does not exist: ${path}\n${err.message}`, 500, 'E_WRONG_KEY_PATH');
this.raw = err;
}
}

View File

@@ -1,9 +0,0 @@
export * from './AuthorizationRequired';
export * from './DriverNotSupported';
export * from './FileNotFound';
export * from './InvalidConfig';
export * from './MethodNotSupported';
export * from './NoSuchBucket';
export * from './PermissionMissing';
export * from './UnknownException';
export * from './WrongKeyPath';

View File

@@ -1,7 +0,0 @@
export { default as Storage } from './Storage';
export { default as StorageManager } from './StorageManager';
export { LocalFileSystemStorage } from './LocalFileSystemStorage';
export * from './exceptions';
export * from './utils';
export * from './types';

View File

@@ -1,68 +0,0 @@
import { LocalFileSystemStorageConfig } from './LocalFileSystemStorage';
export type { LocalFileSystemStorageConfig };
export type StorageManagerSingleDiskConfig =
| {
driver: 'local';
config: LocalFileSystemStorageConfig;
}
| {
driver: string;
config: unknown;
};
export interface StorageManagerDiskConfig {
[key: string]: StorageManagerSingleDiskConfig;
}
export interface StorageManagerConfig {
/**
* The default disk returned by `disk()`.
*/
default?: string;
disks?: StorageManagerDiskConfig;
}
export interface Response {
raw: unknown;
}
export interface ExistsResponse extends Response {
exists: boolean;
}
export interface ContentResponse<ContentType> extends Response {
content: ContentType;
}
export interface SignedUrlOptions {
/**
* Expiration time of the URL.
* It should be a number of seconds from now.
* @default `900` (15 minutes)
*/
expiry?: number;
}
export interface SignedUrlResponse extends Response {
signedUrl: string;
}
export interface StatResponse extends Response {
size: number;
modified: Date;
}
export interface FileListResponse extends Response {
path: string;
}
export interface DeleteResponse extends Response {
wasDeleted: boolean | null;
}
export interface Range {
start: number | undefined;
end: number | undefined;
}

View File

@@ -1,38 +0,0 @@
import { promisify } from 'util';
import { pipeline as nodePipeline } from 'stream';
import Storage from './Storage';
/**
* Returns a boolean indication if stream param
* is a readable stream or not.
*/
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
export function isReadableStream(stream: any): stream is NodeJS.ReadableStream {
return (
stream !== null &&
typeof stream === 'object' &&
typeof stream.pipe === 'function' &&
typeof stream._read === 'function' &&
typeof stream._readableState === 'object' &&
stream.readable !== false
);
}
export const pipeline = promisify(nodePipeline);
export async function streamToString(stream: NodeJS.ReadableStream): Promise<string> {
const chunks: string[] = [];
stream.setEncoding('utf-8');
for await (const chunk of stream) {
chunks.push(chunk as string);
}
return chunks.join('');
}
export async function getFlatList(storage: Storage, prefix?: string): Promise<string[]> {
const result: string[] = [];
for await (const file of storage.flatList(prefix)) {
result.push(file.path);
}
return result;
}

View File

@@ -1 +0,0 @@
storage

View File

@@ -1,463 +0,0 @@
/**
* @slynova/flydrive
*
* @license MIT
* @copyright Slynova - Romain Lanz <romain.lanz@slynova.ch>
*/
import path from 'path';
jest.mock('fs-extra', () => {
const fse = jest.requireActual('fs-extra');
const errors: {
map: Record<string, Error>;
} = {
map: {},
};
return {
...Object.entries(fse)
.map(([key, func]: [string, any]) => {
if (typeof fse[key] !== 'function') {
return {
[key]: func,
};
}
return {
[key]: (...args: any[]) => {
if (key in errors.map) {
throw (errors.map as any)[key as any];
}
return func(...args);
},
};
})
.reduce((previous, current) => Object.assign(previous, current)),
throwErrors(handler: () => Promise<any>, map: Record<string, Error>): () => Promise<void> {
return async () => {
errors.map = map;
try {
await handler();
} finally {
errors.map = {};
}
};
},
};
});
import fse from 'fs-extra';
const fsem: typeof fse & {
throwErrors(handler: () => Promise<any>, map: Record<string, Error>): () => Promise<void>;
} = fse as any;
import * as CE from '../src/exceptions';
import { LocalFileSystemStorage } from '../src/LocalFileSystemStorage';
import { streamToString, getFlatList } from '../src/utils';
import { RuntimeException } from 'node-exceptions';
const root = path.join(__dirname, 'storage/local');
const storage = new LocalFileSystemStorage({ root });
function isWindowsDefenderError(error: { code: string }): boolean {
return error.code === 'EPERM';
}
function realFsPath(relativePath: string): string {
return path.join(root, relativePath);
}
const testString = 'test-data';
beforeAll(async () => {
await fse.ensureDir(root);
});
afterEach(async () => {
await fse.emptyDir(root);
});
describe('Local Driver', () => {
it('get underlying driver', async () => {
expect(storage.driver()).toBeDefined();
});
it('find if a file exists', async () => {
await fse.outputFile(realFsPath('i_exist'), testString);
const { exists } = await storage.exists('i_exist');
expect(exists).toBe(true);
});
it(
'exists handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.exists('file')).rejects.toThrow('Mocked permission error');
},
{
pathExists: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it(`find if a file doesn't exist`, async () => {
const { exists } = await storage.exists('i_dont_exist');
expect(exists).toBe(false);
});
it('find if a folder exists', async () => {
await fse.ensureDir(realFsPath('test_dir'));
const { exists } = await storage.exists('test_dir');
expect(exists).toBe(true);
});
it('create a file', async () => {
await storage.put('im_new', testString);
const { content } = await storage.get('im_new');
expect(content).toStrictEqual(testString);
});
it('create a file in a deep directory', async () => {
await storage.put('deep/directory/im_new', testString);
const { content } = await storage.get('deep/directory/im_new');
expect(content).toStrictEqual(testString);
});
it('delete a file', async () => {
await fse.outputFile(realFsPath('i_will_be_deleted'), '');
try {
const { wasDeleted } = await storage.delete('i_will_be_deleted');
expect(wasDeleted).toBe(true);
const { exists } = await storage.exists('i_will_be_deleted');
expect(exists).toBe(false);
} catch (error: any) {
if (!isWindowsDefenderError(error)) {
throw error;
}
}
});
it(
'delete rethrows',
fsem.throwErrors(
async () => {
expect(async () => await storage.delete('file')).rejects.toThrow('Mocked permission error');
},
{
unlink: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it('delete a file that does not exist', async () => {
const { wasDeleted } = await storage.delete('i_dont_exist');
expect(wasDeleted).toBe(false);
});
it('move a file', async () => {
await fse.outputFile(realFsPath('i_will_be_renamed'), '');
await storage.move('i_will_be_renamed', 'im_renamed');
const { exists: newExists } = await storage.exists('im_renamed');
expect(newExists).toBe(true);
const { exists: oldExists } = await storage.exists('i_will_be_renamed');
expect(oldExists).toBe(false);
});
it('copy a file', async () => {
await fse.outputFile(realFsPath('i_will_be_copied'), '');
await storage.copy('i_will_be_copied', 'im_copied');
const { exists: newExists } = await storage.exists('im_copied');
expect(newExists).toBe(true);
const { exists: oldExists } = await storage.exists('i_will_be_copied');
expect(oldExists).toBe(true);
});
it(
'copy handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.copy('src', 'dst')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file src'
);
},
{
copy: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it('prepend to a file', async () => {
await fse.outputFile(realFsPath('i_have_content'), 'world');
await storage.prepend('i_have_content', 'hello ');
const { content } = await storage.get('i_have_content');
expect(content).toStrictEqual('hello world');
});
it(
'prepend handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.prepend('prependFails', 'test')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file prependFails'
);
},
{
readFile: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it('append to a file', async () => {
await fse.outputFile(realFsPath('i_have_content'), 'hello');
await storage.append('i_have_content', ' universe');
const { content } = await storage.get('i_have_content');
expect(content).toStrictEqual('hello universe');
});
it(
'append handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.append('appendFails', 'test')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file appendFails'
);
},
{
appendFile: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it('prepend to new file', async () => {
await storage.prepend('i_have_content', testString);
const { content } = await storage.get('i_have_content', 'utf-8');
expect(content).toStrictEqual(testString);
});
it('throw file not found exception when unable to find file', async () => {
expect.assertions(1);
try {
await storage.get('non_existing', 'utf-8');
} catch (error: any) {
expect(error).toBeInstanceOf(CE.FileNotFound);
}
});
it('do not get out of root path when path is absolute', async () => {
const dummyFile = '/dummy_file';
await storage.put(dummyFile, testString);
const content = fse.readFileSync(realFsPath(dummyFile), 'utf-8');
expect(content).toStrictEqual(testString);
});
it('ignore extraneous double dots ..', async () => {
await storage.put('../../../dummy_file', testString);
const content = fse.readFileSync(realFsPath('dummy_file'), 'utf-8');
expect(content).toStrictEqual(testString);
});
it('do not ignore valid double dots ..', async () => {
await storage.put('fake_dir/../dummy_file', testString);
const content = fse.readFileSync(realFsPath('dummy_file'), 'utf-8');
expect(content).toStrictEqual(testString);
});
it('create file from stream', async () => {
await storage.put('foo', testString);
const readStream = fse.createReadStream(realFsPath('foo'));
await storage.put('bar', readStream);
const { content } = await storage.get('bar');
expect(content).toStrictEqual(testString);
});
it('get file as a buffer', async () => {
await fse.outputFile(realFsPath('eita'), testString);
const { content } = await storage.getBuffer('eita');
expect(content).toBeInstanceOf(Buffer);
});
it(
'getBuffer handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.getBuffer('eita')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file eita'
);
},
{
readFile: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it(
'getStat handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.getStat('eita')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file eita'
);
},
{
stat: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it(
'move handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.move('src', 'dst')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file src -> dst'
);
},
{
move: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it(
'put handles errors',
fsem.throwErrors(
async () => {
expect(async () => await storage.put('eita', 'content')).rejects.toThrow(
'E_PERMISSION_MISSING: Missing permission for file eita'
);
},
{
outputFile: new RuntimeException('Mocked permission error', undefined, 'EPERM'),
}
)
);
it(
'flatList handles errors',
fsem.throwErrors(
async () => {
expect(async () => await getFlatList(storage)).rejects.toThrow('E_UNKNOWN');
},
{
opendir: new RuntimeException('Unknown', undefined, 'Unknown'),
}
)
);
it('throw exception when unable to find file', async () => {
expect.assertions(1);
const readStream = storage.getStream('foo');
try {
await streamToString(readStream);
} catch (err: any) {
expect(err.code).toStrictEqual('ENOENT');
}
});
it('get stream of a given file', async () => {
await storage.put('foo', testString);
const readStream = storage.getStream('foo');
const content = await streamToString(readStream);
expect(content).toStrictEqual(testString);
});
it('get the stat of a given file', async () => {
await storage.put('foo', testString);
const { size, modified } = await storage.getStat('foo');
expect(size).toEqual(testString.length);
// It seems that the Date constructor used in fs-extra is not the global one.
expect(modified.constructor.name).toStrictEqual('Date');
});
it('list files with no prefix and empty directory', async () => {
const result = await getFlatList(storage);
expect(result).toStrictEqual([]);
});
it('list files with prefix that does not exist', async () => {
const result = await getFlatList(storage, '/dummy/path');
expect(result).toStrictEqual([]);
});
it('list files with no prefix', async () => {
await Promise.all([
storage.put('foo.txt', 'bar'),
storage.put('foo/bar', 'baz'),
storage.put('other/dir/file.txt', 'hello'),
]);
const result = await getFlatList(storage);
expect(result.sort()).toStrictEqual(['foo.txt', path.normalize('foo/bar'), path.normalize('other/dir/file.txt')]);
});
it('list files with folder prefix', async () => {
await Promise.all([
storage.put('foo.txt', 'bar'),
storage.put('foo/bar', 'baz'),
storage.put('other/dir/file.txt', 'hello'),
]);
const result = await getFlatList(storage, 'other');
expect(result).toStrictEqual([path.normalize('other/dir/file.txt')]);
});
it('list files with subfolder prefix', async () => {
await Promise.all([
storage.put('foo.txt', 'bar'),
storage.put('foo/bar', 'baz'),
storage.put('other/dir/file.txt', 'hello'),
]);
const result = await getFlatList(storage, `other/dir/`);
expect(result).toStrictEqual([path.normalize('other/dir/file.txt')]);
});
it('list files with filename prefix', async () => {
await Promise.all([
storage.put('foo.txt', 'bar'),
storage.put('foo/bar', 'baz'),
storage.put('other/dir/file.txt', 'hello'),
]);
const result = await getFlatList(storage, 'other/dir/fil');
expect(result).toStrictEqual([path.normalize('other/dir/file.txt')]);
});
it('list files with double dots in prefix', async () => {
await Promise.all([
storage.put('foo.txt', 'bar'),
storage.put('foo/bar', 'baz'),
storage.put('other/dir/file.txt', 'hello'),
]);
const result = await getFlatList(storage, 'other/../');
expect(result.sort()).toStrictEqual(['foo.txt', path.normalize('foo/bar'), path.normalize('other/dir/file.txt')]);
});
});

View File

@@ -1,55 +0,0 @@
import Storage from '../src/Storage';
describe('Storage Class', () => {
it('throws on all methods', async () => {
class DumbStorage extends Storage {}
const driver = new DumbStorage();
expect(() => driver.append('location', 'content')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method append is not supported for the driver DumbStorage'
);
expect(() => driver.copy('src', 'desyt')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method copy is not supported for the driver DumbStorage'
);
expect(() => driver.delete('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method delete is not supported for the driver DumbStorage'
);
expect(() => driver.driver()).toThrow(
'E_METHOD_NOT_SUPPORTED: Method driver is not supported for the driver DumbStorage'
);
expect(() => driver.exists('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method exists is not supported for the driver DumbStorage'
);
expect(() => driver.get('location', 'encoding')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method get is not supported for the driver DumbStorage'
);
expect(() => driver.getBuffer('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method getBuffer is not supported for the driver DumbStorage'
);
expect(() => driver.getSignedUrl('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method getSignedUrl is not supported for the driver DumbStorage'
);
expect(() => driver.getStat('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method getStat is not supported for the driver DumbStorage'
);
expect(() => driver.getStream('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method getStream is not supported for the driver DumbStorage'
);
expect(() => driver.getUrl('location')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method getUrl is not supported for the driver DumbStorage'
);
expect(() => driver.move('src', 'dst')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method move is not supported for the driver DumbStorage'
);
expect(() => driver.put('location', 'content')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method put is not supported for the driver DumbStorage'
);
expect(() => driver.prepend('location', 'content')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method prepend is not supported for the driver DumbStorage'
);
expect(() => driver.flatList('prefix')).toThrow(
'E_METHOD_NOT_SUPPORTED: Method flatList is not supported for the driver DumbStorage'
);
});
});

View File

@@ -1,218 +0,0 @@
/**
* @slynova/flydrive
*
* @license MIT
* @copyright Slynova - Romain Lanz <romain.lanz@slynova.ch>
*/
import Storage from '../src/Storage';
import StorageManager from '../src/StorageManager';
import { LocalFileSystemStorage } from '../src/LocalFileSystemStorage';
describe('Storage Manager', () => {
it('throw exception when no disk name is defined', () => {
const storageManager = new StorageManager({});
const fn = (): Storage => storageManager.disk();
expect(fn).toThrow('E_INVALID_CONFIG: Make sure to define a default disk name inside config file');
});
it('throw exception when disk config is missing', () => {
const storageManager = new StorageManager({
default: 'local',
});
const fn = (): Storage => storageManager.disk();
expect(fn).toThrow('E_INVALID_CONFIG: Make sure to define config for local disk');
});
it('throw exception when disk config doesnt have driver', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
// @ts-expect-error No driver
local: {},
},
});
const fn = (): Storage => storageManager.disk();
expect(fn).toThrow('E_INVALID_CONFIG: Make sure to define driver for local disk');
});
it('throw exception when driver is invalid', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'foo',
config: {
root: '',
},
},
},
});
const fn = (): Storage => storageManager.disk();
expect(fn).toThrow('Driver foo is not supported');
});
it('return storage instance for a given driver', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
},
});
const localDriver = storageManager.disk('local');
const localDriver2 = storageManager.disk('local');
expect(localDriver).toBe(localDriver2);
expect(localDriver).toBeInstanceOf(LocalFileSystemStorage);
});
it('extend and add new drivers', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'foo',
config: {},
},
},
});
class FooDriver extends Storage {}
storageManager.registerDriver('foo', FooDriver);
expect(storageManager.disk('local')).toBeInstanceOf(FooDriver);
});
it('add new disks', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
},
});
storageManager.addDisk('home', {
driver: 'local',
config: {
root: '~',
},
});
expect(storageManager.disk('home')).toBeInstanceOf(LocalFileSystemStorage);
});
it("invalid disks can't be added", () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
},
});
const fn = () =>
storageManager.addDisk('local', {
driver: 'local',
config: {
root: '',
},
});
expect(fn).toThrow('E_INVALID_CONFIG: A disk named local is already defined');
});
it('gets all instantiated disks', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
home: {
driver: 'local',
config: {
root: '~',
},
},
},
});
let disks = storageManager.getDisks().keys();
expect([...disks]).toStrictEqual([]);
storageManager.disk('local');
disks = storageManager.getDisks().keys();
expect([...disks].sort()).toStrictEqual(['local']);
storageManager.disk('home');
disks = storageManager.getDisks().keys();
expect([...disks].sort()).toStrictEqual(['home', 'local']);
});
it('gets all available drivers', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
home: {
driver: 'local',
config: {
root: '~',
},
},
},
});
class FooDriver extends Storage {}
storageManager.registerDriver('foo', FooDriver);
class BarDriver extends Storage {}
storageManager.registerDriver('bar', BarDriver);
const disks = storageManager.getDrivers().keys();
expect([...disks].sort()).toStrictEqual(['bar', 'foo', 'local']);
});
it('get disk with custom config', () => {
const storageManager = new StorageManager({
default: 'local',
disks: {
local: {
driver: 'local',
config: {
root: '',
},
},
},
});
const localWithDefaultConfig = storageManager.disk('local');
expect(localWithDefaultConfig).toBeInstanceOf(LocalFileSystemStorage);
});
});

View File

@@ -1,132 +0,0 @@
import {
AuthorizationRequired,
DriverNotSupported,
InvalidConfig,
MethodNotSupported,
NoSuchBucket,
PermissionMissing,
UnknownException,
WrongKeyPath,
} from '../src/exceptions';
describe('AuthorizationRequired', function () {
it('sets the raw exception object', function () {
try {
throw new AuthorizationRequired(new Error('test'), '/some/path');
} catch (err: any) {
expect(err.raw).toBeDefined();
expect(err.raw.message).toBe('test');
}
});
});
describe('DriverNotSupported', function () {
it('driver should be set', function () {
try {
throw DriverNotSupported.driver('alibaba');
} catch (err: any) {
expect(err.driver).toBeDefined();
expect(err.driver).toBe('alibaba');
}
});
});
describe('DriverNotSupported', function () {
it('driver should be set', function () {
try {
throw DriverNotSupported.driver('alibaba');
} catch (err: any) {
expect(err.driver).toBeDefined();
expect(err.driver).toBe('alibaba');
}
});
});
describe('InvalidConfig', function () {
it('missingDiskName', function () {
const err = InvalidConfig.missingDiskName();
expect(err.status).toBe(500);
expect(err.code).toBe('E_INVALID_CONFIG');
});
it('missingDiskConfig', function () {
const err = InvalidConfig.missingDiskConfig('disk_name');
expect(err.status).toBe(500);
expect(err.code).toBe('E_INVALID_CONFIG');
});
it('missingDiskDriver', function () {
const err = InvalidConfig.missingDiskDriver('disk_name');
expect(err.status).toBe(500);
expect(err.code).toBe('E_INVALID_CONFIG');
});
it('duplicateDiskName', function () {
const err = InvalidConfig.duplicateDiskName('disk_name');
expect(err.status).toBe(500);
expect(err.code).toBe('E_INVALID_CONFIG');
});
});
describe('MethodNotSupported', function () {
it('constructor', function () {
const err = new MethodNotSupported('method', 'driver');
expect(err.status).toBe(500);
expect(err.code).toBe('E_METHOD_NOT_SUPPORTED');
});
});
describe('NoSuchBucket', function () {
it('constructor', function () {
try {
throw new NoSuchBucket(new Error('test'), 'bucket');
} catch (err: any) {
expect(err.raw).toBeDefined();
expect(err.raw.message).toBe('test');
expect(err.status).toBe(500);
expect(err.code).toBe('E_NO_SUCH_BUCKET');
}
});
});
describe('PermissionMissing', function () {
it('constructor', function () {
try {
throw new PermissionMissing(new Error('test'), 'bucket');
} catch (err: any) {
expect(err.raw).toBeDefined();
expect(err.raw.message).toBe('test');
expect(err.status).toBe(500);
expect(err.code).toBe('E_PERMISSION_MISSING');
}
});
});
describe('UnknownException', function () {
it('constructor', function () {
try {
throw new UnknownException(new Error('test'), 'ERR_CODE', __filename);
} catch (err: any) {
expect(err.raw).toBeDefined();
expect(err.raw.message).toBe('test');
expect(err.message).toContain(__filename);
expect(err.message).toContain('ERR_CODE');
expect(err.status).toBe(500);
expect(err.code).toBe('E_UNKNOWN');
}
});
});
describe('WrongKeyPath ', function () {
it('constructor', function () {
try {
throw new WrongKeyPath(new Error('test'), 'some/path');
} catch (err: any) {
expect(err.raw).toBeDefined();
expect(err.raw.message).toBe('test');
expect(err.message).toContain('some/path');
expect(err.status).toBe(500);
expect(err.code).toBe('E_WRONG_KEY_PATH');
}
});
});

View File

@@ -1,9 +0,0 @@
import * as drive from '../src';
describe('drive', function () {
it('Objects should be exported', function () {
expect(drive.Storage).toBeDefined();
expect(drive.StorageManager).toBeDefined();
expect(drive.LocalFileSystemStorage).toBeDefined();
});
});

View File

@@ -1,7 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"rootDir": ".."
},
"include": [".", "../src"]
}

View File

@@ -1,32 +0,0 @@
{
"compilerOptions": {
"target": "ES2018",
"lib": ["ES2018"],
"module": "CommonJS",
"moduleResolution": "node",
"declaration": true,
"declarationMap": true,
"strict": true,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"noImplicitAny": true,
"noImplicitThis": true,
"noImplicitReturns": true,
"noUnusedLocals": true,
"noUncheckedIndexedAccess": true,
"noUnusedParameters": true,
"alwaysStrict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"resolveJsonModule": false,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"allowSyntheticDefaultImports": true,
"isolatedModules": true,
"rootDir": "./src",
"outDir": "./dist"
},
"include": ["./src/**/*.ts"]
}

View File

@@ -0,0 +1,45 @@
{
"name": "@directus/storage-driver-azure",
"version": "9.21.2",
"type": "module",
"scripts": {
"build": "tsc --build",
"dev": "tsc --watch",
"test": "vitest run",
"test:watch": "vitest",
"test:coverage": "vitest run --coverage"
},
"description": "Azure file storage abstraction for `@directus/storage`",
"repository": {
"type": "git",
"url": "https://github.com/directus/directus.git",
"directory": "packages/storage-driver-azure"
},
"funding": "https://github.com/directus/directus?sponsor=1",
"license": "GPL-3.0",
"author": "Rijk van Zanten <rijkvanzanten@me.com>",
"exports": {
".": "./dist/index.js",
"./package.json": "./package.json"
},
"main": "dist/index.js",
"files": [
"dist",
"!**/*.d.ts?(.map)"
],
"publishConfig": {
"access": "public"
},
"dependencies": {
"@azure/storage-blob": "12.12.0",
"@directus/storage": "workspace:*",
"@directus/utils": "workspace:*"
},
"devDependencies": {
"@directus/tsconfig": "0.0.6",
"@ngneat/falso": "6.3.0",
"@vitest/coverage-c8": "0.25.3",
"typescript": "4.9.3",
"vitest": "0.25.3"
}
}

View File

@@ -0,0 +1,3 @@
# `@directus/storage-driver-azure`
Azure file storage driver for `@directus/storage`

View File

@@ -0,0 +1,523 @@
import type { ContainerClient } from '@azure/storage-blob';
import type { Mock } from 'vitest';
import { BlobServiceClient, StorageSharedKeyCredential } from '@azure/storage-blob';
import { normalizePath } from '@directus/utils';
import { isReadableStream } from '@directus/utils/node';
import { join } from 'node:path';
import { PassThrough } from 'node:stream';
import { afterEach, describe, expect, test, vi, beforeEach } from 'vitest';
import { DriverAzure } from './index.js';
import type { DriverAzureConfig } from './index.js';
import {
randAlphaNumeric,
randDirectoryPath,
randDomainName,
randFilePath,
randGitBranch as randContainer,
randNumber,
randPastDate,
randWord,
randText,
randFileType,
randUrl,
} from '@ngneat/falso';
vi.mock('@directus/utils/node');
vi.mock('@directus/utils');
vi.mock('@azure/storage-blob');
vi.mock('node:path');
let sample: {
config: Required<DriverAzureConfig>;
path: {
input: string;
inputFull: string;
src: string;
srcFull: string;
dest: string;
destFull: string;
};
range: {
start: number;
end: number;
};
stream: PassThrough;
text: string;
file: {
type: string;
size: number;
modified: Date;
};
};
let driver: DriverAzure;
beforeEach(() => {
sample = {
config: {
containerName: randContainer(),
accountName: randWord(),
accountKey: randAlphaNumeric({ length: 40 }).join(''),
root: randDirectoryPath(),
endpoint: `https://${randDomainName()}`,
},
path: {
input: randFilePath(),
inputFull: randFilePath(),
src: randFilePath(),
srcFull: randFilePath(),
dest: randFilePath(),
destFull: randFilePath(),
},
range: {
start: randNumber(),
end: randNumber(),
},
stream: new PassThrough(),
text: randText(),
file: {
type: randFileType(),
size: randNumber(),
modified: randPastDate(),
},
};
driver = new DriverAzure({
containerName: sample.config.containerName,
accountKey: sample.config.accountKey,
accountName: sample.config.accountName,
});
driver['fullPath'] = vi.fn().mockImplementation((input) => {
if (input === sample.path.src) return sample.path.srcFull;
if (input === sample.path.dest) return sample.path.destFull;
if (input === sample.path.input) return sample.path.inputFull;
return '';
});
});
afterEach(() => {
vi.resetAllMocks();
});
describe('#constructor', () => {
test('Creates signed credentials', () => {
expect(StorageSharedKeyCredential).toHaveBeenCalledWith(sample.config.accountName, sample.config.accountKey);
expect(driver['signedCredentials']).toBeInstanceOf(StorageSharedKeyCredential);
});
test('Creates blob service client and sets containerClient', () => {
const mockSignedCredentials = {} as StorageSharedKeyCredential;
vi.mocked(StorageSharedKeyCredential).mockReturnValueOnce(mockSignedCredentials);
const mockContainerClient = {} as ContainerClient;
const mockBlobServiceClient = {
getContainerClient: vi.fn().mockReturnValue(mockContainerClient),
} as unknown as BlobServiceClient;
vi.mocked(BlobServiceClient).mockReturnValue(mockBlobServiceClient);
const driver = new DriverAzure({
containerName: sample.config.containerName,
accountName: sample.config.accountName,
accountKey: sample.config.accountKey,
});
expect(BlobServiceClient).toHaveBeenCalledWith(
`https://${sample.config.accountName}.blob.core.windows.net`,
mockSignedCredentials
);
expect(mockBlobServiceClient.getContainerClient).toHaveBeenCalledWith(sample.config.containerName);
expect(driver['containerClient']).toBe(mockContainerClient);
});
test('Allows overriding endpoint with optional setting', () => {
test('Creates blob service client and sets containerClient', () => {
const mockSignedCredentials = {} as StorageSharedKeyCredential;
vi.mocked(StorageSharedKeyCredential).mockReturnValueOnce(mockSignedCredentials);
const mockContainerClient = {} as ContainerClient;
const mockBlobServiceClient = {
getContainerClient: vi.fn().mockReturnValue(mockContainerClient),
} as unknown as BlobServiceClient;
vi.mocked(BlobServiceClient).mockReturnValue(mockBlobServiceClient);
const driver = new DriverAzure({
containerName: sample.config.containerName,
accountName: sample.config.accountName,
accountKey: sample.config.accountKey,
endpoint: sample.config.endpoint,
});
expect(BlobServiceClient).toHaveBeenCalledWith(sample.config.endpoint, mockSignedCredentials);
expect(mockBlobServiceClient.getContainerClient).toHaveBeenCalledWith(sample.config.containerName);
expect(driver['containerClient']).toBe(mockContainerClient);
});
});
test('Defaults root path to empty string', () => {
expect(driver['root']).toBe('');
});
test('Normalizes config path when root is given', () => {
vi.mocked(normalizePath).mockReturnValue(sample.path.inputFull);
new DriverAzure({
containerName: sample.config.containerName,
accountName: sample.config.accountName,
accountKey: sample.config.accountKey,
root: sample.path.input,
});
expect(normalizePath).toHaveBeenCalledWith(sample.path.input, { removeLeading: true });
});
});
describe('#fullPath', () => {
test('Returns normalized joined path', () => {
vi.mocked(join).mockReturnValue(sample.path.inputFull);
vi.mocked(normalizePath).mockReturnValue(sample.path.inputFull);
const driver = new DriverAzure({
containerName: sample.config.containerName,
accountName: sample.config.accountName,
accountKey: sample.config.accountKey,
});
driver['root'] = sample.config.root;
const result = driver['fullPath'](sample.path.input);
expect(join).toHaveBeenCalledWith(sample.config.root, sample.path.input);
expect(normalizePath).toHaveBeenCalledWith(sample.path.inputFull);
expect(result).toBe(sample.path.inputFull);
});
});
describe('#read', () => {
let mockDownload: Mock;
beforeEach(async () => {
mockDownload = vi.fn().mockResolvedValue({ readableStreamBody: sample.stream });
const mockBlobClient = vi.fn().mockReturnValue({
download: mockDownload,
});
driver['containerClient'] = {
getBlobClient: mockBlobClient,
} as unknown as ContainerClient;
});
test('Uses blobClient at full path', async () => {
await driver.read(sample.path.input);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.input);
expect(driver['containerClient'].getBlobClient).toHaveBeenCalledWith(sample.path.inputFull);
});
test('Calls download with undefined undefined when no range is passed', async () => {
await driver.read(sample.path.input);
expect(mockDownload).toHaveBeenCalledWith(undefined, undefined);
});
test('Calls download with offset if start range is provided', async () => {
await driver.read(sample.path.input, { start: sample.range.start });
expect(mockDownload).toHaveBeenCalledWith(sample.range.start, undefined);
});
test('Calls download with count if end range is provided', async () => {
await driver.read(sample.path.input, { end: sample.range.end });
expect(mockDownload).toHaveBeenCalledWith(undefined, sample.range.end);
});
test('Calls download with offset and count if start and end ranges are provided', async () => {
await driver.read(sample.path.input, sample.range);
expect(mockDownload).toHaveBeenCalledWith(sample.range.start, sample.range.end - sample.range.start);
});
test('Throws error when no readable stream is returned', async () => {
mockDownload.mockResolvedValue({ readableStreamBody: undefined });
try {
await driver.read(sample.path.input);
} catch (err: any) {
expect(err).toBeInstanceOf(Error);
expect(err.message).toBe(`No stream returned for file "${sample.path.input}"`);
}
});
});
describe('#write', () => {
let mockUploadStream: Mock;
let mockBlockBlobClient: Mock;
beforeEach(() => {
mockUploadStream = vi.fn();
mockBlockBlobClient = vi.fn().mockReturnValue({
uploadStream: mockUploadStream,
});
driver['containerClient'] = {
getBlockBlobClient: mockBlockBlobClient,
} as unknown as ContainerClient;
vi.mocked(isReadableStream).mockReturnValue(true);
});
test('Gets BlockBlobClient for file path', async () => {
await driver.write(sample.path.input, sample.stream);
expect(mockBlockBlobClient).toHaveBeenCalledWith(sample.path.inputFull);
});
test('Uploads stream through uploadStream', async () => {
await driver.write(sample.path.input, sample.stream);
expect(mockUploadStream).toHaveBeenCalledWith(sample.stream, undefined, undefined, {
blobHTTPHeaders: { blobContentType: 'application/octet-stream' },
});
});
test('Allows optional mime type to be set', async () => {
await driver.write(sample.path.input, sample.stream, sample.file.type);
expect(mockUploadStream).toHaveBeenCalledWith(sample.stream, undefined, undefined, {
blobHTTPHeaders: { blobContentType: sample.file.type },
});
});
});
describe('#delete', () => {
let mockDeleteIfExists: Mock;
beforeEach(() => {
mockDeleteIfExists = vi.fn().mockResolvedValue(true);
const mockBlockBlobClient = vi.fn().mockReturnValue({
deleteIfExists: mockDeleteIfExists,
});
driver['containerClient'] = {
getBlockBlobClient: mockBlockBlobClient,
} as unknown as ContainerClient;
});
test('Uses blobClient at full path', async () => {
await driver.delete(sample.path.input);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.input);
expect(driver['containerClient'].getBlockBlobClient).toHaveBeenCalledWith(sample.path.inputFull);
});
test('Returns delete result', async () => {
await driver.delete(sample.path.input);
expect(mockDeleteIfExists).toHaveBeenCalled();
});
});
describe('#stat', () => {
beforeEach(() => {
const mockGetProperties = vi.fn().mockReturnValue({
contentLength: sample.file.size,
lastModified: sample.file.modified,
});
const mockBlobClient = vi.fn().mockReturnValue({
getProperties: mockGetProperties,
});
driver['containerClient'] = {
getBlobClient: mockBlobClient,
} as unknown as ContainerClient;
});
test('Uses blobClient at full path', async () => {
await driver.stat(sample.path.input);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.input);
expect(driver['containerClient'].getBlobClient).toHaveBeenCalledWith(sample.path.inputFull);
});
test('Returns contentLength/lastModified as size/modified from getProperties', async () => {
const result = await driver.stat(sample.path.input);
expect(result).toStrictEqual({
size: sample.file.size,
modified: sample.file.modified,
});
});
});
describe('#exists', () => {
let mockExists: Mock;
beforeEach(() => {
mockExists = vi.fn().mockResolvedValue(true);
const mockBlockBlobClient = vi.fn().mockReturnValue({
exists: mockExists,
});
driver['containerClient'] = {
getBlockBlobClient: mockBlockBlobClient,
} as unknown as ContainerClient;
});
test('Uses blobClient at full path', async () => {
await driver.exists(sample.path.input);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.input);
expect(driver['containerClient'].getBlockBlobClient).toHaveBeenCalledWith(sample.path.inputFull);
});
test('Returns exists result', async () => {
const result = await driver.exists(sample.path.input);
expect(mockExists).toHaveBeenCalled();
expect(result).toBe(true);
});
});
describe('#move', () => {
let mockDeleteIfExists: Mock;
let mockBlockBlobClient: Mock;
beforeEach(() => {
mockDeleteIfExists = vi.fn();
mockBlockBlobClient = vi.fn().mockReturnValue({
deleteIfExists: mockDeleteIfExists,
});
driver['containerClient'] = {
getBlockBlobClient: mockBlockBlobClient,
} as unknown as ContainerClient;
driver.copy = vi.fn();
});
test('Calls #copy with src and dest', async () => {
await driver.move(sample.path.src, sample.path.dest);
expect(driver.copy).toHaveBeenCalledWith(sample.path.src, sample.path.dest);
});
test('Deletes src file after copy is completed', async () => {
await driver.move(sample.path.src, sample.path.dest);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.src);
expect(mockBlockBlobClient).toHaveBeenCalledWith(sample.path.srcFull);
expect(mockDeleteIfExists).toHaveBeenCalledOnce();
});
});
describe('#copy', () => {
let mockPollUntilDone: Mock;
let mockBeginCopyFromUrl: Mock;
let mockBlockBlobClient: Mock;
let mockUrl: string;
beforeEach(() => {
mockPollUntilDone = vi.fn();
mockBeginCopyFromUrl = vi.fn().mockResolvedValue({
pollUntilDone: mockPollUntilDone,
});
mockUrl = randUrl();
mockBlockBlobClient = vi
.fn()
.mockReturnValueOnce({
url: mockUrl,
})
.mockReturnValueOnce({
beginCopyFromURL: mockBeginCopyFromUrl,
});
driver['containerClient'] = {
getBlockBlobClient: mockBlockBlobClient,
} as unknown as ContainerClient;
});
test('Gets BlockBlobClient for src and dest', async () => {
await driver.copy(sample.path.src, sample.path.dest);
expect(driver['fullPath']).toHaveBeenCalledTimes(2);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.src);
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.dest);
expect(mockBlockBlobClient).toHaveBeenCalledTimes(2);
expect(mockBlockBlobClient).toHaveBeenCalledWith(sample.path.srcFull);
expect(mockBlockBlobClient).toHaveBeenCalledWith(sample.path.destFull);
});
test('Calls beginCopyFromUrl with source url', async () => {
await driver.copy(sample.path.src, sample.path.dest);
expect(mockBeginCopyFromUrl).toHaveBeenCalledOnce();
expect(mockBeginCopyFromUrl).toHaveBeenCalledWith(mockUrl);
});
test('Waits for the polling to be done', async () => {
await driver.copy(sample.path.src, sample.path.dest);
expect(mockPollUntilDone).toHaveBeenCalledOnce();
});
});
describe('#list', () => {
let mockListBlobsFlat: Mock;
beforeEach(() => {
mockListBlobsFlat = vi.fn().mockReturnValue([]);
driver['containerClient'] = {
listBlobsFlat: mockListBlobsFlat,
} as unknown as ContainerClient;
});
test('Uses listBlobsFlat at default empty path', async () => {
await driver.list().next();
expect(driver['fullPath']).toHaveBeenCalledWith('');
expect(mockListBlobsFlat).toHaveBeenCalledWith({
prefix: '',
});
});
test('Allows for optional prefix', async () => {
await driver.list(sample.path.input).next();
expect(driver['fullPath']).toHaveBeenCalledWith(sample.path.input);
expect(mockListBlobsFlat).toHaveBeenCalledWith({
prefix: sample.path.inputFull,
});
});
test('Returns blob.name for each returned blob', async () => {
const mockFile = randFilePath();
mockListBlobsFlat.mockReturnValue([{ name: mockFile }]);
const output = [];
for await (const filepath of driver.list()) {
output.push(filepath);
}
expect(output).toStrictEqual([mockFile]);
});
});

View File

@@ -0,0 +1,97 @@
import { BlobServiceClient, ContainerClient, StorageSharedKeyCredential } from '@azure/storage-blob';
import type { Driver, Range } from '@directus/storage';
import { normalizePath } from '@directus/utils';
import { join } from 'node:path';
import type { Readable } from 'node:stream';
export type DriverAzureConfig = {
containerName: string;
accountName: string;
accountKey: string;
root?: string;
endpoint?: string;
};
export class DriverAzure implements Driver {
private containerClient: ContainerClient;
private signedCredentials: StorageSharedKeyCredential;
private root: string;
constructor(config: DriverAzureConfig) {
this.signedCredentials = new StorageSharedKeyCredential(config.accountName, config.accountKey);
const client = new BlobServiceClient(
config.endpoint ?? `https://${config.accountName}.blob.core.windows.net`,
this.signedCredentials
);
this.containerClient = client.getContainerClient(config.containerName);
this.root = config.root ? normalizePath(config.root, { removeLeading: true }) : '';
}
private fullPath(filepath: string) {
return normalizePath(join(this.root, filepath));
}
async read(filepath: string, range?: Range) {
const { readableStreamBody } = await this.containerClient
.getBlobClient(this.fullPath(filepath))
.download(range?.start, range?.end ? range.end - (range.start || 0) : undefined);
if (!readableStreamBody) {
throw new Error(`No stream returned for file "${filepath}"`);
}
return readableStreamBody as Readable;
}
async write(filepath: string, content: Readable, type = 'application/octet-stream') {
const blockBlobClient = this.containerClient.getBlockBlobClient(this.fullPath(filepath));
await blockBlobClient.uploadStream(content as Readable, undefined, undefined, {
blobHTTPHeaders: { blobContentType: type },
});
}
async delete(filepath: string) {
await this.containerClient.getBlockBlobClient(this.fullPath(filepath)).deleteIfExists();
}
async stat(filepath: string) {
const props = await this.containerClient.getBlobClient(this.fullPath(filepath)).getProperties();
return {
size: props.contentLength as number,
modified: props.lastModified as Date,
};
}
async exists(filepath: string) {
return await this.containerClient.getBlockBlobClient(this.fullPath(filepath)).exists();
}
async move(src: string, dest: string) {
await this.copy(src, dest);
await this.containerClient.getBlockBlobClient(this.fullPath(src)).deleteIfExists();
}
async copy(src: string, dest: string) {
const source = this.containerClient.getBlockBlobClient(this.fullPath(src));
const target = this.containerClient.getBlockBlobClient(this.fullPath(dest));
const poller = await target.beginCopyFromURL(source.url);
await poller.pollUntilDone();
}
async *list(prefix = '') {
const blobs = this.containerClient.listBlobsFlat({
prefix: this.fullPath(prefix),
});
for await (const blob of blobs) {
yield (blob.name as string).substring(this.root.length);
}
}
}
export default DriverAzure;

Some files were not shown because too many files have changed in this diff Show More