This commit is contained in:
Xiaoning Liu 2018-11-22 16:08:48 +08:00 коммит произвёл Vincent Jiang (LEI)
Родитель 8858145b79
Коммит 57476aece2
5 изменённых файлов: 305 добавлений и 130 удалений

Просмотреть файл

@ -40,7 +40,7 @@ For example, you can create following CORS settings for debugging. But please cu
### Building
This project is based on TypeScript. For Node.js, generate commonJS module formats and browser bundles, build with:
This project is based on TypeScript. Build with:
```bash
npm install

Просмотреть файл

@ -1,6 +1,7 @@
# Azure Storage SDK V10 for JavaScript
* @azure/storage-blob [![npm version](https://badge.fury.io/js/%40azure%2Fstorage-blob.svg)](https://badge.fury.io/js/%40azure%2Fstorage-blob)
* @azure/storage-file [![npm version](https://badge.fury.io/js/%40azure%2Fstorage-file.svg)](https://badge.fury.io/js/%40azure%2Fstorage-file)
* [API Reference documentation](https://docs.microsoft.com/en-us/javascript/api/overview/azure/storage/client?view=azure-node-preview)
## Introduction
@ -17,6 +18,11 @@ Please note that this version of the SDK is a compete overhaul of the current [A
* Create/Read/List/Update/Delete Block Blobs
* Create/Read/List/Update/Delete Page Blobs
* Create/Read/List/Update/Delete Append Blobs
* File Storage
* Get/Set File Service Properties
* Create/List/Delete File Shares
* Create/List/Delete File Directories
* Create/Read/List/Update/Delete Files
* Features new
* Asynchronous I/O for all operations using the async methods
* HttpPipeline which enables a high degree of per-request configurability
@ -49,21 +55,28 @@ There are differences between Node.js and browsers runtime. When getting start w
* Shared Access Signature(SAS) generation
* `generateAccountSASQueryParameters()`
* `generateBlobSASQueryParameters()`
* `generateFileSASQueryParameters()`
* Parallel uploading and downloading
* `uploadFileToBlockBlob()`
* `uploadStreamToBlockBlob()`
* `downloadBlobToBuffer()`
* `uploadFileToAzureFile()`
* `uploadStreamToAzureFile()`
* `downloadAzureFileToBuffer()`
##### Following features, interfaces, classes or functions are only available in browsers
* Parallel uploading and downloading
* `uploadBrowserDataToBlockBlob()`
* `uploadBrowserDataToAzureFile()`
## Getting Started
### NPM
The preferred way to install the Azure Storage SDK for JavaScript is to use the npm package manager. Simply type the following into a terminal window:
The preferred way to install the Azure Storage SDK for JavaScript is to use the npm package manager. Take "@azure/storage-blob" for example.
Simply type the following into a terminal window:
```bash
npm install @azure/storage-blob
@ -87,17 +100,20 @@ To use the SDK with JS bundle in the browsers, simply add a script tag to your H
```html
<script src="https://mydomain/azure-storage.blob.min.js"></script>
<script src="https://mydomain/azure-storage.file.min.js"></script>
```
The JS bundled file is compatible with [UMD](https://github.com/umdjs/umd) standard, if no module system found, following global variable(s) will be exported:
* `azblob`
* `azfile`
#### Download
Download latest released JS bundles from links in the [GitHub release page](https://github.com/Azure/azure-storage-js/releases). Or from following links directly:
* Blob [https://aka.ms/downloadazurestoragejsblob](https://aka.ms/downloadazurestoragejsblob)
* File [https://aka.ms/downloadazurestoragejsfile](https://aka.ms/downloadazurestoragejsfile)
### CORS
@ -254,6 +270,8 @@ main()
* [Blob Storage Examples](https://github.com/azure/azure-storage-js/tree/master/blob/samples)
* [Blob Storage Examples - Test Cases](https://github.com/azure/azure-storage-js/tree/master/blob/test/)
* [File Storage Examples](https://github.com/azure/azure-storage-js/tree/master/file/samples)
* [File Storage Examples - Test Cases](https://github.com/azure/azure-storage-js/tree/master/file/test/)
## License

Просмотреть файл

@ -1,7 +1,7 @@
# Azure Storage SDK V10 for JavaScript - File
- [![npm version](https://badge.fury.io/js/%40azure%2Fstorage-file.svg)](https://badge.fury.io/js/%40azure%2Fstorage-file)
- [API Reference documentation](https://docs.microsoft.com/en-us/javascript/api/%40azure/storage-file/index?view=azure-node-preview)
* [![npm version](https://badge.fury.io/js/%40azure%2Fstorage-file.svg)](https://badge.fury.io/js/%40azure%2Fstorage-file)
* [API Reference documentation](https://docs.microsoft.com/en-us/javascript/api/%40azure/storage-file/index?view=azure-node-preview)
## Introduction
@ -11,15 +11,15 @@ Please note that this version of the SDK is a compete overhaul of the current [A
### Features
- File Storage
- Get/Set File Service Properties
- Create/List/Delete File Shares
- Create/List/Delete File Directories
- Create/Read/List/Update/Delete Files
- Features new
- Asynchronous I/O for all operations using the async methods
- HttpPipeline which enables a high degree of per-request configurability
- 1-to-1 correlation with the Storage REST API for clarity and simplicity
* File Storage
* Get/Set File Service Properties
* Create/List/Delete File Shares
* Create/List/Delete File Directories
* Create/Read/List/Update/Delete Files
* Features new
* Asynchronous I/O for all operations using the async methods
* HttpPipeline which enables a high degree of per-request configurability
* 1-to-1 correlation with the Storage REST API for clarity and simplicity
### Compatibility
@ -31,11 +31,11 @@ You need polyfills to make this library work with IE11. The easiest way is to us
Or you can load separate polyfills for missed ES feature(s).
This library depends on following ES6 features which need external polyfills loaded.
- `Promise`
- `String.prototype.startsWith`
- `String.prototype.endsWith`
- `String.prototype.repeat`
- `String.prototype.includes`
* `Promise`
* `String.prototype.startsWith`
* `String.prototype.endsWith`
* `String.prototype.repeat`
* `String.prototype.includes`
#### Differences between Node.js and browsers
@ -43,20 +43,20 @@ There are differences between Node.js and browsers runtime. When getting start w
##### Following features, interfaces, classes or functions are only available in Node.js
- Shared Key Authorization based on account name and account key
- `SharedKeyCredential`
- Shared Access Signature(SAS) generation
- `generateAccountSASQueryParameters()`
- `generateFileSASQueryParameters()`
- Parallel uploading and downloading
- `uploadFileToBlockFile()`
- `uploadStreamToBlockFile()`
- `downloadFileToBuffer()`
* Shared Key Authorization based on account name and account key
* `SharedKeyCredential`
* Shared Access Signature(SAS) generation
* `generateAccountSASQueryParameters()`
* `generateFileSASQueryParameters()`
* Parallel uploading and downloading
* `uploadFileToAzureFile()`
* `uploadStreamToAzureFile()`
* `downloadAzureFileToBuffer()`
##### Following features, interfaces, classes or functions are only available in browsers
- Parallel uploading and downloading
- `uploadBrowserDataToFile()`
* Parallel uploading and downloading
* `uploadBrowserDataToAzureFile()`
## Getting Started
@ -121,13 +121,143 @@ The Azure Storage SDK for JavaScript provides low-level and high-level APIs.
## Code Samples
```javascript
// TODO:
const {
Aborter,
StorageURL,
ServiceURL,
ShareURL,
DirectoryURL,
FileURL,
SharedKeyCredential,
AnonymousCredential,
TokenCredential
} = require("@azure/storage-file");
async function main() {
// Enter your storage account name and shared key
const account = "";
const accountKey = "";
// Use SharedKeyCredential with storage account and account key
// SharedKeyCredential is only avaiable in Node.js runtime, not in browsers
const sharedKeyCredential = new SharedKeyCredential(account, accountKey);
// Use TokenCredential with OAuth token
const tokenCredential = new TokenCredential("token");
tokenCredential.token = "renewedToken"; // Renew the token by updating token field of token credential object
// Use AnonymousCredential when url already includes a SAS signature
const anonymousCredential = new AnonymousCredential();
// Use sharedKeyCredential, tokenCredential or anonymousCredential to create a pipeline
const pipeline = StorageURL.newPipeline(sharedKeyCredential);
// List shares
const serviceURL = new ServiceURL(
// When using AnonymousCredential, following url should include a valid SAS
`https://${account}.file.core.windows.net`,
pipeline
);
console.log(`List shares`);
let marker;
do {
const listSharesResponse = await serviceURL.listSharesSegment(
Aborter.none,
marker
);
marker = listSharesResponse.nextMarker;
for (const share of listSharesResponse.shareItems) {
console.log(`\tShare: ${share.name}`);
}
} while (marker);
// Create a share
const shareName = `newshare${new Date().getTime()}`;
const shareURL = ShareURL.fromServiceURL(serviceURL, shareName);
await shareURL.create(Aborter.none);
console.log(`Create share ${shareName} successfully`);
// Create a directory
const directoryName = `newdirectory${new Date().getTime()}`;
const directoryURL = DirectoryURL.fromShareURL(shareURL, directoryName);
await directoryURL.create(Aborter.none);
console.log(`Create directory ${directoryName} successfully`);
// Create a file
const content = "Hello World!";
const fileName = "newfile" + new Date().getTime();
const fileURL = FileURL.fromDirectoryURL(directoryURL, fileName);
await fileURL.create(Aborter.none, content.length);
console.log(`Create file ${fileName} successfully`);
// Upload file range
await fileURL.uploadRange(Aborter.none, content, 0, content.length);
console.log(`Upload file range "${content}" to ${fileName} successfully`);
// List directories and files
console.log(`List directories and files under directory ${directoryName}`);
marker = undefined;
do {
const listFilesAndDirectoriesResponse = await directoryURL.listFilesAndDirectoriesSegment(
Aborter.none,
marker
);
marker = listFilesAndDirectoriesResponse.nextMarker;
for (const file of listFilesAndDirectoriesResponse.segment.fileItems) {
console.log(`\tFile: ${file.name}`);
}
for (const directory of listFilesAndDirectoriesResponse.segment
.directoryItems) {
console.log(`\tDirectory: ${directory.name}`);
}
} while (marker);
// Get file content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadFileResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadFileResponse.blobBody
const downloadFileResponse = await fileURL.download(Aborter.none, 0);
console.log(
`Downloaded file content${await streamToString(
downloadFileResponse.readableStreamBody
)}`
);
// Delete share
await shareURL.delete(Aborter.none);
console.log(`deleted share ${shareName}`);
}
// A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", data => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
// An async method returns a Promise object, which is compatible with then().catch() coding style.
main()
.then(() => {
console.log("Successfully executed sample.");
})
.catch(err => {
console.log(err.message);
});
```
## More Code Samples
- [File Storage Examples](https://github.com/azure/azure-storage-js/tree/master/file/samples)
- [File Storage Examples - Test Cases](https://github.com/azure/azure-storage-js/tree/master/file/test/)
* [File Storage Examples](https://github.com/azure/azure-storage-js/tree/master/file/samples)
* [File Storage Examples - Test Cases](https://github.com/azure/azure-storage-js/tree/master/file/test/)
## License

Просмотреть файл

@ -4,105 +4,125 @@
const {
Aborter,
BlobURL,
BlockBlobURL,
ContainerURL,
ServiceURL,
StorageURL,
ServiceURL,
ShareURL,
DirectoryURL,
FileURL,
SharedKeyCredential,
AnonymousCredential,
TokenCredential
} = require(".."); // Change to "@azure/storage-blob" in your package
} = require(".."); // Change to "@azure/storage-file" in your package
async function main() {
// Enter your storage account name and shared key
const account = "account";
const accountKey = "accountkey";
const account = "";
const accountKey = "";
// Use SharedKeyCredential with storage account and account key
// SharedKeyCredential is only avaiable in Node.js runtime, not in browsers
const sharedKeyCredential = new SharedKeyCredential(account, accountKey);
// Use TokenCredential with OAuth token
const tokenCredential = new TokenCredential("token");
tokenCredential.token = "renewedToken"; // Renew the token by updating token filed of token credential
tokenCredential.token = "renewedToken"; // Renew the token by updating token field of token credential object
// Use AnonymousCredential when url already includes a SAS signature
const anonymousCredential = new AnonymousCredential();
// Use sharedKeyCredential, tokenCredential or AnonymousCredential to create a pipeline
// Use sharedKeyCredential, tokenCredential or anonymousCredential to create a pipeline
const pipeline = StorageURL.newPipeline(sharedKeyCredential);
// List containers
// List shares
const serviceURL = new ServiceURL(
// When using AnonymousCredential, following url should include a valid SAS or support public access
`https://${account}.blob.core.windows.net`,
// When using AnonymousCredential, following url should include a valid SAS
`https://${account}.file.core.windows.net`,
pipeline
);
console.log(`List shares`);
let marker;
do {
const listContainersResponse = await serviceURL.listContainersSegment(
const listSharesResponse = await serviceURL.listSharesSegment(
Aborter.none,
marker
);
marker = listContainersResponse.marker;
for (const container of listContainersResponse.containerItems) {
console.log(`Container: ${container.name}`);
marker = listSharesResponse.nextMarker;
for (const share of listSharesResponse.shareItems) {
console.log(`\tShare: ${share.name}`);
}
} while (marker);
// Create a container
const containerName = `newcontainer${new Date().getTime()}`;
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
// Create a share
const shareName = `newshare${new Date().getTime()}`;
const shareURL = ShareURL.fromServiceURL(serviceURL, shareName);
await shareURL.create(Aborter.none);
console.log(`Create share ${shareName} successfully`);
const createContainerResponse = await containerURL.create(Aborter.none);
console.log(
`Create container ${containerName} successfully`,
createContainerResponse.requestId
);
// Create a directory
const directoryName = `newdirectory${new Date().getTime()}`;
const directoryURL = DirectoryURL.fromShareURL(shareURL, directoryName);
await directoryURL.create(Aborter.none);
console.log(`Create directory ${directoryName} successfully`);
// Create a blob
const content = "hello";
const blobName = "newblob" + new Date().getTime();
const blobURL = BlobURL.fromContainerURL(containerURL, blobName);
const blockBlobURL = BlockBlobURL.fromBlobURL(blobURL);
const uploadBlobResponse = await blockBlobURL.upload(
Aborter.none,
content,
content.length
);
console.log(
`Upload block blob ${blobName} successfully`,
uploadBlobResponse.requestId
);
// Create a file
const content = "Hello World!";
const fileName = "newfile" + new Date().getTime();
const fileURL = FileURL.fromDirectoryURL(directoryURL, fileName);
await fileURL.create(Aborter.none, content.length);
console.log(`Create file ${fileName} successfully`);
// List blobs
// Upload file range
await fileURL.uploadRange(Aborter.none, content, 0, content.length);
console.log(`Upload file range "${content}" to ${fileName} successfully`);
// List directories and files
console.log(`List directories and files under directory ${directoryName}`);
marker = undefined;
do {
const listBlobsResponse = await containerURL.listBlobFlatSegment(
const listFilesAndDirectoriesResponse = await directoryURL.listFilesAndDirectoriesSegment(
Aborter.none,
marker
);
marker = listBlobsResponse.marker;
for (const blob of listBlobsResponse.segment.blobItems) {
console.log(`Blob: ${blob.name}`);
marker = listFilesAndDirectoriesResponse.nextMarker;
for (const file of listFilesAndDirectoriesResponse.segment.fileItems) {
console.log(`\tFile: ${file.name}`);
}
for (const directory of listFilesAndDirectoriesResponse.segment
.directoryItems) {
console.log(`\tDirectory: ${directory.name}`);
}
} while (marker);
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
const downloadBlockBlobResponse = await blobURL.download(Aborter.none, 0);
// Get file content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadFileResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadFileResponse.blobBody
const downloadFileResponse = await fileURL.download(Aborter.none, 0);
console.log(
"Downloaded blob content",
downloadBlockBlobResponse.readableStreamBody.read(content.length).toString()
`Downloaded file content${await streamToString(
downloadFileResponse.readableStreamBody
)}`
);
// Delete container
await containerURL.delete(Aborter.none);
// Delete share
await shareURL.delete(Aborter.none);
console.log(`deleted share ${shareName}`);
}
console.log("deleted container");
// A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", data => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
// An async method returns a Promise object, which is compatible with then().catch() coding style.
@ -112,4 +132,4 @@ main()
})
.catch(err => {
console.log(err.message);
});
});

Просмотреть файл

@ -5,101 +5,108 @@
const fs = require("fs");
const {
AnonymousCredential,
uploadBrowserDataToBlockBlob,
downloadBlobToBuffer,
uploadFileToBlockBlob,
uploadStreamToBlockBlob,
uploadBrowserDataToAzureFile,
downloadAzureFileToBuffer,
uploadFileToAzureFile,
uploadStreamToAzureFile,
Aborter,
BlobURL,
BlockBlobURL,
ContainerURL,
FileURL,
DirectoryURL,
ShareURL,
ServiceURL,
StorageURL
} = require(".."); // Change to "@azure/storage-blob" in your package
async function main() {
// Fill in following settings before running this sample
const account = "account";
const accountSas = "accountSas";
const localFilePath = "localFilePath";
const account = "";
const accountSas = "";
const localFilePath = "";
const pipeline = StorageURL.newPipeline(new AnonymousCredential(), {
// httpClient: MyHTTPClient, // A customized HTTP client implementing IHTTPClient interface
// logger: MyLogger, // A customized logger implementing IHTTPPipelineLogger interface
// httpClient: MyHTTPClient, // A customized HTTP client implementing IHttpClient interface
// logger: MyLogger, // A customized logger implementing IHttpPipelineLogger interface
retryOptions: { maxTries: 4 }, // Retry options
telemetry: { value: "HighLevelSample V1.0.0" } // Customized telemetry string
});
const serviceURL = new ServiceURL(
`https://${account}.blob.core.windows.net${accountSas}`,
`https://${account}.file.core.windows.net${accountSas}`,
pipeline
);
// Create a container
const containerName = `newcontainer${new Date().getTime()}`;
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
await containerURL.create(Aborter.none);
// Create a share
const shareName = `newshare${new Date().getTime()}`;
const shareURL = ShareURL.fromServiceURL(serviceURL, shareName);
await shareURL.create(Aborter.none);
console.log(`Create share ${shareName} successfully`);
// Create a blob
const blobName = "newblob" + new Date().getTime();
const blobURL = BlobURL.fromContainerURL(containerURL, blobName);
const blockBlobURL = BlockBlobURL.fromBlobURL(blobURL);
// Create a directory
const directoryName = `newdirectory${new Date().getTime()}`;
const directoryURL = DirectoryURL.fromShareURL(shareURL, directoryName);
await directoryURL.create(Aborter.none);
console.log(`Create directory ${directoryName} successfully`);
// Parallel uploading with uploadFileToBlockBlob in Node.js runtime
// uploadFileToBlockBlob is only available in Node.js
await uploadFileToBlockBlob(Aborter.none, localFilePath, blockBlobURL, {
blockSize: 4 * 1024 * 1024, // 4MB block size
// Upload local file to Azure file parallelly
const fileName = "newfile" + new Date().getTime();
const fileURL = FileURL.fromDirectoryURL(directoryURL, fileName);
const fileSize = fs.statSync(localFilePath).size;
// Parallel uploading with uploadFileToAzureFile in Node.js runtime
// uploadFileToAzureFile is only available in Node.js
await uploadFileToAzureFile(Aborter.none, localFilePath, fileURL, {
rangeSize: 4 * 1024 * 1024, // 4MB range size
parallelism: 20, // 20 concurrency
progress: ev => console.log(ev)
});
console.log("uploadFileToBlockBlob success");
console.log("uploadFileToAzureFile success");
// Parallel uploading a Readable stream with uploadStreamToBlockBlob in Node.js runtime
// uploadStreamToBlockBlob is only available in Node.js
await uploadStreamToBlockBlob(
// Parallel uploading a Readable stream with uploadStreamToAzureFile in Node.js runtime
// uploadStreamToAzureFile is only available in Node.js
await uploadStreamToAzureFile(
Aborter.timeout(30 * 60 * 60 * 1000), // Abort uploading with timeout in 30mins
fs.createReadStream(localFilePath),
blockBlobURL,
fileSize,
fileURL,
4 * 1024 * 1024,
20,
{
progress: ev => console.log(ev)
}
);
console.log("uploadStreamToBlockBlob success");
console.log("uploadStreamToAzureFile success");
// Parallel uploading a browser File/Blob/ArrayBuffer in browsers with uploadBrowserDataToBlockBlob
// Uncomment following code in browsers because uploadBrowserDataToBlockBlob is only available in browsers
// Parallel uploading a browser File/Blob/ArrayBuffer in browsers with uploadBrowserDataToAzureFile
// Uncomment following code in browsers because uploadBrowserDataToAzureFile is only available in browsers
/*
const browserFile = document.getElementById("fileinput").files[0];
await uploadBrowserDataToBlockBlob(Aborter.none, browserFile, blockBlobURL, {
blockSize: 4 * 1024 * 1024, // 4MB block size
await uploadBrowserDataToAzureFile(Aborter.none, browserFile, fileURL, {
rangeSize: 4 * 1024 * 1024, // 4MB range size
parallelism: 20, // 20 concurrency
progress: ev => console.log(ev)
});
*/
// Parallel downloading a block blob into Node.js buffer
// downloadBlobToBuffer is only available in Node.js
const fileSize = fs.statSync(localFilePath).size;
// Parallel downloading an Azure file into Node.js buffer
// downloadAzureFileToBuffer is only available in Node.js
const buffer = Buffer.alloc(fileSize);
await downloadBlobToBuffer(
await downloadAzureFileToBuffer(
Aborter.timeout(30 * 60 * 60 * 1000),
buffer,
blockBlobURL,
fileURL,
0,
undefined,
{
blockSize: 4 * 1024 * 1024, // 4MB block size
rangeSize: 4 * 1024 * 1024, // 4MB range size
parallelism: 20, // 20 concurrency
progress: ev => console.log(ev)
}
);
console.log("downloadBlobToBuffer success");
console.log("downloadAzureFileToBuffer success");
// Delete container
await containerURL.delete(Aborter.none);
console.log("deleted container");
// Delete share
await shareURL.delete(Aborter.none);
console.log("deleted share");
}
// An async method returns a Promise object, which is compatible with then().catch() coding style.