* order docs

* rewrap all the things

* move headers out a level

* add all the tables of contents

* fill in PQE_METADATA_URL with prod url

* fill in PQE_DATA_URL with prod url

* fill in docs URLs

* correct some statements in deployment doc

* re-initialize changelog in chan-compatible format

* correct branch name again

* Remove general planetary computer docs
This commit is contained in:
James Santucci 2021-10-27 12:26:34 -04:00 коммит произвёл GitHub
Родитель 57b22b89d0
Коммит 17e04ae08d
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
9 изменённых файлов: 30 добавлений и 571 удалений

4
.github/workflows/cicd.yml поставляемый
Просмотреть файл

@ -2,10 +2,10 @@ name: Planetary Computer APIs CI/CD
on:
push:
branches: [ $default-branch ]
branches: [ main ]
tags: [ v* ]
pull_request:
branches: [ $default-branch, cicd-target ]
branches: [ main, cicd-target ]
jobs:

Просмотреть файл

@ -1,8 +1,9 @@
# Change Log
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
- Initial open source release
### Changed
- Organized and updated documentation markdown docs [#11](https://github.com/microsoft/planetary-computer-apis/pull/11)

Просмотреть файл

@ -1,5 +1,11 @@
# Deployment
- [Deployment](#deployment)
- [Configure Azure resources](#configure-azure-resources)
- [Build and publish containers](#build-and-publish-containers)
- [Create Helm apps](#create-helm-apps)
- [Summary](#summary)
The publicy available Planetary Query Engine is a collection of related services
that run as containers in Azure Kubernetes Service (AKS). Getting those
containers deployed requires a few steps, that we'll go into more below:
@ -19,8 +25,9 @@ container registry. If you're not familiar with Kubernetes, the cluster is an
abstraction representing a connection between a _scheduler_ and a _node pool_.
This is a gross oversimplification, but it's sufficient for describing the
Planetary Query Engine's Kubernetes deployment. The node pool is responsible for
providing undifferentiated compute capacity. The scheduler is responsible for
keeping track of what tasks need to run and assigning them to the node pool.
providing undifferentiated compute capacity, e.g., "two cores and 8gb of RAM."
The scheduler is responsible for keeping track of what tasks need to run and
assigning them to the node pool.
We provision this cluster using terraform, and you can find the cluster
configuration in [aks.tf](../deployment/terraform/resources/aks.tf).
@ -33,13 +40,11 @@ Those commands build the containers necessary for running the services, running
migrations, and more.
After those containers are built, they need to be pushed to a _container
registry_. That happens in the [`cipublish`](../scripts/cipublish) script. For
each of the metadata query engine, data query engine, and SAS services, that
script ships the containers to the configured container registry. You'll need to
ensure that the relevant variables referred to in that file are set in your
environment, that the Azure Container Registry referred to exists, and that
you've logged in to Azure and that container registry with `az login` and `az
acr login`, but having completed those steps, you can ship the built containers.
registry_. That happens in [continuous
integration](../.github/workflows/cicd.yml) in the `publish` job. For each of
the metadata query engine and data query engine, that script ships the
containers to the an appropriate container registry associated with this
repository. These published container images are publicly available.
## Create Helm apps
@ -52,4 +57,12 @@ short, you'll need to be able to provide variables to fill in [the deploy values
template](../deployment/helm/deploy-values.template.yaml). Most of them are
availabale from Terraform output, if you completed the Terrraform step; however
some also depend on environment variables that you also already needed. You can
find the latter category of values prefixed with `env.` in the template.
find the latter category of values prefixed with `env.` in the template.
# Summary
The entire workflow for testing, building, publishing, and deploying the data
and metadata query engines is publicly viewable in this repository. You can view
workflow runs in the [`Actions`
tab](https://github.com/microsoft/planetary-computer-apis/actions/workflows/cicd.yml)
for this repo at any time.

Просмотреть файл

@ -1,86 +0,0 @@
# How to crop and scale a TIF asset from the Planetary Computer
Tiling assets are cropped according to a predefined pyramid with fewer tiles and lower resolution at the top of the pyramid and more, higher resolution tiles on the bottom.
Sometimes, greater control of image boundaries is required.
In this how-to you will construct an API call to crop and scale a tif asset according to two flavors of user specification: bounding box `GET` request and polygon `POST`.
### Prerequisites
- [How-to generate SAS token/sign requests](./how-to-generate-sas-token-sign-requests.md)
- (optional) Get an account (TODO)
- [How to read a STAC Item in the Planetary Computer STAC catalog](./how-to-preview-stac-entry.md)
### Cropping an asset by bounding box
Because we're cropping a single asset rather than a (potentially global) mosaic of assets, it is not unlikely that the spatial extent represented by the imagery is limited to a small region of the globe.
The first thing we should do to guarantee good results is verify the external, spatial bounds of the asset in question.
The URL we'll be constructing requires a minimum x, minimum y, maximum x, and maximum y - sometimes referred to as a bounding box.
Referring to the spatial bounding box reported on the [info endpoint](PQE_METADATA_URL/collections/naip?item=md_m_3807619_se_18_060_20181025_20190211) where the collection = 'naip' and the item = md_m_3807619_se_18_060_20181025_20190211, we see that the minimum x is -76.6919..., the minimum y is 38.6213..., the maximum x is -76.6200..., and the maximum y is 38.6916...:
```json
{
"spatial":{
"bbox":[
[
-76.69198556156623,
38.621369461223104,
-76.6200684427915,
38.69162323586947
]
]
}
}
```
> Note: the STAC item for this entry lists a `proj:bbox` field, but this is *not* what we are using to establish a tif's bounds and to construct the bounding box used in cropping.
> The difficulty with this field is that it is the bounding box as defined in the native projection of the tif whereas `crop` endpoints anticipate a bounding box defined in terms of latitude/longitude coordinates.
> The info endpoint (PQE_DATA_URL/collections/{collection_id}/map/tiles?item={item_id}), on the other hand, will *always* provide an image's extent in terms of latitude/longitude and is thus well suited for the construction of crop boundaries.
Looking to the [API reference](DQE_API_REFERENCE_URL) under the `OGC Tiles` heading, you should see that the "Bbox crop" endpoint uses template PQE_DATA_URL/collections/{collection_id}/crop/{minx},{miny},{maxx},{maxy}.{format}?item={item_id}&assets=image.
In constructing the {minx},{miny},{maxx},{maxy} portion of the crop template it is generally desirable, though not strictly necessary, to keep the minx/miny values *higher* than the minimum x/y values and the maxx/maxy values *lower* than the maximum x,y values advertised on the tif's lat/lng bbox above.
Following this rule ensures that the output imagery is fully within the region actually represented by a tif.
Any crop-region in excess of the tif's advertised coverage will be treated as `NoData` and appear as a black or transparent region in crop endpoint results.
A suitable bbox subselection with data throughout the output imagery might be minx=-76.68, miny=38.63, maxx=-76.63, maxy=38.68.
Filling out the rest of the "Bbox crop" template with the same item used in prior examples, we get PQE_DATA_URL/collections/naip/crop/-76.68,38.63,-76.63,38.68.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image.
For output format, we'll use `tif` so that spatial information is bundled up with the output imagery.
This format selection is ideal for analysis in desktop GIS software and will enable quick verification of output correctness.
Available formats are the same as listed above - feel free to try out some of the alternatives such as `png` or `npy`
The URL, after all templating has been attended to, should be: PQE_DATA_URL/collections/naip/crop/-76.68,38.63,-76.63,38.68.tif?item=md_m_3807619_se_18_060_20181025_20190211&assets=image
### Cropping an asset by polygon
Cropping by polygon is just like cropping by bounding box except that, instead of providing a bounding box in the API path, a polygon is supplied in a `POST` body.
Sticking to the familiar template above, all that we need to do to use the equivalent polygon crop endpoint is delete the "{minx},{miny},{maxx},{maxy}" portion of the URL, convert the `GET` to a `POST` and insert the following JSON into the `POST`'s body:
```json
{
"type":"Polygon",
"coordinates":[
[
[
-76.68,
38.63
],
[
-76.63,
38.63
],
[
-76.63,
38.68
],
[
-76.68,
38.68
],
[
-76.68,
38.63
]
]
]
}
```

Просмотреть файл

@ -1,101 +0,0 @@
# How-to generate SAS token/sign requests
The Planetary Computer service supports several use cases for interacting with data, one of which is interacting directly with the underlying imagery. All underlying imagery is stored on Azure Blob Storage, and may only be accessed directly via the use of shared access signatures (SAS tokens).
In this how-to article, you will learn how to request SAS tokens and supply them on requests for imagery blobs.
## When a SAS token is needed
A SAS token is needed whenever an Azure Blob URL is returned in a request and downloading the data is desired. For example, in [How to read a STAC Item in the Planetary Computer STAC catalog](./how-to-preview-stac-entry.md), the assets retrieved are Azure Blob URLs. A SAS token will need to be appended to the blob URL as query parameters.
For example, an Azure Blob URL may look like: `https://naipeuwest.blob.core.windows.net/naip/01.tif`.
And an example SAS token may look like: `se=2021-04-08T18%3A49%3A29Z&sp=rl&sip=20.73.55.19&sv=2020-02-10&sr=c&skoid=cccccccc-dddd-4444-aaaa-eeeeeeeeeeee&sktid=***&skt=2021-04-08T17%3A47%3A29Z&ske=2021-04-09T17%3A49%3A29Z&sks=b&skv=2020-02-10&sig=bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb%3D`.
Combining the URL with the SAS token, remembering to place a `?` in between will result in: `https://naipeuwest.blob.core.windows.net/naip/01.tif?se=2021-04-08T18%3A49%3A29Z&sp=rl&sip=20.70.50.10&sv=2020-02-10&sr=c&skoid=cccccccc-dddd-4444-aaaa-eeeeeeeeeeee&sktid=***&skt=2021-04-08T17%3A47%3A29Z&ske=2021-04-09T17%3A49%3A29Z&sks=b&skv=2020-02-10&sig=bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb%3D`. The resulting URL may then be downloaded.
## Endpoints for requesting a SAS token
There are two endpoints that may be used to obtain a SAS token:
1) [token](https://planetarycomputer.microsoft.com/data/v1/token/{DATASET})
2) [sign](https://planetarycomputer.microsoft.com/data/v1/sign)
The `token` endpoint allows for the generation of a SAS token for a given dataset, which can then be used for all requests for that same dataset. For example, to obtain a SAS token for the `naip` dataset, a request may be made to: `https://planetarycomputer.microsoft.com/data/v1/token/naip`. An example response may look like:
```json
{
"msft:expiry":"2021-04-08T18:49:29Z",
"token":"se=2021-04-08T18%3A49%3A29Z&sp=rl&sip=20.73.55.19&sv=2020-02-10&sr=c&skoid=cccccccc-dddd-4444-aaaa-eeeeeeeeeeee&sktid=***&skt=2021-04-08T17%3A47%3A29Z&ske=2021-04-09T17%3A49%3A29Z&sks=b&skv=2020-02-10&sig=bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb%3D"
}
```
The `token` field is the SAS token. The `msft:expiry` field specifies (in UTC) when this token expires. For reference documentation on expiration times and how to increase them, see: [Rate limits and access restrictions](./reference-rate-limits-and-access-restrictions.md).
The `sign` endpoint makes it easy to convert an unsigned blob URL to a signed URL by passing the URL directly into the endpoint with the `href` parameter. For example: `https://planetarycomputer.microsoft.com/data/v1/sign?href=https://naipeuwest.blob.core.windows.net/naip/01.tif` returns JSON such as:
```json
{
"msft:expiry":"2021-04-08T18:49:29Z",
"href":"https://naipeuwest.blob.core.windows.net/naip/01.tif?se=2021-04-08T18%3A49%3A29Z&sp=rl&sip=20.73.55.19&sv=2020-02-10&sr=c&skoid=cccccccc-dddd-4444-aaaa-eeeeeeeeeeee&sktid=***&skt=2021-04-08T17%3A47%3A29Z&ske=2021-04-09T17%3A49%3A29Z&sks=b&skv=2020-02-10&sig=bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb%3D"
}
```
The `href` field here contains the full, signed URL which may be used directly.
## Supplying an API subscription key
An API subscription key may be supplied to increase [rate limits](./reference-rate-limits-and-access-restrictions.md) in one of two ways:
Supply it in an `Ocp-Apim-Subscription-Key` on request header, example:
```bash
curl -H "Ocp-Apim-Subscription-Key: 123456789" https://planetarycomputer.microsoft.com/data/v1/token/naip?subscription-key=123456789
```
Supply it in a `subscription-key` query parameter, example:
```bash
curl https://planetarycomputer.microsoft.com/data/v1/token/naip?subscription-key=123456789
```
## Planetary Computer SDK for Python
The [Planetary Computer SDK for Python](https://github.com/microsoft/planetary-computer-sdk-for-python) makes the above process more straightforward by providing a library that calls these endpoints to sign URLs, and even sign all assets within a [PySTAC](https://github.com/stac-utils/pystac) item. A cache is also kept, which tracks expiration values, to ensure new SAS tokens are only requested when needed.
Here's an example of using the library to sign a single URL:
```python
import planetary_computer as pc
import pystac
item: pystac.Item = ... # Landsat item
b4_href = pc.sign(item.assets['SR_B4'].href)
with rasterio.open(b4_href) as ds:
...
```
And here's an example of using the library to sign all assets in a [PySTAC](https://github.com/stac-utils/pystac) item:
```python
import pystac
import planetary_computer as pc
raw_item: pystac.Item = ...
item: pystac.Item = pc.sign_assets(raw_item)
# Now use the item however you want. All appropriate assets are signed for read access.
```
This library may be installed via:
```bash
pip install planetarycomputer
```
Once installed, the CLI may be used to supply an API subscription key to increase [rate limits](./reference-rate-limits-and-access-restrictions.md)
```bash
planetarycomputer configure
```

Просмотреть файл

@ -1,117 +0,0 @@
# How to read a STAC Item in the Planetary Computer STAC catalog
The Planetary Computer Asset Catalog is an index and store of a variety of geospatial assets available for users to incorporate in computational pipelines and applications.
If accessed from within the appropriate Azure infrastructure, there are almost no limits on the amounts or rates of access to this data.
None of the Planetary Computer's assets are exclusive to Azure infrastructure, however, and external users can access everything it has to offer with some minor rate limits.
In this how-to article, you will learn how to read and interpret individual STAC items indexed in the Planetary Computer STAC catalog.
We will look at the metadata which is important for understanding what an item represents and for retrieving asset data.
### Locate a tif asset through the Metadata Query Engine
The Metadata Query Engine is a [STAC API](https://stacspec.org/STAC-api.html) which allows users to explore and search through the entirety of the Planetary Computer's offerings.
Assets can be found by reference to their spatial extent or their associated metadata.
To keep things simple for the purposes of this how-to, we'll be looking at a single asset on a single item in the Planetary Computer's [NAIP](https://www.fsa.usda.gov/programs-and-services/aerial-photography/imagery-programs/naip-imagery/) collection.
As an implementation of the STAC API, MDE entries are intended to be simple, human-readable JSON.
The item we're looking at can be found here: [PQE_METADATA_URL/collections/naip/items/md_m_3807619_se_18_060_20181025_20190211](PQE_METADATA_URL/collections/naip/items/md_m_3807619_se_18_060_20181025_20190211).
##### Previewing the contents of a Planetry Computer STAC entry
There are a few of fields which are very useful for previewing its contents.
Satellite imagery happens within a particular spatial and temporal extent; most analyses which depend on satellite imagery will need to pay close attention to the where and the when of any imagery it intends to use.
Spatial information is exposed through the `geometry` field.
```json
{
"coordinates":[
[
[
-76.625,
38.625
],
[
-76.625,
38.6875
],
[
-76.6875,
38.6875
],
[
-76.6875,
38.625
],
[
-76.625,
38.625
]
]
],
"type":"Polygon"
}
```
With this geometry and a little help from [geojson.io](http://geojson.io/), we can see that our asset lies just to the southeast of Washington D.C.
![Image of asset location on geojson.io](images/asset-location.png)
Temporal information can be found on the `datetime` field under `properties` (in fact, the `properties` field contains a lot of useful information).
Here is the data under the `properties` field of this item
```json
{
"created":"2021-02-22T05:29:43Z",
"updated":"2021-02-22T05:29:43Z",
"providers":[
{
"name":"USDA Farm Service Agency",
"roles":[
"producer",
"licensor"
],
"url":"https://www.fsa.usda.gov/programs-and-services/aerial-photography/imagery-programs/naip-imagery/"
}
],
"gsd":0.6,
"datetime":"2018-10-25T00:00:00Z",
"naip:year":"2018",
"naip:state":"md",
"proj:epsg":26918,
"proj:shape":[
12814,
10202
],
"proj:bbox":[
352847.4,
4276115.999999999,
358968.60000000003,
4283804.399999999
],
"proj:transform":[
0.6,
0,
352847.4,
0,
-0.6,
4283804.399999999,
0,
0,
1
]
}
```
It looks a like our `Asset` was captured on October 25, 2018 just to the southeast of Washington D.C.
We can get a very high level preview of what this sensor captured by checking the `thumbnail` entry under `assets`.
```json
{
"title":"Thumbnail",
"href":"https://naipeuwest.blob.core.windows.net/naip/v002/md/2018/md_060cm_2018/38076/m_3807619_se_18_060_20181025.200.jpg",
"type":"image/jpeg",
"roles":[
"thumbnail"
]
}
```
Visiting the contained `href` (See [How-to generate SAS token/sign requests](./how-to-generate-sas-token-sign-requests.md) to supply a SAS token on the URL) we get a coarse representation of this entry.
![Thumbnail preview of this item's contents](images/m_3807619_se_18_060_20181025.200.jpg)

Просмотреть файл

@ -1,169 +0,0 @@
# How to tile a TIF asset from the Planetary Computer
The Planetary Computer Asset Catalog is an index and store of a variety of geospatial assets available for users to incorporate in computational pipelines and applications.
If accessed from within the appropriate Azure infrastructure, there are almost no limits on the amounts or rates of access to this data.
None of the Planetary Computer's assets are exclusive to Azure infrastructure, however, and external users can access everything it has to offer with some minor rate limits.
In this how-to article, you will learn how to access individual `tif` assets indexed in the Planetary Computer's Metadata Query Engine (MQE) through the Data Query Engine (DQE).
We will look at the metadata which is important for understanding what an item represents and constructing Data Query Engine API calls.
Turning to the Data Query Engine, we will inspect the tif asset's tiling metadata and construct a URL template which can be used to provide tiles for a [tiled web map](https://en.wikipedia.org/wiki/Tiled_web_map).
### Prerequisites
- [How-to generate SAS token/sign requests](./how-to-generate-sas-token-sign-requests.md)
- (optional) Get an account (TODO)
- [How to read a STAC Item in the Planetary Computer STAC catalog](./how-to-preview-stac-entry.md)
### Tiling an asset
The simplest endpoint on the DQE provides asset information necessary to construct a tiling URL specific to one of hte supported projections.
For the purposes of this how-to we'll use this item from the STAC catalog: [PQE_METADATA_URL/collections/naip?items/md_m_3807619_se_18_060_20181025_20190211](PQE_METADATA_URL/collections/naip/items/md_m_3807619_se_18_060_20181025_20190211).
This URL includes the collection (naip) as well as the item ID (md_m_3807619_se_18_060_20181025_20190211).
To access asset information, '/map/tiles' is added to the metadata URL.
Additionally, `item` and `assets` query parameters specifying the item and asset for which metadata should be retrieved must be supplied.
In this case, the relevant asset found on the STAC API is labeled 'image'
With these modifications, the updated URL is [PQE_DATA_URL/collections/naip/map/tiles?item=md_m_3807619_se_18_060_20181025_20190211&assets=image](PQE_DATA_URL/collections/naip/map/tiles?item=md_m_3807619_se_18_060_20181025_20190211&assets=image).
An extent property is provided which describes the region for which meaningful tiles can be expected.
```json
{
"spatial":{
"bbox":[
[
-76.69198556156623,
38.621369461223104,
-76.6200684427915,
38.69162323586947
]
]
}
}
```
In addition, a list of tile matrix sets is provided, each of which defines a [tiling pyramid](https://northstar-www.dartmouth.edu/doc/idl/html_6.2/Image_Tiling.html) supported by the Planetary Computer DQE.
The most common of the supported tile matrix sets is 'WebMercatorQuad', otherwise known as [Web Mercator](https://en.wikipedia.org/wiki/Web_Mercator_projection) or [WGS84](https://spatialreference.org/ref/sr-org/epsg3857-wgs84-web-mercator-auxiliary-sphere/).
This full list is located under the key 'tileMatrixSetLinks':
```json
[
{
"tileMatrixSet":"LINZAntarticaMapTilegrid",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/LINZAntarticaMapTilegrid"
},
{
"tileMatrixSet":"CanadianNAD83_LCC",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/CanadianNAD83_LCC"
},
{
"tileMatrixSet":"NZTM2000",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/NZTM2000"
},
{
"tileMatrixSet":"UPSAntarcticWGS84Quad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/UPSAntarcticWGS84Quad"
},
{
"tileMatrixSet":"WorldCRS84Quad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/WorldCRS84Quad"
},
{
"tileMatrixSet":"WorldMercatorWGS84Quad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/WorldMercatorWGS84Quad"
},
{
"tileMatrixSet":"UPSArcticWGS84Quad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/UPSArcticWGS84Quad"
},
{
"tileMatrixSet":"WebMercatorQuad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/WebMercatorQuad"
},
{
"tileMatrixSet":"UTM31WGS84Quad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/UTM31WGS84Quad"
},
{
"tileMatrixSet":"EuropeanETRS89_LAEAQuad",
"tileMatrixSetURI":"PQE_DATA_URL/tileMatrixSets/EuropeanETRS89_LAEAQuad"
}
]
```
The final piece of information provided by the DQE API's info endpoint is a templated URL which can be used by Leaflet, OpenLayers, Bing Maps or just about any web mapping library to display this NAIP asset at varying resolutions and projections.
This template can be found in the array of links with the key 'links':
```json
[
{
"title":"Slippy Map Tiles",
"href":"PQE_DATA_URL/collections/naip/map/tiles/{tileMatrixSetId}/{tileMatrixZoom}/{tileRow}/{tileCol}.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image",
"rel":"item",
"type":"image/png",
"templated":true
}
]
```
The template in this case is "PQE_DATA_URL/collections/naip/map/tiles/{tileMatrixSetId}/{tileMatrixZoom}/{tileRow}/{tileCol}.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image".
A bit more information is required for this to work well with web mapping libraries.
First, tileMatrixSetId needs to be one of the supported tile matrix sets above.
Let's keep things simple: we'll use Web Mercator.
Its ID is 'WebMercatorQuad', so our updated URL should be "PQE_DATA_URL/collections/naip/map/tiles/WebMercatorQuad/{tileMatrixZoom}/{tileRow}/{tileCol}.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image"
Actually viewing this image will require the selection of some specific subset of the available bands on the tif.
One or more bands will need to be provided via query parameter (just like we did with assets above).
To provide a single band, the updated query parameters might be "?item=md_m_3807619_se_18_060_20181025_20190211&assets=image&bidx=1".
In this case, however, we want to view an RGB composite.
Luckily, that's exactly what the DQE assumes when three bands are provided.
The only thing necessary is to separate bands with a comma.
The updated template should now look like this: "PQE_DATA_URL/collections/naip/map/tiles/WebMercatorQuad/{tileMatrixZoom}/{tileRow}/{tileCol}.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image&bidx=1,2,3"
The final necessary modification is to convert tileMatrixZoom, tileRow, and tileCol to the expected template values for the web mapping library being used to display this tile layer and to select a format for the returned imagery.
For this how-to we'll use a very simple [leaflet map tile layer](https://leafletjs.com/reference-1.7.1.html#tilelayer) (if you prefer one of the alternatives, don't worry: the other web mapping libraries are quite similar).
![Leaflet tile layer docs](images/leaflet-tile-layer-docs.png)
Leaflet's documentation indicates that {z}, {x}, and {y} are the preferred template values for tile pyramids, so we'll switch tileMatrixZoom, tileRow, and tileCol to those, respectively.
That should leave us with the template URL "PQE_DATA_URL/collections/naip/map/tiles/WebMercatorQuad/{z}/{x}/{y}.{format}?item=md_m_3807619_se_18_060_20181025_20190211&assets=image&bidx=1,2,3"
The only remaining decision is what format we'd like.
Available extensions are: `png`, `npy` (yes, numpy tiles are supported!), `tif`, "`jpg`, `jp2`, `webp`, and `pngraw`.
Web mapping libraries typically play nicely with `png`, so let's specify that.
At this point, the template should be ready for use in Leaflet: "PQE_DATA_URL/collections/naip/map/tiles/WebMercatorQuad/{z}/{x}/{y}.png?item=md_m_3807619_se_18_060_20181025_20190211&assets=image&bidx=1,2,3"
Copy the HTML below into a new file named "index.html" and open it with your browser of choice to see a tiled web map for the selected NAIP asset.
```html
<html>
<head>
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.7.1/dist/leaflet.css"
integrity="sha512-xodZBNTC5n17Xt2atTPuE1HxjVMSvLVW9ocqUKLsCC5CXdbqCmblAshOMAS6/keqq/sMZMZ19scR4PsZChSR7A=="
crossorigin=""/>
</head>
<body>
<div id="map" style="position:fixed;right:0px;left:0px;height:100%;">
</div>
<script src="https://unpkg.com/leaflet@1.7.1/dist/leaflet.js"
integrity="sha512-XQoYMqMTK8LvdxXYG3nZ448hOEQiglfqkJs1NOQV44cWnUrBc8PkAOcXy20w0vlaXaVUearIOBhiXZ5V3ynxwA=="
crossorigin="">
</script>
<script>
var map = L.map('map', {
center: {lat: 38.652024950540856, lng: -76.65505472919905},
zoom: 13
});
var Stamen_TonerLite = L.tileLayer('http://stamen-tiles-{s}.a.ssl.fastly.net/toner-lite/{z}/{x}/{y}.png', {
attribution: 'Map tiles by <a href="http://stamen.com">Stamen Design</a>, <a href="http://creativecommons.org/licenses/by/3.0">CC BY 3.0</a> &mdash; Map data &copy; <a href="http://www.openstreetmap.org/copyright">OpenStreetMap</a>',
subdomains: 'abcd',
minZoom: 0,
maxZoom: 20
}).addTo(map);
var mosaicLayer = L.tileLayer("PQE_DATA_URL/collections/naip/map/tiles/WebMercatorQuad/{z}/{x}/{y}.png?item=md_m_3807619_se_18_060_20181025_20190211&assets=image&bidx=1,2,3", {
minZoom: 12,
maxZoom: 18
}).addTo(map);
</script>
</body>
</html>
```

Просмотреть файл

@ -1,49 +0,0 @@
# How to access a TIF asset with QGIS through the WMTS protocol
The Planetary Computer Asset Catalog is an index and store of a variety of geospatial assets available for users to incorporate in computational pipelines and applications.
If accessed from within the appropriate Azure infrastructure, there are almost no limits on the amounts or rates of access to this data.
None of the Planetary Computer's assets are exclusive to Azure infrastructure, however, and external users can access everything it has to offer with some minor rate limits.
In this how-to article, you will learn how to access individual `tif` assets indexed in the Planetary Computer's STAC catalog with the help of QGIS and the [WMTS protocol](https://www.ogc.org/standards/wmts).
### Prerequisites
- [Install QGIS](https://qgis.org/en/site/forusers/download.html)
- [How-to generate SAS token/sign requests](./how-to-generate-sas-token-sign-requests.md)
- (optional) Get an account (TODO)
- [How to read a STAC Item in the Planetary Computer STAC catalog](./how-to-preview-stac-entry.md)
### Constructing the WMTS URL
First, refer to the [API documentation](PC_OPENAPI_URL).
There, the URL template for WMTS connections is available: `PC_URL/data/v1/pctiler/{TileMatrixSet}/WMTSCapabilities.xml`
In addition to the base URL, a few query parameters will be necessary: the collection, item, asset, and set of bands which jointly define the minimal set of information necessary to view a TIF stored in the Planetary Computer's STAC catalog.
Here's what the query parameter template might look like: `?collection={collection_id}&items={item_id}&assets={asset}&bidx={bands}`.
For the purposes of this how to, we'll look at the item al_m_3008503_ne_16_060_20191118_20200114 within the naip collection.
The asset which refers to the full scale TIF is 'image' and the first three bands are R, G, and B.
Using the popular WebMercatorQuad projection and tile matrix set, that leaves the URL: PC_URL/data/v1/pctiler/WebMercatorQuad/WMTSCapabilities.xml?collection=naip&items=al_m_3008503_ne_16_060_20191118_20200114&assets=image&bidx=1,2,3
Next, we will use this URL to add a WMTS layer to QGIS.
### Adding a WMTS layer to QGIS
With QGIS open, go to `Layer > Add Layer > Add WMS/WMTS layer...`
![Add a WMS/WMTS layer](images/add_layer_dialog_wmts.png)
Select `New` to add a new WMTS layer source
![Create a new WMTS layer](images/create_new_wmts_connection.png)
Add a name/label to keep track of this layer, and paste the URL constructed above (PC_URL/data/v1/pctiler/WebMercatorQuad/WMTSCapabilities.xml?collection=naip&items=al_m_3008503_ne_16_060_20191118_20200114&assets=image&bidx=1,2,3) into the URL field.
Hit "OK" once it is the Name and URL fields are completed
![Create new WMTS layer dialog](images/add_wmts_url.png)
Connect to the newly defined layer and hit 'Add'
![Connect to the newly created WMTS layer](images/connect_to_wmts.png)
That's it!
You should have a tiled layer available for inspection in your QGIS client.
![Inspect the newly added layer](images/wmts_success.png)

Просмотреть файл

@ -1,33 +0,0 @@
# Rate limits and access restrictions
To prevent abuse, the SAS token related endpoints ([token](https://planetarycomputer.microsoft.com/data/v1/token) and [sign](https://planetarycomputer.microsoft.com/data/v1/sign) are rate limited by requesting IP. There are two boolean variables that control which rate limiting policy is applied to incoming requests:
* Whether or not the request is originating from within the same datacenter as the Planetary Computer service
* Whether or not a valid API subscription key has been supplied on the request
These two variables are used to determine the tier of rate limiting which is applied to requests, as well as the valid length of time for issued SAS tokens.
## Within datacenter
The Planetary Computer service is running within the Azure West Europe datacenter. The IP address of incoming requests is checked against the published set of IP ranges for the West Europe datacenter. For example, an Azure VM running within the West Europe datacenter would pass this check. Because the data is kept local to the datacenter, a less stringent rate limiting policy is applied. This check is based purely on the IP of the incoming request, and no additional headers or query parameters need to be supplied.
## Supplied subscription key
Signing up for an API subscription key is optional, however supplying one on a request helps the system track usage and prevent abuse, and therefore allows for a less stringent rate limiting/token expiration policy. There are two ways to supply a subscription key on requests:
* Supply it in an `Ocp-Apim-Subscription-Key` on request header
* Supply it in a `subscription-key` query parameter
Alternatively, the [Planetary Computer SDK for Python](https://github.com/microsoft/planetary-computer-sdk-for-python) may be used to aid in this.
## Rate limits and expirations
Rate limits and expiration values table based on the two variables defined above:
| Variables | Requests per minute | Token expiration minutes |
|-------------------------------------|---------------------|--------------------------|
|Within datacenter, with subscription | 120 | 60 * 24 * 32 (~1 month) |
|Within datacenter, no subscription | 60 | 60 (1 hour) |
|Outside datacenter, with subscription| 10 | 60 (1 hour) |
|Outside datacenter, no subscription | 5 | 5 |