Prototype OIDC auth and proxy flow (#235)
* Proof of concept OIDC authentication workflow * BFF token-proxy implementation using azure functions * oAuth2 flow implementation * Frontend toggle for login and secure session proxy * Feature flag WIP Pin Layer feature * Update functions version and containers for dev * Use azure-tables for session persistence * Add CSRF protection, use POST for state changes * Implement oAuth2 state and OIDC nonce checks * Update readme
This commit is contained in:
Родитель
00bec4adcb
Коммит
9cd5491efd
|
@ -374,6 +374,7 @@ public/_downloads/
|
|||
# misc
|
||||
.DS_Store
|
||||
.env*
|
||||
.vscode/settings.json
|
||||
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
|
|
28
README.md
28
README.md
|
@ -1,6 +1,6 @@
|
|||
# Planetary Computer Data Catalog
|
||||
|
||||
⚠️ Note: This repository serves as a reference implementation for interacting with
|
||||
Note: This repository serves as a reference implementation for interacting with
|
||||
Planetary Computer APIs on Azure. Ths code supports the production deployment of
|
||||
the Planetary Computer Data Catalog and Explorer applications. This repository
|
||||
is not meant to be reusable in other situations without significant
|
||||
|
@ -22,38 +22,32 @@ A homepage, data catalog, and visualizations for the Planetary Computer.
|
|||
|
||||
## Requirements
|
||||
|
||||
- Node v14.15 (LTS)
|
||||
- Yarn
|
||||
- Docker
|
||||
- docker-compose
|
||||
|
||||
## Getting started
|
||||
|
||||
The easiest way to ensure your node environment matches the requirements is to use [nvm](https://github.com/nvm-sh/nvm#installing-and-updating), and in the project root, run:
|
||||
|
||||
```sh
|
||||
nvm use
|
||||
```
|
||||
|
||||
You can install `yarn` globally for that node version with:
|
||||
|
||||
```sh
|
||||
npm install -g yarn
|
||||
```
|
||||
|
||||
Now install the app dependencies and build the docs and external notebooks for the project:
|
||||
The entire development environment is created as part of a multi-container docker-compose setup. To
|
||||
fetch and build the images, run:
|
||||
|
||||
```sh
|
||||
./scripts/update
|
||||
|
||||
```
|
||||
|
||||
Now you can start the development server, and the site should be accessible at <http://localhost:3000>.
|
||||
Now you can start the development server, and the site should be accessible at <http://localhost:4280>.
|
||||
|
||||
```sh
|
||||
./scripts/server
|
||||
```
|
||||
|
||||
If you want to run just the frontend development server on your host, ensure you have Node 14 installed and run:
|
||||
|
||||
```sh
|
||||
yarn install
|
||||
yarn start
|
||||
```
|
||||
|
||||
To build the latest docs or external notebook, or if any new dependencies have been added, re-run `./scripts/update` (you may need to refresh the app in your browser if the site was running).
|
||||
|
||||
### Developing
|
||||
|
|
|
@ -0,0 +1,25 @@
|
|||
FROM mcr.microsoft.com/azure-functions/python:3.0-python3.8
|
||||
|
||||
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
|
||||
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
|
||||
|
||||
# Install azure functions core tools
|
||||
ENV DEBIAN_VERSION=10
|
||||
|
||||
RUN wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg
|
||||
RUN mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
|
||||
RUN wget -q https://packages.microsoft.com/config/debian/$DEBIAN_VERSION/prod.list
|
||||
RUN mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
|
||||
RUN chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
|
||||
RUN chown root:root /etc/apt/sources.list.d/microsoft-prod.list
|
||||
|
||||
RUN apt-get update
|
||||
|
||||
# Must use functions v3 to keep support for function proxies
|
||||
RUN apt-get install azure-functions-core-tools-3
|
||||
|
||||
COPY requirements.txt /
|
||||
RUN pip install -r /requirements.txt
|
||||
|
||||
COPY requirements-dev.txt /
|
||||
RUN pip install -r /requirements-dev.txt
|
|
@ -18,8 +18,17 @@ The `local.settings.json` file has the following keys in the Values section:
|
|||
|`AuthAdminUrl`| | URL to the PC ID admin page which contains the signup table. Used in the Teams notification message.
|
||||
|`SignupUrl`| | URL to POST new user content to on submission
|
||||
|`SignupToken` | `pc-id--request-auth-token` | Bearer token required to make the above POST request
|
||||
|`PCID_CLIENT_ID`| | The local or remote oAuth2 application id in pc-id for the frond end|
|
||||
|`PCID_CLIENT_SECRET` | | The client secret associated with the client id above |
|
||||
|`PCID_CONFIG_URL`| | The URL to the pc-id well known OIDC configuration endpoint (<https://planetarycomputer-staging.microsoft.com/id/o/.well-known/openid-configuration/>|
|
||||
|`PCID_REDIRECT_URL` | | The login callback for the oAuth flow (<http://localhost:4280/api/auth/login/callback>)
|
||||
|`PROXY_API_ROOT` | | The URL to be used as the proxy upstream (<https://planetarycomputer-staging.microsoft.com/api>)
|
||||
|
||||
### Production
|
||||
|
||||
These values, or their production equivalents, are configured in the Static
|
||||
Web App `App Configuration` section.
|
||||
|
||||
Additionally, `SESSION_CONN_STR` and `SESSION_TABLE_NAME` should be configured
|
||||
in the production/staging setting. In the development environment, they are set
|
||||
in the docker-compose file.
|
||||
|
|
|
@ -0,0 +1,28 @@
|
|||
import json
|
||||
import logging
|
||||
import azure.functions as func
|
||||
from azure.core.exceptions import ResourceNotFoundError
|
||||
|
||||
from ..pccommon.csrf import check_csrf
|
||||
from ..pccommon.session_manager import InvalidSessionCookie, SessionManager
|
||||
from ..pccommon.csrf import check_csrf, CSRFException
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
try:
|
||||
check_csrf(req)
|
||||
_ = SessionManager(req)
|
||||
# TODO: Refresh token flow
|
||||
return func.HttpResponse(
|
||||
body=json.dumps({"isLoggedIn": True}),
|
||||
mimetype="application/json",
|
||||
)
|
||||
except (InvalidSessionCookie, ResourceNotFoundError, CSRFException):
|
||||
logging.exception("Invalid session or CSRF")
|
||||
return func.HttpResponse(
|
||||
status_code=401,
|
||||
body=json.dumps(
|
||||
{"isLoggedIn": False},
|
||||
),
|
||||
mimetype="application/json",
|
||||
)
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["post"],
|
||||
"route": "auth/refresh"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,47 @@
|
|||
import secrets
|
||||
from urllib.parse import urlparse
|
||||
|
||||
import requests
|
||||
import azure.functions as func
|
||||
|
||||
from ..pccommon.auth import (
|
||||
make_auth_url,
|
||||
get_oidc_prop,
|
||||
make_oidc_state_nonce_cookie,
|
||||
)
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""Initiate the login sequence with the identity provider."""
|
||||
auth_endpoint = get_oidc_prop("authorization_endpoint")
|
||||
state_nonce = secrets.token_hex()
|
||||
token_nonce = secrets.token_hex()
|
||||
url = make_auth_url(
|
||||
auth_endpoint,
|
||||
scopes=["openid", "email"],
|
||||
state=state_nonce,
|
||||
nonce=token_nonce,
|
||||
)
|
||||
resp = requests.get(url, allow_redirects=False)
|
||||
|
||||
pcid_parsed = urlparse(auth_endpoint)
|
||||
redirect_loc = (
|
||||
f"{pcid_parsed.scheme}://{pcid_parsed.netloc}{resp.headers['Location']}"
|
||||
)
|
||||
|
||||
if resp.status_code == 302:
|
||||
return func.HttpResponse(
|
||||
status_code=302,
|
||||
headers={
|
||||
"Location": redirect_loc,
|
||||
# Note that there is an issue with adding multiple cookie
|
||||
# headers, so nonce and state are combined.
|
||||
# https://github.com/Azure/azure-functions-python-worker/issues/892
|
||||
"Set-Cookie": make_oidc_state_nonce_cookie(state_nonce, token_nonce),
|
||||
},
|
||||
)
|
||||
|
||||
return func.HttpResponse(
|
||||
status_code=404,
|
||||
body="oAuth Flow redirect not found",
|
||||
)
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["get"],
|
||||
"route": "auth/login"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,127 @@
|
|||
from typing import cast, Tuple
|
||||
from datetime import datetime
|
||||
from http.cookies import SimpleCookie
|
||||
from urllib.parse import unquote
|
||||
|
||||
import logging
|
||||
import os
|
||||
import secrets
|
||||
|
||||
import azure.functions as func
|
||||
import jwt
|
||||
import requests
|
||||
|
||||
from ..pccommon.auth import (
|
||||
NONCE_SEPERATOR,
|
||||
OAUTH_NONCE_COOKIE,
|
||||
generate_rsa_pub,
|
||||
get_oidc_prop,
|
||||
make_session_cookie,
|
||||
make_token_form,
|
||||
)
|
||||
from ..pccommon.session_table import SessionTable
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""
|
||||
Exchange the authorization code for an access token and set the secure
|
||||
session cookie on the client.
|
||||
"""
|
||||
# Get the auth code from the OIDC provider
|
||||
# NB: The code param is `_code` and proxied to this function to avoid
|
||||
# the limitation described here:
|
||||
# https://github.com/Azure/static-web-apps/issues/165
|
||||
authorization_code = req.params.get("_code")
|
||||
|
||||
if authorization_code == None:
|
||||
logging.info("oAuth2 code flow failed to provide callback code")
|
||||
return func.HttpResponse(status_code=500, body="")
|
||||
|
||||
if not is_response_state_valid(req):
|
||||
raise Exception("Invalid oAuth2 state exchange")
|
||||
|
||||
# Exchange the auth code for an access token
|
||||
client_id = os.environ.get("PCID_CLIENT_ID")
|
||||
token_url = get_oidc_prop("token_endpoint")
|
||||
headers = {"Content-Type": "application/x-www-form-urlencoded"}
|
||||
form = make_token_form(authorization_code)
|
||||
|
||||
resp = requests.post(token_url, data=form, headers=headers)
|
||||
|
||||
if resp.status_code == 200:
|
||||
# Get the jwt from the access token and verify its signature
|
||||
jwt_encoded = resp.json().get("id_token")
|
||||
|
||||
kid = jwt.get_unverified_header(jwt_encoded)["kid"]
|
||||
public_key = generate_rsa_pub(kid)
|
||||
try:
|
||||
|
||||
jwt_decoded = jwt.decode(
|
||||
jwt_encoded,
|
||||
public_key,
|
||||
audience=client_id,
|
||||
verify=["exp"],
|
||||
algorithms=["RS256"],
|
||||
)
|
||||
|
||||
# Verify that the client nonce matches the nonce in the token
|
||||
_, client_nonce = get_client_nonces(req)
|
||||
if client_nonce != jwt_decoded.get("nonce"):
|
||||
raise Exception("Nonce mismatch in token.")
|
||||
|
||||
# Create a session in the session table to store the jwt
|
||||
# this is not stored in the cookie
|
||||
session_id = secrets.token_hex()
|
||||
with SessionTable() as client:
|
||||
client.set_session_data(session_id, jwt_decoded, jwt_encoded)
|
||||
|
||||
expirey = datetime.fromtimestamp(cast(float, jwt_decoded.get("exp")))
|
||||
age = expirey - datetime.now()
|
||||
headers = {
|
||||
"Location": "/",
|
||||
"Set-Cookie": make_session_cookie(session_id, age.seconds),
|
||||
}
|
||||
|
||||
return func.HttpResponse(status_code=302, headers=headers)
|
||||
except Exception as e:
|
||||
logging.exception("Error decoding JWT")
|
||||
return func.HttpResponse(status_code=500, body=str(e))
|
||||
|
||||
logging.info(f"oAuth2 code flow failed: {resp.status_code} - {resp.text}")
|
||||
return func.HttpResponse(
|
||||
status_code=resp.status_code, body="Bad response for access token"
|
||||
)
|
||||
|
||||
|
||||
def is_response_state_valid(req: func.HttpRequest) -> bool:
|
||||
"""
|
||||
Check that the state value returned from the auth provider matches the
|
||||
value generated for the request and stored in the client cookie
|
||||
"""
|
||||
resp_state_nonce = req.params.get("state")
|
||||
state, _ = get_client_nonces(req)
|
||||
|
||||
return state == resp_state_nonce
|
||||
|
||||
|
||||
def get_client_nonces(req: func.HttpRequest) -> Tuple[str, str]:
|
||||
"""
|
||||
Get the oAuth state and nonce values from the client cookie
|
||||
"""
|
||||
cookie_reader: SimpleCookie = SimpleCookie(req.headers.get("Cookie"))
|
||||
client_nonce = cookie_reader.get(OAUTH_NONCE_COOKIE)
|
||||
|
||||
if client_nonce is None:
|
||||
raise Exception("No oAuth2 nonce in cookie")
|
||||
|
||||
try:
|
||||
# NB: The cookie wasn't explicitly url-encoded, but the AZ function
|
||||
# runtime does that automatically. The value needs to be unquoted.
|
||||
state, nonce = unquote(client_nonce.value).split(NONCE_SEPERATOR)
|
||||
assert state
|
||||
assert nonce
|
||||
|
||||
return state, nonce
|
||||
except:
|
||||
logging.exception("Failed to get state and nonce from cookie")
|
||||
raise Exception("Invalid oAuth2 state/nonce cookie")
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["get", "post"],
|
||||
"route": "auth/login/callback-proxy"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,38 @@
|
|||
from urllib.parse import urlparse
|
||||
import azure.functions as func
|
||||
|
||||
from ..pccommon.auth import get_invalidated_session_cookie, get_oidc_prop
|
||||
from ..pccommon.session_manager import InvalidSessionCookie, SessionManager
|
||||
from ..pccommon.csrf import check_csrf
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""
|
||||
Initiate the logout sequence by redirecting the user to the PCID logout
|
||||
endpoint.
|
||||
"""
|
||||
|
||||
check_csrf(req)
|
||||
|
||||
try:
|
||||
# Terminate the session
|
||||
session = SessionManager(req)
|
||||
session.destroy()
|
||||
|
||||
# Redirect to PCID logout endpoint - currently not used
|
||||
# Do we want to log out of PCID or just the front end?
|
||||
# ==============
|
||||
# auth_endpoint = get_oidc_prop("authorization_endpoint")
|
||||
# id_url = urlparse(auth_endpoint)
|
||||
# host = req.headers.get("Host")
|
||||
# client_logout_url = f"{id_url.scheme}://{host}/api/auth/logout/callback"
|
||||
# id_logout_url = f"{id_url.scheme}://{id_url.netloc}/id/accounts/logout/?next={client_logout_url}" # noqa E501
|
||||
# "Location": id_logout_url,
|
||||
|
||||
headers = {
|
||||
"Set-Cookie": get_invalidated_session_cookie(),
|
||||
}
|
||||
return func.HttpResponse(status_code=200, headers=headers)
|
||||
|
||||
except InvalidSessionCookie:
|
||||
return func.HttpResponse(status_code=401)
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["post"],
|
||||
"route": "auth/logout"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,6 @@
|
|||
import azure.functions as func
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""Receive callback from the id app after the user has logged out."""
|
||||
return func.HttpResponse(status_code=302, headers={"Location": "/"})
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["get", "post"],
|
||||
"route": "auth/logout/callback"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,29 @@
|
|||
import json
|
||||
import azure.functions as func
|
||||
|
||||
from azure.core.exceptions import ResourceNotFoundError
|
||||
|
||||
from ..pccommon.session_manager import InvalidSessionCookie, SessionManager
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""Return userinfo"""
|
||||
|
||||
try:
|
||||
session = SessionManager(req)
|
||||
body = {
|
||||
"isLoggedIn": True,
|
||||
"email": session.email,
|
||||
"expires": session.expires.isoformat(),
|
||||
}
|
||||
return func.HttpResponse(
|
||||
body=json.dumps(body),
|
||||
status_code=200,
|
||||
mimetype="application/json",
|
||||
)
|
||||
except (InvalidSessionCookie, ResourceNotFoundError):
|
||||
return func.HttpResponse(
|
||||
body=json.dumps({"isLoggedIn": False}),
|
||||
status_code=200,
|
||||
mimetype="application/json",
|
||||
)
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["get", "post"],
|
||||
"route": "auth/me"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,86 @@
|
|||
from typing import Union
|
||||
import os
|
||||
import requests
|
||||
import azure.functions as func
|
||||
|
||||
from azure.core.exceptions import ResourceNotFoundError
|
||||
|
||||
from ..pccommon.session_manager import InvalidSessionCookie, SessionManager
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
"""Proxies the request to the API and includes the bearer token"""
|
||||
# The proxy provides the templated path parameters in the route "restOfPath"
|
||||
rest = req.route_params.get("restOfPath")
|
||||
upstream_url = os.environ.get("PROXY_API_ROOT")
|
||||
url = f"{upstream_url}/{rest}"
|
||||
|
||||
try:
|
||||
# Get the jwt from the session store and apply it to the auth header
|
||||
session = SessionManager(req)
|
||||
token = session.id_token
|
||||
headers = get_request_headers(req.headers, token)
|
||||
data = req.get_body() if req.method == "POST" else None
|
||||
params = reconstruct_params(req.params)
|
||||
|
||||
# Fetch from upstream
|
||||
resp = requests.request(
|
||||
req.method, url, params=params, headers=headers, data=data, stream=True
|
||||
)
|
||||
|
||||
# Let the proxied response set the length and encoding of the body
|
||||
if "content-length" in resp.headers:
|
||||
del resp.headers["content-length"]
|
||||
if "content-encoding" in resp.headers:
|
||||
del resp.headers["content-encoding"]
|
||||
|
||||
# Proxy the response back to the client
|
||||
return func.HttpResponse(
|
||||
body=resp.content,
|
||||
status_code=resp.status_code,
|
||||
headers=resp.headers,
|
||||
mimetype=resp.headers["Content-Type"],
|
||||
)
|
||||
except (InvalidSessionCookie, ResourceNotFoundError):
|
||||
return func.HttpResponse(
|
||||
status_code=401,
|
||||
)
|
||||
|
||||
|
||||
def get_request_headers(headers, token: str) -> dict:
|
||||
"""
|
||||
Construct headers for the proxied request from the incoming request
|
||||
headers. Add the token to the Authorization header.
|
||||
"""
|
||||
headers = {k: v for k, v in headers.items()}
|
||||
headers = {
|
||||
"user-agent": headers.get("user-agent"),
|
||||
"accept": headers.get("accept"),
|
||||
"cache-control": headers.get("cache-control"),
|
||||
"accept-language": headers.get("accept-language"),
|
||||
"connection": headers.get("connection"),
|
||||
"sec-fetch-site": headers.get("sec-fetch-site"),
|
||||
"sec-fetch-user": headers.get("sec-fetch-user"),
|
||||
"sec-fetch-mode": headers.get("sec-fetch-mode"),
|
||||
"sec-fetch-dest": headers.get("sec-fetch-dest"),
|
||||
"x-forwarded-for": headers.get("x-forwarded-for"),
|
||||
"authorization": f"Bearer {token}",
|
||||
}
|
||||
|
||||
return headers
|
||||
|
||||
|
||||
def reconstruct_params(params: dict) -> Union[dict, None]:
|
||||
"""
|
||||
Query string parameters with duplicate keys were consolidated to a list
|
||||
by the API. Reconstruct the original query string by separating them to
|
||||
duplicated keys.
|
||||
"""
|
||||
if not params:
|
||||
return None
|
||||
splitable_list = ["assets"]
|
||||
formatted = {}
|
||||
for key, value in params.items():
|
||||
formatted[key] = value.split(",") if key in splitable_list else value
|
||||
|
||||
return formatted
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": ["get", "post"],
|
||||
"route": "px/{*restOfPath}"
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -0,0 +1,34 @@
|
|||
import logging
|
||||
|
||||
import azure.functions as func
|
||||
import requests
|
||||
|
||||
from ..pccommon.session_table import SessionTable
|
||||
|
||||
|
||||
def main(req: func.HttpRequest) -> func.HttpResponse:
|
||||
logging.info("Python HTTP trigger function processed a request.")
|
||||
|
||||
query = req.params.get("query")
|
||||
ql = 0
|
||||
|
||||
px = req.params.get("px")
|
||||
pxl = 0
|
||||
if px:
|
||||
logging.info("using proxy param")
|
||||
resp = requests.get(
|
||||
f"https://planetarycomputer-staging.microsoft.com/api/stac/v1/collections/{px}/items?limit=1"
|
||||
)
|
||||
pxl = len(resp.text)
|
||||
|
||||
if query:
|
||||
logging.info("using query param")
|
||||
with SessionTable() as client:
|
||||
session_data = client.get_session_data(query)
|
||||
ql = True
|
||||
|
||||
if ql or pxl > 0:
|
||||
return func.HttpResponse(f"query: {ql}, px: {pxl}")
|
||||
|
||||
else:
|
||||
return func.HttpResponse("Baseline no-op function", status_code=200)
|
|
@ -0,0 +1,20 @@
|
|||
{
|
||||
"scriptFile": "__init__.py",
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": [
|
||||
"get",
|
||||
"post"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "$return"
|
||||
}
|
||||
]
|
||||
}
|
|
@ -4,17 +4,16 @@
|
|||
"fileLoggingMode": "always",
|
||||
"logLevel": {
|
||||
"default": "Information",
|
||||
"Function": "Debug"
|
||||
"Function": "Information"
|
||||
},
|
||||
"applicationInsights": {
|
||||
"samplingSettings": {
|
||||
"isEnabled": true,
|
||||
"excludedTypes": "Request"
|
||||
"isEnabled": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"extensionBundle": {
|
||||
"id": "Microsoft.Azure.Functions.ExtensionBundle",
|
||||
"version": "[1.*, 2.0.0)"
|
||||
"version": "[2.*, 3.0.0)"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -5,6 +5,11 @@
|
|||
"AuthAdminUrl": "",
|
||||
"SignupUrl": "",
|
||||
"SignupToken": "",
|
||||
"PCID_CLIENT_ID": "",
|
||||
"PCID_CLIENT_SECRET": "",
|
||||
"PCID_CONFIG_URL": "",
|
||||
"PCID_REDIRECT_URL": "",
|
||||
|
||||
"FUNCTIONS_WORKER_RUNTIME": "python"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,128 @@
|
|||
import logging
|
||||
import os
|
||||
|
||||
from typing import List
|
||||
from functools import lru_cache
|
||||
|
||||
import jwt
|
||||
import requests
|
||||
|
||||
|
||||
SESSION_COOKIE = "mspc_session_id"
|
||||
OAUTH_NONCE_COOKIE = "mspc_oauth_nonce"
|
||||
NONCE_SEPERATOR = ":"
|
||||
|
||||
|
||||
@lru_cache(maxsize=1)
|
||||
def get_oidc_config():
|
||||
"""Return the OIDC configuration from the identity provider."""
|
||||
config_url = os.environ.get("PCID_CONFIG_URL")
|
||||
resp = requests.get(config_url)
|
||||
|
||||
if resp.ok:
|
||||
logging.info("Connected to OIDC config endpoint")
|
||||
return resp.json()
|
||||
|
||||
raise Exception("Failed to get PCID config")
|
||||
|
||||
|
||||
def get_oidc_prop(prop_name: str):
|
||||
"""Get a specific property value from the OIDC configuration."""
|
||||
config = get_oidc_config()
|
||||
return config.get(prop_name)
|
||||
|
||||
|
||||
def make_auth_url(auth_endpoint: str, scopes: List[str], state: str, nonce: str):
|
||||
"""Construct an authorization request URL for the provided auth endpoint."""
|
||||
client_id = os.environ.get("PCID_CLIENT_ID")
|
||||
redirect_uri = os.environ.get("PCID_REDIRECT_URL")
|
||||
|
||||
oidc_scopes = " ".join(scopes)
|
||||
params = {
|
||||
"client_id": client_id,
|
||||
"redirect_uri": redirect_uri,
|
||||
"scope": oidc_scopes,
|
||||
"response_type": "code",
|
||||
"response_mode": "form_post",
|
||||
"state": state,
|
||||
"nonce": nonce,
|
||||
}
|
||||
qs = "&".join([f"{k}={v}" for k, v in params.items()])
|
||||
|
||||
return f"{auth_endpoint}?{qs}"
|
||||
|
||||
|
||||
def make_token_form(authorization_code: str):
|
||||
"""Form encoded payload to exchange auth code for access token"""
|
||||
client_id = os.environ.get("PCID_CLIENT_ID")
|
||||
client_secret = os.environ.get("PCID_CLIENT_SECRET")
|
||||
redirect_uri = os.environ.get("PCID_REDIRECT_URL")
|
||||
|
||||
return "".join(
|
||||
[
|
||||
f"client_id={client_id}",
|
||||
f"&redirect_uri={redirect_uri}",
|
||||
"&grant_type=authorization_code",
|
||||
f"&code={authorization_code}",
|
||||
f"&client_secret={client_secret}",
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
def generate_rsa_pub(kid: str) -> str:
|
||||
"""
|
||||
Generate an RSA public key from the published OIDC JSON Web Key
|
||||
configuration for the key identifier (kid) used to sign the JWT.
|
||||
"""
|
||||
jwks_uri = get_oidc_prop("jwks_uri")
|
||||
jwks = requests.get(jwks_uri).json()
|
||||
jwk = [jwk for jwk in jwks["keys"] if jwk["kid"] == kid]
|
||||
|
||||
if len(jwk) > 0:
|
||||
return jwt.algorithms.RSAAlgorithm.from_jwk(jwk[0])
|
||||
|
||||
raise Exception("Failed to get public key for provided token kid")
|
||||
|
||||
|
||||
def get_invalidated_session_cookie():
|
||||
"""Returns a session cookie header that expires immediately."""
|
||||
frags = {
|
||||
SESSION_COOKIE: "",
|
||||
"Max-Age": "1",
|
||||
"Expires": "Thu, 01 Jan 1970 00:00:00 GMT",
|
||||
"Path": "/",
|
||||
"SameSite": "strict",
|
||||
}
|
||||
crumbs = ";".join([f"{k}={v}" for k, v in frags.items()])
|
||||
return f"{crumbs}; HttpOnly; Secure"
|
||||
|
||||
|
||||
def make_session_cookie(session_id: str, max_age: int = 3600):
|
||||
"""
|
||||
Returns a secure, httponly cookie header with an expiration
|
||||
to identify the session.
|
||||
"""
|
||||
frags = {
|
||||
SESSION_COOKIE: session_id,
|
||||
"Max-Age": max_age,
|
||||
"Path": "/",
|
||||
"SameSite": "strict",
|
||||
}
|
||||
crumbs = ";".join([f"{k}={v}" for k, v in frags.items()])
|
||||
return f"{crumbs}; HttpOnly; Secure"
|
||||
|
||||
|
||||
def make_oidc_state_nonce_cookie(state: str, nonce: str, max_age: int = 40):
|
||||
"""
|
||||
Returns a secure, httponly cookie header with an quick expiration
|
||||
to preserve the oAuth state and nonce for a single login attempt.
|
||||
"""
|
||||
cookie_value = f"{state}{NONCE_SEPERATOR}{nonce}"
|
||||
frags = {
|
||||
OAUTH_NONCE_COOKIE: cookie_value,
|
||||
"Max-Age": max_age,
|
||||
"Path": "/",
|
||||
"SameSite": "lax",
|
||||
}
|
||||
crumbs = ";".join([f"{k}={v}" for k, v in frags.items()])
|
||||
return f"{crumbs}; HttpOnly; Secure"
|
|
@ -0,0 +1,19 @@
|
|||
import logging
|
||||
from urllib.parse import urlparse
|
||||
|
||||
import azure.functions as func
|
||||
|
||||
|
||||
class CSRFException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def check_csrf(req: func.HttpRequest) -> None:
|
||||
"""Check that the origin of the request matches the host"""
|
||||
origin = urlparse(req.headers.get("Origin"))
|
||||
|
||||
# Functions are proxied, so can't use host directly
|
||||
host = urlparse(req.headers.get("x-ms-original-url"))
|
||||
|
||||
if (origin.netloc != host.netloc) or origin == None:
|
||||
raise CSRFException(f"CSRF request made from {origin.netloc} to {host.netloc}")
|
|
@ -0,0 +1,51 @@
|
|||
from datetime import datetime
|
||||
from http.cookies import SimpleCookie
|
||||
from typing import Optional
|
||||
|
||||
import azure.functions as func
|
||||
|
||||
from .auth import SESSION_COOKIE
|
||||
from .session_table import SessionTable
|
||||
|
||||
|
||||
class InvalidSessionCookie(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class SessionManager:
|
||||
def __init__(self, req: func.HttpRequest):
|
||||
cookie_reader: SimpleCookie = SimpleCookie()
|
||||
cookie_header = req.headers.get("Cookie")
|
||||
|
||||
if cookie_header == None:
|
||||
raise InvalidSessionCookie("No cookie header found")
|
||||
|
||||
cookie_reader.load(cookie_header)
|
||||
session_cookie = cookie_reader.get(SESSION_COOKIE)
|
||||
|
||||
if session_cookie == None:
|
||||
raise InvalidSessionCookie()
|
||||
|
||||
self.session_id = None
|
||||
self.session_data = {}
|
||||
|
||||
if session_cookie:
|
||||
self.session_id = session_cookie.value
|
||||
with SessionTable() as client:
|
||||
self.session_data = client.get_session_data(self.session_id)
|
||||
|
||||
@property
|
||||
def id_token(self) -> Optional[str]:
|
||||
return self.session_data.get("token")
|
||||
|
||||
@property
|
||||
def email(self) -> Optional[str]:
|
||||
return self.session_data.get("email")
|
||||
|
||||
@property
|
||||
def expires(self) -> Optional[datetime]:
|
||||
return self.session_data.get("expires")
|
||||
|
||||
def destroy(self) -> None:
|
||||
with SessionTable() as client:
|
||||
client.delete_session_data(self.session_id)
|
|
@ -0,0 +1,52 @@
|
|||
import json
|
||||
import logging
|
||||
import os
|
||||
|
||||
from datetime import datetime
|
||||
from typing import Any, cast
|
||||
|
||||
from azure.data.tables import TableClient, UpdateMode, TableEntity
|
||||
from azure.core.exceptions import ResourceExistsError
|
||||
|
||||
|
||||
PKEY = "sessions"
|
||||
|
||||
|
||||
class SessionTable:
|
||||
def __init__(self):
|
||||
self.table_name = os.environ.get("SESSION_TABLE_NAME")
|
||||
self.conn_str = os.environ.get("SESSION_CONN_STR")
|
||||
|
||||
def get_session_data(self, session_id: str) -> TableEntity:
|
||||
return self.client.get_entity(partition_key=PKEY, row_key=session_id)
|
||||
|
||||
def set_session_data(self, session_id: str, session_data: dict, encoded_jwt: dict):
|
||||
expires = datetime.fromtimestamp(cast(float, session_data.get("exp")))
|
||||
email = session_data.get("email")
|
||||
|
||||
entity = {
|
||||
"PartitionKey": PKEY,
|
||||
"RowKey": session_id,
|
||||
"token": json.dumps(encoded_jwt),
|
||||
"expires": expires,
|
||||
"email": email,
|
||||
}
|
||||
|
||||
self.client.upsert_entity(mode=UpdateMode.REPLACE, entity=entity)
|
||||
|
||||
def delete_session_data(self, session_id: str) -> None:
|
||||
try:
|
||||
self.client.delete_entity(partition_key=PKEY, row_key=session_id)
|
||||
|
||||
except ResourceExistsError:
|
||||
logging.exception("Unable to delete session data")
|
||||
|
||||
def __enter__(self):
|
||||
self.client = TableClient.from_connection_string(
|
||||
conn_str=self.conn_str, table_name=self.table_name
|
||||
)
|
||||
return self
|
||||
|
||||
def __exit__(self, *args: Any) -> None:
|
||||
if self.client:
|
||||
self.client.close()
|
|
@ -1,4 +1,16 @@
|
|||
{
|
||||
"$schema": "http://json.schemastore.org/proxies",
|
||||
"proxies": {}
|
||||
"proxies": {
|
||||
"login_callback_proxy": {
|
||||
"matchCondition": {
|
||||
"methods": ["GET"],
|
||||
"route": "/api/auth/login/callback"
|
||||
},
|
||||
"backendUri": "http://localhost/api/auth/login/callback-proxy",
|
||||
"requestOverrides": {
|
||||
"backend.request.querystring.code": "",
|
||||
"backend.request.querystring._code": "{request.querystring.code}"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,4 @@
|
|||
# Dev dependencies
|
||||
black
|
||||
mypy
|
||||
types-requests
|
|
@ -3,4 +3,6 @@
|
|||
# Manually managing azure-functions-worker may cause unexpected issues
|
||||
|
||||
azure-functions
|
||||
requests
|
||||
azure-data-tables==12.2
|
||||
requests
|
||||
pyjwt[crypto]
|
||||
|
|
|
@ -0,0 +1,23 @@
|
|||
import os
|
||||
|
||||
from azure.data.tables import TableClient
|
||||
from azure.core.exceptions import ResourceExistsError
|
||||
|
||||
|
||||
def setup_session_table():
|
||||
table_name = os.environ.get("SESSION_TABLE_NAME")
|
||||
conn_str = os.environ.get("SESSION_CONN_STR")
|
||||
|
||||
with TableClient.from_connection_string(
|
||||
conn_str=conn_str, table_name=table_name
|
||||
) as table_client:
|
||||
|
||||
try:
|
||||
table_client.create_table()
|
||||
print(f"Created table_name: {table_name}")
|
||||
except ResourceExistsError:
|
||||
print(f"{table_name} already exists")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
setup_session_table()
|
|
@ -0,0 +1,18 @@
|
|||
#!/bin/bash
|
||||
|
||||
# Start a local http proxy that forwards :7072 to the api container on :7071
|
||||
# This is necessary because the swa cli, which is the main processes in this
|
||||
# container, does not accept non-localhost connections. The api is running in
|
||||
# it's own container, so this proxy is run as a background process to facilitate
|
||||
# that communication.
|
||||
./node_modules/http-proxy-cli/bin/http-proxy.js --port 7072 http://api:7071 &
|
||||
|
||||
# Start the swa cli emulator, listening for :3000 as a result of the --run yarn start command
|
||||
# and also for the proxy started above to the api container.
|
||||
./node_modules/@azure/static-web-apps-cli/dist/cli/bin.js start http://localhost:3000 --run "yarn start" --api-location "http://localhost:7072" &
|
||||
|
||||
# Wait for any process to exit
|
||||
wait -n
|
||||
|
||||
# Exit with status of process that exited first
|
||||
exit $?
|
|
@ -1,5 +1,51 @@
|
|||
version: "2.4"
|
||||
services:
|
||||
app:
|
||||
image: node:14-slim
|
||||
working_dir: /usr/src
|
||||
ports:
|
||||
- "3000:3000"
|
||||
- "4280:4280"
|
||||
environment:
|
||||
- CHOKIDAR_USEPOLLING=true
|
||||
- CHOKIDAR_INTERVAL=100
|
||||
volumes:
|
||||
- .:/usr/src
|
||||
networks:
|
||||
- pcdc-network
|
||||
command: ["./wait-for-it.sh", "-t", "45", "api:7071", "--", "./app_init.sh"]
|
||||
|
||||
api:
|
||||
build:
|
||||
context: api
|
||||
dockerfile: Dockerfile
|
||||
working_dir: /usr/src
|
||||
ports:
|
||||
- "7071:7071"
|
||||
environment:
|
||||
# Azurite uses well-known conntection strings for local dev
|
||||
- SESSION_CONN_STR=DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;TableEndpoint=http://azurite:10022/devstoreaccount1
|
||||
- SESSION_TABLE_NAME=pcdcsessions
|
||||
volumes:
|
||||
- ./api:/usr/src
|
||||
networks:
|
||||
pcdc-network:
|
||||
depends_on:
|
||||
- azurite
|
||||
command: func host start --script-root ./ --cors "*" --port 7071
|
||||
|
||||
azurite:
|
||||
container_name: pcdc-azurite
|
||||
image: mcr.microsoft.com/azure-storage/azurite
|
||||
hostname: azurite
|
||||
command: "azurite --silent --tableHost 0.0.0.0 --tablePort 10022 -l /workspace"
|
||||
ports:
|
||||
- "10022:10022"
|
||||
volumes:
|
||||
- "pcdc-azurite-data:/workspace"
|
||||
networks:
|
||||
pcdc-network:
|
||||
|
||||
etl:
|
||||
build:
|
||||
context: etl
|
||||
|
@ -17,3 +63,9 @@ services:
|
|||
volumes:
|
||||
- ./mockstac/app:/app
|
||||
- ./mockstac/data:/data
|
||||
|
||||
volumes:
|
||||
pcdc-azurite-data: null
|
||||
|
||||
networks:
|
||||
pcdc-network:
|
||||
|
|
|
@ -91,9 +91,11 @@
|
|||
]
|
||||
},
|
||||
"devDependencies": {
|
||||
"@azure/static-web-apps-cli": "^0.8.2",
|
||||
"copy-webpack-plugin": "^6.4.1",
|
||||
"cypress": "^8.5.0",
|
||||
"html-loader": "^1.3.2",
|
||||
"http-proxy-cli": "2.1.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"json-loader": "^0.5.7",
|
||||
"markdown-loader": "^6.0.0",
|
||||
|
|
|
@ -0,0 +1,25 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -e
|
||||
|
||||
function usage() {
|
||||
echo -n "Usage: $(basename "$0")
|
||||
Run formatters on python and JS files
|
||||
"
|
||||
}
|
||||
|
||||
function run() {
|
||||
|
||||
docker-compose run --rm --no-deps api black --check .
|
||||
# docker-compose run --rm --no-deps api mypy .
|
||||
}
|
||||
|
||||
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
||||
case "${1}" in
|
||||
--help)
|
||||
usage
|
||||
;;
|
||||
*)
|
||||
run ;;
|
||||
esac
|
||||
fi
|
|
@ -5,13 +5,16 @@ set -e
|
|||
function usage() {
|
||||
echo -n "Usage: $(basename "$0")
|
||||
Run webpack dev server with hot reloading for development
|
||||
--local: run a frontend-only dev server on host
|
||||
"
|
||||
}
|
||||
|
||||
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
||||
if [[ "${1:-}" == "--help" ]]; then
|
||||
usage
|
||||
else
|
||||
elif [[ "${1:-}" == "--local" ]]; then
|
||||
yarn start
|
||||
else
|
||||
docker-compose up app api azurite
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
|
|
@ -14,13 +14,19 @@ Options:
|
|||
function run() {
|
||||
|
||||
# Install JS dependencies on host
|
||||
yarn
|
||||
docker-compose \
|
||||
-f docker-compose.yml \
|
||||
run --rm --no-deps app \
|
||||
yarn
|
||||
|
||||
# Ensure container images are current
|
||||
docker-compose build
|
||||
|
||||
# Run etl to build documentation and external notebook/md files
|
||||
docker-compose run --rm --no-deps etl "${params[@]}"
|
||||
|
||||
# Ensure azurite tables are up to date
|
||||
docker-compose run --rm api python scripts/setup_tables.py
|
||||
}
|
||||
|
||||
params=()
|
||||
|
@ -36,4 +42,4 @@ if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
|||
*)
|
||||
run ;;
|
||||
esac
|
||||
fi
|
||||
fi
|
||||
|
|
|
@ -1,16 +1,22 @@
|
|||
import React from "react";
|
||||
import { getFlags } from "../utils/featureFlags";
|
||||
|
||||
const Feature = ({ name, fallback, children }) => {
|
||||
interface FeatureProps {
|
||||
name: string;
|
||||
fallback?: React.ReactNode;
|
||||
}
|
||||
|
||||
const Feature: React.FC<FeatureProps> = ({ name, children, fallback = null }) => {
|
||||
const flags = getFlags();
|
||||
const feature = flags.find(feature => feature.name === name);
|
||||
|
||||
if (feature) {
|
||||
if (feature.active) {
|
||||
return children;
|
||||
return <>{children}</>;
|
||||
}
|
||||
|
||||
if (fallback) {
|
||||
return fallback;
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
}
|
||||
|
|
@ -1,5 +1,6 @@
|
|||
import { Stack, Text, useTheme } from "@fluentui/react";
|
||||
import HeaderLink from "./components/HeaderLink";
|
||||
import { useSession } from "components/auth/hooks/SessionContext";
|
||||
|
||||
import siteConfig from "config/site.yml";
|
||||
import {
|
||||
|
@ -13,9 +14,13 @@ import {
|
|||
rightAligned,
|
||||
} from "./styles";
|
||||
import { gridContentStyle, offGridContentStyle } from "styles";
|
||||
import Feature from "components/Feature";
|
||||
import Login from "components/auth/Login";
|
||||
import Logout from "components/auth/Logout";
|
||||
|
||||
const Header = ({ onGrid = true }) => {
|
||||
const theme = useTheme();
|
||||
const { status } = useSession();
|
||||
const navClass = onGrid ? gridContentStyle : offGridContentStyle;
|
||||
|
||||
return (
|
||||
|
@ -66,9 +71,17 @@ const Header = ({ onGrid = true }) => {
|
|||
Documentation
|
||||
</HeaderLink>
|
||||
<div className={rightAligned}>
|
||||
<HeaderLink asButton to="/account/request">
|
||||
Request access
|
||||
</HeaderLink>
|
||||
<Stack horizontal verticalAlign="center" tokens={{ childrenGap: 4 }}>
|
||||
{!status.isLoggedIn && (
|
||||
<HeaderLink asButton to="/account/request">
|
||||
Request access
|
||||
</HeaderLink>
|
||||
)}
|
||||
<Feature name="login">
|
||||
<Login />
|
||||
<Logout />
|
||||
</Feature>
|
||||
</Stack>
|
||||
</div>
|
||||
</Stack>
|
||||
</nav>
|
||||
|
|
|
@ -0,0 +1,16 @@
|
|||
import { Link, Text } from "@fluentui/react";
|
||||
import { useSession } from "./hooks/SessionContext";
|
||||
|
||||
const Login: React.FC = () => {
|
||||
const { status } = useSession();
|
||||
const login = (
|
||||
<>
|
||||
<Text>or, </Text>
|
||||
<Link href="/api/auth/login">Sign in</Link>
|
||||
</>
|
||||
);
|
||||
|
||||
return !status.isLoggedIn ? login : null;
|
||||
};
|
||||
|
||||
export default Login;
|
|
@ -0,0 +1,27 @@
|
|||
import axios from "axios";
|
||||
import { DefaultButton } from "@fluentui/react";
|
||||
import { useMutation } from "react-query";
|
||||
|
||||
import { useSession } from "./hooks/SessionContext";
|
||||
import { useEffect } from "react";
|
||||
|
||||
const Logout: React.FC = () => {
|
||||
const session = useSession();
|
||||
const mutation = useMutation(() => axios.post("./api/auth/logout"));
|
||||
|
||||
const handleOnClick = () => {
|
||||
mutation.mutate();
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (mutation.isSuccess) {
|
||||
session.logout();
|
||||
}
|
||||
}, [mutation.isSuccess, session]);
|
||||
|
||||
const logout = <DefaultButton onClick={handleOnClick}>Sign out</DefaultButton>;
|
||||
|
||||
return session.status.isLoggedIn ? logout : null;
|
||||
};
|
||||
|
||||
export default Logout;
|
|
@ -0,0 +1,82 @@
|
|||
import React, { createContext, useState, useEffect } from "react";
|
||||
import { useAuthRefresh, useAuthStatus } from "./useAuthRefresh";
|
||||
|
||||
type Session = {
|
||||
isLoggedIn: boolean;
|
||||
email: string;
|
||||
};
|
||||
|
||||
type SessionContextType = {
|
||||
status: Session;
|
||||
logout: () => void;
|
||||
};
|
||||
|
||||
const initialSession: Session = {
|
||||
isLoggedIn: false,
|
||||
email: "",
|
||||
};
|
||||
|
||||
const initialContext = {
|
||||
status: initialSession,
|
||||
logout: () => {},
|
||||
};
|
||||
|
||||
const DEFAULT_REFERESH_MS = 1 * 1000 * 60;
|
||||
const SessionContext = createContext<SessionContextType>(initialContext);
|
||||
|
||||
export const SessionProvider: React.FC = ({ children }) => {
|
||||
const [session, setSession] = useState<Session>(initialSession);
|
||||
const [refreshInterval, setRefreshInterval] = useState(0);
|
||||
|
||||
const { data: statusData, isLoading: isStatusLoading } = useAuthStatus();
|
||||
const {
|
||||
data: refreshData,
|
||||
isLoading: isRefreshLoading,
|
||||
isError: isRefreshError,
|
||||
} = useAuthRefresh(refreshInterval);
|
||||
|
||||
// Turn off the refresh interval when the request fails (will likely be a 401)
|
||||
if (isRefreshError && refreshInterval !== 0) {
|
||||
setSession(initialSession);
|
||||
setRefreshInterval(0);
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (isRefreshLoading) return;
|
||||
|
||||
setSession(refreshData || initialSession);
|
||||
|
||||
// Not logged in, cease polling/refreshing
|
||||
if (!refreshData?.isLoggedIn) {
|
||||
setRefreshInterval(0);
|
||||
}
|
||||
}, [isRefreshLoading, refreshData]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isStatusLoading) return;
|
||||
setSession(statusData || initialSession);
|
||||
|
||||
// Logged in but not polling for refresh, start polling for refresh
|
||||
if (statusData?.isLoggedIn && refreshInterval === 0) {
|
||||
setRefreshInterval(DEFAULT_REFERESH_MS);
|
||||
}
|
||||
}, [isStatusLoading, refreshInterval, statusData]);
|
||||
|
||||
const context = {
|
||||
status: session,
|
||||
logout: () => {
|
||||
setSession(initialSession);
|
||||
},
|
||||
};
|
||||
return (
|
||||
<SessionContext.Provider value={context}>{children}</SessionContext.Provider>
|
||||
);
|
||||
};
|
||||
|
||||
export const useSession = () => {
|
||||
const context = React.useContext(SessionContext);
|
||||
if (context === undefined) {
|
||||
throw new Error("useSession must be used within a SessionProvider");
|
||||
}
|
||||
return context;
|
||||
};
|
|
@ -0,0 +1,32 @@
|
|||
import axios from "axios";
|
||||
import { useQuery } from "react-query";
|
||||
|
||||
export const useAuthRefresh = (refetchInterval: number) => {
|
||||
return useQuery(
|
||||
"api/auth/refresh",
|
||||
async () => {
|
||||
const res = await axios.post("/api/auth/refresh");
|
||||
return res.data;
|
||||
},
|
||||
{
|
||||
enabled: refetchInterval > 0,
|
||||
refetchInterval: refetchInterval,
|
||||
refetchOnWindowFocus: false,
|
||||
refetchOnMount: false,
|
||||
retry: false,
|
||||
}
|
||||
);
|
||||
};
|
||||
|
||||
export const useAuthStatus = () => {
|
||||
return useQuery(
|
||||
"user",
|
||||
async () => {
|
||||
const res = await axios.get("/api/auth/me");
|
||||
return res.data;
|
||||
},
|
||||
{
|
||||
retry: false,
|
||||
}
|
||||
);
|
||||
};
|
|
@ -9,6 +9,7 @@ function CollectionProvider({ collection, children }) {
|
|||
</CollectionContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
function useStac() {
|
||||
const context = React.useContext(CollectionContext);
|
||||
if (context === undefined) {
|
||||
|
|
|
@ -10,13 +10,16 @@ import "./styles/index.css";
|
|||
import App from "./App";
|
||||
|
||||
import { QueryClient, QueryClientProvider } from "react-query";
|
||||
import { SessionProvider } from "components/auth/hooks/SessionContext";
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
ReactDOM.render(
|
||||
<ThemeProvider>
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<App />
|
||||
<SessionProvider>
|
||||
<App />
|
||||
</SessionProvider>
|
||||
</QueryClientProvider>
|
||||
</ThemeProvider>,
|
||||
document.getElementById("root")
|
||||
|
|
|
@ -9,7 +9,7 @@ import {
|
|||
import { useCallback } from "react";
|
||||
import { useBoolean } from "react-use";
|
||||
import { IStacItem } from "types/stac";
|
||||
import { makeItemPreviewUrl } from "utils";
|
||||
import { useItemPreviewUrl } from "utils";
|
||||
import { useExploreSelector } from "../state/hooks";
|
||||
import { selectCurrentMosaic } from "../state/mosaicSlice";
|
||||
|
||||
|
@ -31,9 +31,9 @@ const ItemPreview = ({ item, size = 100, border = "side" }: ItemPreviewProps) =>
|
|||
[setLoading]
|
||||
);
|
||||
|
||||
if (!renderOption) return null;
|
||||
const previewUrl = useItemPreviewUrl(item, renderOption, size);
|
||||
|
||||
const previewUrl = makeItemPreviewUrl(item, renderOption, size);
|
||||
if (!renderOption) return null;
|
||||
|
||||
return (
|
||||
<>
|
||||
|
|
|
@ -6,6 +6,7 @@ import {
|
|||
IStackTokens,
|
||||
Stack,
|
||||
} from "@fluentui/react";
|
||||
import Feature from "components/Feature";
|
||||
import { useExploreDispatch } from "pages/Explore/state/hooks";
|
||||
import {
|
||||
pinCurrentMosaic,
|
||||
|
@ -52,21 +53,22 @@ const LegendCmdBar = ({
|
|||
onClick={handleShowOptionsClick}
|
||||
styles={buttonStyles}
|
||||
/>
|
||||
<IconButton
|
||||
aria-label={pin.title}
|
||||
title={pin.title}
|
||||
iconProps={{ iconName: pin.icon }}
|
||||
onClick={handlePin}
|
||||
styles={buttonStyles}
|
||||
/>
|
||||
|
||||
<IconButton
|
||||
aria-label={expand.title}
|
||||
title={expand.title}
|
||||
iconProps={{ iconName: expand.icon }}
|
||||
onClick={handleExpandClick}
|
||||
styles={buttonStyles}
|
||||
/>
|
||||
<Feature name="pin">
|
||||
<IconButton
|
||||
aria-label={pin.title}
|
||||
title={pin.title}
|
||||
iconProps={{ iconName: pin.icon }}
|
||||
onClick={handlePin}
|
||||
styles={buttonStyles}
|
||||
/>
|
||||
<IconButton
|
||||
aria-label={expand.title}
|
||||
title={expand.title}
|
||||
iconProps={{ iconName: expand.icon }}
|
||||
onClick={handleExpandClick}
|
||||
styles={buttonStyles}
|
||||
/>
|
||||
</Feature>
|
||||
</Stack>
|
||||
);
|
||||
};
|
||||
|
|
|
@ -11,8 +11,6 @@ const useMosaicLayer = (
|
|||
mapReady: boolean
|
||||
) => {
|
||||
const { detail } = useExploreSelector(s => s);
|
||||
// const currentLayer = useExploreSelector(selectCurrentMosaic);
|
||||
// const { collection, query, renderOption } = currentLayer;
|
||||
const mosaics = useExploreSelector(s => s.mosaic.layers);
|
||||
|
||||
// If we are showing the detail as a tile layer, craft the tileJSON request
|
||||
|
|
|
@ -14,6 +14,7 @@ import { loadingStyle } from "./SearchResultsPane";
|
|||
import QueryInfo from "./QueryInfo";
|
||||
import { selectCurrentMosaic } from "pages/Explore/state/mosaicSlice";
|
||||
import PinLayer from "../PinLayer";
|
||||
import Feature from "components/Feature";
|
||||
|
||||
interface SearchResultsHeaderProps {
|
||||
results: IStacSearchResult | undefined;
|
||||
|
@ -64,7 +65,9 @@ const SearchResultsHeader = ({ results, isLoading }: SearchResultsHeaderProps) =
|
|||
>
|
||||
{resultsText}
|
||||
<Stack horizontal verticalAlign="center">
|
||||
<PinLayer />
|
||||
<Feature name="pin">
|
||||
<PinLayer />
|
||||
</Feature>
|
||||
<QueryInfo />
|
||||
</Stack>
|
||||
</Stack>
|
||||
|
|
|
@ -1,19 +1,20 @@
|
|||
import axios from "axios";
|
||||
import { useSession } from "components/auth/hooks/SessionContext";
|
||||
import { QueryFunctionContext, useQuery, UseQueryResult } from "react-query";
|
||||
import { IStacFilter, IStacSearchResult } from "types/stac";
|
||||
import { STAC_URL } from "utils/constants";
|
||||
import { getStacUrl } from "utils/constants";
|
||||
|
||||
const getStacItems = async (
|
||||
queryParam: QueryFunctionContext<[string, IStacFilter | undefined]>
|
||||
queryParam: QueryFunctionContext<[string, IStacFilter | undefined, string]>
|
||||
): Promise<IStacSearchResult> => {
|
||||
// eslint-disable-next-line
|
||||
const [_, search] = queryParam.queryKey;
|
||||
const [_, search, stacUrl] = queryParam.queryKey;
|
||||
|
||||
if (typeof search === "undefined") {
|
||||
return Promise.reject();
|
||||
}
|
||||
|
||||
const resp = await axios.post(`${STAC_URL}/search`, search);
|
||||
const resp = await axios.post(`${stacUrl}/search`, search);
|
||||
|
||||
return resp.data;
|
||||
};
|
||||
|
@ -21,7 +22,9 @@ const getStacItems = async (
|
|||
export const useStacSearch = (
|
||||
search: IStacFilter | undefined
|
||||
): UseQueryResult<IStacSearchResult, Error> => {
|
||||
return useQuery(["items", search], getStacItems, {
|
||||
const { status } = useSession();
|
||||
const stacUrl = getStacUrl(status.isLoggedIn) as string;
|
||||
return useQuery(["items", search, stacUrl], getStacItems, {
|
||||
keepPreviousData: true, // intended to not clear out search results when panning the map
|
||||
refetchOnWindowFocus: false,
|
||||
refetchOnMount: false,
|
||||
|
|
|
@ -15,4 +15,15 @@ export const DATA_URL = apiRoot.endsWith("stac")
|
|||
? tilerRoot.replace("stac", "data")
|
||||
: `${tilerRoot}/api/data/v1`;
|
||||
|
||||
export const AUTH_STAC_URL = "/api/px/stac/v1";
|
||||
export const AUTH_DATA_URL = "/api/px/data/v1";
|
||||
|
||||
export const HUB_URL = process.env.REACT_APP_HUB_URL || "";
|
||||
|
||||
export const getStacUrl = isLoggedIn => {
|
||||
return isLoggedIn ? AUTH_STAC_URL : STAC_URL;
|
||||
};
|
||||
|
||||
export const getDataUrl = isLoggedIn => {
|
||||
return isLoggedIn ? AUTH_DATA_URL : DATA_URL;
|
||||
};
|
||||
|
|
|
@ -1,20 +0,0 @@
|
|||
export const initializeFeatures = () => {
|
||||
const flags = getFlags();
|
||||
// {
|
||||
// name:,
|
||||
// description:,
|
||||
// active:
|
||||
// }
|
||||
const initialFlags = [];
|
||||
|
||||
// Merge old flags with new flags, keeping the existing `active` setting
|
||||
const updatedFlags = initialFlags.map(f => {
|
||||
return { ...f, ...flags.find(fl => fl.name === f.name) };
|
||||
});
|
||||
|
||||
localStorage.setItem("flags", JSON.stringify(updatedFlags));
|
||||
};
|
||||
|
||||
export const getFlags = () => {
|
||||
return JSON.parse(localStorage.getItem("flags")) || [];
|
||||
};
|
|
@ -0,0 +1,28 @@
|
|||
type FeatureFlag = {
|
||||
name: string;
|
||||
description: string;
|
||||
active: boolean;
|
||||
};
|
||||
|
||||
export const initializeFeatures = () => {
|
||||
const flags = getFlags();
|
||||
const initialFlags: FeatureFlag[] = [
|
||||
{ name: "login", description: "Test login", active: false },
|
||||
{ name: "pin", description: "Pin layer to map", active: false },
|
||||
];
|
||||
|
||||
// Merge old flags with new flags, keeping the existing `active` setting
|
||||
const updatedFlags = initialFlags.map(initialFlag => {
|
||||
return {
|
||||
...initialFlag,
|
||||
...flags.find(existingFlag => existingFlag.name === initialFlag.name),
|
||||
};
|
||||
});
|
||||
|
||||
localStorage.setItem("flags", JSON.stringify(updatedFlags));
|
||||
};
|
||||
|
||||
export const getFlags = (): FeatureFlag[] => {
|
||||
const storedFlags = localStorage.getItem("flags");
|
||||
return storedFlags ? JSON.parse(storedFlags) : [];
|
||||
};
|
|
@ -2,10 +2,11 @@ import dayjs, { Dayjs } from "dayjs";
|
|||
import utc from "dayjs/plugin/utc";
|
||||
|
||||
import { IStacCollection, IStacItem } from "types/stac";
|
||||
import { DATA_URL, HUB_URL } from "./constants";
|
||||
import { DATA_URL, getDataUrl, HUB_URL } from "./constants";
|
||||
import * as qs from "query-string";
|
||||
import { IMosaic, IMosaicRenderOption } from "pages/Explore/types";
|
||||
import { DEFAULT_MIN_ZOOM } from "pages/Explore/utils/constants";
|
||||
import { useSession } from "components/auth/hooks/SessionContext";
|
||||
|
||||
dayjs.extend(utc);
|
||||
export { dayjs };
|
||||
|
@ -232,14 +233,15 @@ export const makeTileJsonUrl = (
|
|||
return `${DATA_URL}/mosaic/${query.searchId}/tilejson.json?&${scaleParam}&${renderParams}${minZoom}${collectionParam}${format}`;
|
||||
};
|
||||
|
||||
export const makeItemPreviewUrl = (
|
||||
export const useItemPreviewUrl = (
|
||||
item: IStacItem,
|
||||
renderOption: IMosaicRenderOption,
|
||||
renderOption: IMosaicRenderOption | null,
|
||||
size?: number
|
||||
) => {
|
||||
const { status } = useSession();
|
||||
const maxSize = size ? `&max_size=${size}` : "";
|
||||
const url = encodeURI(`${DATA_URL}/item/preview.png`);
|
||||
const renderParams = encodeRenderOpts(removeMercatorAssets(renderOption.options));
|
||||
const url = encodeURI(`${getDataUrl(status.isLoggedIn)}/item/preview.png`);
|
||||
const renderParams = encodeRenderOpts(removeMercatorAssets(renderOption?.options));
|
||||
|
||||
const params = `?collection=${item.collection}&item=${item.id}&${renderParams}${maxSize}`;
|
||||
|
||||
|
@ -263,6 +265,6 @@ const encodeRenderOpts = (renderOpts: string | undefined) => {
|
|||
};
|
||||
|
||||
// Remove the suffix that designates the mercator assets from the render options
|
||||
const removeMercatorAssets = (renderOpts: string) => {
|
||||
const removeMercatorAssets = (renderOpts: string = "") => {
|
||||
return renderOpts.replaceAll("_wm", "");
|
||||
};
|
||||
|
|
|
@ -3,7 +3,8 @@ import { makeFilterBody } from "pages/Explore/utils/hooks/useStacFilter";
|
|||
import { collectionFilter } from "pages/Explore/utils/stac";
|
||||
import { useQuery } from "react-query";
|
||||
import { makeTileJsonUrl } from "utils";
|
||||
import { DATA_URL, STAC_URL } from "./constants";
|
||||
import { getDataUrl, getStacUrl } from "./constants";
|
||||
import { useSession } from "components/auth/hooks/SessionContext";
|
||||
|
||||
// Query content can be prefetched if it's likely to be used
|
||||
export const usePrefetchContent = () => {
|
||||
|
@ -11,7 +12,9 @@ export const usePrefetchContent = () => {
|
|||
};
|
||||
|
||||
export const useCollections = () => {
|
||||
return useQuery(["stac", STAC_URL], getCollections, {
|
||||
const { isLoggedIn: loggedIn } = useSession();
|
||||
const stacUrl = getStacUrl(loggedIn);
|
||||
return useQuery(["stac", stacUrl], getCollections, {
|
||||
refetchOnWindowFocus: false,
|
||||
refetchOnMount: false,
|
||||
});
|
||||
|
@ -46,8 +49,9 @@ export const registerStacFilter = async (collectionId, queryInfo, cql) => {
|
|||
registerCancelToken && registerCancelToken();
|
||||
|
||||
// Make a new request
|
||||
const dataUrl = getDataUrl();
|
||||
const body = makeFilterBody([collectionFilter(collectionId)], queryInfo, cql);
|
||||
const r = await axios.post(`${DATA_URL}/mosaic/register`, body, {
|
||||
const r = await axios.post(`${dataUrl}/mosaic/register`, body, {
|
||||
cancelToken: new axios.CancelToken(c => (registerCancelToken = c)),
|
||||
});
|
||||
return r.data.searchid;
|
||||
|
|
|
@ -0,0 +1,182 @@
|
|||
#!/usr/bin/env bash
|
||||
# Use this script to test if a given TCP host/port are available
|
||||
|
||||
WAITFORIT_cmdname=${0##*/}
|
||||
|
||||
echoerr() { if [[ $WAITFORIT_QUIET -ne 1 ]]; then echo "$@" 1>&2; fi }
|
||||
|
||||
usage()
|
||||
{
|
||||
cat << USAGE >&2
|
||||
Usage:
|
||||
$WAITFORIT_cmdname host:port [-s] [-t timeout] [-- command args]
|
||||
-h HOST | --host=HOST Host or IP under test
|
||||
-p PORT | --port=PORT TCP port under test
|
||||
Alternatively, you specify the host and port as host:port
|
||||
-s | --strict Only execute subcommand if the test succeeds
|
||||
-q | --quiet Don't output any status messages
|
||||
-t TIMEOUT | --timeout=TIMEOUT
|
||||
Timeout in seconds, zero for no timeout
|
||||
-- COMMAND ARGS Execute command with args after the test finishes
|
||||
USAGE
|
||||
exit 1
|
||||
}
|
||||
|
||||
wait_for()
|
||||
{
|
||||
if [[ $WAITFORIT_TIMEOUT -gt 0 ]]; then
|
||||
echoerr "$WAITFORIT_cmdname: waiting $WAITFORIT_TIMEOUT seconds for $WAITFORIT_HOST:$WAITFORIT_PORT"
|
||||
else
|
||||
echoerr "$WAITFORIT_cmdname: waiting for $WAITFORIT_HOST:$WAITFORIT_PORT without a timeout"
|
||||
fi
|
||||
WAITFORIT_start_ts=$(date +%s)
|
||||
while :
|
||||
do
|
||||
if [[ $WAITFORIT_ISBUSY -eq 1 ]]; then
|
||||
nc -z $WAITFORIT_HOST $WAITFORIT_PORT
|
||||
WAITFORIT_result=$?
|
||||
else
|
||||
(echo -n > /dev/tcp/$WAITFORIT_HOST/$WAITFORIT_PORT) >/dev/null 2>&1
|
||||
WAITFORIT_result=$?
|
||||
fi
|
||||
if [[ $WAITFORIT_result -eq 0 ]]; then
|
||||
WAITFORIT_end_ts=$(date +%s)
|
||||
echoerr "$WAITFORIT_cmdname: $WAITFORIT_HOST:$WAITFORIT_PORT is available after $((WAITFORIT_end_ts - WAITFORIT_start_ts)) seconds"
|
||||
break
|
||||
fi
|
||||
sleep 1
|
||||
done
|
||||
return $WAITFORIT_result
|
||||
}
|
||||
|
||||
wait_for_wrapper()
|
||||
{
|
||||
# In order to support SIGINT during timeout: http://unix.stackexchange.com/a/57692
|
||||
if [[ $WAITFORIT_QUIET -eq 1 ]]; then
|
||||
timeout $WAITFORIT_BUSYTIMEFLAG $WAITFORIT_TIMEOUT $0 --quiet --child --host=$WAITFORIT_HOST --port=$WAITFORIT_PORT --timeout=$WAITFORIT_TIMEOUT &
|
||||
else
|
||||
timeout $WAITFORIT_BUSYTIMEFLAG $WAITFORIT_TIMEOUT $0 --child --host=$WAITFORIT_HOST --port=$WAITFORIT_PORT --timeout=$WAITFORIT_TIMEOUT &
|
||||
fi
|
||||
WAITFORIT_PID=$!
|
||||
trap "kill -INT -$WAITFORIT_PID" INT
|
||||
wait $WAITFORIT_PID
|
||||
WAITFORIT_RESULT=$?
|
||||
if [[ $WAITFORIT_RESULT -ne 0 ]]; then
|
||||
echoerr "$WAITFORIT_cmdname: timeout occurred after waiting $WAITFORIT_TIMEOUT seconds for $WAITFORIT_HOST:$WAITFORIT_PORT"
|
||||
fi
|
||||
return $WAITFORIT_RESULT
|
||||
}
|
||||
|
||||
# process arguments
|
||||
while [[ $# -gt 0 ]]
|
||||
do
|
||||
case "$1" in
|
||||
*:* )
|
||||
WAITFORIT_hostport=(${1//:/ })
|
||||
WAITFORIT_HOST=${WAITFORIT_hostport[0]}
|
||||
WAITFORIT_PORT=${WAITFORIT_hostport[1]}
|
||||
shift 1
|
||||
;;
|
||||
--child)
|
||||
WAITFORIT_CHILD=1
|
||||
shift 1
|
||||
;;
|
||||
-q | --quiet)
|
||||
WAITFORIT_QUIET=1
|
||||
shift 1
|
||||
;;
|
||||
-s | --strict)
|
||||
WAITFORIT_STRICT=1
|
||||
shift 1
|
||||
;;
|
||||
-h)
|
||||
WAITFORIT_HOST="$2"
|
||||
if [[ $WAITFORIT_HOST == "" ]]; then break; fi
|
||||
shift 2
|
||||
;;
|
||||
--host=*)
|
||||
WAITFORIT_HOST="${1#*=}"
|
||||
shift 1
|
||||
;;
|
||||
-p)
|
||||
WAITFORIT_PORT="$2"
|
||||
if [[ $WAITFORIT_PORT == "" ]]; then break; fi
|
||||
shift 2
|
||||
;;
|
||||
--port=*)
|
||||
WAITFORIT_PORT="${1#*=}"
|
||||
shift 1
|
||||
;;
|
||||
-t)
|
||||
WAITFORIT_TIMEOUT="$2"
|
||||
if [[ $WAITFORIT_TIMEOUT == "" ]]; then break; fi
|
||||
shift 2
|
||||
;;
|
||||
--timeout=*)
|
||||
WAITFORIT_TIMEOUT="${1#*=}"
|
||||
shift 1
|
||||
;;
|
||||
--)
|
||||
shift
|
||||
WAITFORIT_CLI=("$@")
|
||||
break
|
||||
;;
|
||||
--help)
|
||||
usage
|
||||
;;
|
||||
*)
|
||||
echoerr "Unknown argument: $1"
|
||||
usage
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [[ "$WAITFORIT_HOST" == "" || "$WAITFORIT_PORT" == "" ]]; then
|
||||
echoerr "Error: you need to provide a host and port to test."
|
||||
usage
|
||||
fi
|
||||
|
||||
WAITFORIT_TIMEOUT=${WAITFORIT_TIMEOUT:-15}
|
||||
WAITFORIT_STRICT=${WAITFORIT_STRICT:-0}
|
||||
WAITFORIT_CHILD=${WAITFORIT_CHILD:-0}
|
||||
WAITFORIT_QUIET=${WAITFORIT_QUIET:-0}
|
||||
|
||||
# Check to see if timeout is from busybox?
|
||||
WAITFORIT_TIMEOUT_PATH=$(type -p timeout)
|
||||
WAITFORIT_TIMEOUT_PATH=$(realpath $WAITFORIT_TIMEOUT_PATH 2>/dev/null || readlink -f $WAITFORIT_TIMEOUT_PATH)
|
||||
|
||||
WAITFORIT_BUSYTIMEFLAG=""
|
||||
if [[ $WAITFORIT_TIMEOUT_PATH =~ "busybox" ]]; then
|
||||
WAITFORIT_ISBUSY=1
|
||||
# Check if busybox timeout uses -t flag
|
||||
# (recent Alpine versions don't support -t anymore)
|
||||
if timeout &>/dev/stdout | grep -q -e '-t '; then
|
||||
WAITFORIT_BUSYTIMEFLAG="-t"
|
||||
fi
|
||||
else
|
||||
WAITFORIT_ISBUSY=0
|
||||
fi
|
||||
|
||||
if [[ $WAITFORIT_CHILD -gt 0 ]]; then
|
||||
wait_for
|
||||
WAITFORIT_RESULT=$?
|
||||
exit $WAITFORIT_RESULT
|
||||
else
|
||||
if [[ $WAITFORIT_TIMEOUT -gt 0 ]]; then
|
||||
wait_for_wrapper
|
||||
WAITFORIT_RESULT=$?
|
||||
else
|
||||
wait_for
|
||||
WAITFORIT_RESULT=$?
|
||||
fi
|
||||
fi
|
||||
|
||||
if [[ $WAITFORIT_CLI != "" ]]; then
|
||||
if [[ $WAITFORIT_RESULT -ne 0 && $WAITFORIT_STRICT -eq 1 ]]; then
|
||||
echoerr "$WAITFORIT_cmdname: strict mode, refusing to execute subprocess"
|
||||
exit $WAITFORIT_RESULT
|
||||
fi
|
||||
exec "${WAITFORIT_CLI[@]}"
|
||||
else
|
||||
exit $WAITFORIT_RESULT
|
||||
fi
|
676
yarn.lock
676
yarn.lock
Разница между файлами не показана из-за своего большого размера
Загрузить разницу
Загрузка…
Ссылка в новой задаче