зеркало из https://github.com/microsoft/CCF.git
Remove `proposal_generator.py` (#3400)
This commit is contained in:
Родитель
204a05fa6c
Коммит
4b67651af3
|
@ -1 +1 @@
|
|||
Daily please
|
||||
Daily please!
|
||||
|
|
|
@ -10,6 +10,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
|
|||
### Remove
|
||||
|
||||
- The `ccf` Python package no longer provides utilities to issue requests to a running CCF service. This is because CCF supports widely-used client-server protocols (TLS, HTTP) that should already be provided by libraries for all programming languages. The `ccf` Python package can still be used to audit the ledger and snapshot files (#3386).
|
||||
- The `proposal_generator` has been removed from the `ccf` Python package. The majority of proposals can be trivially constructed in existing client tooling, without needing to invoke Python. This also introduces parity between the default constitution and custom constitution actions - all should be constructed and called from the same governance client code. Some jinja templates are included in `samples/templates` for constructing careful ballots from existing proposals.
|
||||
|
||||
## [2.0.0-dev8]
|
||||
|
||||
|
|
|
@ -592,7 +592,9 @@ if(BUILD_TESTS)
|
|||
PYTHON_SCRIPT ${CMAKE_SOURCE_DIR}/tests/governance.py
|
||||
CONSENSUS cft
|
||||
CONSTITUTION ${CONSTITUTION_ARGS}
|
||||
ADDITIONAL_ARGS --oe-binary ${OE_BINDIR} --initial-operator-count 1
|
||||
ADDITIONAL_ARGS
|
||||
--oe-binary ${OE_BINDIR} --initial-operator-count 1
|
||||
--jinja-templates-path ${CMAKE_SOURCE_DIR}/samples/templates
|
||||
)
|
||||
|
||||
add_e2e_test(
|
||||
|
|
|
@ -131,14 +131,8 @@ foreach(UTILITY ${CCF_UTILITIES})
|
|||
endforeach()
|
||||
|
||||
# Copy utilities from tests directory
|
||||
set(CCF_TEST_UTILITIES
|
||||
tests.sh
|
||||
cimetrics_env.sh
|
||||
upload_pico_metrics.py
|
||||
test_install.sh
|
||||
test_python_cli.sh
|
||||
docker_wrap.sh
|
||||
config.jinja
|
||||
set(CCF_TEST_UTILITIES tests.sh cimetrics_env.sh upload_pico_metrics.py
|
||||
test_install.sh docker_wrap.sh config.jinja
|
||||
)
|
||||
foreach(UTILITY ${CCF_TEST_UTILITIES})
|
||||
configure_file(
|
||||
|
|
|
@ -14,28 +14,53 @@ Setting up a token issuer with manual key refresh
|
|||
|
||||
Before adding public token signing keys to a running CCF network, the IdP has to be stored as token issuer with a ``set_jwt_issuer`` proposal:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ cat issuer.json
|
||||
{
|
||||
"issuer": "my-issuer",
|
||||
"auto_refresh": false
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_jwt_issuer",
|
||||
"args": {
|
||||
"issuer": "my_issuer",
|
||||
"key_filter": "all",
|
||||
"auto_refresh": false
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
$ python -m ccf.proposal_generator set_jwt_issuer issuer.json
|
||||
|
||||
The ``issuer`` field is an arbitrary identifier and should be used during token validation to differentiate between multiple issuers.
|
||||
|
||||
Note that ``issuer.json`` has some additional optional fields for more advanced scenarios.
|
||||
Note that this action takes some additional optional args for more advanced scenarios.
|
||||
See :ref:`build_apps/auth/jwt:Advanced issuer configuration` for details.
|
||||
|
||||
After this proposal is accepted, signing keys for an issuer can be updated with a ``set_jwt_public_signing_keys`` proposal:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
$ ISSUER="my-issuer"
|
||||
$ python -m ccf.proposal_generator set_jwt_public_signing_keys $ISSUER jwks.json
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_jwt_public_signing_keys",
|
||||
"args": {
|
||||
"issuer": "my_issuer",
|
||||
"jwks": {
|
||||
"keys": [
|
||||
{
|
||||
"kty": "RSA",
|
||||
"kid": "my_kid",
|
||||
"x5c": [
|
||||
"MIICrDCCAZSgAwIBAgIUcj2cyqhj1U8XzZ0gvV1sF4e4vtowDQYJKoZIhvcNAQELBQAwEDEOMAwGA1UEAwwFZHVtbXkwHhcNMjIwMTEyMTM0MzMzWhcNMjIwMTIyMTM0MzMzWjAQMQ4wDAYDVQQDDAVkdW1teTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAK/RrDSau3y4uI0AMKRfeC/aflZ7LfqFeaWOrj3WrDd1lWIJNJfXw4yrjyLq/NxDPF/3Rk4JBA4FPxUuQ2gwiLaZr9/OJjG2e+1sT9Sj243IC4tKvm8ilUtbgx9f9GvyoP5UhnZHa3GQ2MnRpTtzOq2u+XhjrQBfadGEVjUCpbwRaV7vTUr2WZQ/e1HbLFg0vdApP/U2Z/p5LUyRooLIu12mgMFZAd8zYWXsHGx+D4F8DeRpuPDrF5CKOeA2HvhIeYy+kKMhSuE2wBn3lHtTstZfoJoJJuDPwr5F58jlBHBdRL7BtfB2O8jK5iPNb3bZnyAgQ4xJVr/4wUcbeI7SA28CAwEAATANBgkqhkiG9w0BAQsFAAOCAQEAXlso7E6vqH0W9w3jMrtIWJiZWZi1O+JQ3TkcVX0qTHHQRSjFyY+wjjL11ERZqz2XnrfQZDQdlpB4g7EXh24lN7QE5XuxN6gfRVDPVrY8+mg5qXBW6yz70Kgt7iuy5QjTOMU3GjqYBjObcXBqp+/p6mU0cTFHSt7xQjK2XI1AdOuxeAtUsko6+pvMwZus2RARzJurSAuWh4mU2ozALmUlAD+looOsvqui4s6CNUUakfSUlDj5drnYOHqqrRXDVonzDCDJgN/ZJubwFkTmZ1Urr5OTR5/WdC9G69RU27eqyRFuWlCafy8+hwyo6sZykDJpa6FBbeXWfm7s8Sj77y0Gdg=="
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
``jwks.json`` contains the signing keys as JWKS (`JSON Web Key Set <https://tools.ietf.org/html/rfc7517>`_) document.
|
||||
The ``"jwks"`` field contains the signing keys as a JWKS (`JSON Web Key Set <https://tools.ietf.org/html/rfc7517>`_) document.
|
||||
|
||||
Setting up a token issuer with automatic key refresh
|
||||
----------------------------------------------------
|
||||
|
@ -48,23 +73,39 @@ The following extra conditions must be true compared to setting up an issuer wit
|
|||
- The ``issuer`` must be an OpenID Connect issuer URL, for example ``https://login.microsoftonline.com/common/v2.0``. During auto-refresh, the keys are fetched from that URL by appending ``/.well-known/openid-configuration``.
|
||||
- A CA certificate for the issuer URL must be stored so that the TLS connection to the IdP can be validated during key refresh.
|
||||
|
||||
The CA certificate is stored with a ``set_ca_cert`` proposal:
|
||||
The CA certificate is stored with a ``set_ca_cert_bundle`` proposal:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ python -m ccf.proposal_generator set_ca_cert jwt_ms cacert.pem
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_ca_cert_bundle",
|
||||
"args": {
|
||||
"name": "jwt_ms",
|
||||
"cert_bundle": "-----BEGIN CERTIFICATE-----\nMIICtDCCAZygAwIBAgIUD7xmXLQWbN/q+tuH97Aq2krO0GAwDQYJKoZIhvcNAQEL\nBQAwFDESMBAGA1UEAwwJbG9jYWxob3N0MB4XDTIyMDExMjEzNDMzNloXDTIyMDEy\nMjEzNDMzNlowFDESMBAGA1UEAwwJbG9jYWxob3N0MIIBIjANBgkqhkiG9w0BAQEF\nAAOCAQ8AMIIBCgKCAQEAoWXwixcQ0CrZQAD9Ojo0kxKtrsJB0dmxwKGx/JH2VQYh\nYQ9+8zSuXKW7L0dJL3Qf9R7eJvj1w4i/gPHSggsgrp+MbYLos3DK1M3wdATpsn/r\nhVFCuVpq9nVOZQh9Uiq1fbsYBpoJZ+aSpRJrqK8VaQDr/zPVnU72zYSxgEvwll+e\nvw1+erna3nZevf02hGvD1HU2DBEIkyj50yRzfKufGbw70ySxDAxCpkM+Qsw+WD5/\ncI2D8mhMFA7NdPIbB0OWwCOqrFxtwkA2N11nqJlodzFmcdCDE/fyZc2/Fer+C4ol\nhnYBXVqEodlbytmYHIWB3+XbymDrbqPeCvr2I6nK2QIDAQABMA0GCSqGSIb3DQEB\nCwUAA4IBAQBrHD9cUy5mfkelWzJRknaK3BszUWSwOjjXYh0vFTW8ixZUjKfQDpbe\nPEL3aV3IgnBEwFnormhGCLcOatAGLCgZ//FREts8KaNgyrObKyuMLPQi5vf5/ucG\n/68mGwq2hdh0+ysVqcjjLQCTfbPJPUQ5V2hOh79jOy29JdavcBGR4SeRdOgzdcwA\nd9/T8VuoC6tjt2OF7IJ59JOSBWMcxCbr7KyyJjuxykzyjDa/XQs2Egt4WE+ZVUgc\nav1tQB2leiJGbjhswhLMe7NbuOtwcELsILpPo3pbdKEMlRFngj7H80IFurxtdu/M\nN2D/+LkySi6UDM8q6ADSdjG+cnNzSjEo\n-----END CERTIFICATE-----\n"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Now the issuer can be created with auto-refresh enabled:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ cat issuer.json
|
||||
{
|
||||
"issuer": "https://login.microsoftonline.com/common/v2.0",
|
||||
"auto_refresh": true,
|
||||
"ca_cert_name": "jwt_ms"
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_jwt_issuer",
|
||||
"args": {
|
||||
"issuer": "https://login.microsoftonline.com/common/v2.0",
|
||||
"key_filter": "all",
|
||||
"ca_cert_bundle_name": "jwt_ms",
|
||||
"auto_refresh": true
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
$ python -m ccf.proposal_generator set_jwt_issuer issuer.json
|
||||
|
||||
.. note::
|
||||
|
||||
|
@ -77,8 +118,16 @@ If an issuer should not be used anymore, then a ``remove_jwt_issuer`` proposal c
|
|||
|
||||
.. code-block:: bash
|
||||
|
||||
$ ISSUER="https://login.microsoftonline.com/common/v2.0"
|
||||
$ python -m ccf.proposal_generator remove_jwt_issuer $ISSUER
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "remove_jwt_issuer",
|
||||
"args": {
|
||||
"issuer": "https://login.microsoftonline.com/common/v2.0"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Validating tokens
|
||||
-----------------
|
||||
|
@ -101,20 +150,26 @@ After the signing certificates have been stored, token validation follows the sa
|
|||
|
||||
CCF validates embedded SGX evidence if a key policy is given in the issuer metadata:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ cat issuer.json
|
||||
{
|
||||
"issuer": "https://shareduks.uks.attest.azure.net",
|
||||
"key_filter": "sgx",
|
||||
"key_policy": {
|
||||
"sgx_claims": {
|
||||
"signer_id": "5e5410aaf99a32e32df2a97d579e65f8310f274816ec4f34cedeeb1be410a526",
|
||||
"attributes": "0300000000000000"
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_jwt_issuer",
|
||||
"args": {
|
||||
"issuer": "https://shareduks.uks.attest.azure.net",
|
||||
"key_filter": "sgx",
|
||||
"key_policy": {
|
||||
"sgx_claims": {
|
||||
"signer_id": "5e5410aaf99a32e32df2a97d579e65f8310f274816ec4f34cedeeb1be410a526",
|
||||
"attributes": "0300000000000000"
|
||||
}
|
||||
},
|
||||
"auto_refresh": false
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
$ python -m ccf.proposal_generator set_jwt_issuer issuer.json
|
||||
|
||||
All claims contained in ``key_policy.sgx_claims`` must be identical to the ones embedded in the certificate.
|
||||
Any attempt to add a certificate with mismatching claims in a ``set_jwt_public_signing_keys`` proposal for that issuer would result in failure.
|
||||
|
|
|
@ -134,25 +134,196 @@ It validates the request body and returns the result of a mathematical operation
|
|||
Deployment
|
||||
----------
|
||||
|
||||
An app bundle can be wrapped into a governance proposal with the Python client for deployment:
|
||||
An app bundle must be wrapped into a JSON object for submission as a ``set_js_app`` proposal to deploy the application code onto a CCF service.
|
||||
For instance a proposal which deploys the example app above would look like:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ python -m ccf.proposal_generator set_js_app my-app/
|
||||
SUCCESS | Writing proposal to ./set_js_app_proposal.json
|
||||
SUCCESS | Wrote vote to ./set_js_app_vote_for.json
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_js_app",
|
||||
"args": {
|
||||
"bundle": {
|
||||
"metadata": {
|
||||
"endpoints": {
|
||||
"/compute": {
|
||||
"post": {
|
||||
"js_module": "math.js",
|
||||
"js_function": "compute",
|
||||
"forwarding_required": "never",
|
||||
"authn_policies": [
|
||||
"user_cert"
|
||||
],
|
||||
"mode": "readonly",
|
||||
"openapi": {
|
||||
"requestBody": {
|
||||
"required": true,
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"properties": {
|
||||
"op": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"add",
|
||||
"sub",
|
||||
"mul"
|
||||
]
|
||||
},
|
||||
"left": {
|
||||
"type": "number"
|
||||
},
|
||||
"right": {
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"op",
|
||||
"left",
|
||||
"right"
|
||||
],
|
||||
"type": "object",
|
||||
"additionalProperties": false
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"responses": {
|
||||
"200": {
|
||||
"description": "Compute result",
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"properties": {
|
||||
"result": {
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"result"
|
||||
],
|
||||
"type": "object",
|
||||
"additionalProperties": false
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"400": {
|
||||
"description": "Client-side error",
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"properties": {
|
||||
"error": {
|
||||
"description": "Error message",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"error"
|
||||
],
|
||||
"type": "object",
|
||||
"additionalProperties": false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"/compute2/{op}/{left}/{right}": {
|
||||
"get": {
|
||||
"js_module": "math.js",
|
||||
"js_function": "compute2",
|
||||
"forwarding_required": "never",
|
||||
"authn_policies": [
|
||||
"user_cert"
|
||||
],
|
||||
"mode": "readonly",
|
||||
"openapi": {
|
||||
"parameters": [
|
||||
{
|
||||
"name": "op",
|
||||
"in": "path",
|
||||
"required": true,
|
||||
"schema": {
|
||||
"type": "string",
|
||||
"enum": [
|
||||
"add",
|
||||
"sub",
|
||||
"mul"
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "left",
|
||||
"in": "path",
|
||||
"required": true,
|
||||
"schema": {
|
||||
"type": "number"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "right",
|
||||
"in": "path",
|
||||
"required": true,
|
||||
"schema": {
|
||||
"type": "number"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
"default": {
|
||||
"description": "Default response"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"modules": [
|
||||
{
|
||||
"name": "math.js",
|
||||
"module": "function compute_impl(op, left, right) {\n let result;\n if (op == \"add\") result = left + right;\n else if (op == \"sub\") result = left - right;\n else if (op == \"mul\") result = left * right;\n else {\n return {\n statusCode: 400,\n body: {\n error: \"unknown op\",\n },\n };\n }\n\n return {\n body: {\n result: result,\n },\n };\n}\n\nexport function compute(request) {\n const body = request.body.json();\n\n if (typeof body.left != \"number\" || typeof body.right != \"number\") {\n return {\n statusCode: 400,\n body: {\n error: \"invalid operand type\",\n },\n };\n }\n\n return compute_impl(body.op, body.left, body.right);\n}\n\nexport function compute2(request) {\n const params = request.params;\n\n // Type of params is always string. Try to parse as float\n let left = parseFloat(params.left);\n if (isNaN(left)) {\n return {\n statusCode: 400,\n body: {\n error: \"left operand is not a parseable number\",\n },\n };\n }\n\n let right = parseFloat(params.right);\n if (isNaN(right)) {\n return {\n statusCode: 400,\n body: {\n error: \"right operand is not a parseable number\",\n },\n };\n }\n\n return compute_impl(params.op, left, right);\n}\n"
|
||||
}
|
||||
]
|
||||
},
|
||||
"disable_bytecode_cache": false
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
The key fields are:
|
||||
|
||||
- ``args.bundle.metadata``: The object contained in ``app.json``, defining the HTTP endpoints that access the app.
|
||||
- ``args.bundle.modules``: The contents of all JS files (including scripts and any modules they depend on) which define the app's functionality.
|
||||
- ``args.disable_bytecode_cache``: Whether the bytecode cache should be enabled for this app. See below for more detail.
|
||||
|
||||
Once :ref:`submitted and accepted <governance/proposals:Submitting a New Proposal>`, a ``set_js_app`` proposal atomically (re-)deploys the complete JavaScript application.
|
||||
Any existing application endpoints and JavaScript modules are removed.
|
||||
|
||||
By default, the source code is pre-compiled into bytecode and both the source code and the bytecode are stored in the Key Value store. To disable precompilation and remove any existing cached bytecode, add ``--disable-bytecode-cache`` to the above command. See :ref:`Resource Usage <operations/resource_usage:Memory>` for a discussion on latency vs. memory usage.
|
||||
If you are using ``npm`` or similar to build your app it may make sense to convert your app into a proposal-ready JSON bundle during packaging.
|
||||
For an example of how this could be done, see `this example script <https://github.com/microsoft/CCF/tree/main/tests/npm-app/build_bundle.js>`_ from one of CCF's test applications, called by ``npm build`` from the corresponding `package.json <https://github.com/microsoft/CCF/tree/main/tests/npm-app/package.json>`_.
|
||||
|
||||
Bytecode cache
|
||||
~~~~~~~~~~~~~~
|
||||
|
||||
By default, the source code is pre-compiled into bytecode and both the source code and the bytecode are stored in the Key Value store. To disable precompilation and remove any existing cached bytecode, set ``"args.disable_bytecode_cache": true`` in the above proposal. See :ref:`Resource Usage <operations/resource_usage:Memory>` for a discussion on latency vs. memory usage.
|
||||
|
||||
If CCF is updated and introduces a newer JavaScript engine version, then any pre-compiled bytecode is not used anymore and must be re-compiled by either re-deploying the JavaScript application or issuing a proposal for re-compilation:
|
||||
|
||||
.. code-block:: bash
|
||||
.. code-block:: json
|
||||
|
||||
$ python -m ccf.proposal_generator refresh_js_app_bytecode_cache
|
||||
SUCCESS | Writing proposal to ./refresh_js_app_bytecode_cache_proposal.json
|
||||
SUCCESS | Wrote vote to ./refresh_js_app_bytecode_cache_vote_for.json
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "refresh_js_app_bytecode_cache"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
.. note:: The operator RPC :http:GET:`/js_metrics` returns the size of the bytecode and whether it is used. If it is not used, then either no bytecode is stored or it needs to be re-compiled due to a CCF update.
|
||||
|
|
|
@ -46,52 +46,200 @@ For transparency and auditability, all governance operations (including votes) a
|
|||
Creating a Proposal
|
||||
-------------------
|
||||
|
||||
For custom proposals with multiple actions and precise conditional requirements you will need to write the proposal script by hand.
|
||||
For simple proposals there is a helper script in the CCF Python package - ``proposal_generator.py``.
|
||||
This can be used to create proposals for common operations like adding members and users, without writing any JSON.
|
||||
It also produces sample vote scripts, which validate that the executed proposed actions exactly match what is expected.
|
||||
These sample proposals and votes can be used as a syntax and API reference for producing more complex custom proposals.
|
||||
A proposal's body contains a JSON object with a list of desired actions.
|
||||
The actions are identified by name, matching a function from the constitution which should be called to verify and apply this action.
|
||||
Each action may have associated arguments.
|
||||
The schema of these arguments is determined by the constitution which handles them, so they should be constructed with reference to a target constitution.
|
||||
Some examples of proposals which could be sent to the default sample constitution provided with CCF:
|
||||
|
||||
Assuming the CCF Python package has been installed in the current Python environment, the proposal generator can be invoked directly as ``ccf.proposal_generator``. With no further argument it will print help text, including the list of possible actions as subcommands:
|
||||
.. code-block:: json
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
usage: proposal_generator.py [-h] [-po PROPOSAL_OUTPUT_FILE] [-vo VOTE_OUTPUT_FILE] [-pp] [-i] [-v]
|
||||
{add_node_code,remove_ca_cert_bundle,remove_js_app,remove_jwt_issuer,remove_member,remove_node,remove_node_code,remove_user,set_ca_cert_bundle,set_constitution,set_js_app,set_jwt_issuer,set_jwt_public_signing_keys,set_member,set_member_data,set_recovery_threshold,set_user,set_user_data,transition_node_to_trusted,transition_service_to_open,trigger_ledger_rekey,trigger_recovery_shares_refresh}
|
||||
|
||||
Additional detail is available from the ``--help`` option. You can also find the script in a checkout of CCF:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
$ python CCF/python/ccf/proposal_generator.py
|
||||
|
||||
Some of these subcommands require additional arguments, such as the node ID or user certificate to add to the service. Additional options allow the generated votes and proposals to be redirected to other files or pretty-printed:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
$ python -m ccf.proposal_generator transition_node_to_trusted 6d566123a899afaea977c5fc0f7a2a9fef33f2946fbc4abefbc3e10ee597343f 211019154318Z
|
||||
SUCCESS | Writing proposal to ./trust_node_proposal.json
|
||||
SUCCESS | Wrote vote to ./trust_node_vote_for.json
|
||||
|
||||
$ cat trust_node_proposal.json
|
||||
{"actions": [{"name": "transition_node_to_trusted", "args": {"node_id": "6d566123a899afaea977c5fc0f7a2a9fef33f2946fbc4abefbc3e10ee597343f", "valid_from": "211019154318Z"}}]}
|
||||
|
||||
$ python -m ccf.proposal_generator --pretty-print --proposal-output-file add_pedro.json --vote-output-file vote_for_pedro.json set_user pedro_cert.pem
|
||||
SUCCESS | Writing proposal to ./add_pedro.json
|
||||
SUCCESS | Wrote vote to ./vote_for_pedro.json
|
||||
|
||||
$ cat add_pedro.json
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_user",
|
||||
"args": {
|
||||
"cert": "-----BEGIN CERTIFICATE-----\nMIIBsjCCATigAwIBAgIUOiTU32JZsA0dSv64hW2mrKM0phEwCgYIKoZIzj0EAwMw\nEDEOMAwGA1UEAwwFdXNlcjIwHhcNMjEwNDE0MTUyODMyWhcNMjIwNDE0MTUyODMy\nWjAQMQ4wDAYDVQQDDAV1c2VyMjB2MBAGByqGSM49AgEGBSuBBAAiA2IABBFf+FD0\nUGIyJubt8j+f8+/BP7IY6G144yF/vBNe7CJpNNRyiMZzEyN6wmEKIjsn3gU36A6E\nqNYBlbYbXD1kzlw4q/Pe/Wl3o237p8Es6LD1e1MDUFp2qUcNA6vari6QLKNTMFEw\nHQYDVR0OBBYEFDuGVragGSHoIrFA44kQRg/SKIcFMB8GA1UdIwQYMBaAFDuGVrag\nGSHoIrFA44kQRg/SKIcFMA8GA1UdEwEB/wQFMAMBAf8wCgYIKoZIzj0EAwMDaAAw\nZQIxAPx54LaqQevKrcZIr7QSCZKGFJgSxfVxovSfEqTMD+sKdWzNTqJtJ1SDav1v\nImA4iwIwBsrdevSQj4U2ynXiTJKljviDnyc47ktJVkg/Ppq5cMcEZHO4Q0H/Wq3H\nlUuVImyR\n-----END CERTIFICATE-----\n"
|
||||
"cert": "-----BEGIN CERTIFICATE-----\nMIIBszCCATigAwIBAgIUeYsXeSyujwWWSySPlaVxP0pfO/EwCgYIKoZIzj0EAwMw\nEDEOMAwGA1UEAwwFdXNlcjMwHhcNMjIwMTEyMTAxOTM0WhcNMjMwMTEyMTAxOTM0\nWjAQMQ4wDAYDVQQDDAV1c2VyMzB2MBAGByqGSM49AgEGBSuBBAAiA2IABLWb5TWU\nX9+ldfOZAyEZkbgb7n5CDZcfWXkyL6QXQI7OJb0uF9P6AOuErd/q5Vv2Mqg8LnJs\nmZafY9qZ1Z9XbfOkh5DI08PipIgDBIQ7BYIgstWege/rppcFKuqgjGm1waNTMFEw\nHQYDVR0OBBYEFOhjbOPTvy4iZ7+PFXvYY8Sm1lxcMB8GA1UdIwQYMBaAFOhjbOPT\nvy4iZ7+PFXvYY8Sm1lxcMA8GA1UdEwEB/wQFMAMBAf8wCgYIKoZIzj0EAwMDaQAw\nZgIxAJHzWMG/CeEg+lfI7gwCv4GEPqc1mZj5PT9uIvFso5NQe36L1UFhMCJDx4g0\nx7rQdwIxAJ5145d33LLc+Row4lOEAiHJpzivurLl4y5Kx6SkY3JMQbmGPJaslPWm\nxfWXoAcGhQ==\n-----END CERTIFICATE-----\n",
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_node_certificate_validity",
|
||||
"args": {
|
||||
"node_id": "ba9faac9683f7854c2cf0a97f57e63c260bf8d06f8183772c5655093c0af6e19",
|
||||
"valid_from": "220112101937Z",
|
||||
"validity_period_days": 366
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "transition_node_to_trusted",
|
||||
"args": {
|
||||
"node_id": "ba9faac9683f7854c2cf0a97f57e63c260bf8d06f8183772c5655093c0af6e19",
|
||||
"valid_from": "220101120000Z"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "transition_node_to_trusted",
|
||||
"args": {
|
||||
"node_id": "5d5b09f6dcb2d53a5fffc60c4ac0d55fabdf556069d6631545f42aa6e3500f2e",
|
||||
"valid_from": "220101120000Z"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "transition_service_to_open"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Most HTTP client libraries and tools should have functionality for constructing and providing these JSON objects, and constitutions should be written to provide clear validation errors if a proposal is malformed.
|
||||
|
||||
A ballot's body contains a JS function which evaluates a given proposal, embedded inside a JSON object.
|
||||
These may try to confirm the precise content equality of the proposal they are considering, or put some constraints on its parameters.
|
||||
They could also be simple positive/negative votes, in a model where members fetch and validate a proposal offline before submitting their votes.
|
||||
Some example ballots which could apply to the proposals above:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ballot": "export function vote (rawProposal, proposerId)\n
|
||||
{\n
|
||||
// Accepts any proposal\n
|
||||
return true;\n
|
||||
}"
|
||||
}
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ballot": "export function vote (rawProposal, proposerId)\n
|
||||
{\n
|
||||
// Refuses every proposal\n
|
||||
return false;\n
|
||||
}"
|
||||
}
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ballot": "export function vote (rawProposal, proposerId)\n
|
||||
{\n
|
||||
// Accepts 'set_node_certificate_validity' proposals with a max validity period of 1 year\n
|
||||
let proposal = JSON.parse(rawProposal);\n
|
||||
let action = proposal[\"actions\"][0];\n
|
||||
if (action[\"name\"] === \"set_node_certificate_validity\") {\n
|
||||
let action_args = action[\"args\"];\n
|
||||
if (action_args[\"validity_period_days\"] <= 365) {\n
|
||||
return true;\n
|
||||
}\n
|
||||
}\n
|
||||
return false;\n
|
||||
}"
|
||||
}
|
||||
|
||||
The CCF repository includes a sample Jinja template which will automatically build a ballot, doing a structural equality check against a target proposal. For example if this was run for the ``set_node_certificate_validity`` proposal above:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
# Relies on jinja-cli:
|
||||
# pip install jinja-cli
|
||||
$ jinja ballot_script.js.jinja -d proposal.json
|
||||
|
||||
export function vote (rawProposal, proposerId) {
|
||||
let proposal = JSON.parse(rawProposal);
|
||||
if (!("actions" in proposal))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
let actions = proposal["actions"];
|
||||
if (actions.length !== 1 )
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check that the "set_node_certificate_validity" action is exactly what was expected
|
||||
{
|
||||
let action = actions[0];
|
||||
if (!("name" in action))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
if (action.name !== "set_node_certificate_validity")
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
if (!("args" in action))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
let args = action.args;
|
||||
|
||||
// Check each argument
|
||||
{
|
||||
if (!("node_id" in args))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// Compare stringified JSON representation, to cover object equality
|
||||
const expected = JSON.stringify("ba9faac9683f7854c2cf0a97f57e63c260bf8d06f8183772c5655093c0af6e19");
|
||||
if (JSON.stringify(args["node_id"]) !== expected)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
// Check each argument
|
||||
{
|
||||
if (!("valid_from" in args))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// Compare stringified JSON representation, to cover object equality
|
||||
const expected = JSON.stringify("220112101937Z");
|
||||
if (JSON.stringify(args["valid_from"]) !== expected)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
// Check each argument
|
||||
{
|
||||
if (!("validity_period_days" in args))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
// Compare stringified JSON representation, to cover object equality
|
||||
const expected = JSON.stringify(366);
|
||||
if (JSON.stringify(args["validity_period_days"]) !== expected)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
The ``ballot.json.jinja`` template will additionally embed this script in a JSON object.
|
||||
|
||||
These proposals and votes should be sent as the body of HTTP requests as described below.
|
||||
|
||||
Submitting a New Proposal
|
||||
|
|
|
@ -1,427 +0,0 @@
|
|||
# Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
# Licensed under the Apache 2.0 License.
|
||||
|
||||
import argparse
|
||||
import inspect
|
||||
import json
|
||||
import glob
|
||||
import os
|
||||
import sys
|
||||
import shutil
|
||||
import tempfile
|
||||
import jinja2
|
||||
from typing import Optional, Any, List
|
||||
|
||||
from cryptography import x509
|
||||
import cryptography.hazmat.backends as crypto_backends
|
||||
from loguru import logger as LOG # type: ignore
|
||||
|
||||
|
||||
DEFAULT_PROPOSAL_OUTPUT = "{proposal_name}_proposal.json"
|
||||
DEFAULT_VOTE_OUTPUT = "{proposal_name}_vote_for.json"
|
||||
|
||||
|
||||
def complete_proposal_output_path(
|
||||
proposal_name: str,
|
||||
proposal_output_path: Optional[str] = None,
|
||||
common_dir: str = ".",
|
||||
):
|
||||
if proposal_output_path is None:
|
||||
proposal_output_path = DEFAULT_PROPOSAL_OUTPUT.format(
|
||||
proposal_name=proposal_name
|
||||
)
|
||||
|
||||
if not proposal_output_path.endswith(".json"):
|
||||
proposal_output_path += ".json"
|
||||
|
||||
proposal_output_path = os.path.join(common_dir, proposal_output_path)
|
||||
|
||||
return proposal_output_path
|
||||
|
||||
|
||||
def complete_vote_output_path(
|
||||
proposal_name: str, vote_output_path: Optional[str] = None, common_dir: str = "."
|
||||
):
|
||||
if vote_output_path is None:
|
||||
vote_output_path = DEFAULT_VOTE_OUTPUT.format(proposal_name=proposal_name)
|
||||
|
||||
if not vote_output_path.endswith(".json"):
|
||||
vote_output_path += ".json"
|
||||
|
||||
vote_output_path = os.path.join(common_dir, vote_output_path)
|
||||
|
||||
return vote_output_path
|
||||
|
||||
|
||||
def build_proposal(
|
||||
proposed_call: str,
|
||||
args: Optional[Any] = None,
|
||||
inline_args: bool = False,
|
||||
):
|
||||
LOG.trace(f"Generating {proposed_call} proposal")
|
||||
|
||||
template_loader = jinja2.PackageLoader("ccf", "templates")
|
||||
template_env = jinja2.Environment(
|
||||
loader=template_loader, undefined=jinja2.StrictUndefined
|
||||
)
|
||||
|
||||
action = {"name": proposed_call, "args": args}
|
||||
actions = [action]
|
||||
|
||||
proposals_template = template_env.get_template("proposals.json.jinja")
|
||||
proposal = proposals_template.render(actions=actions)
|
||||
|
||||
vote_template = template_env.get_template("ballots.json.jinja")
|
||||
vote = vote_template.render(actions=actions)
|
||||
|
||||
LOG.trace(f"Made {proposed_call} proposal:\n{proposal}")
|
||||
LOG.trace(f"Accompanying vote:\n{vote}")
|
||||
|
||||
return proposal, vote
|
||||
|
||||
|
||||
def cli_proposal(func):
|
||||
func.is_cli_proposal = True
|
||||
return func
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_member(
|
||||
member_cert_path: str,
|
||||
member_enc_pubk_path: str = None,
|
||||
member_data: Any = None,
|
||||
**kwargs,
|
||||
):
|
||||
member_info = {"cert": open(member_cert_path, encoding="utf-8").read()}
|
||||
if member_enc_pubk_path is not None:
|
||||
member_info["encryption_pub_key"] = open(
|
||||
member_enc_pubk_path, encoding="utf-8"
|
||||
).read()
|
||||
if member_data is not None:
|
||||
member_info["member_data"] = member_data
|
||||
|
||||
return build_proposal("set_member", member_info, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_member(member_id: str, **kwargs):
|
||||
args = {"member_id": member_id}
|
||||
return build_proposal("remove_member", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_member_data(member_id: str, member_data: Any, **kwargs):
|
||||
proposal_args = {"member_id": member_id, "member_data": member_data}
|
||||
return build_proposal("set_member_data", proposal_args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_user(user_cert_path: str, user_data: Any = None, **kwargs):
|
||||
user_info = {"cert": open(user_cert_path, encoding="utf-8").read()}
|
||||
if user_data is not None:
|
||||
user_info["user_data"] = user_data
|
||||
return build_proposal("set_user", user_info, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_user(user_id: str, **kwargs):
|
||||
args = {"user_id": user_id}
|
||||
return build_proposal("remove_user", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_user_data(user_id: str, user_data: Any, **kwargs):
|
||||
proposal_args = {"user_id": user_id, "user_data": user_data}
|
||||
return build_proposal("set_user_data", proposal_args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_constitution(constitution_paths: List[str], **kwargs):
|
||||
concatenated = "\n".join(
|
||||
open(path, "r", encoding="utf-8").read() for path in constitution_paths
|
||||
)
|
||||
proposal_args = {"constitution": concatenated}
|
||||
return build_proposal("set_constitution", proposal_args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_js_app(bundle_path: str, disable_bytecode_cache: bool = False, **kwargs):
|
||||
# read modules
|
||||
if os.path.isfile(bundle_path):
|
||||
tmp_dir = tempfile.TemporaryDirectory(prefix="ccf")
|
||||
shutil.unpack_archive(bundle_path, tmp_dir.name)
|
||||
bundle_path = tmp_dir.name
|
||||
modules_path = os.path.join(bundle_path, "src")
|
||||
modules = read_modules(modules_path)
|
||||
|
||||
# read metadata
|
||||
metadata_path = os.path.join(bundle_path, "app.json")
|
||||
with open(metadata_path, encoding="utf-8") as f:
|
||||
metadata = json.load(f)
|
||||
|
||||
# sanity checks
|
||||
module_paths = set(module["name"] for module in modules)
|
||||
for url, methods in metadata["endpoints"].items():
|
||||
for method, endpoint in methods.items():
|
||||
module_path = endpoint["js_module"]
|
||||
if module_path not in module_paths:
|
||||
raise ValueError(
|
||||
f"{method} {url}: module '{module_path}' not found in bundle"
|
||||
)
|
||||
|
||||
proposal_args = {
|
||||
"bundle": {"metadata": metadata, "modules": modules},
|
||||
"disable_bytecode_cache": disable_bytecode_cache,
|
||||
}
|
||||
|
||||
return build_proposal("set_js_app", proposal_args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_js_app(**kwargs):
|
||||
return build_proposal("remove_js_app", **kwargs)
|
||||
|
||||
|
||||
def read_modules(modules_path: str) -> List[dict]:
|
||||
modules = []
|
||||
for path in glob.glob(f"{modules_path}/**/*", recursive=True):
|
||||
if not os.path.isfile(path):
|
||||
continue
|
||||
rel_module_name = os.path.relpath(path, modules_path)
|
||||
rel_module_name = rel_module_name.replace("\\", "/") # Windows support
|
||||
with open(path, encoding="utf-8") as f:
|
||||
js = f.read()
|
||||
modules.append({"name": rel_module_name, "module": js})
|
||||
return modules
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def refresh_js_app_bytecode_cache(**kwargs):
|
||||
return build_proposal("refresh_js_app_bytecode_cache", **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def transition_node_to_trusted(
|
||||
node_id: str, valid_from: str, validity_period_days: Optional[int] = None, **kwargs
|
||||
):
|
||||
args = {"node_id": node_id, "valid_from": valid_from}
|
||||
if validity_period_days is not None:
|
||||
args["validity_period_days"] = validity_period_days # type: ignore
|
||||
return build_proposal("transition_node_to_trusted", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_node(node_id: str, **kwargs):
|
||||
return build_proposal("remove_node", {"node_id": node_id}, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def add_node_code(code_id: str, **kwargs):
|
||||
return build_proposal("add_node_code", {"code_id": code_id}, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_node_code(code_id: str, **kwargs):
|
||||
return build_proposal("remove_node_code", {"code_id": code_id}, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def transition_service_to_open(**kwargs):
|
||||
return build_proposal("transition_service_to_open", **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def trigger_ledger_rekey(**kwargs):
|
||||
return build_proposal("trigger_ledger_rekey", **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def trigger_recovery_shares_refresh(**kwargs):
|
||||
return build_proposal("trigger_recovery_shares_refresh", **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_recovery_threshold(threshold: int, **kwargs):
|
||||
proposal_args = {"recovery_threshold": threshold}
|
||||
return build_proposal("set_recovery_threshold", proposal_args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_ca_cert_bundle(
|
||||
cert_bundle_name, cert_bundle_path, skip_checks: bool = False, **kwargs
|
||||
):
|
||||
with open(cert_bundle_path, encoding="utf-8") as f:
|
||||
cert_bundle_pem = f.read()
|
||||
|
||||
if not skip_checks:
|
||||
delim = "-----END CERTIFICATE-----"
|
||||
for cert_pem in cert_bundle_pem.split(delim):
|
||||
if not cert_pem.strip():
|
||||
continue
|
||||
cert_pem += delim
|
||||
try:
|
||||
x509.load_pem_x509_certificate(
|
||||
cert_pem.encode(), crypto_backends.default_backend()
|
||||
)
|
||||
except Exception as exc:
|
||||
raise ValueError("Cannot parse PEM certificate") from exc
|
||||
|
||||
args = {"name": cert_bundle_name, "cert_bundle": cert_bundle_pem}
|
||||
return build_proposal("set_ca_cert_bundle", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_ca_cert_bundle(cert_bundle_name, **kwargs):
|
||||
args = {"name": cert_bundle_name}
|
||||
return build_proposal("remove_ca_cert_bundle", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_jwt_issuer(json_path: str, **kwargs):
|
||||
with open(json_path, encoding="utf-8") as f:
|
||||
obj = json.load(f)
|
||||
args = {
|
||||
"issuer": obj["issuer"],
|
||||
"key_filter": obj.get("key_filter", "all"),
|
||||
"key_policy": obj.get("key_policy"),
|
||||
"ca_cert_bundle_name": obj.get("ca_cert_bundle_name"),
|
||||
"auto_refresh": obj.get("auto_refresh", False),
|
||||
"jwks": obj.get("jwks"),
|
||||
}
|
||||
return build_proposal("set_jwt_issuer", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def remove_jwt_issuer(issuer: str, **kwargs):
|
||||
args = {"issuer": issuer}
|
||||
return build_proposal("remove_jwt_issuer", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_jwt_public_signing_keys(issuer: str, jwks_path: str, **kwargs):
|
||||
with open(jwks_path, encoding="utf-8") as f:
|
||||
jwks = json.load(f)
|
||||
if "keys" not in jwks:
|
||||
raise ValueError("not a JWKS document")
|
||||
args = {"issuer": issuer, "jwks": jwks}
|
||||
return build_proposal("set_jwt_public_signing_keys", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_node_certificate_validity(
|
||||
node_id: str, valid_from: str, validity_period_days: Optional[int] = None, **kwargs
|
||||
):
|
||||
args = {"node_id": node_id, "valid_from": valid_from}
|
||||
if validity_period_days is not None:
|
||||
args["validity_period_days"] = validity_period_days # type: ignore
|
||||
return build_proposal("set_node_certificate_validity", args, **kwargs)
|
||||
|
||||
|
||||
@cli_proposal
|
||||
def set_all_nodes_certificate_validity(
|
||||
valid_from: str, validity_period_days: Optional[int] = None, **kwargs
|
||||
):
|
||||
args = {"valid_from": valid_from}
|
||||
if validity_period_days is not None:
|
||||
args["validity_period_days"] = validity_period_days # type: ignore
|
||||
return build_proposal("set_all_nodes_certificate_validity", args, **kwargs)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser()
|
||||
|
||||
parser.add_argument(
|
||||
"-po",
|
||||
"--proposal-output-file",
|
||||
type=str,
|
||||
help=f"Path where proposal JSON object (request body for POST /gov/proposals) will be dumped. Default is {DEFAULT_PROPOSAL_OUTPUT}",
|
||||
)
|
||||
parser.add_argument(
|
||||
"-vo",
|
||||
"--vote-output-file",
|
||||
type=str,
|
||||
help=f"Path where vote JSON object (request body for POST /gov/proposals/{{proposal_id}}/ballots) will be dumped. Default is {DEFAULT_VOTE_OUTPUT}",
|
||||
)
|
||||
parser.add_argument(
|
||||
"-i",
|
||||
"--inline-args",
|
||||
action="store_true",
|
||||
help="Create a fixed proposal script with the call arguments as literals inside "
|
||||
"the script. When not inlined, the parameters are passed separately and could "
|
||||
"be replaced in the resulting object",
|
||||
)
|
||||
parser.add_argument("-v", "--verbose", action="store_true")
|
||||
|
||||
# Auto-generate CLI args based on the inspected signatures of generator functions
|
||||
module = inspect.getmodule(inspect.currentframe())
|
||||
proposal_generators = inspect.getmembers(module, predicate=inspect.isfunction)
|
||||
subparsers = parser.add_subparsers(
|
||||
title="Possible proposals", dest="proposal_type", required=True
|
||||
)
|
||||
|
||||
for func_name, func in proposal_generators:
|
||||
# Only generate for decorated functions
|
||||
if not hasattr(func, "is_cli_proposal"):
|
||||
continue
|
||||
|
||||
subparser = subparsers.add_parser(func_name)
|
||||
parameters = inspect.signature(func).parameters
|
||||
func_param_names = []
|
||||
for param_name, param in parameters.items():
|
||||
param_name_or_flag = param_name
|
||||
add_argument_extras = {}
|
||||
if param.kind == param.VAR_POSITIONAL or param.kind == param.VAR_KEYWORD:
|
||||
continue
|
||||
if param.annotation == param.empty:
|
||||
param_type = None
|
||||
elif param.annotation == dict or param.annotation == Any:
|
||||
param_type = json.loads
|
||||
elif param.annotation == List[str]:
|
||||
add_argument_extras["nargs"] = "+"
|
||||
param_type = str # type: ignore
|
||||
elif param.annotation == bool:
|
||||
assert param.default in [
|
||||
inspect.Parameter.empty,
|
||||
False,
|
||||
], f"{param_name} must have no default value or False"
|
||||
param_type = None
|
||||
param_name_or_flag = "--" + param_name.replace("_", "-")
|
||||
add_argument_extras["action"] = "store_true"
|
||||
else:
|
||||
param_type = param.annotation
|
||||
if param.default is None:
|
||||
add_argument_extras["nargs"] = "?"
|
||||
add_argument_extras["default"] = param.default # type: ignore
|
||||
if param_type is not None:
|
||||
add_argument_extras["type"] = param_type # type: ignore
|
||||
subparser.add_argument(param_name_or_flag, **add_argument_extras) # type: ignore
|
||||
func_param_names.append(param_name)
|
||||
subparser.set_defaults(func=func, param_names=func_param_names)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
LOG.remove()
|
||||
LOG.add(
|
||||
sys.stdout,
|
||||
format="<level>[{time:YYYY-MM-DD HH:mm:ss.SSS}] {level} | {message}</level>",
|
||||
level="TRACE" if args.verbose else "INFO",
|
||||
)
|
||||
|
||||
proposal, vote = args.func(
|
||||
**{name: getattr(args, name) for name in args.param_names},
|
||||
inline_args=args.inline_args,
|
||||
)
|
||||
|
||||
proposal_path = complete_proposal_output_path(
|
||||
args.proposal_type, proposal_output_path=args.proposal_output_file
|
||||
)
|
||||
LOG.success(f"Writing proposal to {proposal_path}")
|
||||
with open(proposal_path, "w", encoding="utf-8") as f:
|
||||
f.write(proposal)
|
||||
|
||||
vote_path = complete_vote_output_path(
|
||||
args.proposal_type, vote_output_path=args.vote_output_file
|
||||
)
|
||||
LOG.success(f"Writing vote to {vote_path}")
|
||||
with open(vote_path, "w", encoding="utf-8") as f:
|
||||
f.write(vote)
|
|
@ -1,2 +0,0 @@
|
|||
# Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
# Licensed under the Apache 2.0 License.
|
|
@ -39,7 +39,6 @@ setup(
|
|||
python_requires=">=3.8",
|
||||
install_requires=requirements,
|
||||
scripts=[
|
||||
path.join(PACKAGE_NAME, "proposal_generator.py"),
|
||||
path.join(PACKAGE_NAME, "read_ledger.py"),
|
||||
path.join(PACKAGE_NAME, "ledger_viz.py"),
|
||||
path.join(PACKAGE_NAME, "split_ledger.py"),
|
||||
|
@ -49,13 +48,5 @@ setup(
|
|||
path.join(UTILITIES_PATH, "submit_recovery_share.sh"),
|
||||
path.join(UTILITIES_PATH, "verify_quote.sh"),
|
||||
],
|
||||
package_data={
|
||||
"jinja_templates": [
|
||||
path.join(TEMPLATES_PATH, "ballot_script.json.jinja"),
|
||||
path.join(TEMPLATES_PATH, "ballots.json.jinja"),
|
||||
path.join(TEMPLATES_PATH, "macros.jinja"),
|
||||
path.join(TEMPLATES_PATH, "proposals.json.jinja"),
|
||||
]
|
||||
},
|
||||
include_package_data=True,
|
||||
)
|
||||
|
|
|
@ -0,0 +1,10 @@
|
|||
{
|
||||
"actions": [
|
||||
{
|
||||
"name": "set_user",
|
||||
"args": {
|
||||
"cert": {% filter tojson %}{% include cert %}{% endfilter %}
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
|
@ -7,7 +7,6 @@ import infra.proc
|
|||
import infra.net
|
||||
import infra.e2e_args
|
||||
import suite.test_requirements as reqs
|
||||
import ccf.proposal_generator
|
||||
import json
|
||||
|
||||
from loguru import logger as LOG
|
||||
|
@ -25,7 +24,9 @@ def test_cert_store(network, args):
|
|||
f.write("foo")
|
||||
f.flush()
|
||||
try:
|
||||
ccf.proposal_generator.set_ca_cert_bundle(cert_name, f.name)
|
||||
network.consortium.set_ca_cert_bundle(
|
||||
primary, cert_name, f.name, skip_checks=False
|
||||
)
|
||||
except ValueError:
|
||||
pass
|
||||
else:
|
||||
|
|
|
@ -11,6 +11,7 @@ import infra.e2e_args
|
|||
import suite.test_requirements as reqs
|
||||
import infra.logging_app as app
|
||||
import json
|
||||
import jinja2
|
||||
import requests
|
||||
import infra.crypto
|
||||
from datetime import datetime
|
||||
|
@ -105,7 +106,7 @@ def test_user(network, args, verify=True):
|
|||
# Note: This test should not be chained in the test suite as it creates
|
||||
# a new user and uses its own LoggingTxs
|
||||
primary, _ = network.find_nodes()
|
||||
new_user_local_id = f"user{3}"
|
||||
new_user_local_id = "user3"
|
||||
new_user = network.create_user(new_user_local_id, args.participants_curve)
|
||||
user_data = {"lifetime": "temporary"}
|
||||
network.consortium.add_user(primary, new_user.local_id, user_data)
|
||||
|
@ -123,6 +124,50 @@ def test_user(network, args, verify=True):
|
|||
return network
|
||||
|
||||
|
||||
@reqs.description("Validate sample Jinja templates")
|
||||
@reqs.supports_methods("log/private")
|
||||
def test_jinja_templates(network, args, verify=True):
|
||||
primary, _ = network.find_primary()
|
||||
|
||||
new_user_local_id = "bob"
|
||||
new_user = network.create_user(new_user_local_id, args.participants_curve)
|
||||
|
||||
with primary.client(new_user_local_id) as c:
|
||||
r = c.post("/app/log/private", {"id": 42, "msg": "New user test"})
|
||||
assert r.status_code == http.HTTPStatus.UNAUTHORIZED.value
|
||||
|
||||
template_loader = jinja2.ChoiceLoader(
|
||||
[
|
||||
jinja2.FileSystemLoader(args.jinja_templates_path),
|
||||
jinja2.FileSystemLoader(os.path.dirname(new_user.cert_path)),
|
||||
]
|
||||
)
|
||||
template_env = jinja2.Environment(
|
||||
loader=template_loader, undefined=jinja2.StrictUndefined
|
||||
)
|
||||
|
||||
proposal_template = template_env.get_template("set_user_proposal.json.jinja")
|
||||
proposal_body = proposal_template.render(cert=os.path.basename(new_user.cert_path))
|
||||
proposal = network.consortium.get_any_active_member().propose(
|
||||
primary, proposal_body
|
||||
)
|
||||
|
||||
ballot_template = template_env.get_template("ballot.json.jinja")
|
||||
ballot_body = ballot_template.render(**json.loads(proposal_body))
|
||||
network.consortium.vote_using_majority(primary, proposal, ballot_body)
|
||||
|
||||
with primary.client(new_user_local_id) as c:
|
||||
r = c.post("/app/log/private", {"id": 42, "msg": "New user test"})
|
||||
assert r.status_code == http.HTTPStatus.OK.value
|
||||
|
||||
network.consortium.remove_user(primary, new_user.service_id)
|
||||
with primary.client(new_user_local_id) as c:
|
||||
r = c.get("/app/log/private")
|
||||
assert r.status_code == http.HTTPStatus.UNAUTHORIZED.value
|
||||
|
||||
return network
|
||||
|
||||
|
||||
@reqs.description("Add untrusted node, check no quote is returned")
|
||||
def test_no_quote(network, args):
|
||||
untrusted_node = network.create_node("local://localhost")
|
||||
|
@ -245,6 +290,8 @@ def test_each_node_cert_renewal(network, args):
|
|||
or args.maximum_node_certificate_validity_days,
|
||||
)
|
||||
except Exception as e:
|
||||
if expected_exception is None:
|
||||
raise e
|
||||
assert isinstance(e, expected_exception)
|
||||
continue
|
||||
else:
|
||||
|
@ -298,6 +345,7 @@ def gov(args):
|
|||
test_member_data(network, args)
|
||||
test_quote(network, args)
|
||||
test_user(network, args)
|
||||
test_jinja_templates(network, args)
|
||||
test_no_quote(network, args)
|
||||
test_ack_state_digest_update(network, args)
|
||||
test_invalid_client_signature(network, args)
|
||||
|
@ -324,7 +372,15 @@ def js_gov(args):
|
|||
|
||||
|
||||
if __name__ == "__main__":
|
||||
cr = ConcurrentRunner()
|
||||
|
||||
def add(parser):
|
||||
parser.add_argument(
|
||||
"--jinja-templates-path",
|
||||
help="Path to directory containing sample Jinja templates",
|
||||
required=True,
|
||||
)
|
||||
|
||||
cr = ConcurrentRunner(add)
|
||||
|
||||
cr.add(
|
||||
"session_auth",
|
||||
|
|
|
@ -15,13 +15,37 @@ import infra.node
|
|||
import infra.crypto
|
||||
import infra.member
|
||||
from ccf.ledger import NodeStatus
|
||||
import ccf.proposal_generator
|
||||
import ccf.ledger
|
||||
from infra.proposal import ProposalState
|
||||
import shutil
|
||||
import tempfile
|
||||
import glob
|
||||
|
||||
from cryptography import x509
|
||||
import cryptography.hazmat.backends as crypto_backends
|
||||
|
||||
from loguru import logger as LOG
|
||||
|
||||
|
||||
def slurp_file(path):
|
||||
return open(path, encoding="utf-8").read()
|
||||
|
||||
|
||||
def slurp_json(path):
|
||||
return json.load(open(path, encoding="utf-8"))
|
||||
|
||||
|
||||
def read_modules(modules_path):
|
||||
modules = []
|
||||
for path in glob.glob(f"{modules_path}/**/*", recursive=True):
|
||||
if not os.path.isfile(path):
|
||||
continue
|
||||
rel_module_name = os.path.relpath(path, modules_path)
|
||||
rel_module_name = rel_module_name.replace("\\", "/") # Windows support
|
||||
modules.append({"name": rel_module_name, "module": slurp_file(path)})
|
||||
return modules
|
||||
|
||||
|
||||
class Consortium:
|
||||
def __init__(
|
||||
self,
|
||||
|
@ -116,26 +140,40 @@ class Consortium:
|
|||
for member in self.members:
|
||||
member.authenticate_session = flag
|
||||
|
||||
def make_proposal(self, proposal_name, *args, **kwargs):
|
||||
func = getattr(ccf.proposal_generator, proposal_name)
|
||||
proposal, vote = func(*args, **kwargs)
|
||||
def make_proposal(self, proposal_name, **kwargs):
|
||||
action = {
|
||||
"name": proposal_name,
|
||||
}
|
||||
if kwargs:
|
||||
args = {}
|
||||
for k, v in kwargs.items():
|
||||
if v is not None:
|
||||
args[k] = v
|
||||
action["args"] = args
|
||||
|
||||
proposal_body = {"actions": [action]}
|
||||
|
||||
trivial_vote_for = (
|
||||
"export function vote (rawProposal, proposerId) { return true }"
|
||||
)
|
||||
ballot_body = {"ballot": trivial_vote_for}
|
||||
|
||||
proposal_output_path = os.path.join(
|
||||
self.common_dir, f"{proposal_name}_proposal.json"
|
||||
)
|
||||
vote_output_path = os.path.join(
|
||||
ballot_output_path = os.path.join(
|
||||
self.common_dir, f"{proposal_name}_vote_for.json"
|
||||
)
|
||||
|
||||
LOG.debug(f"Writing proposal to {proposal_output_path}")
|
||||
with open(proposal_output_path, "w", encoding="utf-8") as f:
|
||||
f.write(proposal)
|
||||
json.dump(proposal_body, f, indent=2)
|
||||
|
||||
LOG.debug(f"Writing vote to {vote_output_path}")
|
||||
with open(vote_output_path, "w", encoding="utf-8") as f:
|
||||
f.write(vote)
|
||||
LOG.debug(f"Writing ballot to {ballot_output_path}")
|
||||
with open(ballot_output_path, "w", encoding="utf-8") as f:
|
||||
json.dump(ballot_body, f, indent=2)
|
||||
|
||||
return f"@{proposal_output_path}", f"@{vote_output_path}"
|
||||
return f"@{proposal_output_path}", f"@{ballot_output_path}"
|
||||
|
||||
def activate(self, remote_node):
|
||||
for m in self.members:
|
||||
|
@ -159,11 +197,15 @@ class Consortium:
|
|||
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_member",
|
||||
os.path.join(self.common_dir, f"{new_member_local_id}_cert.pem"),
|
||||
os.path.join(self.common_dir, f"{new_member_local_id}_enc_pubk.pem")
|
||||
cert=slurp_file(
|
||||
os.path.join(self.common_dir, f"{new_member_local_id}_cert.pem")
|
||||
),
|
||||
encryption_pub_key=slurp_file(
|
||||
os.path.join(self.common_dir, f"{new_member_local_id}_enc_pubk.pem")
|
||||
)
|
||||
if recovery_member
|
||||
else None,
|
||||
member_data,
|
||||
member_data=member_data,
|
||||
)
|
||||
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
|
@ -286,7 +328,7 @@ class Consortium:
|
|||
LOG.info(f"Retiring node {node_to_retire.local_node_id}")
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"remove_node",
|
||||
node_to_retire.node_id,
|
||||
node_id=node_to_retire.node_id,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -302,9 +344,9 @@ class Consortium:
|
|||
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"transition_node_to_trusted",
|
||||
node_id,
|
||||
valid_from,
|
||||
validity_period_days,
|
||||
node_id=node_id,
|
||||
valid_from=valid_from,
|
||||
validity_period_days=validity_period_days,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
self.vote_using_majority(
|
||||
|
@ -322,7 +364,7 @@ class Consortium:
|
|||
def remove_member(self, remote_node, member_to_remove):
|
||||
LOG.info(f"Retiring member {member_to_remove.local_id}")
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"remove_member", member_to_remove.service_id
|
||||
"remove_member", member_id=member_to_remove.service_id
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -346,8 +388,8 @@ class Consortium:
|
|||
def add_user(self, remote_node, user_id, user_data=None):
|
||||
proposal, careful_vote = self.make_proposal(
|
||||
"set_user",
|
||||
self.user_cert_path(user_id),
|
||||
user_data,
|
||||
cert=slurp_file(self.user_cert_path(user_id)),
|
||||
user_data=user_data,
|
||||
)
|
||||
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal)
|
||||
|
@ -356,8 +398,7 @@ class Consortium:
|
|||
def add_users_and_transition_service_to_open(self, remote_node, users):
|
||||
proposal = {"actions": []}
|
||||
for user_id in users:
|
||||
with open(self.user_cert_path(user_id), encoding="utf-8") as cf:
|
||||
cert = cf.read()
|
||||
cert = slurp_file(self.user_cert_path(user_id))
|
||||
proposal["actions"].append({"name": "set_user", "args": {"cert": cert}})
|
||||
proposal["actions"].append({"name": "transition_service_to_open"})
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal)
|
||||
|
@ -373,7 +414,9 @@ class Consortium:
|
|||
when trying to use ccf.ledger to read ledger entries.
|
||||
"""
|
||||
proposal, _ = self.make_proposal(
|
||||
"set_user", self.user_cert_path("user0"), {"padding": "x" * 4096 * 5}
|
||||
"set_user",
|
||||
cert=slurp_file(self.user_cert_path("user0")),
|
||||
user_data={"padding": "x" * 4096 * 5},
|
||||
)
|
||||
m = self.get_any_active_member()
|
||||
p = m.propose(remote_node, proposal)
|
||||
|
@ -384,7 +427,7 @@ class Consortium:
|
|||
self.add_user(remote_node, u)
|
||||
|
||||
def remove_user(self, remote_node, user_id):
|
||||
proposal, careful_vote = self.make_proposal("remove_user", user_id)
|
||||
proposal, careful_vote = self.make_proposal("remove_user", user_id=user_id)
|
||||
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal)
|
||||
self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -392,8 +435,8 @@ class Consortium:
|
|||
def set_user_data(self, remote_node, user_id, user_data):
|
||||
proposal, careful_vote = self.make_proposal(
|
||||
"set_user_data",
|
||||
user_id,
|
||||
user_data,
|
||||
user_id=user_id,
|
||||
user_data=user_data,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal)
|
||||
self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -401,69 +444,136 @@ class Consortium:
|
|||
def set_member_data(self, remote_node, member_service_id, member_data):
|
||||
proposal, careful_vote = self.make_proposal(
|
||||
"set_member_data",
|
||||
member_service_id,
|
||||
member_data,
|
||||
member_id=member_service_id,
|
||||
member_data=member_data,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal)
|
||||
self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def set_constitution(self, remote_node, constitution_paths):
|
||||
concatenated = "\n".join(slurp_file(path) for path in constitution_paths)
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_constitution", constitution_paths
|
||||
"set_constitution", constitution=concatenated
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def set_js_app(self, remote_node, app_bundle_path, disable_bytecode_cache=False):
|
||||
def set_js_app_from_dir(
|
||||
self, remote_node, bundle_path, disable_bytecode_cache=False
|
||||
):
|
||||
if os.path.isfile(bundle_path):
|
||||
tmp_dir = tempfile.TemporaryDirectory(prefix="ccf")
|
||||
shutil.unpack_archive(bundle_path, tmp_dir.name)
|
||||
bundle_path = tmp_dir.name
|
||||
modules_path = os.path.join(bundle_path, "src")
|
||||
modules = read_modules(modules_path)
|
||||
|
||||
# read metadata
|
||||
metadata_path = os.path.join(bundle_path, "app.json")
|
||||
with open(metadata_path, encoding="utf-8") as f:
|
||||
metadata = json.load(f)
|
||||
|
||||
# sanity checks
|
||||
module_paths = set(module["name"] for module in modules)
|
||||
for url, methods in metadata["endpoints"].items():
|
||||
for method, endpoint in methods.items():
|
||||
module_path = endpoint["js_module"]
|
||||
if module_path not in module_paths:
|
||||
raise ValueError(
|
||||
f"{method} {url}: module '{module_path}' not found in bundle"
|
||||
)
|
||||
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_js_app", app_bundle_path, disable_bytecode_cache
|
||||
"set_js_app",
|
||||
bundle={"metadata": metadata, "modules": modules},
|
||||
disable_bytecode_cache=disable_bytecode_cache,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
# Large apps take a long time to process - wait longer than normal for commit
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote, timeout=30)
|
||||
|
||||
def set_js_app_from_json(
|
||||
self, remote_node, json_path, disable_bytecode_cache=False
|
||||
):
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_js_app",
|
||||
bundle=slurp_json(json_path),
|
||||
disable_bytecode_cache=disable_bytecode_cache,
|
||||
)
|
||||
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
# Large apps take a long time to process - wait longer than normal for commit
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote, timeout=30)
|
||||
|
||||
def remove_js_app(self, remote_node):
|
||||
proposal_body, careful_vote = ccf.proposal_generator.remove_js_app()
|
||||
proposal_body, careful_vote = self.make_proposal("remove_js_app")
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def refresh_js_app_bytecode_cache(self, remote_node):
|
||||
(
|
||||
proposal_body,
|
||||
careful_vote,
|
||||
) = ccf.proposal_generator.refresh_js_app_bytecode_cache()
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"refresh_js_app_bytecode_cache"
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def set_jwt_issuer(self, remote_node, json_path):
|
||||
proposal_body, careful_vote = self.make_proposal("set_jwt_issuer", json_path)
|
||||
obj = slurp_json(json_path)
|
||||
args = {
|
||||
"issuer": obj["issuer"],
|
||||
"key_filter": obj.get("key_filter", "all"),
|
||||
"key_policy": obj.get("key_policy"),
|
||||
"ca_cert_bundle_name": obj.get("ca_cert_bundle_name"),
|
||||
"auto_refresh": obj.get("auto_refresh", False),
|
||||
"jwks": obj.get("jwks"),
|
||||
}
|
||||
proposal_body, careful_vote = self.make_proposal("set_jwt_issuer", **args)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def remove_jwt_issuer(self, remote_node, issuer):
|
||||
proposal_body, careful_vote = self.make_proposal("remove_jwt_issuer", issuer)
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"remove_jwt_issuer", issuer=issuer
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def set_jwt_public_signing_keys(self, remote_node, issuer, jwks_path):
|
||||
obj = slurp_json(jwks_path)
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_jwt_public_signing_keys", issuer, jwks_path
|
||||
"set_jwt_public_signing_keys", issuer=issuer, jwks=obj
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def set_ca_cert_bundle(
|
||||
self, remote_node, cert_name, cert_pem_path, skip_checks=False
|
||||
self, remote_node, cert_name, cert_bundle_path, skip_checks=False
|
||||
):
|
||||
if not skip_checks:
|
||||
cert_bundle_pem = slurp_file(cert_bundle_path)
|
||||
delim = "-----END CERTIFICATE-----"
|
||||
for cert_pem in cert_bundle_pem.split(delim):
|
||||
if not cert_pem.strip():
|
||||
continue
|
||||
cert_pem += delim
|
||||
try:
|
||||
x509.load_pem_x509_certificate(
|
||||
cert_pem.encode(), crypto_backends.default_backend()
|
||||
)
|
||||
except Exception as exc:
|
||||
raise ValueError("Cannot parse PEM certificate") from exc
|
||||
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_ca_cert_bundle", cert_name, cert_pem_path, skip_checks=skip_checks
|
||||
"set_ca_cert_bundle",
|
||||
name=cert_name,
|
||||
cert_bundle=slurp_file(cert_bundle_path),
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def remove_ca_cert_bundle(self, remote_node, cert_name):
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"remove_ca_cert_bundle", cert_name
|
||||
"remove_ca_cert_bundle", name=cert_name
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -511,7 +621,7 @@ class Consortium:
|
|||
|
||||
def set_recovery_threshold(self, remote_node, recovery_threshold):
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_recovery_threshold", recovery_threshold
|
||||
"set_recovery_threshold", recovery_threshold=recovery_threshold
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
proposal.vote_for = careful_vote
|
||||
|
@ -521,12 +631,16 @@ class Consortium:
|
|||
return r
|
||||
|
||||
def add_new_code(self, remote_node, new_code_id):
|
||||
proposal_body, careful_vote = self.make_proposal("add_node_code", new_code_id)
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"add_node_code", code_id=new_code_id
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
def retire_code(self, remote_node, code_id):
|
||||
proposal_body, careful_vote = self.make_proposal("remove_node_code", code_id)
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"remove_node_code", code_id=code_id
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
||||
|
@ -535,9 +649,9 @@ class Consortium:
|
|||
):
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_node_certificate_validity",
|
||||
node_to_renew.node_id,
|
||||
valid_from,
|
||||
validity_period_days,
|
||||
node_id=node_to_renew.node_id,
|
||||
valid_from=valid_from,
|
||||
validity_period_days=validity_period_days,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
return self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -547,8 +661,8 @@ class Consortium:
|
|||
):
|
||||
proposal_body, careful_vote = self.make_proposal(
|
||||
"set_all_nodes_certificate_validity",
|
||||
valid_from,
|
||||
validity_period_days,
|
||||
valid_from=valid_from,
|
||||
validity_period_days=validity_period_days,
|
||||
)
|
||||
proposal = self.get_any_active_member().propose(remote_node, proposal_body)
|
||||
r = self.vote_using_majority(remote_node, proposal, careful_vote)
|
||||
|
@ -564,12 +678,10 @@ class Consortium:
|
|||
current_status = r.body.json()["service_status"]
|
||||
current_cert = r.body.json()["service_certificate"]
|
||||
|
||||
expected_cert = open(
|
||||
os.path.join(self.common_dir, "networkcert.pem"), "rb"
|
||||
).read()
|
||||
expected_cert = slurp_file(os.path.join(self.common_dir, "networkcert.pem"))
|
||||
|
||||
assert (
|
||||
current_cert == expected_cert[:-1].decode()
|
||||
current_cert == expected_cert[:-1]
|
||||
), "Current service certificate did not match with networkcert.pem"
|
||||
assert (
|
||||
current_status == status.value
|
||||
|
|
|
@ -81,6 +81,7 @@ def get_common_folder_name(workspace, label):
|
|||
class UserInfo:
|
||||
local_id: int
|
||||
service_id: str
|
||||
cert_path: str
|
||||
|
||||
|
||||
class Network:
|
||||
|
@ -421,8 +422,8 @@ class Network:
|
|||
self.consortium.activate(self.find_random_node())
|
||||
|
||||
if args.js_app_bundle:
|
||||
self.consortium.set_js_app(
|
||||
remote_node=self.find_random_node(), app_bundle_path=args.js_app_bundle
|
||||
self.consortium.set_js_app_from_dir(
|
||||
remote_node=self.find_random_node(), bundle_path=args.js_app_bundle
|
||||
)
|
||||
|
||||
for path in args.jwt_issuer:
|
||||
|
@ -664,11 +665,10 @@ class Network:
|
|||
log_output=False,
|
||||
).check_returncode()
|
||||
|
||||
with open(
|
||||
os.path.join(self.common_dir, f"{local_user_id}_cert.pem"), encoding="utf-8"
|
||||
) as c:
|
||||
cert_path = os.path.join(self.common_dir, f"{local_user_id}_cert.pem")
|
||||
with open(cert_path, encoding="utf-8") as c:
|
||||
service_user_id = infra.crypto.compute_cert_der_hash_hex_from_pem(c.read())
|
||||
new_user = UserInfo(local_user_id, service_user_id)
|
||||
new_user = UserInfo(local_user_id, service_user_id, cert_path)
|
||||
if record:
|
||||
self.users.append(new_user)
|
||||
|
||||
|
|
|
@ -43,7 +43,7 @@ def test_module_import(network, args):
|
|||
|
||||
# Update JS app, deploying modules _and_ app script that imports module
|
||||
bundle_dir = os.path.join(THIS_DIR, "basic-module-import")
|
||||
network.consortium.set_js_app(primary, bundle_dir)
|
||||
network.consortium.set_js_app_from_dir(primary, bundle_dir)
|
||||
|
||||
with primary.client("user0") as c:
|
||||
r = c.post("/app/test_module", {})
|
||||
|
@ -60,7 +60,7 @@ def test_dynamic_module_import(network, args):
|
|||
|
||||
# Update JS app, deploying modules _and_ app script that imports module
|
||||
bundle_dir = os.path.join(THIS_DIR, "dynamic-module-import")
|
||||
network.consortium.set_js_app(primary, bundle_dir)
|
||||
network.consortium.set_js_app_from_dir(primary, bundle_dir)
|
||||
|
||||
with primary.client("user0") as c:
|
||||
r = c.post("/app/test_module", {})
|
||||
|
@ -78,7 +78,9 @@ def test_bytecode_cache(network, args):
|
|||
bundle_dir = os.path.join(THIS_DIR, "basic-module-import")
|
||||
|
||||
LOG.info("Verifying that app works without bytecode cache")
|
||||
network.consortium.set_js_app(primary, bundle_dir, disable_bytecode_cache=True)
|
||||
network.consortium.set_js_app_from_dir(
|
||||
primary, bundle_dir, disable_bytecode_cache=True
|
||||
)
|
||||
|
||||
with primary.client("user0") as c:
|
||||
r = c.get("/node/js_metrics")
|
||||
|
@ -92,7 +94,9 @@ def test_bytecode_cache(network, args):
|
|||
assert r.body.text() == "Hello world!"
|
||||
|
||||
LOG.info("Verifying that app works with bytecode cache")
|
||||
network.consortium.set_js_app(primary, bundle_dir, disable_bytecode_cache=False)
|
||||
network.consortium.set_js_app_from_dir(
|
||||
primary, bundle_dir, disable_bytecode_cache=False
|
||||
)
|
||||
|
||||
with primary.client("user0") as c:
|
||||
r = c.get("/node/js_metrics")
|
||||
|
@ -106,7 +110,9 @@ def test_bytecode_cache(network, args):
|
|||
assert r.body.text() == "Hello world!"
|
||||
|
||||
LOG.info("Verifying that redeploying app cleans bytecode cache")
|
||||
network.consortium.set_js_app(primary, bundle_dir, disable_bytecode_cache=True)
|
||||
network.consortium.set_js_app_from_dir(
|
||||
primary, bundle_dir, disable_bytecode_cache=True
|
||||
)
|
||||
|
||||
with primary.client("user0") as c:
|
||||
r = c.get("/node/js_metrics")
|
||||
|
@ -146,7 +152,7 @@ def test_app_bundle(network, args):
|
|||
bundle_path = shutil.make_archive(
|
||||
os.path.join(tmp_dir, "bundle"), "zip", bundle_dir
|
||||
)
|
||||
set_js_proposal = network.consortium.set_js_app(primary, bundle_path)
|
||||
set_js_proposal = network.consortium.set_js_app_from_dir(primary, bundle_path)
|
||||
|
||||
assert (
|
||||
raw_module_name
|
||||
|
@ -197,7 +203,7 @@ def test_dynamic_endpoints(network, args):
|
|||
bundle_dir = os.path.join(PARENT_DIR, "js-app-bundle")
|
||||
|
||||
LOG.info("Deploying initial js app bundle archive")
|
||||
network.consortium.set_js_app(primary, bundle_dir)
|
||||
network.consortium.set_js_app_from_dir(primary, bundle_dir)
|
||||
|
||||
valid_body = {"op": "sub", "left": 82, "right": 40}
|
||||
expected_response = {"result": 42}
|
||||
|
@ -238,7 +244,7 @@ def test_dynamic_endpoints(network, args):
|
|||
]
|
||||
with open(metadata_path, "w", encoding="utf-8") as f:
|
||||
json.dump(metadata, f, indent=2)
|
||||
network.consortium.set_js_app(primary, modified_bundle_dir)
|
||||
network.consortium.set_js_app_from_dir(primary, modified_bundle_dir)
|
||||
|
||||
LOG.info("Checking modified endpoint is accessible without auth")
|
||||
with primary.client() as c:
|
||||
|
@ -280,8 +286,10 @@ def test_npm_app(network, args):
|
|||
)
|
||||
|
||||
LOG.info("Deploying npm app")
|
||||
bundle_dir = os.path.join(app_dir, "dist")
|
||||
network.consortium.set_js_app(primary, bundle_dir)
|
||||
bundle_path = os.path.join(
|
||||
app_dir, "dist", "bundle.json"
|
||||
) # Produced by build step of test npm-app
|
||||
network.consortium.set_js_app_from_json(primary, bundle_path)
|
||||
|
||||
LOG.info("Calling npm app endpoints")
|
||||
with primary.client("user0") as c:
|
||||
|
|
|
@ -249,7 +249,9 @@ def run_code_upgrade_from(
|
|||
LOG.info("Upgrade to new JS app")
|
||||
# Upgrade to a version of the app containing an endpoint that
|
||||
# registers custom claims
|
||||
network.consortium.set_js_app(primary, args.new_js_app_bundle)
|
||||
network.consortium.set_js_app_from_dir(
|
||||
primary, args.new_js_app_bundle
|
||||
)
|
||||
LOG.info("Run transaction with additional claim")
|
||||
# With wait_for_sync, the client checks that all nodes, including
|
||||
# the minority of old ones, have acked the transaction
|
||||
|
|
|
@ -6,7 +6,7 @@ import re
|
|||
import infra.e2e_args
|
||||
import infra.network
|
||||
import infra.consortium
|
||||
import ccf.proposal_generator
|
||||
import infra.clients
|
||||
from infra.proposal import ProposalState
|
||||
|
||||
import suite.test_requirements as reqs
|
||||
|
@ -178,10 +178,9 @@ def test_governance(network, args):
|
|||
LOG.info("New non-active member should get insufficient rights response")
|
||||
current_recovery_thresold = network.consortium.recovery_threshold
|
||||
try:
|
||||
(
|
||||
proposal_recovery_threshold,
|
||||
careful_vote,
|
||||
) = ccf.proposal_generator.set_recovery_threshold(current_recovery_thresold)
|
||||
proposal_recovery_threshold, careful_vote = network.consortium.make_proposal(
|
||||
"set_recovery_threshold", recovery_threshold=current_recovery_thresold
|
||||
)
|
||||
new_member.propose(node, proposal_recovery_threshold)
|
||||
assert False, "New non-active member should get insufficient rights response"
|
||||
except infra.proposal.ProposalNotCreated as e:
|
||||
|
@ -199,10 +198,9 @@ def test_governance(network, args):
|
|||
assert proposal.state == infra.proposal.ProposalState.ACCEPTED
|
||||
|
||||
LOG.info("New member makes a new proposal")
|
||||
(
|
||||
proposal_recovery_threshold,
|
||||
careful_vote,
|
||||
) = ccf.proposal_generator.set_recovery_threshold(current_recovery_thresold)
|
||||
proposal_recovery_threshold, careful_vote = network.consortium.make_proposal(
|
||||
"set_recovery_threshold", recovery_threshold=current_recovery_thresold
|
||||
)
|
||||
proposal = new_member.propose(node, proposal_recovery_threshold)
|
||||
|
||||
LOG.debug("Other members (non proposer) are unable to withdraw new proposal")
|
||||
|
|
|
@ -0,0 +1,53 @@
|
|||
import { readdirSync, statSync, readFileSync, writeFileSync } from "fs";
|
||||
import { join } from "path";
|
||||
|
||||
const args = process.argv.slice(2);
|
||||
|
||||
const getAllFiles = function (dirPath, arrayOfFiles) {
|
||||
arrayOfFiles = arrayOfFiles || [];
|
||||
|
||||
const files = readdirSync(dirPath);
|
||||
for (const file of files) {
|
||||
const filePath = join(dirPath, file);
|
||||
if (statSync(filePath).isDirectory()) {
|
||||
arrayOfFiles = getAllFiles(filePath, arrayOfFiles);
|
||||
} else {
|
||||
arrayOfFiles.push(filePath);
|
||||
}
|
||||
}
|
||||
|
||||
return arrayOfFiles;
|
||||
};
|
||||
|
||||
const removePrefix = function (s, prefix) {
|
||||
return s.substr(prefix.length);
|
||||
};
|
||||
|
||||
const rootDir = args[0];
|
||||
|
||||
const metadataPath = join(rootDir, "app.json");
|
||||
const metadata = JSON.parse(readFileSync(metadataPath, "utf-8"));
|
||||
|
||||
const srcDir = join(rootDir, "src");
|
||||
const allFiles = getAllFiles(srcDir);
|
||||
|
||||
// The trailing / is included so that it is trimmed in removePrefix.
|
||||
// This produces "foo/bar.js" rather than "/foo/bar.js"
|
||||
const toTrim = srcDir + "/";
|
||||
|
||||
const modules = allFiles.map(function (filePath) {
|
||||
return {
|
||||
name: removePrefix(filePath, toTrim),
|
||||
module: readFileSync(filePath, "utf-8"),
|
||||
};
|
||||
});
|
||||
|
||||
const bundlePath = join(args[0], "bundle.json");
|
||||
const bundle = {
|
||||
metadata: metadata,
|
||||
modules: modules,
|
||||
};
|
||||
console.log(
|
||||
`Writing bundle containing ${modules.length} modules to ${bundlePath}`
|
||||
);
|
||||
writeFileSync(bundlePath, JSON.stringify(bundle));
|
|
@ -1,7 +1,8 @@
|
|||
{
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"build": "del-cli -f dist/ && rollup --config && cp app.json dist/"
|
||||
"build": "del-cli -f dist/ && rollup --config && cp app.json dist/ && node build_bundle.js dist/",
|
||||
"bundle": "node build_bundle.js dist"
|
||||
},
|
||||
"type": "module",
|
||||
"dependencies": {
|
||||
|
|
|
@ -64,9 +64,6 @@ python3.8 -m venv env
|
|||
source env/bin/activate
|
||||
python -m pip install -e ../../../python
|
||||
|
||||
# Test Python package CLI
|
||||
../../../tests/test_python_cli.sh > test_python_cli.out
|
||||
|
||||
# Poll until service has died
|
||||
while [ "$(service_http_status)" == "200" ]; do
|
||||
echo "Waiting for service to close..."
|
||||
|
|
|
@ -1,33 +0,0 @@
|
|||
#!/bin/bash
|
||||
# Copyright (c) Microsoft Corporation. All rights reserved.
|
||||
# Licensed under the Apache 2.0 License.
|
||||
set -ex
|
||||
|
||||
# This only checks that the following commands do not throw errors.
|
||||
# It is expected that other tests cover correctness of the generated
|
||||
# proposals, this just checks the basic usability of the CLI.
|
||||
|
||||
keygenerator.sh --help
|
||||
keygenerator.sh --name alice
|
||||
keygenerator.sh --name bob --gen-enc-key
|
||||
|
||||
python -m ccf.proposal_generator --help
|
||||
|
||||
python -m ccf.proposal_generator set_member --help
|
||||
python -m ccf.proposal_generator set_member bob_cert.pem bob_enc_pubk.pem
|
||||
python -m ccf.proposal_generator set_member bob_cert.pem bob_enc_pubk.pem '"Arbitrary data"'
|
||||
python -m ccf.proposal_generator set_member bob_cert.pem bob_enc_pubk.pem '{"Interesting": {"nested": ["da", "ta"]}}'
|
||||
|
||||
python -m ccf.proposal_generator set_user --help
|
||||
python -m ccf.proposal_generator set_user alice_cert.pem
|
||||
python -m ccf.proposal_generator set_user alice_cert.pem '"ADMIN"'
|
||||
python -m ccf.proposal_generator set_user alice_cert.pem '{"type": "ADMIN", "friendlyName": "Alice"}'
|
||||
|
||||
python -m ccf.proposal_generator transition_service_to_open --help
|
||||
python -m ccf.proposal_generator transition_service_to_open
|
||||
|
||||
python -m ccf.proposal_generator transition_node_to_trusted --help
|
||||
python -m ccf.proposal_generator transition_node_to_trusted 42 211019154318Z
|
||||
|
||||
python -m ccf.proposal_generator add_node_code --help
|
||||
python -m ccf.proposal_generator add_node_code 1234abcd
|
Загрузка…
Ссылка в новой задаче