Bug: Move commands under app to cluster (#82)

* moved azb spark app commands to azb spark cluster

* fixed docs to reflect new cli structure

* renamed commands for clarity
This commit is contained in:
Jacob Freck 2017-09-19 08:29:23 -07:00 коммит произвёл GitHub
Родитель bd2878ecdb
Коммит 43dd8e57f1
7 изменённых файлов: 19 добавлений и 38 удалений

Просмотреть файл

@ -87,7 +87,7 @@ More information regarding using a cluster can be found in the [cluster document
Now you can submit jobs to run against the cluster:
```
azb spark app submit \
azb spark cluster submit \
--id <my-cluster-id> \
--name <my-job-name> \
[options] \
@ -103,7 +103,7 @@ The output of spark-submit will be streamed to the console. Use the `--no-wait`
If you decided not to tail the log when submiting the job or want to read it again you can use this command.
```bash
azb spark app logs \
azb spark cluster logs \
--id <my-cluster-id> \
-- name <my-job-name>
[--tail] # If you want it to tail the log if the task is still running

Просмотреть файл

@ -2,16 +2,16 @@
Submitting a job to your Spark cluster in Thunderbolt mimics the experience of a typical standalone cluster. A spark job will be submitted to the system and run to completion.
## Spark-Submit
The spark-submit experience is mostly the same as any regular Spark cluster with a few minor differences. You can take a look at azb spark app --help for more detailed information and options.
The spark-submit experience is mostly the same as any regular Spark cluster with a few minor differences. You can take a look at azb spark cluster --help for more detailed information and options.
Run a Spark job:
```sh
azb spark app submit --id <name_of_spark_cluster> --name <name_of_spark_job> <executable> <executable_params>
azb spark cluster submit --id <name_of_spark_cluster> --name <name_of_spark_job> <executable> <executable_params>
```
For example, run a local pi.py file on a Spark cluster
```sh
azb spark app submit --id spark --name pipy example-jobs/python/pi.py 100
azb spark cluster submit --id spark --name pipy example-jobs/python/pi.py 100
```
NOTE: The job name (--name) must be atleast 3 characters long, can only contain alphanumeric characters including hyphens but excluding underscores, and cannot contain uppercase letters.
@ -24,7 +24,7 @@ If you have set up a [SSH tunnel](./10-clusters.md#SSH%20and%20Port%20Forwarding
The default setting when running a job is --wait. This will simply submit a job to the cluster and wait for the job to run. If you want to just submit the job and not wait, use the --no-wait flag and tail the logs manually:
```sh
azb spark app submit --id spark --name pipy --no-wait example-jobs/pi.py 1000
azb spark cluster submit --id spark --name pipy --no-wait example-jobs/pi.py 1000
```
```sh

Просмотреть файл

@ -1,26 +0,0 @@
import argparse
import typing
from . import submit
from . import app_logs
def setup_parser(parser: argparse.ArgumentParser):
subparsers = parser.add_subparsers(
title="Actions", dest="app_action", metavar="<app_action>")
submit_parser = subparsers.add_parser(
"submit", help="Submit a new spark job")
logs_parser = subparsers.add_parser(
"logs", help="Action on an app")
submit.setup_parser(submit_parser)
app_logs.setup_parser(logs_parser)
def execute(args: typing.NamedTuple):
actions = dict(
submit=submit.execute,
logs=app_logs.execute,
)
func = actions[args.app_action]
func(args)

Просмотреть файл

@ -6,6 +6,8 @@ from . import cluster_delete
from . import cluster_get
from . import cluster_list
from . import cluster_ssh
from . import cluster_app_logs
from . import cluster_submit
class ClusterAction:
@ -15,6 +17,8 @@ class ClusterAction:
get = "get"
list = "list"
ssh = "ssh"
app_logs = "app-logs"
submit = "submit"
def setup_parser(parser: argparse.ArgumentParser):
@ -32,8 +36,13 @@ def setup_parser(parser: argparse.ArgumentParser):
ClusterAction.get, help="Get information about a cluster")
list_parser = subparsers.add_parser(
ClusterAction.list, help="List clusters in your account")
app_logs_parser = subparsers.add_parser(
"app-logs", help="Get the logs from a submitted app")
ssh_parser = subparsers.add_parser(
ClusterAction.ssh, help="SSH into the master node of a cluster")
submit_parser = subparsers.add_parser(
"submit", help="Submit a new spark job to a cluster")
cluster_create.setup_parser(create_parser)
cluster_add_user.setup_parser(add_user_parser)
@ -41,6 +50,8 @@ def setup_parser(parser: argparse.ArgumentParser):
cluster_get.setup_parser(get_parser)
cluster_list.setup_parser(list_parser)
cluster_ssh.setup_parser(ssh_parser)
cluster_submit.setup_parser(submit_parser)
cluster_app_logs.setup_parser(app_logs_parser)
def execute(args: typing.NamedTuple):
@ -52,6 +63,8 @@ def execute(args: typing.NamedTuple):
actions[ClusterAction.get] = cluster_get.execute
actions[ClusterAction.list] = cluster_list.execute
actions[ClusterAction.ssh] = cluster_ssh.execute
actions[ClusterAction.submit] = cluster_submit.execute
actions[ClusterAction.app_logs] = cluster_app_logs.execute
func = actions[args.cluster_action]
func(args)

Просмотреть файл

Просмотреть файл

Просмотреть файл

@ -2,8 +2,6 @@ import argparse
import typing
from . import cluster
from . import submit
from . import app
def setup_parser(parser: argparse.ArgumentParser):
@ -13,17 +11,13 @@ def setup_parser(parser: argparse.ArgumentParser):
cluster_parser = subparsers.add_parser(
"cluster", help="Commands to manage a cluster")
app_parser = subparsers.add_parser(
"app", help="Action on an app")
cluster.setup_parser(cluster_parser)
app.setup_parser(app_parser)
def execute(args: typing.NamedTuple):
actions = dict(
cluster=cluster.execute,
app=app.execute,
)
func = actions[args.action]
func(args)