aztk/CHANGELOG.md

18 KiB

Changelog

0.9.1 (2018-10-5)

Bug Fixes

  • Fix: pin all node dependencies not in Pipfile (#667) (0606598), closes #667
  • Fix: vsts integration tests block (#657) (4a60c8a), closes #657
  • Fix: vsts mutliline secrets (#668) (cb62207), closes #668

0.9.0 (2018-08-30)

Breaking Changes

  • spark roll back scheduling disable (#653) (93615d9), closes #653
  • remove custom scripts (#650) (442228a), closes #650
  • 0.9.0 deprecated code removal (#645) (eef36dc), closes #645
  • SDK refactor (#622) (b18eb69), closes #622

Features

  • Add ability to specify docker run options in toolkit config (#613) (9d554c3), closes #613 #3
  • add brief flag to debug tool (#634) (b7bdd8c), closes #634
  • first run docs update (#644) (9098533), closes #644
  • SDK refactor (#622) (b18eb69), closes #622

Bug Fixes

  • diagnostics function write error result bug (#649) (293f297), closes #649
  • expose get cluster configuration API (#648) (7c14648), closes #648
  • remove bad node scripts import (#652) (0a9ce94), closes #652
  • typo in vsts build (#654) (7c37b06), closes #654
  • update incompatible dependencies in setup.py (#639) (f98d037), closes #639

Internal Changes

0.8.1 (2018-06-20)

Bug Fixes

  • docs links version (#614) (a8f8e92), closes #614
  • set defaults for SparkConfiguration, add tests (#606) (5306a2a), closes #606
  • spark debug tool filter out .venv, make debug tool testable (#612) (4e0b1ec), closes #612
  • Suppress msrest warnings (#611) (883980d), closes #611

0.8.0 (2018-06-12)

Deprecated Features

  • ClusterConfiguration fields vm_count and vm_count_low_pri have been renamed to size and size_low_priority
  • command line flag --size-low-pri for aztk spark cluster create has been replaced with --size-low-priority
  • default secrets.yaml block has been deprecated, place all child parameters directly at the root
  • Spark version 1.6 has been deprecated

Added Features

  • add cluster list quiet flag, ability to compose with delete (#581) (88d0419), closes #581
  • add node run command (#572) (af449dc), closes #572
  • Add VSTS CI (#561) (66037fd), closes #561
  • Disable scheduling on group of nodes (#540) (8fea9ce), closes #540
  • New Models design with auto validation, default and merging (#543) (02f336b), closes #543
  • nvBLAS and OpenBLAS plugin (#539) (603a413), closes #539
  • pure python ssh (#577) (f16aac0), closes #577
  • Support passing of remote executables via aztk spark cluster submit (#549) (f6735cc), closes #549
  • TensorflowOnSpark python plugin (#525) (1527929), closes #525
  • Conda, Apt-Get and Pip Install Plugins (#594) (fbf1bab), closes #594
  • Warnings show stacktrace on verbose (#587) (b9a863b), closes #587

Bug Fixes

  • add toolkit to sdk docs and example (d688c9c)
  • --size-low-pri being ignored (#593) (fa3ac0e), closes #593
  • fix typos (#595) (7d7a814), closes #595
  • getting started script reuse aad application (#569) (3d16cf3), closes #569
  • models v2 deserialization (#584) (1eeff23), closes #584
  • optimize start task (#582) (e5e529a), closes #582
  • remove deprecated vm_count call (#586) (dbde8bc), closes #586
  • Remove old spark-defaults.conf jars (#567) (8b8cd62), closes #567
  • set logger to stdout (#588) (3f0c8f9), closes #588
  • switch create user to pool wide (#574) (49a890a), closes #574
  • switch from pycryptodome to pycryptodomex (#564) (19dde42), closes #564
  • allow cluster config to be printed when no username has been set (#597) (1cc71c7), closes #597

0.7.1 (2018-05-11)

Bug Fixes

  • Fix: create virtual environment even if container exists (2db7b00)
  • Fix: gitattributes for jar files (#548) (a18660b), closes #548
  • Fix: pass docker repo command back to the cluster config (#538) (a99bbe1), closes #538

0.7.0 (2018-05-01)

AZTK is now published on pip! Documentation has migrated to readthedocs

This release includes a number of breaking changes. Please follow the migration for upgrading from 0.6.0..

Breaking Changes

  • Moved docker_repo under a new toolkit key. docker_repo is now only used for custom Docker images. Use toolkit for supported images.
  • Docker images have been refactored and moved to a different Dockerhub repository. The new supported images are not backwards compatible. See the documentation on configuration files.

Deprecated Features

Added Features

  • add internal flag to node commands (#482) (1eaa1b6), closes #482
  • Added custom scripts functionality for plugins with the cli(Deprecate custom scripts) (#517 (c98df7d), closes #517
  • disable msrestazure keyring log (#509) (3cc43c3), closes #509
  • enable mixed mode for jobs (#442) (8d00a2c), closes #442
  • getting started script (#475) (7ef721f), closes #475
  • JupyterLab plugin (#459) (da61337), closes #459
  • managed storage for clusters and jobs (#443) (8aa1843), closes #443
  • match cluster submit exit code in cli (#478) (8889059), closes #478
  • Plugin V2: Running plugin on host (#461) (de78983), closes #461
  • Plugins (#387) (c724d94), closes #387
  • Pypi auto deployment (#428) (c237501), closes #428
  • Readthedocs support (#497) (e361c3b), closes #497
  • refactor docker images (#510) (779bffb), closes #510
  • Spark add output logs flag (#468) (32de752), closes #468
  • spark debug tool (#455) (44a0765), closes #455
  • spark ui proxy plugin (#467) (2e995b4), closes #467
  • Spark vnet custom dns hostname fix (#490) (61e7c59), closes #490
  • New Toolkit configuration (#507) (7a7e63c), closes #507

Bug Fixes

  • add gitattributes file (#470) (82ad029), closes #470
  • add plugins to cluster_install_cmd call (#423) (216f63d), closes #423
  • add spark.history.fs.logDirectory to required keys (#456) (4ef3dd0), closes #456
  • add support for jars, pyfiles, files in Jobs (#408) (2dd7891), closes #408
  • add timeout handling to cluster_run and copy (#524) (47000a5), closes #524
  • azure file share not being shared with container (#521) (07ac9b7), closes #521
  • Dependency issue with keyring not having good dependencies (#504) (5e79a2c), closes #504
  • filter job submission clusters out of cluster list (#409) (1c31335), closes #409
  • fix aztk cluster submit paths, imports (#464) (c1f43c7), closes #464
  • fix broken spark init command (#486) (a33bdbc), closes #486
  • fix job submission cluster data issues (#533) (9ccc1c6), closes #533
  • fix spark job submit path (#474) (ee1e61b), closes #474
  • make node scripts upload in memory (#519) (0015e22), closes #519
  • pypi long description (#450) (db7a2ef), closes #450
  • remove unnecessary example (#417) (f1e3f7a), closes #417
  • Remove unused ssh plugin flags (#488) (be8cd2a), closes #488
  • set explicit file open encoding (#448) (5761a36), closes #448
  • Spark shuffle service worker registration fail (#492) (013f6e4), closes #492
  • throw error if submitting before master elected (#479) (a59fe8b), closes #479
  • hdfs using wrong conditions (#515) (a00dbb7), closes #515
  • AZTK_IS_MASTER not set on worker and failing (#506) (b8a3fcc), closes #506
  • VNet required error now showing if using mixed mode without it (#440) (9253aac), closes #440
  • Worker on master flag ignored and standardize boolean environment (#514) (5579d95), closes #514
  • Fix job configuration option for aztk spark job submit command (#435) (4be5ac2), closes #435
  • Fix keyring (#505) (12450fb), closes #505
  • Fix the endpoint (#437) (bcefca3), closes #437
  • Fix typo in command_builder 'expecity' -> 'explicitly' (#447) (27822f4), closes #447
  • Fix typo load_aztk_screts -> load_aztk_secrets (#421) (6827181), closes #421
  • Update file to point at master branch (#501) (4ba3c9d), closes #501
  • Update storage sdk from 0.33.0 to 1.1.0 (#439) (f2eb1a4), closes #439

Internal Changes

  • Internal: Cluster data helpers and upload_node_script into cluster_data module (#401) (2bed496), closes #401
  • Internal: Move node scripts under aztk and upload all aztk to cluster (#433) (dfbfead), closes #433

0.6.0 Mixed Mode, Cluster Run & Copy

Features:

  • aztk spark init customization flags
  • aztk spark cluster run command added
  • aztk spark cluster copy command added
  • enable Spark dynamic allocation by default
  • add SDK support for file-like objects
  • add Spark integration tests
  • add flag worker_on_master option for cluster and job submission mode
  • Spark driver runs on master node for single application job submission mode

Bug Fixes:

  • load jars in .aztk/jars/ in job submission mode
  • replace outdated error in cluster_create
  • fix type error crash if no jars are specified in job submission
  • stop using mutable default parameters
  • print job application code if exit_code is 0
  • job submission crash if executor or driver cores specified
  • wrong error thrown if user added before master node picked

0.5.1 Job Submission, AAD, VNET

Major Features:

Breaking changes:

  • SecretsConfiguration inputs changed. Check in SDK for the new format

0.5.0 SDK

0.3.1 List cluster only list spark cluster

0.3.0 New CLI with one command aztk

0.2.0 Spark use start task instead of

0.1.0 Initial