* fix function name

* update version

* changelog updates

* set minor version to correct number

* update changelog

* changelog formatting

* typos

* whitespace

* remove print debug statement

* whitespace

* whitespace
This commit is contained in:
Jacob Freck 2018-10-29 12:28:26 -07:00 коммит произвёл GitHub
Родитель e486536919
Коммит 6d2e6c5f7b
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
3 изменённых файлов: 20 добавлений и 7 удалений

Просмотреть файл

@ -1,5 +1,18 @@
# Changelog
## 0.10.0 (2018-10-29)
**Breaking Changes**
* Remove deprecated SDK API code (#671) ([fc50536](https://github.com/Azure/aztk/commit/fc50536)), closes [#671](https://github.com/Azure/aztk/issues/671)
* Remove custom scripts (#673) ([9e32b4b](https://github.com/Azure/aztk/commit/9e32b4b)), closes [#673](https://github.com/Azure/aztk/issues/673)
* Replaced states with Enums for ClusterState, JobState, ApplicationState (#677) ([e486536](https://github.com/Azure/aztk/commit/e486536)), closes [#677](https://github.com/Azure/aztk/issues/677)
**Features**
* Spark retry docker pull (#672) ([18b74e4](https://github.com/Azure/aztk/commit/18b74e4)), closes [#672](https://github.com/Azure/aztk/issues/672)
* Spark scheduling target (#661) ([4408c4f](https://github.com/Azure/aztk/commit/4408c4f)), closes [#661](https://github.com/Azure/aztk/issues/661)
* Spark submit scheduling internal (#674) ([8c2bf0c](https://github.com/Azure/aztk/commit/8c2bf0c)), closes [#674](https://github.com/Azure/aztk/issues/674)
## 0.9.1 (2018-10-5)
**Bug Fixes**
* Fix: pin all node dependencies not in Pipfile (#667) ([0606598](https://github.com/Azure/aztk/commit/0606598)), closes [#667](https://github.com/Azure/aztk/issues/667)

Просмотреть файл

@ -22,8 +22,8 @@
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
major = 0
minor = 9
patch = 1
minor = 10
patch = 0
suffix = ""

Просмотреть файл

@ -156,7 +156,7 @@ class SshConfig:
"Please supply a username either in the ssh.yaml configuration file or with a parameter (--username)")
def __convert_to_path(path: str):
def _convert_to_path(path: str):
if path:
abs_path = os.path.abspath(os.path.expanduser(path))
if not os.path.exists(abs_path):
@ -226,10 +226,10 @@ class JobConfig:
spark_configuration = config.get("spark_configuration")
if spark_configuration:
self.spark_defaults_conf = __convert_to_path(spark_configuration.get("spark_defaults_conf"))
self.spark_env_sh = __convert_to_path(spark_configuration.get("spark_env_sh"))
self.core_site_xml = __convert_to_path(spark_configuration.get("core_site_xml"))
self.jars = [__convert_to_path(jar) for jar in spark_configuration.get("jars") or []]
self.spark_defaults_conf = _convert_to_path(spark_configuration.get("spark_defaults_conf"))
self.spark_env_sh = _convert_to_path(spark_configuration.get("spark_env_sh"))
self.core_site_xml = _convert_to_path(spark_configuration.get("core_site_xml"))
self.jars = [_convert_to_path(jar) for jar in spark_configuration.get("jars") or []]
def _read_config_file(self, path: str = aztk.utils.constants.DEFAULT_SPARK_JOB_CONFIG):
"""