fix broken links to headers
This commit is contained in:
Jun Ki Min 2019-03-28 18:37:24 -04:00 коммит произвёл GitHub
Родитель 132c43ac27
Коммит c98b5c54cb
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 2 добавлений и 2 удалений

Просмотреть файл

@ -9,7 +9,7 @@ This document describes how to setup all the dependencies to run the notebooks i
* [Compute environments](#compute-environments) * [Compute environments](#compute-environments)
* [Setup guide for Local or DSVM](#setup-guide-for-local-or-dsvm) * [Setup guide for Local or DSVM](#setup-guide-for-local-or-dsvm)
* [Setup Requirements](#setup-requirements) * [Requirements](#requirements)
* [Dependencies setup](#dependencies-setup) * [Dependencies setup](#dependencies-setup)
* [Register the conda environment as a kernel in Jupyter](#Register-the-conda-environment-as-a-kernel-in-Jupyter) * [Register the conda environment as a kernel in Jupyter](#Register-the-conda-environment-as-a-kernel-in-Jupyter)
* [Troubleshooting for the DSVM](#troubleshooting-for-the-dsvm) * [Troubleshooting for the DSVM](#troubleshooting-for-the-dsvm)
@ -136,7 +136,7 @@ SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true, -Dspark.worker.cleanup.a
## Setup guide for Azure Databricks ## Setup guide for Azure Databricks
### Requirements ### Requirements of Azure Databricks
* Databricks Runtime version 4.3 (Apache Spark 2.3.1, Scala 2.11) or greater * Databricks Runtime version 4.3 (Apache Spark 2.3.1, Scala 2.11) or greater
* Python 3 * Python 3