add gpu env readme and setup
This commit is contained in:
Родитель
ff208c61fc
Коммит
f4b7e9132b
|
@ -68,6 +68,7 @@ This project welcomes contributions and suggestions. Before contributing, please
|
|||
| Build Type | Branch | Status | | Branch | Status |
|
||||
| --- | --- | --- | --- | --- | --- |
|
||||
| **Linux CPU** | master | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly?branchName=master)](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_build/latest?definitionId=4792) | | staging | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly_staging?branchName=staging)](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_build/latest?definitionId=4594) |
|
||||
| **Linux GPU** | master | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly_gpu?branchName=master)](https://msdata.visualstudio.com/DefaultCollection/AlgorithmsAndDataScience/_build/latest?definitionId=4997) | | staging | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly_gpu_staging?branchName=staging)](https://msdata.visualstudio.com/DefaultCollection/AlgorithmsAndDataScience/_build/latest?definitionId=4998)|
|
||||
| **Linux Spark** | master | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly_spark?branchName=master)](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_build/latest?definitionId=4804) | | staging | [![Status](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_apis/build/status/nightly_spark_staging?branchName=staging)](https://msdata.visualstudio.com/AlgorithmsAndDataScience/_build/latest?definitionId=4805)|
|
||||
|
||||
**NOTE** - the tests are executed every night, we use `pytest` for testing python utilities in [reco_utils](reco_utils) and [notebooks](notebooks).
|
||||
|
|
24
SETUP.md
24
SETUP.md
|
@ -22,6 +22,7 @@ We have different compute environments, depending on the kind of machine
|
|||
|
||||
Environments supported to run the notebooks on the DSVM:
|
||||
* Python CPU
|
||||
* Python GPU
|
||||
* PySpark
|
||||
|
||||
Environments supported to run the notebooks on Azure Databricks:
|
||||
|
@ -58,6 +59,18 @@ Assuming the repo is cloned as `Recommenders` in the local system, to install th
|
|||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary><strong><em>Python GPU environment</em></strong></summary>
|
||||
|
||||
Assuming that you have a GPU machine, to install the Python GPU environment, which by default installs the CPU environment:
|
||||
|
||||
cd Recommenders
|
||||
./scripts/generate_conda_file.sh --gpu
|
||||
conda env create -n reco_gpu -f conda_gpu.yaml
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong><em>PySpark environment</em></strong></summary>
|
||||
|
||||
|
@ -86,6 +99,17 @@ unset PYSPARK_DRIVER_PYTHON
|
|||
```
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong><em>All environments</em></strong></summary>
|
||||
|
||||
To install all three environments:
|
||||
|
||||
cd Recommenders
|
||||
./scripts/generate_conda_file.sh --gpu --pyspark
|
||||
conda env create -n reco_full -f conda_full.yaml
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
### Register the conda environment in Jupyter notebook
|
||||
|
||||
|
|
Загрузка…
Ссылка в новой задаче