miguelgfierro 2020-09-10 15:09:59 +01:00
Π ΠΎΠ΄ΠΈΡ‚Π΅Π»ΡŒ 2289af27dd
ΠšΠΎΠΌΠΌΠΈΡ‚ 12fa12a437
1 ΠΈΠ·ΠΌΠ΅Π½Ρ‘Π½Π½Ρ‹Ρ… Ρ„Π°ΠΉΠ»ΠΎΠ²: 5 Π΄ΠΎΠ±Π°Π²Π»Π΅Π½ΠΈΠΉ ΠΈ 5 ΡƒΠ΄Π°Π»Π΅Π½ΠΈΠΉ

ΠŸΡ€ΠΎΡΠΌΠΎΡ‚Ρ€Π΅Ρ‚ΡŒ Ρ„Π°ΠΉΠ»

@ -94,29 +94,29 @@ Click on the following menus to see details:
<summary><strong><em>Set PySpark environment variables on Linux or MacOS</em></strong></summary>
To set these variables every time the environment is activated, we can follow the steps of this [guide](https://conda.io/docs/user-guide/tasks/manage-environments.html#macos-and-linux).
First, get the path of the environment `reco_pyspark` is installed:
RECO_ENV=$(conda env list | grep reco_pyspark | awk '{print $NF}')
mkdir -p $RECO_ENV/etc/conda/activate.d
mkdir -p $RECO_ENV/etc/conda/deactivate.d
You also need to find where Spark is installed and set `SPARK_HOME` variable, on the DSVM, `SPARK_HOME=/dsvm/tools/spark/current`.
Then, create the file `$RECO_ENV/etc/conda/activate.d/env_vars.sh` and add:
#!/bin/sh
RECO_ENV=$(conda env list | grep reco_pyspark | awk '{print $NF}')
export PYSPARK_PYTHON=$RECO_ENV/bin/python
export PYSPARK_DRIVER_PYTHON=$RECO_ENV/bin/python
export SPARK_HOME_BACKUP=$SPARK_HOME
export SPARK_HOME=/dsvm/tools/spark/current
This will export the variables every time we do `conda activate reco_pyspark`.
To unset these variables when we deactivate the environment, create the file `$RECO_ENV/etc/conda/deactivate.d/env_vars.sh` and add:
This will export the variables every time we do `conda activate reco_pyspark`. To unset these variables when we deactivate the environment, create the file `$RECO_ENV/etc/conda/deactivate.d/env_vars.sh` and add:
#!/bin/sh
unset PYSPARK_PYTHON
unset PYSPARK_DRIVER_PYTHON
export SPARK_HOME=$SPARK_HOME_BACKUP
unset SPARK_HOME_BACKUP
</details>