* conda

* Guide

* correct tag

* Update README.md

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* Update docs/source/installation.md

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

* Sylvain's comments

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
This commit is contained in:
Lysandre Debut 2020-12-03 14:28:49 -05:00 коммит произвёл GitHub
Родитель 9ad6194318
Коммит 0c5615af66
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
5 изменённых файлов: 131 добавлений и 5 удалений

1
.github/conda/build.sh поставляемый Normal file
Просмотреть файл

@ -0,0 +1 @@
$PYTHON setup.py install # Python command to install the script.

48
.github/conda/meta.yaml поставляемый Normal file
Просмотреть файл

@ -0,0 +1,48 @@
{% set name = "transformers" %}
package:
name: "{{ name|lower }}"
version: "{{ TRANSFORMERS_VERSION }}"
source:
path: ../../
build:
noarch: python
requirements:
host:
- python
- pip
- numpy
- dataclasses
- packaging
- filelock
- requests
- tqdm >=4.27
- sacremoses
- regex !=2019.12.17
- protobuf
- tokenizers ==0.9.4
run:
- python
- numpy
- dataclasses
- packaging
- filelock
- requests
- tqdm >=4.27
- sacremoses
- regex !=2019.12.17
- protobuf
- tokenizers ==0.9.4
test:
imports:
- transformers
about:
home: https://huggingface.co
license: Apache License 2.0
license_file: LICENSE
summary: "🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0."

43
.github/workflows/release-conda.yml поставляемый Normal file
Просмотреть файл

@ -0,0 +1,43 @@
name: Release - Conda
on:
push:
tags:
- v*
env:
ANACONDA_API_TOKEN: ${{ secrets.ANACONDA_API_TOKEN }}
jobs:
build_and_package:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- name: Checkout repository
uses: actions/checkout@v1
- name: Install miniconda
uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
auto-activate-base: false
activate-environment: "build-transformers"
channels: huggingface
- name: Setup conda env
run: |
conda install -c defaults anaconda-client conda-build
- name: Extract version
run: echo "TRANSFORMERS_VERSION=`python setup.py --version`" >> $GITHUB_ENV
- name: Build conda packages
run: |
conda info
conda build .github/conda
- name: Upload to Anaconda
run: anaconda upload `conda build .github/conda --output` --force

Просмотреть файл

@ -137,14 +137,16 @@ The model itself is a regular [Pytorch `nn.Module`](https://pytorch.org/docs/sta
## Installation ## Installation
### With pip
This repository is tested on Python 3.6+, PyTorch 1.0.0+ (PyTorch 1.3.1+ for [examples](https://github.com/huggingface/transformers/tree/master/examples)) and TensorFlow 2.0. This repository is tested on Python 3.6+, PyTorch 1.0.0+ (PyTorch 1.3.1+ for [examples](https://github.com/huggingface/transformers/tree/master/examples)) and TensorFlow 2.0.
You should install 🤗 Transformers in a [virtual environment](https://docs.python.org/3/library/venv.html). If you're unfamiliar with Python virtual environments, check out the [user guide](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/). You should install 🤗 Transformers in a [virtual environment](https://docs.python.org/3/library/venv.html). If you're unfamiliar with Python virtual environments, check out the [user guide](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/).
First, create a virtual environment with the version of Python you're going to use and activate it. First, create a virtual environment with the version of Python you're going to use and activate it.
Then, you will need to install one of, or both, TensorFlow 2.0 and PyTorch. Then, you will need to install at least one of TensorFlow 2.0, PyTorch or Flax.
Please refer to [TensorFlow installation page](https://www.tensorflow.org/install/pip#tensorflow-2.0-rc-is-available) and/or [PyTorch installation page](https://pytorch.org/get-started/locally/#start-locally) regarding the specific install command for your platform. Please refer to [TensorFlow installation page](https://www.tensorflow.org/install/pip#tensorflow-2.0-rc-is-available), [PyTorch installation page](https://pytorch.org/get-started/locally/#start-locally) regarding the specific install command for your platform and/or [Flax installation page](https://github.com/google/flax#quick-install).
When TensorFlow 2.0 and/or PyTorch has been installed, 🤗 Transformers can be installed using pip as follows: When TensorFlow 2.0 and/or PyTorch has been installed, 🤗 Transformers can be installed using pip as follows:
@ -154,6 +156,18 @@ pip install transformers
If you'd like to play with the examples, you must [install the library from source](https://huggingface.co/transformers/installation.html#installing-from-source). If you'd like to play with the examples, you must [install the library from source](https://huggingface.co/transformers/installation.html#installing-from-source).
### With conda
Since Transformers version v4.0.0, we now have a conda channel: `huggingface`.
🤗 Transformers can be installed using conda as follows:
```shell script
conda install -c huggingface transformers
```
Follow the installation pages of TensorFlow, PyTorch or Flax to see how to install them with conda.
## Models architectures ## Models architectures
**[All the model checkpoints](https://huggingface.co/models)** provided by 🤗 Transformers are seamlessly integrated from the huggingface.co [model hub](https://huggingface.co) where they are uploaded directly by [users](https://huggingface.co/users) and [organizations](https://huggingface.co/organizations). **[All the model checkpoints](https://huggingface.co/models)** provided by 🤗 Transformers are seamlessly integrated from the huggingface.co [model hub](https://huggingface.co) where they are uploaded directly by [users](https://huggingface.co/users) and [organizations](https://huggingface.co/organizations).

Просмотреть файл

@ -12,9 +12,10 @@ must install it from source.
## Installation with pip ## Installation with pip
First you need to install one of, or both, TensorFlow 2.0 and PyTorch. First you need to install one of, or both, TensorFlow 2.0 and PyTorch.
Please refer to [TensorFlow installation page](https://www.tensorflow.org/install/pip#tensorflow-2.0-rc-is-available) Please refer to [TensorFlow installation page](https://www.tensorflow.org/install/pip#tensorflow-2.0-rc-is-available),
and/or [PyTorch installation page](https://pytorch.org/get-started/locally/#start-locally) regarding the specific [PyTorch installation page](https://pytorch.org/get-started/locally/#start-locally) and/or
install command for your platform. [Flax installation page](https://github.com/google/flax#quick-install)
regarding the specific install command for your platform.
When TensorFlow 2.0 and/or PyTorch has been installed, 🤗 Transformers can be installed using pip as follows: When TensorFlow 2.0 and/or PyTorch has been installed, 🤗 Transformers can be installed using pip as follows:
@ -34,6 +35,12 @@ or 🤗 Transformers and TensorFlow 2.0 in one line with:
pip install transformers[tf-cpu] pip install transformers[tf-cpu]
``` ```
or 🤗 Transformers and Flax in one line with:
```bash
pip install transformers[flax]
```
To check 🤗 Transformers is properly installed, run the following command: To check 🤗 Transformers is properly installed, run the following command:
```bash ```bash
@ -66,6 +73,19 @@ python -c "from transformers import pipeline; print(pipeline('sentiment-analysis
to check 🤗 Transformers is properly installed. to check 🤗 Transformers is properly installed.
## With conda
Since Transformers version v4.0.0, we now have a conda channel: `huggingface`.
🤗 Transformers can be installed using conda as follows:
```
conda install -c huggingface transformers
```
Follow the installation pages of TensorFlow, PyTorch or Flax to see how to install them with conda.
## Caching models ## Caching models
This library provides pretrained models that will be downloaded and cached locally. Unless you specify a location with This library provides pretrained models that will be downloaded and cached locally. Unless you specify a location with