Fix torch include in `op_builder/mlu/fused_adam.py` and update no-torch workflow triggers (#6584)

Changes from #6472 caused the no-torch workflow that is an example of
how we build the DeepSpeed release package to fail (so we caught this
before a release, see more in #6402). These changes also copy the style
used to include torch in other accelerator op_builder implementations,
such as npu
[here](https://github.com/microsoft/DeepSpeed/blob/master/op_builder/npu/fused_adam.py#L8)
and hpu
[here](828ddfbbda/op_builder/hpu/fused_adam.py (L15)).

This also updates the no-torch workflow to run on all changes to the
op_builder directory. The test runs quickly and shouldn't add any
additional testing burden there.

Resolves: #6576
This commit is contained in:
Logan Adams 2024-09-27 13:32:48 -07:00 коммит произвёл GitHub
Родитель 828ddfbbda
Коммит 8cded575a9
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: B5690EEEBB952194
2 изменённых файлов: 6 добавлений и 1 удалений

1
.github/workflows/no-torch.yml поставляемый
Просмотреть файл

@ -5,6 +5,7 @@ on:
pull_request:
paths:
- '.github/workflows/no-torch.yml'
- 'op_builder/**'
schedule:
- cron: "0 0 * * *"

Просмотреть файл

@ -5,7 +5,11 @@
# DeepSpeed Team
from .builder import MLUOpBuilder
import torch
try:
import torch
except ImportError as e:
pass
class MLUFusedAdam: