Docs: fix broken links and redirects (#2267)

This commit is contained in:
Adam J. Stewart 2024-08-31 14:38:28 +02:00 коммит произвёл GitHub
Родитель 16d90f29c8
Коммит c5380abe28
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: B5690EEEBB952194
25 изменённых файлов: 60 добавлений и 69 удалений

Просмотреть файл

@ -215,7 +215,7 @@
"id": "dWidC6vDrMON"
},
"source": [
"If you do not want to utilize the `ClassificationTask` functionality for your experiments, you can also just create a [timm](https://github.com/rwightman/pytorch-image-models) model with pretrained weights from TorchGeo as follows:"
"If you do not want to utilize the `ClassificationTask` functionality for your experiments, you can also just create a [timm](https://github.com/huggingface/pytorch-image-models) model with pretrained weights from TorchGeo as follows:"
]
},
{

Просмотреть файл

@ -52,7 +52,7 @@ Tests
TorchGeo uses `GitHub Actions <https://docs.github.com/en/actions>`_ for Continuous Integration. We run a suite of unit tests on every commit to ensure that pull requests don't break anything. If you submit a pull request that adds or modifies any Python code, we require unit tests for that code before the pull request can be merged.
For example, if you add a new dataset in ``torchgeo/datasets/foo.py``, you'll need to create corresponding unit tests in ``tests/datasets/test_foo.py``. The easiest way to do this is to find unit tests for similar datasets and modify them for your dataset. These tests can then be run with `pytest <https://docs.pytest.org/>`_:
For example, if you add a new dataset in ``torchgeo/datasets/foo.py``, you'll need to create corresponding unit tests in ``tests/datasets/test_foo.py``. The easiest way to do this is to find unit tests for similar datasets and modify them for your dataset. These tests can then be run with `pytest <https://docs.pytest.org/en/stable/>`_:
.. code-block:: console
@ -79,7 +79,7 @@ For example, if you add a new dataset in ``torchgeo/datasets/foo.py``, you'll ne
From this output, you can see that all tests pass, but many lines of code in ``torchgeo/datasets/foo.py`` are not being tested, including 376--403, 429--496, etc. In order for this pull request to be merged, additional tests will need to be added until there is 100% test coverage.
These tests require `pytest <https://docs.pytest.org/>`_ and `pytest-cov <https://pytest-cov.readthedocs.io/>`_ to be installed.
These tests require `pytest <https://docs.pytest.org/en/stable/>`_ and `pytest-cov <https://pytest-cov.readthedocs.io/en/latest/>`_ to be installed.
.. note:: If you add a new dataset, the tests will require some form of data to run. This data should be stored in ``tests/data/<dataset>``. Please don't include real data, as this may violate the license the data is distributed under, and can involve very large file sizes. Instead, create fake data examples using the instructions found `here <https://github.com/microsoft/torchgeo/blob/main/tests/data/README.md>`__.
@ -91,7 +91,7 @@ Linters
In order to remain `PEP-8 <https://peps.python.org/pep-0008/>`_ compliant and maintain a high-quality codebase, we use a couple of linting tools:
* `ruff <https://docs.astral.sh/ruff/>`_ for code formatting
* `mypy <https://mypy.readthedocs.io/>`_ for static type analysis
* `mypy <https://mypy.readthedocs.io/en/stable/>`_ for static type analysis
* `prettier <https://prettier.io/docs/en/>`_ for code formatting
These tools should be used from the root of the project to ensure that our configuration files are found. Ruff is relatively easy to use, and will automatically fix most issues it encounters:
@ -142,7 +142,7 @@ Now, every time you run ``git commit``, pre-commit will run and let you know if
Documentation
-------------
All of our documentation is hosted on `Read the Docs <https://readthedocs.org/>`_. If you make non-trivial changes to the documentation, it helps to build the documentation yourself locally. To do this, make sure the dependencies are installed:
All of our documentation is hosted on `Read the Docs <https://about.readthedocs.com/>`_. If you make non-trivial changes to the documentation, it helps to build the documentation yourself locally. To do this, make sure the dependencies are installed:
.. code-block:: console
@ -164,7 +164,7 @@ The resulting HTML files can be found in ``_build/html``. Open ``index.html`` in
Tutorials
---------
TorchGeo has a number of tutorials included in the documentation that can be run in `Google Colab <https://colab.research.google.com/>`_. These Jupyter notebooks are tested before each release to make sure that they still run properly. To test these locally, install `pytest <https://docs.pytest.org/>`_ and `nbmake <https://github.com/treebeardtech/nbmake>`_ and run:
TorchGeo has a number of tutorials included in the documentation that can be run in `Google Colab <https://colab.research.google.com/>`_. These Jupyter notebooks are tested before each release to make sure that they still run properly. To test these locally, install `pytest <https://docs.pytest.org/en/stable/>`_ and `nbmake <https://github.com/treebeardtech/nbmake>`_ and run:
.. code-block:: console

Просмотреть файл

@ -1,7 +1,7 @@
Installation
============
TorchGeo is simple and easy to install. We support installation using the `pip <https://pip.pypa.io/>`_, `conda <https://docs.conda.io/>`_, and `spack <https://spack.io/>`_ package managers.
TorchGeo is simple and easy to install. We support installation using the `pip <https://pip.pypa.io/en/stable/>`_, `conda <https://docs.conda.io/en/latest/>`_, and `spack <https://spack.io/>`_ package managers.
pip
---
@ -34,7 +34,7 @@ By default, only required dependencies are installed. TorchGeo has a number of o
$ pip install torchgeo[style,tests]
$ pip install torchgeo[all]
See the ``pyproject.toml`` for a complete list of options. See the `pip documentation <https://pip.pypa.io/>`_ for more details.
See the ``pyproject.toml`` for a complete list of options. See the `pip documentation <https://pip.pypa.io/en/stable/>`_ for more details.
conda
-----
@ -82,4 +82,4 @@ Optional dependencies can be installed by enabling build variants:
$ spack install py-torchgeo+datasets
$ spack install py-torchgeo+style+tests
Run ``spack info py-torchgeo`` for a complete list of variants. See the `spack documentation <https://spack.readthedocs.io/>`_ for more details.
Run ``spack info py-torchgeo`` for a complete list of variants. See the `spack documentation <https://spack.readthedocs.io/en/latest/>`_ for more details.

Просмотреть файл

@ -189,7 +189,7 @@ filterwarnings = [
# https://github.com/pytorch/vision/pull/5898
"ignore:.* is deprecated and will be removed in Pillow 10:DeprecationWarning:torchvision.transforms.functional_pil",
"ignore:.* is deprecated and will be removed in Pillow 10:DeprecationWarning:torchvision.transforms._functional_pil",
# https://github.com/rwightman/pytorch-image-models/pull/1256
# https://github.com/huggingface/pytorch-image-models/pull/1256
"ignore:.* is deprecated and will be removed in Pillow 10:DeprecationWarning:timm.data",
# https://github.com/pytorch/pytorch/issues/72906
# https://github.com/pytorch/pytorch/pull/69823

Просмотреть файл

@ -3,8 +3,8 @@
"""TorchGeo: datasets, samplers, transforms, and pre-trained models for geospatial data.
This library is part of the `PyTorch <http://pytorch.org/>`_ project. PyTorch is an open
source machine learning framework.
This library is part of the `PyTorch <https://pytorch.org/>`_ project. PyTorch is an
open source machine learning framework.
The :mod:`torchgeo` package consists of popular datasets, model architectures, and
common image transformations for geospatial data.

Просмотреть файл

@ -64,8 +64,8 @@ class ADVANCE(NonGeoDataset):
"""
urls = (
'https://zenodo.org/record/3828124/files/ADVANCE_vision.zip?download=1',
'https://zenodo.org/record/3828124/files/ADVANCE_sound.zip?download=1',
'https://zenodo.org/records/3828124/files/ADVANCE_vision.zip?download=1',
'https://zenodo.org/records/3828124/files/ADVANCE_sound.zip?download=1',
)
filenames = ('ADVANCE_vision.zip', 'ADVANCE_sound.zip')
md5s = ('a9e8748219ef5864d3b5a8979a67b471', 'a2d12f2d2a64f5c3d3a9d8c09aaf1c31')

Просмотреть файл

@ -22,7 +22,7 @@ class CDL(RasterDataset):
"""Cropland Data Layer (CDL) dataset.
The `Cropland Data Layer
<https://data.nal.usda.gov/dataset/cropscape-cropland-data-layer>`__, hosted on
<https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>`__, hosted on
`CropScape <https://nassgeodata.gmu.edu/CropScape/>`_, provides a raster,
geo-referenced, crop-specific land cover map for the continental United States. The
CDL also includes a crop mask layer and planting frequency layers, as well as
@ -37,7 +37,7 @@ class CDL(RasterDataset):
If you use this dataset in your research, please cite it using the following format:
* https://www.nass.usda.gov/Research_and_Science/Cropland/sarsfaqs2.php#Section1_14.0
* https://www.nass.usda.gov/Research_and_Science/Cropland/sarsfaqs2.php#what.1
"""
filename_glob = '*_30m_cdls.tif'

Просмотреть файл

@ -330,7 +330,7 @@ class ChesapeakeCVPR(GeoDataset):
additional layer of data to this dataset containing a prior over the Chesapeake Bay
land cover classes generated from the NLCD land cover labels. For more information
about this layer see `the dataset documentation
<https://zenodo.org/record/5866525>`_.
<https://zenodo.org/records/5866525>`_.
If you use this dataset in your research, please cite the following paper:
@ -340,7 +340,7 @@ class ChesapeakeCVPR(GeoDataset):
subdatasets = ('base', 'prior_extension')
urls: ClassVar[dict[str, str]] = {
'base': 'https://lilablobssc.blob.core.windows.net/lcmcvpr2019/cvpr_chesapeake_landcover.zip',
'prior_extension': 'https://zenodo.org/record/5866525/files/cvpr_chesapeake_landcover_prior_extension.zip?download=1',
'prior_extension': 'https://zenodo.org/records/5866525/files/cvpr_chesapeake_landcover_prior_extension.zip?download=1',
}
filenames: ClassVar[dict[str, str]] = {
'base': 'cvpr_chesapeake_landcover.zip',

Просмотреть файл

@ -61,7 +61,7 @@ class DeepGlobeLandCover(NonGeoDataset):
If you use this dataset in your research, please cite the following paper:
* https://arxiv.org/pdf/1805.06561.pdf
* https://arxiv.org/pdf/1805.06561
.. note::

Просмотреть файл

@ -43,7 +43,7 @@ class DFC2022(NonGeoDataset):
* DEMs collected from the
`IGN RGE ALTI database <https://geoservices.ign.fr/documentation/donnees/alti/rgealti/>`_
* Labels collected from the
`UrbanAtlas 2012 database <https://land.copernicus.eu/local/urban-atlas/urban-atlas-2012/view/>`_
`UrbanAtlas 2012 database <https://land.copernicus.eu/en/products/urban-atlas/urban-atlas-2012>`_
* Data collected from 19 regions in France
Dataset format:

Просмотреть файл

@ -47,7 +47,7 @@ class EnviroAtlas(GeoDataset):
.. versionadded:: 0.3
"""
url = 'https://zenodo.org/record/5778193/files/enviroatlas_lotp.zip?download=1'
url = 'https://zenodo.org/records/5778193/files/enviroatlas_lotp.zip?download=1'
filename = 'enviroatlas_lotp.zip'
md5 = 'bfe601be21c7c001315fc6154be8ef14'

Просмотреть файл

@ -21,13 +21,8 @@ from .utils import Path, check_integrity, extract_archive
class EUDEM(RasterDataset):
"""European Digital Elevation Model (EU-DEM) Dataset.
The `EU-DEM
<https://land.copernicus.eu/imagery-in-situ/eu-dem/eu-dem-v1.1?tab=mapview>`__
dataset is a Digital Elevation Model of reference for the entire European region.
The dataset can be downloaded from this `website
<https://land.copernicus.eu/imagery-in-situ/eu-dem/eu-dem-v1.1?tab=mapview>`_
after making an account. A dataset factsheet is available
`here <https://land.copernicus.eu/user-corner/publications/eu-dem-flyer/view>`__.
`EU-DEM <https://www.eea.europa.eu/en/datahub/datahubitem-view/d08852bc-7b5f-4835-a776-08362e2fbf4b>`__
is a Digital Elevation Model of reference for the entire European region.
Dataset features:
@ -41,10 +36,6 @@ class EUDEM(RasterDataset):
* DEMs are single-channel tif files
If you use this dataset in your research, please give credit to:
* `Copernicus <https://land.copernicus.eu/imagery-in-situ/eu-dem/eu-dem-v1.1>`_
.. versionadded:: 0.3
"""

Просмотреть файл

@ -137,12 +137,12 @@ class IDTReeS(NonGeoDataset):
}
metadata: ClassVar[dict[str, dict[str, str]]] = {
'train': {
'url': 'https://zenodo.org/record/3934932/files/IDTREES_competition_train_v2.zip?download=1',
'url': 'https://zenodo.org/records/3934932/files/IDTREES_competition_train_v2.zip?download=1',
'md5': '5ddfa76240b4bb6b4a7861d1d31c299c',
'filename': 'IDTREES_competition_train_v2.zip',
},
'test': {
'url': 'https://zenodo.org/record/3934932/files/IDTREES_competition_test_v2.zip?download=1',
'url': 'https://zenodo.org/records/3934932/files/IDTREES_competition_test_v2.zip?download=1',
'md5': 'b108931c84a70f2a38a8234290131c9b',
'filename': 'IDTREES_competition_test_v2.zip',
},

Просмотреть файл

@ -26,7 +26,7 @@ class INaturalist(GeoDataset):
If you use an iNaturalist dataset in your research, please cite it according to:
* https://www.inaturalist.org/pages/help#cite
* https://help.inaturalist.org/en/support/solutions/articles/151000170344-how-should-i-cite-inaturalist-
.. versionadded:: 0.3
"""

Просмотреть файл

@ -63,17 +63,17 @@ class LoveDA(NonGeoDataset):
info_dict: ClassVar[dict[str, dict[str, str]]] = {
'train': {
'url': 'https://zenodo.org/record/5706578/files/Train.zip?download=1',
'url': 'https://zenodo.org/records/5706578/files/Train.zip?download=1',
'filename': 'Train.zip',
'md5': 'de2b196043ed9b4af1690b3f9a7d558f',
},
'val': {
'url': 'https://zenodo.org/record/5706578/files/Val.zip?download=1',
'url': 'https://zenodo.org/records/5706578/files/Val.zip?download=1',
'filename': 'Val.zip',
'md5': '84cae2577468ff0b5386758bb386d31d',
},
'test': {
'url': 'https://zenodo.org/record/5706578/files/Test.zip?download=1',
'url': 'https://zenodo.org/records/5706578/files/Test.zip?download=1',
'filename': 'Test.zip',
'md5': 'a489be0090465e01fb067795d24e6b47',
},

Просмотреть файл

@ -32,7 +32,7 @@ class OpenBuildings(VectorDataset):
r"""Open Buildings dataset.
The `Open Buildings
<https://sites.research.google/open-buildings/#download>`__ dataset
<https://sites.research.google/open-buildings/>`__ dataset
consists of computer generated building detections across the African continent.
Dataset features:
@ -48,10 +48,10 @@ class OpenBuildings(VectorDataset):
* meta data geojson file
The data can be downloaded from `here
<https://sites.research.google/open-buildings/#download>`__. Additionally, the
`meta data geometry file
<https://sites.research.google/open-buildings/tiles.geojson>`_ also needs to be
placed in `root` as `tiles.geojson`.
<https://sites.research.google/open-buildings/#open-buildings-download>`__.
Additionally, the `meta data geometry file
<https://openbuildings-public-dot-gweb-research.uw.r.appspot.com/public/tiles.geojson>`_
also needs to be placed in `root` as `tiles.geojson`.
If you use this dataset in your research, please cite the following technical
report:

Просмотреть файл

@ -117,7 +117,7 @@ class PASTIS(NonGeoDataset):
}
directory = 'PASTIS-R'
filename = 'PASTIS-R.zip'
url = 'https://zenodo.org/record/5735646/files/PASTIS-R.zip?download=1'
url = 'https://zenodo.org/records/5735646/files/PASTIS-R.zip?download=1'
md5 = '4887513d6c2d2b07fa935d325bd53e09'
prefix: ClassVar[dict[str, str]] = {
's2': os.path.join('DATA_S2', 'S2_'),

Просмотреть файл

@ -57,7 +57,7 @@ class ReforesTree(NonGeoDataset):
"""
classes = ('other', 'banana', 'cacao', 'citrus', 'fruit', 'timber')
url = 'https://zenodo.org/record/6813783/files/reforesTree.zip?download=1'
url = 'https://zenodo.org/records/6813783/files/reforesTree.zip?download=1'
md5 = 'f6a4a1d8207aeaa5fbab7b21b683a302'
zipfilename = 'reforesTree.zip'

Просмотреть файл

@ -35,7 +35,7 @@ class SeasonalContrastS2(NonGeoDataset):
If you use this dataset in your research, please cite the following paper:
* https://arxiv.org/pdf/2103.16607.pdf
* https://arxiv.org/pdf/2103.16607
"""
all_bands = (
@ -56,13 +56,13 @@ class SeasonalContrastS2(NonGeoDataset):
metadata: ClassVar[dict[str, dict[str, str]]] = {
'100k': {
'url': 'https://zenodo.org/record/4728033/files/seco_100k.zip?download=1',
'url': 'https://zenodo.org/records/4728033/files/seco_100k.zip?download=1',
'md5': 'ebf2d5e03adc6e657f9a69a20ad863e0',
'filename': 'seco_100k.zip',
'directory': 'seasonal_contrast_100k',
},
'1m': {
'url': 'https://zenodo.org/record/4728033/files/seco_1m.zip?download=1',
'url': 'https://zenodo.org/records/4728033/files/seco_1m.zip?download=1',
'md5': '187963d852d4d3ce6637743ec3a4bd9e',
'filename': 'seco_1m.zip',
'directory': 'seasonal_contrast_1m',

Просмотреть файл

@ -25,7 +25,7 @@ class Sentinel(RasterDataset):
If you use this dataset in your research, please cite it using the following format:
* https://asf.alaska.edu/data-sets/sar-data-sets/sentinel-1/sentinel-1-how-to-cite/
* https://asf.alaska.edu/datasets/daac/sentinel-1/
"""
@ -33,7 +33,7 @@ class Sentinel1(Sentinel):
r"""Sentinel-1 dataset.
The `Sentinel-1 mission
<https://sentinel.esa.int/web/sentinel/missions/sentinel-1>`_ comprises a
<https://sentiwiki.copernicus.eu/web/s1-mission>`_ comprises a
constellation of two polar-orbiting satellites, operating day and night
performing C-band synthetic aperture radar imaging, enabling them to
acquire imagery regardless of the weather.
@ -50,16 +50,16 @@ class Sentinel1(Sentinel):
Product Types:
* `Level-0
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-0>`_:
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-0>`_:
Raw (RAW)
* `Level-1
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-1>`_:
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-1>`_:
Single Look Complex (SLC)
* `Level-1
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-1>`_:
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-1>`_:
Ground Range Detected (GRD)
* `Level-2
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-2>`_:
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/product-types-processing-levels/level-2>`_:
Ocean (OCN)
Polarizations:
@ -72,13 +72,13 @@ class Sentinel1(Sentinel):
Acquisition Modes:
* `Stripmap (SM)
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/stripmap>`_
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/stripmap>`_
* `Interferometric Wide (IW) swath
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/interferometric-wide-swath>`_
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/interferometric-wide-swath>`_
* `Extra Wide (EW) swatch
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/extra-wide-swath>`_
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/extra-wide-swath>`_
* `Wave (WV)
<https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/wave>`_
<https://sentinels.copernicus.eu/en/web/sentinel/user-guides/sentinel-1-sar/acquisition-modes/wave>`_
.. note::
At the moment, this dataset only supports the GRD product type. Data must be
@ -255,7 +255,7 @@ class Sentinel2(Sentinel):
"""Sentinel-2 dataset.
The `Copernicus Sentinel-2 mission
<https://sentinel.esa.int/web/sentinel/missions/sentinel-2>`_ comprises a
<https://sentiwiki.copernicus.eu/web/s2-mission>`_ comprises a
constellation of two polar-orbiting satellites placed in the same sun-synchronous
orbit, phased at 180° to each other. It aims at monitoring variability in land
surface conditions, and its wide swath width (290 km) and high revisit time (10 days

Просмотреть файл

@ -36,7 +36,7 @@ class FarSeg(Module):
If you use this model in your research, please cite the following paper:
* https://arxiv.org/pdf/2011.09766.pdf
* https://arxiv.org/pdf/2011.09766
"""
def __init__(

Просмотреть файл

@ -122,7 +122,7 @@ Weights.__deepcopy__ = lambda *args, **kwargs: args[0]
class ResNet18_Weights(WeightsEnum): # type: ignore[misc]
"""ResNet-18 weights.
For `timm <https://github.com/rwightman/pytorch-image-models>`_
For `timm <https://github.com/huggingface/pytorch-image-models>`_
*resnet18* implementation.
.. versionadded:: 0.4
@ -301,7 +301,7 @@ class ResNet18_Weights(WeightsEnum): # type: ignore[misc]
class ResNet50_Weights(WeightsEnum): # type: ignore[misc]
"""ResNet-50 weights.
For `timm <https://github.com/rwightman/pytorch-image-models>`_
For `timm <https://github.com/huggingface/pytorch-image-models>`_
*resnet50* implementation.
.. versionadded:: 0.4
@ -597,7 +597,7 @@ class ResNet50_Weights(WeightsEnum): # type: ignore[misc]
class ResNet152_Weights(WeightsEnum): # type: ignore[misc]
"""ResNet-152 weights.
For `timm <https://github.com/rwightman/pytorch-image-models>`_
For `timm <https://github.com/huggingface/pytorch-image-models>`_
*resnet152* implementation.
.. versionadded:: 0.6
@ -663,7 +663,7 @@ def resnet18(
If you use this model in your research, please cite the following paper:
* https://arxiv.org/pdf/1512.03385.pdf
* https://arxiv.org/pdf/1512.03385
.. versionadded:: 0.4
@ -697,7 +697,7 @@ def resnet50(
If you use this model in your research, please cite the following paper:
* https://arxiv.org/pdf/1512.03385.pdf
* https://arxiv.org/pdf/1512.03385
.. versionchanged:: 0.4
Switched to multi-weight support API.
@ -732,7 +732,7 @@ def resnet152(
If you use this model in your research, please cite the following paper:
* https://arxiv.org/pdf/1512.03385.pdf
* https://arxiv.org/pdf/1512.03385
.. versionadded:: 0.6

Просмотреть файл

@ -37,7 +37,7 @@ Weights.__deepcopy__ = lambda *args, **kwargs: args[0]
class ViTSmall16_Weights(WeightsEnum): # type: ignore[misc]
"""Vision Transformer Small Patch Size 16 weights.
For `timm <https://github.com/rwightman/pytorch-image-models>`_
For `timm <https://github.com/huggingface/pytorch-image-models>`_
*vit_small_patch16_224* implementation.
.. versionadded:: 0.4

Просмотреть файл

@ -286,7 +286,7 @@ class BYOLTask(BaseTask):
Reference implementation:
* https://github.com/deepmind/deepmind-research/tree/master/byol
* https://github.com/google-deepmind/deepmind-research/tree/master/byol
If you use this trainer in your research, please cite the following paper:

Просмотреть файл

@ -74,7 +74,7 @@ class AppendNBR(AppendNormalizedDifferenceIndex):
If you use this index in your research, please cite the following paper:
* https://www.sciencebase.gov/catalog/item/4f4e4b20e4b07f02db6abb36
* https://www.yumpu.com/en/document/view/24226870/the-normalized-burn-ratio-and-relationships-to-burn-severity-/7
.. versionadded:: 0.2
"""