зеркало из https://github.com/microsoft/torchgeo.git
a027434eef
* Add Annual NLCD dataset * Add fake test data * Removed unused import * Run ruff check * Improve _verrify function * Run ruff check * Update citation |
||
---|---|---|
.. | ||
advance | ||
agb_live_woody_density | ||
agrifieldnet | ||
airphen | ||
astergdem | ||
bigearthnet | ||
biomassters | ||
cabuar | ||
caffe | ||
cbf | ||
cdl | ||
chabud | ||
chesapeake | ||
cms_mangrove_canopy | ||
cowc_counting | ||
cowc_detection | ||
cropharvest | ||
cv4a_kenya_crop_type | ||
cyclone | ||
deepglobelandcover | ||
dfc2022 | ||
digital_typhoon | ||
eddmaps | ||
enviroatlas | ||
esri2020 | ||
etci2021 | ||
eudem | ||
eurocrops | ||
eurosat | ||
fair1m | ||
fire_risk | ||
forestdamage | ||
ftw | ||
gbif | ||
geonrw | ||
gid15 | ||
globbiomass | ||
idtrees | ||
inaturalist | ||
inria | ||
iobench | ||
l7irish | ||
l8biome | ||
landcoverai | ||
landsat8 | ||
levircd | ||
loveda | ||
mapinwild | ||
millionaid | ||
mmearth | ||
naip | ||
nasa_marine_debris | ||
nccm | ||
nlcd | ||
nongeoclassification | ||
openbuildings | ||
oscd | ||
pastis | ||
patternnet | ||
potsdam | ||
prisma | ||
quakeset | ||
raster | ||
ref_cloud_cover_detection_challenge_v1 | ||
reforestree | ||
resisc45 | ||
rwanda_field_boundary | ||
satlas | ||
seasonet | ||
seco | ||
sen12ms | ||
sentinel1 | ||
sentinel2 | ||
skippd | ||
skyscript | ||
so2sat | ||
south_africa_crop_type | ||
south_america_soybean | ||
spacenet | ||
ssl4eo | ||
ssl4eo_benchmark_landsat | ||
sustainbench_crop_yield | ||
technoserve-cashew-benin | ||
ucmerced | ||
usavars | ||
vaihingen | ||
vector | ||
vhr10 | ||
western_usa_live_fuel_moisture | ||
xview2 | ||
xview3 | ||
zuericrop | ||
README.md |
README.md
This directory contains fake data used to test torchgeo. Depending on the type of dataset, fake data can be created in multiple ways:
GeoDataset
GeoDataset data can be created like so. We first open an existing data example and use it to copy the driver/CRS/transform to the fake data.
Raster data
import os
import numpy as np
import rasterio as rio
ROOT = "data/landsat8"
FILENAME = "LC08_L2SP_023032_20210622_20210629_02_T1_SR_B1.TIF"
SIZE = 64
with rio.open(os.path.join(ROOT, FILENAME), "r") as src:
dtype = src.profile["dtype"]
Z = np.random.randint(np.iinfo(dtype).max, size=(SIZE, SIZE), dtype=dtype)
with rio.open(FILENAME, "w", **src.profile) as dst:
for i in dst.indexes:
dst.write(Z, i)
Optionally, if the dataset has a colormap, this can be copied like so:
cmap = src.colormap(1)
dst.write_colormap(1, cmap)
Vector data
import os
from collections import OrderedDict
import fiona
ROOT = "data/cbf"
FILENAME = "Ontario.geojson"
rec = {"type": "Feature", "id": "0", "properties": OrderedDict(), "geometry": {"type": "Polygon", "coordinates": [[(0, 0), (0, 1), (1, 1), (1, 0), (0, 0)]]}}
with fiona.open(os.path.join(ROOT, FILENAME), "r") as src:
src.meta["schema"]["properties"] = OrderedDict()
with fiona.open(FILENAME, "w", **src.meta) as dst:
dst.write(rec)
NonGeoDataset
NonGeoDataset data can be created like so.
RGB images
import numpy as np
from PIL import Image
DTYPE = np.uint8
SIZE = 64
arr = np.random.randint(np.iinfo(DTYPE).max, size=(SIZE, SIZE, 3), dtype=DTYPE)
img = Image.fromarray(arr)
img.save("01.png")
Grayscale images
import numpy as np
from PIL import Image
DTYPE = np.uint8
SIZE = 64
arr = np.random.randint(np.iinfo(DTYPE).max, size=(SIZE, SIZE), dtype=DTYPE)
img = Image.fromarray(arr)
img.save("02.jpg")
Audio wav files
import numpy as np
from scipy.io import wavfile
audio = np.random.randn(1).astype(np.float32)
wavfile.write("01.wav", rate=22050, data=audio)
HDF5 datasets
import h5py
import numpy as np
DTYPE = np.uint8
SIZE = 64
NUM_CLASSES = 10
images = np.random.randint(np.iinfo(DTYPE).max, size=(SIZE, SIZE, 3), dtype=DTYPE)
masks = np.random.randint(NUM_CLASSES, size=(SIZE, SIZE), dtype=DTYPE)
with h5py.File("data.hdf5", "w") as f:
f.create_dataset("images", data=images)
f.create_dataset("masks", data=masks)
LAS Point Cloud files
import laspy
num_points = 4
las = laspy.read("0.las")
las.points = las.points[:num_points]
points = np.random.randint(low=0, high=100, size=(num_points,), dtype=las.x.dtype)
las.x = points
las.y = points
las.z = points
if hasattr(las, "red"):
colors = np.random.randint(low=0, high=10, size=(num_points,), dtype=las.red.dtype)
las.red = colors
las.green = colors
las.blue = colors
las.write("0.las")