heat.utils.data.partial_dataset

Tool for using a dataset which will not fit in memory with neural networks

Module Contents

class PartialH5Dataset(file: str, comm: heat.core.communication.MPICommunication = MPI_WORLD, dataset_names: str | List[str] = 'data', transforms: List[Callable] = None, use_gpu: bool = True, validate_set: bool = False, initial_load: int = 7000, load_length: int = 1000)

Bases: torch.utils.data.Dataset

Create a Dataset object for a dataset which loads portions of data from an HDF5 file. Very similar to <heat.utils.data.datatools.Dataset>(). This will create 2 threads, one for loading the data from the target file, and one for converting items before being passed to the network. The conversion is done by the iterator. A portion of the data of length initial_load is loaded upon initialization, the rest of the data is loaded after the loaded data is returned by PartialH5DataLoaderIter(). This iterator will be used by the HeAT heat.utils.data.datatools.DataLoader() automatically with this type of dataset.

Notes

H5 datasets require the GIL to load data. This can be a bottleneck if data needs to be loaded multiple times (e.g. the case for using this dataset). It is recommended to find another way to preprocess the data and avoid using H5 files for this reason.

Parameters:
  • file (str) – H5 file to use

  • comm (MPICommunication) – Global MPI communicator generated by HeAT

  • dataset_names (Union[str, List[str]], optional) – Name/s of dataset/s to load from file. If a string is given, it will be the only dataset loaded. Default is “data”.

  • transforms (List[Callable], optional) – Transforms to apply to the data after it is gotten from the loaded data before it is used by the network. This should be a list of Callable torch functions for each item returned by the __getitem__ function of the individual dataset. If a list element is None then no transform will be applied to the corresponding element returned by __getitem__. I.e. if __getitem__ returns an image and a label then the list would look like this: transforms = [image_transforms, None]. If this is None, no transforms will be applied to any elements. Default is None.

  • use_gpu (bool, optional) – Use GPUs if available. Defaults to True.

  • validate_set (bool, optional) – Load the entire dataset onto each node upon initialization and skip loaded in iterator This is typically the case needed for validation sets when the network should be tested against the whole dataset. Default is False.

  • initial_load (int, optional) – How many elements to load from the file in the 0th dimension. Default is 7000 elements

  • load_length (int, optional) – How many elements to load from the file in the iterator. Default is 1000 elements

Shuffle()

Send half of the local data to the process self.comm.rank + 1 if available, else wrap around. After receiving the new data, shuffle the local tensor.

Not implemented for partial dataset

Ishuffle()

Send half of the local data to the process self.comm.rank + 1 if available, else wrap around. After receiving the new data, shuffle the local tensor.

Not implemented for partial dataset

__getitem__(index: int | slice | List[int] | torch.Tensor) torch.Tensor

This should be defined by the user at runtime. This function needs to be designed such that the data is in the 0th dimension and the indexes called are only in the 0th dim!

__len__() int

Get the total length of the dataset

thread_replace_converted_batches()

Replace the elements of the dataset with newly loaded elements. :func:’PartialH5DataLoaderIter’ will put the used indices in the used_indices parameter. This object is reset to an empty list after these elements are overwritten with new data.

class PartialH5DataLoaderIter(loader)

Bases: object

Iterator to be used with :func:’PartialH5Dataset’. It closely mirrors the standard torch iterator while loading new data to replace the loaded batches automatically. It also pre-fetches the batches and begins their preparation, collation, and device setting in the background.

__len__()

Get the length of the iterator

_next_data()
__next__()

Get the next batch of data. Shamelessly taken from torch.

__iter__()

Get a new iterator of this class

Return type:

PartialH5DataLoaderIter

__thread_convert_all(index_list)