Skip to main content

MNISTAutoEncoder

This class implements an autoencoder model for MNIST dataset using PyTorch Lightning. It consists of an encoder and a decoder component. The training_step method defines the forward pass, calculates the Mean Squared Error loss between the reconstructed and original input, and logs the training loss. The configure_optimizers method sets up the Adam optimizer for training.

Attributes

  • encoder: torch.nn.Module

    • The encoder part of the autoencoder.
  • decoder: torch.nn.Module

    • The decoder part of the autoencoder.

Constructors

  • Initializes the MNISTAutoEncoder with an encoder and a decoder.

  • Parameters

    • encoder: object
      • The encoder module for the autoencoder.
    • decoder: object
      • The decoder module for the autoencoder.

Methods

def training_step(batch: tuple, batch_idx: int) - > torch.Tensor
  • Performs a single training step.

Args: batch: The current batch of data, containing input images and labels. batch_idx: The index of the current batch.

Returns: The calculated training loss for the current step.

  • Parameters

    • batch: tuple
      • A tuple containing the input images and their corresponding labels.
    • batch_idx: int
      • The index of the current batch.
  • Return Value: torch.Tensor

    • The training loss.
def configure_optimizers()
  • Configures the optimizer for the model.

Returns: The configured optimizer.

  • Return Value: torch.optim.Optimizer
    • The Adam optimizer.