Member-only story

PyTorch important APIs

Nikhil Verma
3 min readJan 15, 2023

The heart of Pytorch DL Framework are its functional APIs which give access to various Neural Layers ranging from linear, convolution to recurrent ones. Also provides basic regularization layers like dropout, batch-norm, pooling and many others. It gives access to use range of optimisers and learning rate schedulers along with data processing functionality via dataset and dataloader. This is a quick glance of various modules under which you could find related classes or functions.

Using the mentioned APIs one could load data, create a neural network and could train the model of their choice.

A generic training and evaluation loop goes like:-

There are other low effort, high impact tricks one could use while training a neural network.

Attached is a working example of a training NN for simple function

Loading required libraries and modules

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torch.utils.data import Dataset, DataLoader

Defining DataSet

class Data(Dataset):
def __init__(self, data):
self.data =…

--

--

Nikhil Verma
Nikhil Verma

Written by Nikhil Verma

Knowledge shared is knowledge squared | My Portfolio https://lihkinverma.github.io/portfolio/ | My blogs are living document, updated as I receive comments

No responses yet