CIFAR in PyTorch

Dec 16, 2024 pm 05:15 PM

Buy Me a Coffee☕

*My post explains CIFAR-100.

CIFAR100() can use CIFAR-100 dataset as shown below:

*Memos:

  • The 1st argument is root(Required-Type:str or pathlib.Path). *An absolute or relative path is possible.
  • The 2nd argument is train(Optional-Default:True-Type:bool). *If it's True, train data(50,000 images) is used while if it's False, test data(10,000 images) is used.
  • The 3rd argument is transform(Optional-Default:None-Type:callable).
  • The 4th argument is target_transform(Optional-Default:None-Type:callable).
  • The 5th argument is download(Optional-Default:False-Type:bool): *Memos:
    • If it's True, the dataset is downloaded from the internet and extracted(unzipped) to root.
    • If it's True and the dataset is already downloaded, it's extracted.
    • If it's True and the dataset is already downloaded and extracted, nothing happens.
    • It should be False if the dataset is already downloaded and extracted because it's faster.
    • You can manually download and extract the dataset(cifar-100-python.tar.gz) from here to data/cifar-100-python/.
from torchvision.datasets import CIFAR100

train_data = CIFAR100(
    root="data"
)

train_data = CIFAR100(
    root="data",
    train=True,
    transform=None,
    target_transform=None,
    download=False
)

test_data = CIFAR100(
    root="data",
    train=False
)

len(train_data), len(test_data)
# (50000, 10000)

train_data
# Dataset CIFAR100
#     Number of datapoints: 50000
#     Root location: data
#     Split: Train

train_data.root
# 'data'

train_data.train
# True

print(train_data.transform)
# None

print(train_data.target_transform)
# None

train_data.download
# <bound method CIFAR10.download of Dataset CIFAR100
#    Number of datapoints: 50000
#    Root location: data
#    Split: Train>

len(train_data.classes), train_data.classes
# (100,
#  ['apple', 'aquarium_fish', 'baby', 'bear', 'beaver', 'bed',
#   'bicycle', 'bottle', 'bowl', ..., 'wolf', 'woman', 'worm']

train_data[0]
# (<PIL.Image.Image image mode=RGB size=32x32>, 19)

train_data[1]
# (<PIL.Image.Image image mode=RGB size=32x32>, 29)

train_data[2]
# (<PIL.Image.Image image mode=RGB size=32x32>, 0)

train_data[3]
# (<PIL.Image.Image image mode=RGB size=32x32>, 11)

train_data[4]
# (<PIL.Image.Image image mode=RGB size=32x32>, 1)

import matplotlib.pyplot as plt

def show_images(data, main_title=None):
    plt.figure(figsize=(10, 5))
    plt.suptitle(t=main_title, y=1.0, fontsize=14)
    for i, (im, lab) in enumerate(data, start=1):
        plt.subplot(2, 5, i)
        plt.title(label=lab)
        plt.imshow(X=im)
        if i == 10:
            break
    plt.tight_layout()
    plt.show()

show_images(data=train_data, main_title="train_data")
show_images(data=test_data, main_title="test_data")
Copy after login

CIFAR in PyTorch

CIFAR in PyTorch

The above is the detailed content of CIFAR in PyTorch. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to avoid being detected by the browser when using Fiddler Everywhere for man-in-the-middle reading? How to avoid being detected by the browser when using Fiddler Everywhere for man-in-the-middle reading? Apr 02, 2025 am 07:15 AM

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...

How to solve permission issues when using python --version command in Linux terminal? How to solve permission issues when using python --version command in Linux terminal? Apr 02, 2025 am 06:36 AM

Using python in Linux terminal...

How to teach computer novice programming basics in project and problem-driven methods within 10 hours? How to teach computer novice programming basics in project and problem-driven methods within 10 hours? Apr 02, 2025 am 07:18 AM

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

How to get news data bypassing Investing.com's anti-crawler mechanism? How to get news data bypassing Investing.com's anti-crawler mechanism? Apr 02, 2025 am 07:03 AM

Understanding the anti-crawling strategy of Investing.com Many people often try to crawl news data from Investing.com (https://cn.investing.com/news/latest-news)...

Python 3.6 loading pickle file error ModuleNotFoundError: What should I do if I load pickle file '__builtin__'? Python 3.6 loading pickle file error ModuleNotFoundError: What should I do if I load pickle file '__builtin__'? Apr 02, 2025 am 06:27 AM

Loading pickle file in Python 3.6 environment error: ModuleNotFoundError:Nomodulenamed...

What is the reason why pipeline files cannot be written when using Scapy crawler? What is the reason why pipeline files cannot be written when using Scapy crawler? Apr 02, 2025 am 06:45 AM

Discussion on the reasons why pipeline files cannot be written when using Scapy crawlers When learning and using Scapy crawlers for persistent data storage, you may encounter pipeline files...

See all articles