Abstract Proceedings of ICIRESM – 2019
Full conference PDF is available to the subscribed user. Use your subscription login to access,
CLASSIFICATION AND RECOGNITION OF GIRLS USING NEURAL NETWORKS AND DEEP LEARNING USING CIFAR-10
Pytorch models are neural network models trained on large reference datasets such as Image Net. We focus on using Pytorch models to predict the class (label) of an input, we also discuss the process involved. This process is called model inference. The whole process consists of the following main steps. First read the input image Making image transformations. For example - size, center, crop. Normalization etc. Forward Pass Determine the result vector using Pytorch weights. Each element of this output vector describes the confidence with which the model predicts that the source image belongs to a certain class. Using models for image classification like pyTorch or Opencv. Show the predictions based on the scores obtained (elements of the output vector we mentioned in the third). The CIFAR-10 dataset we use is a collection of images from 10 different categories such as cars, birds, dogs, horses, ships, trucks, etc. The idea of the project is to create an image classification model that can identify which category the input image belongs to. Image classification is used in many applications and is a great paper to start deep learning with machine learning. CIFAR-10 is a very popular computer vision dataset. This dataset has been well studied in many deep learning studies for object recognition. This dataset consists of 60,000 images divided into 10 object classes, each containing 6,000 images 32 32 This dataset contains low-resolution (32°32) images that allow researchers to experiment with new algorithms. 10 different categories of this dataset, for example: dogs, cats, objects, etc
Cutting edge research, Pytorch weights
30/08/2019
32
19030
IMPORTANT DAYS
Paper Submission Last Date
October 20th, 2024
Notification of Acceptance
November 7th, 2024
Camera Ready Paper Submission & Author's Registration
November 1st, 2024
Date of Conference
November 15th, 2024
Publication
January 30th, 2025