smart_garbage_classifier

Smart Garbage Segregation is a project that aims to using AI/ML to efficiently and effectively sort waste into different categories such as plastic, glass, etc. using oneDNN. ...learn more

Project status: Under Development

oneAPI, Artificial Intelligence, Cloud

Intel Technologies
DevCloud, oneAPI, Intel Python

Code Samples [1]Links [1]

Overview / Usage

Abstract

OneDNN provides highly optimized routines for various deep learning operations, including convolution, pooling, normalization, and activation functions. By using oneDNN, you can expect faster execution times and better performance on modern CPUs, especially those with Intel processors.

In this project os.environ['TF\_ENABLE\_ONEDNN\_OPTS'] = '1' line sets an environment variable called TF\_ENABLE\_ONEDNN\_OPTS to '1'. This enables the use of Intel's OneAPI Deep Neural Network Library (OneDNN) optimizations for TensorFlow on the system where this code is being run. OneDNN is a high-performance library for deep learning that is designed to optimize the performance of deep neural network computations on a variety of hardware platforms. By enabling OneDNN optimizations, this code may run faster on certain hardware architectures that are compatible with OneDNN. In this project, the Conv2D and Dense layers will be automatically optimized using oneDNN, which should result in faster training and inference times on compatible hardware.

The tensorflow.keras module is used to create a convolutional neural network (CNN) model for image classification. The model architecture consists of three convolutional blocks, each followed by a max pooling layer, and three fully connected layers with dropout for regularization.

Finally, the model.compile method is called to configure the optimizer, loss function, and evaluation metric for the model. The optimizer used is Adam, and the loss function used is sparse categorical cross-entropy. The model is also evaluated using the accuracy metric.

Methodology / Approach

These are the steps involved in making this project:

  • Importing Libraries
  • Data Importing
  • Data Exploration
  • Data Configuration
  • Preparing the Data
    • Creating a Generator for Training Set
    • Creating a Generator for Testing Set
  • Writing the labels into a text file 'Labels.txt
  • Model Creation
  • Model Compilation
  • Training the Model (batch_size = 32, epochs = 10)
  • Testing Predictions
  • Saving model as 'model.h5'
  • Deploying the Model as a Web Application using Streamlit

Technologies Used

These are the frameworks/libraries used to bootstrap your project.

  • One API /One
  • Devcloud
  • Python
  • Streamlit

Repository

https://github.com/raison024/Smart-Garbage-Segregation.git

Collaborators

There are no people to show.

Comments (0)