ResNets using Gradient boosted blocks of 2-Layer Neural Network

Nandini Ramanan

Nandini Ramanan

Dallas, Texas

0 0
  • 0 Collaborators

Investigating the significance of the change in representation in subsequent layers of a deep network. Also analyzing the significance of joint and individual training of these layers. ...learn more

Project status: Under Development

Artificial Intelligence

Intel Technologies
Intel Opt ML/DL Framework, Movidius NCS

Code Samples [1]

Overview / Usage

An alternate way to reconstruct ResNets

Methodology / Approach

Difference between ResNets and gradient boosting methods is while gradient boosting directly updates the predictor, ResNets iteratively optimize the feature extraction by stacking ResNet layers rather than the predictor, according to the existing work. We are studying to construct the resnets by replacing the residual blocks with gradient boosted weak models

Technologies Used

Keras, Tensorflow, Movidius compute Neural Stick

Repository

https://github.com/sridas123/Residual-Networks-Vs-GB

Comments (0)