Error loading page.
Try refreshing the page. If that doesn't work, there may be a network issue, and you can use our self test page to see what's preventing the page from loading.
Learn more about possible network issues or contact support for more help.

Math for Deep Learning

ebook
Math for Deep Learning provides the essential math you need to understand deep learning discussions, explore more complex implementations, and better use the deep learning toolkits.
With Math for Deep Learning, you'll learn the essential mathematics used by and as a background for deep learning. 
You’ll work through Python examples to learn key deep learning related topics in probability, statistics, linear algebra, differential calculus, and matrix calculus as well as how to implement data flow in a neural network, backpropagation, and gradient descent. You’ll also use Python to work through the mathematics that underlies those algorithms and even build a fully-functional neural network.
In addition you’ll find coverage of gradient descent including variations commonly used by the deep learning community: SGD, Adam, RMSprop, and Adagrad/Adadelta.
 

Expand title description text
Publisher: No Starch Press

OverDrive Read

  • ISBN: 9781718501911
  • Release date: November 23, 2021

EPUB ebook

  • ISBN: 9781718501911
  • File size: 26716 KB
  • Release date: November 23, 2021

Formats

OverDrive Read
EPUB ebook

Languages

English

Math for Deep Learning provides the essential math you need to understand deep learning discussions, explore more complex implementations, and better use the deep learning toolkits.
With Math for Deep Learning, you'll learn the essential mathematics used by and as a background for deep learning. 
You’ll work through Python examples to learn key deep learning related topics in probability, statistics, linear algebra, differential calculus, and matrix calculus as well as how to implement data flow in a neural network, backpropagation, and gradient descent. You’ll also use Python to work through the mathematics that underlies those algorithms and even build a fully-functional neural network.
In addition you’ll find coverage of gradient descent including variations commonly used by the deep learning community: SGD, Adam, RMSprop, and Adagrad/Adadelta.
 

Expand title description text