Submitted by SimonJDPrince t3_xurvaq in MachineLearning

Major update to draft of "Understanding Deep Learning" textbook is now available via:

udlbook.github.io/udlbook/

New material includes convolutional networks, residual connections, BatchNorm, transformers, and graph neural networks.

There's lots of stuff in here now that is rarely covered in other textbooks including double descent, implicit regularization, transformers for vision, why residual connections help, how BatchNorm works, graph attention networks, etc.

I learned a lot writing it and so probably you will reading it.

There are also lots of slides if you are teaching a course on Deep Learning this semester.

Feedback welcome! Thanks to all who have commented so far.

39

Comments

You must log in or register to comment.

curiousshortguy t1_iqxws3o wrote

I love that you're making the figures available separately, too. The TOC and topics you mention indeed go deeper than the average material

I'm looking forward to the notebooks though, that stuff usually makes things really actionable.

4

SeucheAchat9115 t1_iqzdfk2 wrote

I really like the material. I think it would be helpful if you would support solutions for the problems you gibe in each section.

2

SimonJDPrince OP t1_ir048nc wrote

I have to keep some solutions back so that it can be used by instructors, but I'm going to make about half of them available and might add other problems to the website that aren't in the book and have answers. I haven't written out any of the answers yet, so it's possible that one or two of them aren't well-formulated. If you struggle with any of them, you can always email me.

2

Erosis t1_ir0tjwg wrote

I believe in figure 3.8, there seems to be a small typo.

> g-h) The clipped planes are then weighted

should be:

> g-i) The clipped planes are then weighted

Let me know if I'm mistaken here. Good stuff so far!

Edit: Added github issue regarding this.

2

MuffinB0y t1_iqxbv9m wrote

Looks good! Projected printed version?

1

SimonJDPrince OP t1_ir04cnn wrote

Late 2023... it takes them a while to print it unfortunately.

2