Errata
The errata list is a list of errors and their corrections that were found after the product was released. If the error was corrected in a later version or reprint the date of the correction will be displayed in the column titled "Date Corrected".
The following errata were submitted by our customers and approved as valid errors by the author or editor.
Color key: Serious technical mistake Minor technical mistake Language or formatting error Typo Question Note Update
Version | Location | Description | Submitted By | Date submitted | Date corrected |
---|---|---|---|---|---|
Printed | Page 11 last line |
print ans |
Kenneth T. Hall | Sep 11, 2017 | Sep 15, 2017 |
Page 16 first equation |
[xw0...xw9]=xW Note from the Author or Editor: |
Anonymous | Oct 31, 2017 | Apr 13, 2018 | |
Page 21 Figure 2-3 |
The figure is supposed to represent the model described in the text. Note from the Author or Editor: |
Paolo Baronti | Jan 24, 2018 | Apr 13, 2018 | |
Page 26 Table 3-1 |
The definition of the subtraction shortcut as |
Paolo Baronti | Jan 07, 2018 | Apr 13, 2018 | |
Printed | Page 36 2nd from the bottom |
A = tf. constant ... |
Yevgeniy Davletshin | Sep 19, 2017 | Apr 13, 2018 |
Page 37 Middle of page, for code |
The author created a 2x2x3 Array. Note from the Author or Editor: |
Clem Wang | Sep 23, 2017 | Apr 13, 2018 | |
Printed | Page 41 2nd from the top |
in the formula f(xi) = w.T xi + b Note from the Author or Editor: |
Yevgeniy Davletshin | Sep 19, 2017 | Apr 13, 2018 |
Page 45 line 5 (formula) |
Note from the Author or Editor: |
Clem Wang | Sep 23, 2017 | Apr 13, 2018 | |
Printed | Page 46 Logistic Regression |
On page 46 , Logistic regression. |
THEOPHILUS SIAMEH | Dec 07, 2017 | Apr 13, 2018 |
Page 50 line 9 ( or line 2 of code on the page) |
The line is missing a minus sign for the first term. The text incorrectly has: |
Clem Wang | Sep 24, 2017 | Apr 13, 2018 | |
Page 57 2nd release, 4th paragraph |
In |
Paolo Baronti | Jan 28, 2018 | Apr 13, 2018 | |
Printed | Page 60 2nd paragraph, 1st sentence |
"Next we have two consecutive layers of convolution and pooling, each with 5x5 convolutions and 64 feature maps, followed by a single fully connected layer with 1,024 units." |
Kenneth T. Hall | Oct 16, 2017 | Apr 13, 2018 |
Printed | Page 60 2nd line of code near page bottom |
cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(y_conv,y_)) Note from the Author or Editor: |
Kenneth T. Hall | Oct 16, 2017 | Apr 13, 2018 |
Printed | Page 61 1st paragraph (after the code). Also, in the footnote on the same page. |
"epoc" should be "epoch" |
Kenneth T. Hall | Oct 16, 2017 | Apr 13, 2018 |
Printed, PDF, ePub | Page 68 code example at top of page |
code reads: Note from the Author or Editor: |
Jeff Kriske | Dec 12, 2017 | Apr 13, 2018 |
Printed | Page 93 Stacking multiple LSTMs |
Note from the Author or Editor: |
Michal Steuer | Mar 16, 2018 | Apr 13, 2018 |
Page 131 Last paragraph |
here we simply specify 'categorical_crossentreopy' |
Edberg | Feb 01, 2018 | Apr 13, 2018 | |
Printed | Page 134 second line of code starting print... |
testY --> Y_test |
Michal Steuer | Mar 29, 2018 | Apr 13, 2018 |
Page 158 3rd paragraph |
'Note that if we were to run xs.eval() one more time' Note from the Author or Editor: |
Edberg | Feb 18, 2018 | Apr 13, 2018 | |
Page 204 middle of the code example |
there is a line with only one '.' |
Edberg | Feb 25, 2018 | Apr 13, 2018 |