Vishwas1 commited on
Commit
b465120
1 Parent(s): ef7ab22

Upload dataset_chunk_118.csv with huggingface_hub

Browse files
Files changed (1) hide show
  1. dataset_chunk_118.csv +2 -0
dataset_chunk_118.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ text
2
+ "). this work is subject to a creative commons cc-by-nc-nd license. (c) mit press.11.2 residual connections and residual blocks 189 shatteredgradientspresumablyarisebecausechangesinearlynetworklayersmodify theoutputinanincreasinglycomplexwayasthenetworkbecomesdeeper. thederivative of the output y with respect to the first layer f of the network in equation 11.1 is: 1 ∂y ∂f ∂f ∂f = 4 3 2. (11.3) ∂f ∂f ∂f ∂f 1 3 2 1 whenwechangetheparametersthatdeterminef ,allofthederivativesinthissequence 1 can change since layers f ,f , and f are themselves computed from f . consequently, 2 3 4 1 theupdatedgradientateachtrainingexamplemaybecompletelydifferent, andtheloss function becomes badly behaved.1 11.2 residual connections and residual blocks residual or skip connections are branches in the computational path, whereby the input to each network layer f[•] is added back to the output (figure 11.4a). by analogy to equation 11.1, the residual network is defined as: h = x+f [x,ϕ ] 1 1 1 h = h +f [h ,ϕ ] 2 1 2 1 2 h = h +f [h ,ϕ ] 3 2 3 2 3 y = h +f [h ,ϕ ], (11.4) 3 4 3 4 where the first term on the right-hand side of each line is the residual connection. each function f learns an additive change to the current representation. it follows that their k outputs must be the same size as their inputs. each additive combination of the input and the processed output is known as a residual block or residual layer. once more, we can write this as a single function by substituting in the expressions problem11.1 for the intermediate quantities h : k y=x + f [x] (11.5) 1(cid:2) (cid:3) + f x+f [x] 2h 1 (cid:2) (cid:3)i + f x+f [x]+f x+f [x] 3 1 2 1 (cid:20) (cid:2) (cid:3) h (cid:2) (cid:3)i(cid:21) + f x+f [x]+f x+f [x] +f x+f [x]+f x+f [x] , 4 1 2 1 3 1 2 1 where we have omitted the parameters ϕ for clarity. we can think of this equation as • “unraveling” the network (figure 11.4b). we see that the final network output is a sum of the input and four smaller networks, corresponding to each line of the equation; one 1inequations11.3and11.6,weoverloadnotationtodefinef astheoutputofthefunctionf [•]. k k draft: please send errata to [email protected] 11 residual networks figure 11.4 residual connections. a) the output of each function f [x,ϕ ] is k k addedbacktoitsinput,whichispassedviaaparallelcomputationalpathcalled a residual or skip connection. hence, the function computes an additive change totherepresentation. b)uponexpanding(unraveling)thenetworkequations,we findthattheoutputisthesumoftheinputplusfoursmallernetworks(depicted in white, orange, gray, and cyan, respectively, and corresponding to terms in equation 11.5); we can think of this as an ensemble of networks. moreover, the output from the cyan network is itself a transformation f [•,ϕ ] of another 4 4 ensemble,andsoon. alternatively,wecanconsiderthenetworkasacombination of16differentpathsthroughthecomputationalgraph. oneexampleisthedashed path from input x to output y, which is the same in panels (a) and (b). this work is subject to a creative commons cc-by-nc-nd license. (c) mit press.11.2 residual connections and residual blocks 191 figure 11.5orderofoperationsinres"