Summaries for Advanced Learning Algorithms (1)

 Neural Networks





inpput layer----hidden layer (activation values)----output layer



The nice properties of a neural network are that when you train it from data, you don't need to go in to explicitly decide what other features, such as affordability and so on, the neural network should compute instead or figure out all by itself what are the features it wants to use in this hidden layer. 


computer vision application 



neural network




More complex neural networks
4 layers = hidden layers + output layer (not counting the input layer )






Foward propagation: making predictions



Since this computation moves from left to right, you start with a1, followed by a2a_2, and then a3a_3. This process is known as forward propagation because it propagates neuron activations forward through the network. It contrasts with backpropagation, which is used to adjust weights during training.

Additionally, a neural network architecture where the number of hidden units is larger in earlier layers and gradually decreases toward the output layer is a common design choice when structuring neural networks.

Comments

Popular posts from this blog

Analysis of Repeated Measures Data using SAS (1)

Medical information for Melanoma, Merkel cell carcinoma and tumor mutation burden

Four essential statistical functions for simulation in SAS