Problem 58882. Neural Nets: Activation functions
Return values of selected Activation function type for value,vector, and matrices.
y=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax
ReLU: Rectified Linear Unit, clips negatives max(0,x) Trains faster than sigmoid
Sigmoid: Exponential normalization [0:1] ![](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJEAAAAnCAYAAAAPS6pLAAAFo0lEQVR4Xu2bSeseRRDGkw8g4vIJooKCEMFEQfQiuMZbFE0iIiiuJ8ENl0MwEc1F8OACHnIwqEguguICCi4XNxS8uRxUPLniB9DnB1NhHHupfnvemf+b6YaHP0kv01X9dHVVdb/bt7XSNFCpge2V/Vv3poFtjUSNBNUaaCSqVmEboJGocaBaA3OS6CbN/j3ht2op2gCzamAuEu2Q1N8LFwhfOzRwhtpcL7zoaNua1GmgeHPPRaI7Jeftwm6HvBDomPCQk3COIVuThAZ2qu7pEn2XkggL8tcIR9BnGuMlh2WBQG93hPNYLNMNiihp31j1Xw2gv+PCXo8evSSCPPcLdwveIyi2MEzwK+Es4YfM6kGg94UjzlW+VO0eEy4STnf2ac3CGkCXzwhX54xGjkRYgic68tinakl0WANd2E0utYAPqvIO4WzHKpsJvqpr+0cj0f+0xlpeIdwqoKd9wqsC6/GI8JqAP9Qv1OO7PppagxSJrlXH84VvhIPCrm6gWhJ9p3GwFkwwVhD4W+GeTDv641+d2s3zZf09TWgkiusWC/NRR5ovu2bvdH+HLoAFQJep/uPYkDlLZP2wCjhblBoSmQBnapxUaF9ihfqycfyxy9ZFItNDaNcm9sSWq/qnm5FHDnT6pzC0UieEmppEz3VfxsKkyu+qfF5ImtHAAEsgkRHAw8wnIzo0PXkMAuR5RYj6sFOTCHLcLLyV0ADH6JvCnky70BBLIBHW3Fv+VsNQlMpmJkjy6NiOtLvUPpinm5JEMJrJ56Imc/RyR97JTiKChVOEX4RcFOslFe0g4dHOssQs1XA8Nv+7QvBIm5JELk9fEzVr4p1bX+BNt0SWSmGxOM6vFAhovIudI5Pl3e5TQ5zrzwUSvlj/1OmAXomSg5Gyd6FqHWsm/6vgOYNh/acC+YnSsskkIsp8oVtYy81YIIIeVrHMpj/G/km4TvhQYEMTJePncKTdIkQdZ9UlN/ZUJEKIB2JMHjAFx5GQc0kkMh+FnMzFgkWuRqzaaJMbAgjzlGCJWxsbi/d475uhjWtGJOhcT0Ui7zUHAsxJIizmeQnzt191OKSQ/FCiXYkfY4vJcH1Lbbky8l5RpzYxhzGrjETBfNEUJDLv3nPNMTeJ+sdHzSJ4F910w7f6ORvzXfCHsBS5lEjNXD19ZyeR95rDhJnTEtn1SUyxOJZsBo4X/LZYeVYVKUfV+uGb3Nj9Ayv0s3Cb8LDA0YYDHM0Ue1Z/pDbJiHkKS+S55ujLSntwsvtEfSsEKSEN5XWB43IrvUKY1bH2XnP0SbSUEN+SqsjuPf5GMizFw7AmHLHB91/rtkSWXEyFj0OJkpFARvxNCvH7aRNP5rh45UfsgIsR9c3WTSLPNcdQVntvtIpiN4lE/ajMnmWEdMGFsvc91Yi8OTGUrUdsju7fna2SbMRc8zQjd80REhyfiDR7aVSySSTq+0RkjnGw+9cb9pKhnzdaB0lyYzIPHP1zhODLC48lQlgWlKiEwltnz84g8sASlRKBb2QnHpB8GJ4n38DkNBepH/spiEU9fA7HmjCfQi4Knd8bW7gV579Kt+yGTpEI8hBu8gpxWNgxRA+xX1+UXHOEBCt5lAZ5SAIy32H5Qv/xieAJtz0KHptEfJP5X9PpmXc7Hwj8lGrMS1ePbKE2zO0NgXxVdD4eS7TKBHCkyeh6nrbGxrfjMGpGV5lYZR+Ueonwo5B6mVn5mS3TnZsGUg7Jk2ddJHJ93KEq7yM2x1CtSaEGsLqXC9l83TpIVHrNkZMNZ/noQnZ+ThdT1bt/6cGE1kEiGHyD4PlholcpWCR7wuDt09qtpgG7+jmg7q6fuK+DRKXXHF5Rcw+nvOO0dmkNFOt5bBJZYqrmAVVb5A3TwNgkwh86VxgrpN4wdS5zumOTaJlaXLjUjUQLJ8AY4jcSjaHFhY/xLzQKbDe+g/quAAAAAElFTkSuQmCC)
HyperTan: Normalization[-1:1] tanh(x)
Softmax: Normalizes output sum to 1, individual values [0:1]
Used on Output node
Working though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist.
Might take a day or two to completely cover Neural Nets in a Matlab centric fashion.
Essentially Out=Softmax(ReLU(X*W)*WP)
Solution Stats
Problem Comments
-
1 Comment
Richard Zapor
on 21 Aug 2023
Multi-Case Softmax should be y=exp(x)./sum(exp(x),2)
Solution Comments
Show commentsProblem Recent Solvers9
Suggested Problems
-
Sort a list of complex numbers based on far they are from the origin.
5603 Solvers
-
340 Solvers
-
Sum of first n terms of a harmonic progression
411 Solvers
-
Longest run of consecutive numbers
5318 Solvers
-
Square Digits Number Chain Terminal Value (Inspired by Project Euler Problem 92)
216 Solvers
More from this Author308
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!