ice cream

The softmax function

Math version

Good read here

σ(Z)=ezi=0Kez{\sigma(Z)} = \frac % Nominator {e^z} % Denominator {\sum\limits_{i=0}^K e^z}
Softmax(xi)=exp(xi)jexp(xj)\text{Softmax}(x_{i}) = \frac{\exp(x_i)} {\sum_j \exp(x_j)}

Latex version:

{\sigma(Z)} = \frac
% Nominator
{e^z}
% Denominator
{\sum\limits_{i=0}^K e^z

Python version


example of a function for calculating softmax for a list of numbers

from numpy import exp 

#calculate the softmax of a vector

def softmax(vector):	
e=exp(vector)	returne/e.sum()

Pytorch version

def softmax(x):
return torch.exp(x)/torch.sum(torch.exp(x), dim=1).view(-1,1)