Deep dive into activation functions in neural networks. Compare Sigmoid, Tanh, ReLU, and Softmax to understand how they introduce non-linearity.