Author: Sam Tsang

  • Universal Approximation

    The universal approximation is the ability of a certain set of mathematical functions or models to mimic or simulate any other function with precision, yielding a proportionate degree of complexity when needed. It can be well thought of in the arena of machine learning and artificial intelligence Introduction: Case of neural networks, the universal approximation…

  • Gradient descent convergence

    Introduction Introduction to Gradient Descent Convergence: Gradient descent is one of the most used optimization algorithms across several fields and domains like machine learning and reducing the error of algorithm models. The idea is to repeatedly change parameters using the steepest descent of the cost function and stop when certain conditions, such as the cost…

  • Gradient Descent

    Introduction Introduction to Gradient Descent: Gradient descent is an algorithm that minimizes the cost function in various machine learning and optimization tasks as a method of optimization. This optimization method is very essential for training and fine-tuning Machine Learning models, especially in cases when the model’s parameters need to be changed and the size of…

  • Multi-Layer Perceptrons

    Introduction  Multi-layer perceptrons are one kind of artificial neural network (ANN) that has the property of being a multi-layered structure of neurons or nodes. Every single node in one layer has connections to all the neurons in the following layer to form the forward feed network. MLPs are mainly applied in supervised learning activities, where…