40 likes | 45 Views
In this post, we would see some of the important terms necessary to understand simple neural network. <br>A neuron refers to a unit that carries information in a neural network. <br>https://neuton.ai/main<br>
E N D
Whether you are learning a language or a concept, understanding of basic vocabulary is a must. Simple neural network is the essential concept of machine learning, and the aspirants who are willing to make their career in data science or machine learning should know these terminologies. In this post, we would see some of the important terms necessary to understand simple neural network. Neurons: A neuron refers to a unit that carries information in a neural network. Also, the neural network works as a mathematical operation that takes its input, multiples it by its weight. It then passes the sum through the activation function to the other neurons. Perceptron: In machine learning, the perceptron works on an algorithm for supervised learning of binary classifiers. A binary classifier is a function that works to decide whether or not an input, represented by a vector of numbers, belong to some specific classes. Activation function: This function is used while passing the information through a neuron.
Sigmoid: A sigmoid function is a mathematical function having a feature, an S-shaped curve, or a sigmoid curve. Theses curves range between 0 and 1. Neurons are sometimes referred to as “Sigmoid neurons,” which means these neurons are used the sigmoid activation function. Tanh: This is a hyperbolic tangent function that ranges between -1 and 1. Also, it is the hyperbolic analogue of tan circular function used throughout trigonometry. Tanh is mathematically defined as the ratio of the corresponding hyperbolic sine and hyperbolic cosine function. Rectifier Linear Unit (ReLU): Rectifier Linear Unit also refers as Rectifier. It is an activation function defined as the positive part of the argument where x is the input of the neuron. Ramp function is another term used for the Rectifier. Tensor: Tensor establish the connection between two neurons that are in sequential lawyers.
Cost Function: Cost function represents the performance of a neural network. It also depends on variables such as weight and biases. A cost function represents a single value, not a vector, because it rates the overall good work of a neural network and what changes could be made as per the working. Mean Squared Error: During the working of the neural network, the output is predicted. Later, it compares with the actual output, and the errors are sent back to the nodes. The process is called backpropagation. Here, mean squared error is a method to compare the errors. The above list doesn’t consist of all the necessary terms related to the simple neural network, research for more though the Google. Understand the terminologies before actually learning the data science or about neural networks.