Perceptron Algorithm for NOT Logic Gate


Introduction

Within the domain of artificial intelligence and machine learning, the Perceptron Algorithm holds a special put as one of the foundational building blocks. Although it could seem basic in comparison to present−day complex neural networks, understanding the Perceptron Algorithm is basic because it shapes the premise for many modern learning techniques. In this article, we are going to investigate the Perceptron Algorithm with a center on its application to the NOT logic gate. We are going to dig into the hypothesis behind the algorithm, its components, and how it can be used to implement the logical NOT operation.

The Logical NOT Gate

Sometime recently we dive into the execution of the Perceptron Algorithm, let's briefly return to the NOT logic gate. The NOT gate could be an essential logic gate that takes a single binary input and produces the inverse output.

Input A

Output

0

1

1

0

Implementation of NOT Logic Gate

Executing NOT Gate using Perceptron: Presently, let's apply the Perceptron Algorithm to actualize the NOT logic gate. As specified earlier, the NOT gate takes a single binary input, so we have one input (x) and one weight (w) related to it. We moreover have an inclination term (b) for moving the decision boundary. We can summarize the usage of the Perceptron Calculation for the NOT gate as follows.

Algorithm Steps

Step 1 :Initialize the Perceptron, begin by setting up the beginning parameters for the perceptron. This incorporates characterizing the weights and bias.

Step 2 :Characterize the Activation Function, this step function makes a difference and changes over the yield of the perceptron into a binary value (0 or 1).

Step 3 :Calculate the Yield of the Perceptron, and utilize the formula:

Yield = (input * weight) + bias.

Step 4 :Train the Perceptron, it involves updating the weights and inclination iteratively based on the algorithm's learning process.

Step 5 :Test the Perceptron, make sure to use the step function to induce the final binary output.

Example

import numpy as np

def step_function(x):
    return 1 if x >= 0 else 0

class PerceptronNOT:
    def __init__(self):
        self.weight = -1.0  
        self.bias = 0.5    

    def predict(self, input_data):
        summation = input_data * self.weight + self.bias
        return step_function(summation)

    def train(self, input_data, target_output, learning_rate=0.1, epochs=100):
        for epoch in range(epochs):
            total_error = 0
            for input_val, target in zip(input_data, target_output):
                prediction = self.predict(input_val)
                error = target - prediction
                total_error += abs(error)
                self.weight += learning_rate * error * input_val
                self.bias += learning_rate * error
            if total_error == 0:
                break


input_data = np.array([0, 1])


target_output = np.array([1, 0])


not_gate = PerceptronNOT()
not_gate.train(input_data, target_output)


print("Testing Perceptron NOT gate:")
for input_val in input_data:
    output = not_gate.predict(input_val)
    print(f"Input: {input_val}, Output: {output}")

Output

Testing Perceptron NOT gate:
Input: 0, Output: 1
Input: 1, Output: 0

The Perceptron Algorithm

The Perceptron Algorithm could be a simple however foundational supervised learning calculation utilized for binary classification assignments. Proposed by Frank Rosenblatt in 1957, it imitates the working of a single neuron in a biological brain. The calculation is especially well−suited for straightly distinct datasets, where two classes can be unmistakably isolated by a straight line.

The calculation begins by initializing the weights and biases to small arbitrary values. These weights are duplicated by the input highlights, and the bias is included to compute the yield. The activation function, ordinarily a step work, decides whether the yield neuron fires or not.

Amid preparing, the perceptron iteratively overhauls its weights and inclination based on the prediction errors, employing a learning rate as a scaling calculation for upgrades.

Despite its effortlessness, the Perceptron Algorithm has cleared the way for more modern neural network structures, laying the groundwork for cutting−edge profound learning strategies.

The Perceptron Algorithm could be an administered learning calculation utilized for binary classification issues. It is based on the concept of a disentangled artificial neuron called a perceptron. The perceptron takes different inputs, each duplicated by its individual weight, and produces a binary yield based on whether the weighted whole of inputs crosses a certain threshold or not.

Conclusion

In conclusion, the Perceptron Algorithm is a principal concept in machine learning, and understanding it is significant for getting a handle on more progressed procedures. By implementing the Perceptron Calculation for the NOT logic gate, we are able to watch how a simple show can be prepared to imitate the behavior of an essential coherent operation. Through its verifiable centrality and hypothetical underpinnings, the Perceptron Algorithm remains a foundation in the ever−expanding field of artificial intelligence and machine learning.

Updated on: 28-Jul-2023

473 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements