Perceptron Algorithm for OR Logic Gate with 2-bit Binary Input


Introduction

The field of artificial intelligence has made noteworthy strides in human intelligence through different algorithms and models. Among these, the Perceptron Algorithm stands as an essential building piece of neural networks, imitating the behavior of a single neuron within the human brain. In this article, we dive into the intricacies of the Perceptron Algorithm and illustrate its application in solving the OR logic gate problem with 2−bit binary input. By comprehending this simple yet capable algorithm, ready to open the potential of more complex neural networks utilized in today's AI landscape.

The calculation is especially well−suited for straightly distinct datasets, where two classes can be unmistakably isolated by a straight line. The calculation begins by initializing the weights and bias to small arbitrary values. These weights are duplicated by the input highlights, and the bias is included to compute the yield. The activation function, ordinarily a step work, decides whether the yield neuron fires or not.

Amid preparing, the perceptron iteratively overhauls its weights and inclination based on the prediction errors, employing a learning rate as a scaling calculate for upgrades. The method proceeds until the calculation focalizes, i.e., the weights and inclination have balanced to accurately map the input−output sets.

Despite its effortlessness, the Perceptron Algorithm has cleared the way for more modern neural network structures, laying the groundwork for cutting edge profound learning strategies.

The Perceptron Algorithm could be an administered learning calculation utilized for binary classification issues. It is based on the concept of a disentangled artificial neuron called a perceptron. The perceptron takes different inputs, each duplicated by its individual weight, and produces a binary yield based on whether the weighted whole of inputs crosses a certain threshold or not.

Understanding the OR Logic Gate

The truth table for the OR gate with 2−bit binary inputs is as follows:

Input A

Input B

Output

0

0

0

0

1

1

1

0

1

1

1

1

Implementation of OR Logic Gate

To apply the Perceptron Algorithm, we require a dataset that speaks to the OR logic gate's behavior. For the OR gate, the input combinations and their comparing outputs are as follows:

Algorithm Steps

Step 1: Initialize weights and biasAt first, the weights and bias can be set to random values or initialized to zero.

Step 2: Calculate the weighted sumFor each input information point, calculate the weighted entirety of inputs utilizing the equation.

Step 3: Apply the activation function, using the calculated weighted sum, apply the actuation function to decide the output of the perceptron.

Step 4: Upgrade weights and bias Compare the anticipated yield with the actual output from the dataset and alter the weights and bias to decrease the mistake.

Step 5: Rehash the over steps for numerous epochs or until the calculation focalizes to a solution.

Example

import numpy as np


def step_function(x):
    return 1 if x >= 0 else 0


class PerceptronOR:
    def __init__(self, input_size):
        self.weights = np.random.rand(input_size)
        self.bias = np.random.rand()

    def predict(self, inputs):
        summation = np.dot(inputs, self.weights) + self.bias
        return step_function(summation)

    def train(self, inputs, target_output, learning_rate=0.1, epochs=100):
        for epoch in range(epochs):
            total_error = 0
            for input_data, target in zip(inputs, target_output):
                prediction = self.predict(input_data)
                error = target - prediction
                total_error += abs(error)
                self.weights += learning_rate * error * input_data
                self.bias += learning_rate * error
            if total_error == 0:
                break


inputs = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
target_output = np.array([0, 1, 1, 1])


or_gate = PerceptronOR(input_size=2)
or_gate.train(inputs, target_output)


print("Testing Perceptron OR gate:")
for input_data in inputs:
    output = or_gate.predict(input_data)
    print(f"Input: {input_data}, Output: {output}")

Output

Testing Perceptron OR gate:
Input: [0 0], Output: 0
Input: [0 1], Output: 1
Input: [1 0], Output: 1
Input: [1 1], Output: 1

Conclusion

The Perceptron Algorithm may be a fundamental concept in machine learning and neural systems. In this article, we investigated the basics of the Perceptron Algorithm and its application to unravel the OR logic gate problem with 2−bit binary input. By training a basic perceptron, we were able to create exact predictions on the OR gate's behavior. Though the Perceptron Algorithm has its restrictions, such as its failure to handle non−linearly separable information, it cleared the way for more modern neural organize designs like multilayer perceptrons (MLPs) and deep learning models. As AI proceeds to advance, understanding the Perceptron Algorithm and its variations is basic for getting a handle on the basic principles of artificial neural systems.

Updated on: 28-Jul-2023

294 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements