Prediction of Wine type using Deep Learning

In recent years, deep learning has gained significant attention for its ability to analyze complex datasets and make accurate predictions. One intriguing application is the prediction of wine types based on various chemical attributes. By leveraging the power of deep learning algorithms, researchers have been able to develop models capable of classifying wines with high accuracy.

This article explores the use of deep learning techniques, such as neural networks, to predict wine types using attributes like alcohol content, acidity, and phenolic compounds. By harnessing the potential of deep learning, wine producers and enthusiasts can enhance their decision-making processes and improve wine quality assessment, ultimately leading to better customer satisfaction and industry advancements.

Dataset Overview

The wine dataset contains 178 samples with 13 chemical features like alcohol content, malic acid, ash, alkalinity, magnesium, phenols, flavonoids, and others. The target variable represents three different wine types (classes 0, 1, 2) based on chemical composition.

Implementation Steps

Step 1: Import Required Libraries

We need pandas for data manipulation, NumPy for numerical operations, TensorFlow/Keras for building the neural network, and scikit-learn for preprocessing ?

import pandas as pd
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import LabelEncoder, StandardScaler
from sklearn.datasets import load_wine

# Load the wine dataset (built-in scikit-learn dataset)
wine_data = load_wine()
X = wine_data.data
y = wine_data.target

print(f"Dataset shape: {X.shape}")
print(f"Target classes: {np.unique(y)}")
Dataset shape: (178, 13)
Target classes: [0 1 2]

Step 2: Data Analysis and Preprocessing

Let's examine the dataset structure and prepare it for training ?

# Dataset information
num_features = X.shape[1]
num_classes = len(np.unique(y))
num_samples = X.shape[0]

print(f"Number of samples: {num_samples}")
print(f"Number of features: {num_features}")
print(f"Number of classes: {num_classes}")
print(f"Feature names: {wine_data.feature_names[:5]}...")  # First 5 features
Number of samples: 178
Number of features: 13
Number of classes: 3
Feature names: ['alcohol', 'malic_acid', 'ash', 'alcalinity_of_ash', 'magnesium']...

Step 3: Train-Test Split and Feature Scaling

Split the data and standardize features for better neural network performance ?

# Split the dataset
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Standardize the features
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)

print(f"Training set shape: {X_train_scaled.shape}")
print(f"Test set shape: {X_test_scaled.shape}")
Training set shape: (142, 13)
Test set shape: (36, 13)

Step 4: Build and Train the Neural Network

Create a deep learning model with multiple dense layers for wine classification ?

# Create the neural network model
model = Sequential([
    Dense(64, activation='relu', input_shape=(num_features,)),
    Dense(32, activation='relu'),
    Dense(num_classes, activation='softmax')
])

# Compile the model
model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy', 
              metrics=['accuracy'])

# Train the model
history = model.fit(X_train_scaled, y_train, 
                   batch_size=16, 
                   epochs=100, 
                   validation_split=0.2, 
                   verbose=0)

print("Training completed!")
print(f"Final training accuracy: {history.history['accuracy'][-1]:.4f}")
print(f"Final validation accuracy: {history.history['val_accuracy'][-1]:.4f}")
Training completed!
Final training accuracy: 1.0000
Final validation accuracy: 0.9643

Step 5: Model Evaluation and Prediction

Evaluate the model performance and make predictions on new data ?

# Evaluate on test set
test_loss, test_accuracy = model.evaluate(X_test_scaled, y_test, verbose=0)
print(f"Test Loss: {test_loss:.4f}")
print(f"Test Accuracy: {test_accuracy:.4f}")

# Make predictions on test set
predictions = model.predict(X_test_scaled, verbose=0)
predicted_classes = np.argmax(predictions, axis=1)

print(f"Sample predictions: {predicted_classes[:5]}")
print(f"Actual labels: {y_test[:5]}")

# Predict wine type for a new sample
new_sample = X_test_scaled[0:1]  # Use first test sample
prediction = model.predict(new_sample, verbose=0)
predicted_class = np.argmax(prediction)
confidence = np.max(prediction)

print(f"Predicted wine type: {predicted_class}")
print(f"Confidence: {confidence:.4f}")
Test Loss: 0.0423
Test Accuracy: 1.0000
Sample predictions: [1 0 1 0 1]
Actual labels: [1 0 1 0 1]
Predicted wine type: 1
Confidence: 0.9999

Model Architecture

The neural network consists of three layers ?

  • Input Layer: 64 neurons with ReLU activation
  • Hidden Layer: 32 neurons with ReLU activation
  • Output Layer: 3 neurons with softmax activation (for 3 wine types)

Key Benefits

  • High Accuracy: Achieves near-perfect classification on the wine dataset
  • Feature Learning: Automatically learns important chemical patterns
  • Scalability: Can handle larger datasets with more wine varieties
  • Real-time Prediction: Fast inference for new wine samples

Conclusion

Deep learning proves highly effective for wine type prediction, achieving excellent accuracy by learning complex patterns in chemical composition data. This approach enables wine producers to automate quality assessment and classification processes, leading to improved decision-making and enhanced wine production efficiency.

Updated on: 2026-03-27T09:47:46+05:30

377 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements