Explain how the minimum of a scalar function can be found in SciPy using Python?

Finding the minimum of a scalar function is a fundamental optimization problem in scientific computing. SciPy provides several optimization algorithms to find minima efficiently. The scipy.optimize module offers various methods like minimize(), fmin_bfgs(), and others for scalar function optimization.

Example

Let's find the minimum of a scalar function using SciPy's optimization tools ?

import matplotlib.pyplot as plt
from scipy import optimize
import numpy as np

print("The function is defined")

def my_func(a):
    return a**2 + 20 * np.sin(a)

# Create data points for plotting
a = np.linspace(-10, 10, 400)
plt.plot(a, my_func(a))
plt.title('Function: f(x) = x² + 20sin(x)')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.grid(True)
print("Plotting the graph")
plt.show()

# Find minimum using BFGS algorithm
result = optimize.fmin_bfgs(my_func, 0)
print("Minimum found at:", result)
The function is defined
Plotting the graph
Optimization terminated successfully.
         Current function value: -19.208510
         Iterations: 5
         Function evaluations: 21
         Gradient evaluations: 7
Minimum found at: [-1.30644001]

Using scipy.optimize.minimize()

The modern approach uses the unified minimize() function with different methods ?

from scipy.optimize import minimize
import numpy as np

def my_func(x):
    return x**2 + 20 * np.sin(x)

# Using BFGS method
result_bfgs = minimize(my_func, x0=0, method='BFGS')
print("BFGS Method:")
print(f"Minimum at x = {result_bfgs.x[0]:.6f}")
print(f"Function value = {result_bfgs.fun:.6f}")
print(f"Success: {result_bfgs.success}")

# Using Nelder-Mead method
result_nm = minimize(my_func, x0=0, method='Nelder-Mead')
print("\nNelder-Mead Method:")
print(f"Minimum at x = {result_nm.x[0]:.6f}")
print(f"Function value = {result_nm.fun:.6f}")
BFGS Method:
Minimum at x = -1.306440
Function value = -19.208510
Success: True

Nelder-Mead Method:
Minimum at x = -1.306533
Function value = -19.208510

Common Optimization Methods

Method Type Best For
BFGS Gradient-based Smooth functions
Nelder-Mead Direct search Non-smooth functions
Powell Direction set Functions without derivatives
CG Conjugate gradient Large-scale problems

Key Parameters

  • x0 − Initial guess for the minimum
  • method − Optimization algorithm to use
  • options − Dictionary of solver-specific options
  • bounds − Constraints on variable values (for bounded methods)

Conclusion

SciPy's optimize.minimize() function provides a unified interface for scalar function minimization. Choose BFGS for smooth functions or Nelder-Mead for non-smooth cases. Always provide a reasonable initial guess for better convergence.

Updated on: 2026-03-25T13:18:10+05:30

241 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements