You're tasked with dividing an array into subarrays in the most cost-efficient way possible. This is a strategic optimization problem where the order and size of divisions significantly impact the total cost.
Given two integer arrays nums and cost of equal size, and an integer k, you need to divide nums into consecutive subarrays. Each division comes with a cost calculated using this formula:
Cost of subarray i = (sum of nums elements + k × i) × (sum of corresponding cost elements)
Where i represents the subarray's position (1st, 2nd, 3rd, etc.). The challenge is to find the minimum total cost across all possible ways to divide the array.
For example, if you divide [1,2,3] into [1] and [2,3], the first subarray gets multiplied by the penalty factor k×1, while the second gets k×2, making later subarrays more expensive.
Input & Output
Constraints
- 1 ≤ nums.length ≤ 1000
- nums.length = cost.length
- 1 ≤ nums[i], cost[i] ≤ 100
- 0 ≤ k ≤ 103
- All arrays must be divided into at least one subarray