Minimum Cost to Divide Array Into Subarrays - Problem
You are given two integer arrays, nums and cost, of the same size, and an integer k.
You can divide nums into subarrays. The cost of the i-th subarray consisting of elements nums[l..r] is:
(nums[l] + nums[l+1] + ... + nums[r] + k * i) * (cost[l] + cost[l+1] + ... + cost[r])
Note that i represents the order of the subarray: 1 for the first subarray, 2 for the second, and so on.
Return the minimum total cost possible from any valid division.
Input & Output
Example 1 — Basic Division
$
Input:
nums = [1,3,2], cost = [2,3,1], k = 1
›
Output:
24
💡 Note:
Optimal division: [1] | [3] | [2]. First subarray: (1+1*1)*2 = 4. Second subarray: (3+1*2)*3 = 15. Third subarray: (2+1*3)*1 = 5. Total: 4+15+5 = 24.
Example 2 — Single Element
$
Input:
nums = [5], cost = [2], k = 3
›
Output:
16
💡 Note:
Only one way to divide: single subarray [5]. Cost = (5 + 3*1) * 2 = 8 * 2 = 16
Example 3 — All Separate vs Together
$
Input:
nums = [1,1], cost = [1,1], k = 2
›
Output:
10
💡 Note:
Option 1: [1] | [1] gives (1+2*1)*1 + (1+2*2)*1 = 3 + 5 = 8. Option 2: [1,1] gives (1+1+2*1)*2 = 4*2 = 8. Both give same result.
Constraints
- 1 ≤ nums.length ≤ 1000
- nums.length == cost.length
- -1000 ≤ nums[i] ≤ 1000
- 1 ≤ cost[i] ≤ 1000
- 1 ≤ k ≤ 1000
Visualization
Tap to expand
💡
Explanation
AI Ready
💡 Suggestion
Tab
to accept
Esc
to dismiss
// Output will appear here after running code