Find the Perfect break of an array
There is an integer array arr of length n. Two arrays band c are called a perfect break of arr if:
The lengths of arrays b and care n.
Elements in array b are in non-decreasing order.
Elements in array c are in non-increasing order.
They satisfy b ≥ 0, c ≥ 0, and b + c = arr, for each i where 0 ≤ i < n.
Find the number of possible perfect breaks of arr. Since the number can be large, return the value modulo 1000000007, i.e. (10⁹+7).
Example:
Input:
n = 3
arr = [2, 3, 2]
b = [0, 1, 1], c = [2, 2, 1] is a perfect break because it satisfies all 4 conditions.
->b and c have 3 elements
->b is non-decreasing order
->c is in non-increasing order
-> elements of b and c are 0 or more and the sums of a + b = [2, 3, 2], which matches arr.
Similarly,
b=[0,1,2],c=[2,2,0] is a perfect break.
b = [0, 2, 2], c = [2, 1, 0] is a perfect break.
b = [1, 2, 2], c = [1, 1, 0] is a perfect break.
b = [1, 3, 2], c = [1, 0, 0] is not a perfect break because b is not non-decreasing.
b = [1, 2, 2], c = [2, 1, 0] is not a perfect break because b[0] + c[0] != arr[0]