Small knowledge, big challenge! This article is participating in the creation activity of “Essential Tips for Programmers”
describe
You are given two arrays rowSum and colSum of non-negative integers where rowSum[i] is the sum of the elements in the ith row and colSum[j] is the sum of the elements of the jth column of a 2D matrix. In other words, you do not know the elements of the matrix, but you do know the sums of each row and column.
Find any matrix of non-negative integers of size rowSum.length x colSum.length that satisfies the rowSum and colSum requirements.
Return a 2D array representing any matrix that fulfills the requirements. It’s guaranteed that at least one matrix that fulfills the requirements exists.
Example 1:
Input: rowSum = [3,8], colSum = [4,7] Output: [[3,0], [1,7]] Explanation: 0th row: 3 + 0 = 3 == rowSum[0] 1st row: 1 + 7 = 8 == rowSum[1] 0th column: 3 + 1 = 4 == colSum[0] 1st column: 0 + 7 = 7 == colSum[1] The row and column sums match, And all matrix elements are non-negative. Another possible matrix is: [[1,2], [3,5]Copy the code
Example 2:
Input: rowSum = [5,7,10], colSum = [8,6,8]
Output: [[0,5,0],
[6,1,0],
[2,0,8]]
Copy the code
Example 3:
Input: rowSum = [14, 9], colSum =,9,8 [6] Output: [,9,5 [0], [6,0,3]]Copy the code
Example 4:
Input: rowSum = (1, 0], colSum = [1] the Output: [[1], [0]]Copy the code
Example 5:
Input: rowSum = [0], colSum = [0]
Output: [[0]]
Copy the code
Note:
1 <= rowSum.length, colSum.length <= 500
0 <= rowSum[i], colSum[i] <= 10^8
sum(rows) == sum(columns)
Copy the code
parsing
According to the question, is there a matrix, but did not give a specific value of each position, but to list out each row sum rowSum, the sum of each column list colSum, let us deduce a matrix, to meet the sum of the corresponding row and column rowSum, colSum request, You can have more than one of these matrices, and you just have to give a result that satisfies the problem. The idea is relatively simple, using greedy algorithm:
-
Initialize a matrix result with all zeros and rows as rowSum and columns as colSum
-
Iterate over each position in result, because the value of each position must not be greater than min(total of current rows, total of current columns), so you can set it temporarily
result[i][j] = min(rowSum[i], colSum[j] Copy the code
-
Then subtract result[I][j] from the sum of the current row and column to proceed to the next round of numerical calculation
-
The result obtained at the end of the traversal is the result
answer
class Solution(object):
def restoreMatrix(self, rowSum, colSum):
"""
:type rowSum: List[int]
:type colSum: List[int]
:rtype: List[List[int]]
"""
M = len(rowSum)
N = len(colSum)
result = [[0]*N for _ in range(M)]
for i in range(M):
for j in range(N):
result[i][j] = min(rowSum[i], colSum[j])
rowSum[i] -= result[i][j]
colSum[j] -= result[i][j]
return result
Copy the code
The results
Runtime: The linked links in the Python online submissions for finding Valid matrices Given Row and Column Sums. Memory Usage: 18.2 MB, less than 66.00% of Python online submissions for finding Valid Matrix Given Row and Column Sums.Copy the code
Original link: leetcode.com/problems/fi…
Your support is my biggest motivation