Article Categories
- All Categories
-
Data Structure
-
Networking
-
RDBMS
-
Operating System
-
Java
-
MS Excel
-
iOS
-
HTML
-
CSS
-
Android
-
Python
-
C Programming
-
C++
-
C#
-
MongoDB
-
MySQL
-
Javascript
-
PHP
-
Economics & Finance
Program to count minimum number of operations to flip columns to make target in Python
Suppose we have a matrix M and a target matrix T with the same number of rows and columns. We can perform an operation where we flip a particular column in the matrix so that all 1s become 0s and all 0s become 1s. If we can reorder the matrix rows for free, we need to find the minimum number of operations required to turn M into T. If there is no solution, return -1.
Problem Example
Given matrices M and T ?
Matrix M:
| 0 | 0 |
| 1 | 0 |
| 1 | 1 |
Target T:
| 0 | 1 |
| 1 | 0 |
| 1 | 1 |
The output will be 1. First, we reorder the rows of M to:
| 0 | 0 |
| 1 | 1 |
| 1 | 0 |
Then flip column 1 to get the target matrix.
Algorithm Approach
To solve this problem, we follow these steps ?
- Convert each row to a binary number for easier comparison
- For each possible row mapping, calculate the XOR pattern needed
- Count the number of column flips required (number of 1s in XOR pattern)
- Return the minimum operations needed
Implementation
from collections import Counter
class Solution:
def solve(self, matrix, target):
# Convert rows to binary numbers
nums1 = []
nums2 = []
# Convert matrix rows to binary numbers
for row in matrix:
row_copy = row.copy() # Don't modify original
binary_num = 0
while row_copy:
binary_num = (binary_num << 1) + row_copy.pop()
nums1.append(binary_num)
# Convert target rows to binary numbers
for row in target:
row_copy = row.copy() # Don't modify original
binary_num = 0
while row_copy:
binary_num = (binary_num << 1) + row_copy.pop()
nums2.append(binary_num)
min_operations = float('inf')
# Try each row as the first row mapping
for num in nums1:
counts = Counter(nums1)
counts[num] -= 1
# Calculate XOR pattern needed
xor_pattern = num ^ nums2[0]
# Check if we can match all other rows
for i in range(1, len(nums2)):
needed = xor_pattern ^ nums2[i]
if not counts[needed]:
break
counts[needed] -= 1
else:
# Count number of column flips needed
min_operations = min(min_operations, bin(xor_pattern).count('1'))
return min_operations if min_operations != float('inf') else -1
# Test the solution
solution = Solution()
M = [
[0, 0],
[1, 0],
[1, 1]
]
T = [
[0, 1],
[1, 0],
[1, 1]
]
print(solution.solve(M, T))
1
How It Works
The algorithm converts each row to a binary number representation. It then tries each possible mapping of the first row and calculates the XOR pattern needed to transform the matrix rows to match the target. The XOR pattern tells us which columns need to be flipped ? each bit position with value 1 indicates a column that needs flipping.
The solution uses a Counter to efficiently check if all required row transformations are possible with the available rows in the matrix.
Conclusion
This solution efficiently finds the minimum column flips needed by converting rows to binary numbers and using XOR operations to determine flip patterns. The time complexity is O(n²) where n is the number of rows.
