我有一个数学问题:假设我使用带有以下命令的 opencv 将图像围绕其中心旋转 30°:
I have a mathematical question: let's suppose I rotate an image around its center by an angle of 30°, using the opencv with the following commands:
M = cv2.getRotationMatrix2D((cols/2,rows/2),30,1)
img_rotate = cv2.warpAffine(img,M,(cols,rows))
如果a取img_rotate的像素(40,40),如何知道原图中对应的像素是什么?
If a take the pixel (40,40) of the img_rotate, how can I know which is the corresponding pixel in the original image?
换句话说,当我将旋转应用于图像时,我获得了转换后的图像.是否有可能获得点之间的映射?比如新图的(x,y)点对应原图的(x',y')点.
in other words, when I apply the rotation to an image I obtain the transformed image. Is there the possibility to obtain the mapping between points? For example the (x,y) point of the new image corresponds to (x',y') point of the original image.
只需使用 仿射变换和逆矩阵.
# inverse matrix of simple rotation is reversed rotation.
M_inv = cv2.getRotationMatrix2D((100/2, 300/2),-30,1)
# points
points = np.array([[35., 0.],
[175., 0.],
[105., 200.],
[105., 215.],
])
# add ones
ones = np.ones(shape=(len(points), 1))
points_ones = np.hstack([points, ones])
# transform points
transformed_points = M_inv.dot(points_ones.T).T
这篇关于图像旋转后如何重新映射点?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!