0

I need an OpenCV function for python which allows cropping an image for given dimensions.

I am using cv2.UMat() for loading images onto GPU and processing there itself. So, below index slicing option won't work on UMat objects.

image[CROP_DIMS[0]:-CROP_DIMS[1], CROP_DIMS[2]:-CROP_DIMS[3]])

To optimize and take full benefits of GPU, I need OpenCV function to crop an image using OpenCV Function, else I will have to get the image from GPU and again load onto GPU once cropped.

Versions : Python - 3.6.4 OpenCV - 3.4.1

I have already searched on stack overflow and google for same but didn't find any suitable matching answer. Either everyone is pointing to index slicing or functions from previous old opencv versions.

0

1 Answer 1

2

Try this:

import cv2
img = cv2.imread("input.png")

crop_img = img[y:y+h, x:x+w]

cv2.imshow("cropped", crop_img)
cv2.waitKey(0)

Sign up to request clarification or add additional context in comments.

6 Comments

I am already using this approach but for that, I will have to retrieve the image from GPU, crop it and again load it to GPU. This is way too expensive. So, I am searching for some OpenCV function for same.
can't you do that in GPU side itself?
Of Course not! cv2.UMat object doesn't allow index slicing.
How would I know that? You haven’t mentioned anything relevant to that. This answer is go along with what you have asked. Update your question in the way that we can understand what you really want.
Updated. But I just simply asked for an OpenCV function to crop an image. It's not what I have asked. It's the same thing which I have mentioned as not working and finding an OpenCV alternative to it.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.