I am using Image.point and Image.fromarray to do exactly the same operation on an image, increase the value of all pixels together by the same value. The thing is that i get to absolutelly different images.
using point
def getValue(val):
return math.floor(255*float(val)/100)
def func(i):
return int(i+getValue(50))
out = img.point(func)
using array and numpy
arr = np.array(np.asarray(img).astype('float'))
value = math.floor(255*float(50)/100)
arr[...,0] += value
arr[...,1] += value
arr[...,2] += value
out = Image.fromarray(arr.astype('uint8'), 'RGB')
I am using the same image (a jpg).
the initial image

the image with point

the image with arrays

How can they be so much different?
uint8... what do you want those values to become in the image?