1

I want to build an array which will contain arrays of increasing sizes based on the current values of array.

for example with

 current_array [100,33]  

and

 limit = (n//10) 

with n=current_array (thus limit_array=[10,3]

I want my output_array to be: [[1,...,10],[1,2,3]]

I wanted to avoid for loops; so I wanted to use arange like that:

 output_array=np.arange(current_array, limit_array, 2,dtype=I)

I understand this is not possible to do that since first two arguments for arange are floats only, but then how would you do that?

subsidiary questions:

1) I am not sure numpy can handle arrays of different sizes. If it is not, I can do this with a array array of arrays (https://docs.python.org/3/library/array.html). But I have to do an array multiplication of this array of array.
So will this be slower than array.arrays processed together? Or should I definitely find another solution?

2) As I said, I have a third np.array [1,2,3] I have to multiply the previous one with.
Will I obtain something like
[[[1,...,10]*1,[1,2,3]],[[1,...,10]*2,[2,4,6]],[[1,...,10]*3,[3,6,9]]] ?

Edit: I also came up with

 result_array=np.array()
 result_array=np.append( np.arange(current_array, currentlimit, 1) for  currentlimit in limit_array)

but not sure it can work

4
  • 1
    Why not a list of arrays? And stay away from np.append. Commented Sep 15, 2015 at 13:01
  • I want to use an array of array to be able to perform on it array operations on the nested array, which would be impossible with a list of array. Besides, why is np.append taboo? Commented Sep 15, 2015 at 16:09
  • 1
    But only a subset of array operations work on nested arrays, and ones that do work might not be any faster than list comprehensions. np.append is just a frontend for np.concatenate and is often misused by new users. Commented Sep 15, 2015 at 16:39
  • This is something to test, indeed, but I would like to stay with uniform collection type. Thanks for the pointer to np.concatenate Commented Sep 15, 2015 at 17:06

1 Answer 1

1

I do not think you can avoid for loops as such, but you can use list comprehension and np.arange() , which should be a bit faster than normal for loops. Example -

np.array([np.arange(1, x+1) for x in limit_array])

Demo -

In [34]: import numpy as np

In [35]: ca = np.array([100,33])

In [39]: na = ca // 10

In [40]: na
Out[40]: array([10,  3], dtype=int32)

In [47]: result_array = np.array([np.arange(1, x+1) for x in na])

In [48]: result_array
Out[48]: array([array([ 1,  2,  3,  4,  5,  6,  7,  8,  9, 10]), array([1, 2, 3])], dtype=object)

You can use list comprehension for your second usecase as well -

In [55]: new_arr = np.array([1,2,3])

In [56]: new_result_array = np.array([result_array * x for x in new_arr])

In [57]:

In [57]: new_result_array
Out[57]:
array([[array([ 1,  2,  3,  4,  5,  6,  7,  8,  9, 10]), array([1, 2, 3])],
       [array([ 2,  4,  6,  8, 10, 12, 14, 16, 18, 20]), array([2, 4, 6])],
       [array([ 3,  6,  9, 12, 15, 18, 21, 24, 27, 30]), array([3, 6, 9])]], dtype=object)
Sign up to request clarification or add additional context in comments.

2 Comments

Ok; that make sense. In my edit I came up with a similar idea with "append" function, but yours seems to be simpler. Anyway, I am not sure how to proceed if I wanted to remove some elements from the nested at generation time. But I will work from this. Thanks
I think that some of these answers stackoverflow.com/questions/4151128/… could also help people. But there is one drawback to the method we came up with, it's that we can't do array multiplication element wise.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.