6

I now have the updated code as follows:

    # Hyperparameters
random_seed = 123
learning_rate = 0.01
num_epochs = 10
batch_size = 128

device = torch.device("cuda:1" if torch.cuda.is_available() else "cpu")

for epoch in range(num_epochs): model = resnet34.train() for batch_idx, (features, targets) in enumerate(train_generator):

    features = features.to(device)
    targets = targets.to(device)
        
    ### FORWARD AND BACK PROP
    logits = model(features)
    cost = torch.nn.functional.cross_entropy(logits, targets)
    optimizer.zero_grad()
    
    cost.backward()
    
    ### UPDATE MODEL PARAMETERS
    optimizer.step()
    
    ### LOGGING
    if not batch_idx % 50:
        print ('Epoch: %03d/%03d | Batch %03d/%03d | Cost: %.4f' 
               %(epoch+1, num_epochs, batch_idx, 
                 len(datagen)//batch_size, cost))

model = model.eval() # eval mode to prevent upd. batchnorm params during inference
with torch.set_grad_enabled(False): # save memory during inference
    print('Epoch: %03d/%03d training accuracy: %.2f%%' % (
          epoch+1, num_epochs, 
          compute_accuracy(model, train_generator)))

When having only one image, the code runs fine. But, when I add another image or more, I get the following:

features = features.to(device)
targets = targets.to(device)
AttributeError: 'numpy.ndarray' object has no attribute 'to'
3
  • Can you show how train_generator was defined? Commented Jun 9, 2022 at 12:32
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. Commented Jun 9, 2022 at 15:18
  • 2
    obviously it's a numpy array, not a pytorch tensor, but you think it's a pytorch tensor, because you call .to(device) on it, which is not possible with numpy arrays, because they aren't pytorch tensors. I hope that was clear. Commented Jun 9, 2022 at 20:39

1 Answer 1

7

It would be nice to see your train_generator code for clarity, but it does not seem to be a torch DataLoader. In this case, you should probably convert your arrays to tensors manually. There are several ways to do so:

  • torch.from_numpy(numpy_array) - for numpy arrays;
  • torch.as_tensor(list) - for common lists and tuples;
  • torch.tensor(array) should also work but the above ways will avoid copying the data when possible.
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.