0

Is it necessary to convert tensors and models to CUDA with tensor.to in colab when I've chosen runtime type as GPU?

I want use CUDA for training my model

1 Answer 1

1
  • tensor.to(device) transfer data to the given device.
  • Yes, you need to transfer model, input, labels etc to whatever device you are intending to use
Sign up to request clarification or add additional context in comments.

1 Comment

Do note: torch.Tensor.to makes a copy while torch.nn.Module.to is an in-place operation.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.