Working with torch in autonlab

Chirag Nagpal chiragn at cs.cmu.edu
Wed Oct 4 15:51:22 EDT 2017


Try this command on the GPU

$python2.7 -c "import torch; print torch.cuda.is_available()"

This is assuming that you are talking of Pytorch and not lua torch.

Chirag


On Wed, Oct 4, 2017 at 3:44 PM, Eti Rastogi <erastogi at andrew.cmu.edu> wrote:

> Hello all
>
> I am new to Auton lab. I wanted to know if anyone is using Torch on
> autonlab GPU machines? I can see that torch is not supported on RedHat
> distribution, so I was wondering if anybody has found any workaround for
> this?
>
> Regards
> Eti Rastogi
>



-- 

*Chirag Nagpal* Graduate Student, Language Technologies Institute
School of Computer Science
Carnegie Mellon University
cs.cmu.edu/~chiragn
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-users/attachments/20171004/4e56c306/attachment.html>


More information about the Autonlab-users mailing list