Working with torch in autonlab

Hanqi Sun hanqis at andrew.cmu.edu
Wed Oct 4 15:59:37 EDT 2017


Hi Eti,

I installed and used Lua Torch last year. You can simply follow the
instructions on http://torch.ch/docs/getting-started.html
<http://torch.ch/docs/getting-started.html>and ignore the "bash
install-deps" command as we already have all the dependencies on GPU
machines.

It worked for me last year but I am not sure whether it still works now.

Best,
Hanqi

On Wed, Oct 4, 2017 at 3:51 PM, Chirag Nagpal <chiragn at cs.cmu.edu> wrote:

> Try this command on the GPU
>
> $python2.7 -c "import torch; print torch.cuda.is_available()"
>
> This is assuming that you are talking of Pytorch and not lua torch.
>
> Chirag
>
>
> On Wed, Oct 4, 2017 at 3:44 PM, Eti Rastogi <erastogi at andrew.cmu.edu>
> wrote:
>
>> Hello all
>>
>> I am new to Auton lab. I wanted to know if anyone is using Torch on
>> autonlab GPU machines? I can see that torch is not supported on RedHat
>> distribution, so I was wondering if anybody has found any workaround for
>> this?
>>
>> Regards
>> Eti Rastogi
>>
>
>
>
> --
>
> *Chirag Nagpal* Graduate Student, Language Technologies Institute
> School of Computer Science
> Carnegie Mellon University
> cs.cmu.edu/~chiragn
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-users/attachments/20171004/c8f0df6e/attachment.html>


More information about the Autonlab-users mailing list