Lua Torch
Manzil Zaheer
manzil at cmu.edu
Mon Mar 26 21:02:18 EDT 2018
Thanks for the detailed analysis. But I am using pytorch. I have not tried Lua torch. Can you please check? Thanks again!
Sent from my Samsung Galaxy smartphone.
-------- Original message --------
From: Predrag Punosevac <predragp at andrew.cmu.edu>
Date: 3/26/18 9:00 PM (GMT-05:00)
To: Manzil Zaheer <manzil at cmu.edu>
Cc: Barnabas Poczos <bapoczos at andrew.cmu.edu>, users at autonlab.org
Subject: Re: Lua Torch
Manzil Zaheer <manzil at cmu.edu> wrote:
> Hi Predrag,
>
> I am not able to use any GPUSs on gpu5,6,7,9. I tried all 3 versions of cuda, but I get the following error:
>
I was able to build it after adding this
export TORCH_NVCC_FLAGS="-D__CUDA_NO_HALF_OPERATORS__"
per
https://github.com/torch/torch7/issues/1086
When I try to run it I get errors that Lua packages are missing (probably
due to my path variables). I have a vague recollection that Simon and I
halped you once with this thing in the past. IIRC it was very picky about
the version of some Lua package and required their version not the one
which comes with yum .
Anyhow I am forwarding this to users at autonlab in hope somebody is using
it and might be of more help. Please stop by NSH 3119 and let us try to
debug this.
Predrag
> THCudaCheck FAIL file=/pytorch/torch/lib/THC/THCGeneral.c line=70 error=30 : unknown error
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "/zfsauton/home/manzilz/local/lib/python3.6/site-packages/torch/cuda/__init__.py", line 384, in _lazy_new
> _lazy_init()
> File "/zfsauton/home/manzilz/local/lib/python3.6/site-packages/torch/cuda/__init__.py", line 142, in _lazy_init
> torch._C._cuda_init()
> RuntimeError: cuda runtime error (30) : unknown error at /pytorch/torch/lib/THC/THCGeneral.c:70
>
> Can you kindly look into it?
>
> Thanks,
> Manzil
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/autonlab-users/attachments/20180327/0bf590eb/attachment.html>
More information about the Autonlab-users
mailing list