<div dir="ltr"><span id="docs-internal-guid-b2a10cbe-9a92-4d2f-f5d9-c1794e7ba070"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">NIPS WORKSHOP 2015 CALL FOR ABSTRACTS</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"><span class="Apple-tab-span" style="white-space:pre">   </span></span><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Statistical Methods for Understanding Neural Systems</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"><span class="Apple-tab-span" style="white-space:pre">     </span></span><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Friday, December 11th, 2015</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"><span class="Apple-tab-span" style="white-space:pre">      </span></span><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Montreal, Canada</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">--------</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Organizers:  Allie Fletcher  Jakob Macke   Ryan Adams  Jascha Sohl-Dickstein</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">--------</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Overview:</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap">Recent advances in neural recording technologies, including calcium imaging and high-density electrode arrays, have made it possible to simultaneously record neural activity from large populations of neurons for extended periods of time. These developments promise unprecedented insights into the collective dynamics of neural populations and thereby the underpinnings of brain-like computation. However, this new large-scale regime for neural data brings significant methodological challenges. This workshop seeks to explore the statistical methods and theoretical tools that will be necessary to study these data, build new models of neural dynamics, and increase our understanding of the underlying computation. We have invited researchers across a range of disciplines in statistics, applied physics, machine learning, and both theoretical and experimental neuroscience, with the goal of fostering interdisciplinary insights. We hope that active discussions among these groups can set in motion new collaborations and facilitate future breakthroughs on fundamental research problems.</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap">Call for Papers</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap">We invite high quality submissions of extended abstracts on topics including, but not limited to, the following fundamental questions:</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-style:italic;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">How can we deal with incomplete data in a principled manner?</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">In most experimental settings, even advanced neural recording methods can only sample a small fraction of all neurons that might be involved in a task, and the observations are often indirect and noisy. As a result, many recordings are from neurons that receive inputs from neurons that are not themselves directly observed, at least not over the same time period. How can we deal with this `missing data' problem in a principled manner? How does this sparsity of recordings influence what we can and cannot infer about neural dynamics and mechanisms?</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-style:italic;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">How can we incorporate existing models of neural dynamics into neural data analysis?</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Theoretical neuroscientists have intensely studied neural population dynamics for decades, resulting in a plethora of models of neural population dynamics. However, most analysis methods for neural data do not directly incorporate any models of neural dynamics, but rather build on generic methods for dimensionality reduction or time-series modelling. How can we incorporate existing models of neural dynamics? Conversely, how can we design neural data analysis methods such that they explicitly constrain models of neural dynamics?</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-style:italic;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">What synergies are there between analyzing biological and artificial neural systems?</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">The rise of ‘deep learning’ methods has shown that hard computational problems can be solved by machine learning algorithms that are built by cascading many nonlinear units. Although artificial neural systems are fully observable, it has proven challenging to provide a theoretical understanding of how they solve computational problems and which features of a neural network are critical for its performance. While such ‘deep networks’ differ from biological neural networks in many ways, they provide an interesting testing ground for evaluating strategies for understanding neural processing systems. Are there synergies between analysis methods for analyzing biological and artificial neural systems? Has the resurgence of deep learning resulted in new hypotheses or strategies for trying to understand biological neural networks?</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Confirmed Speakers</span><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">:</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Matthias Bethge</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Mitya Chklovskii</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">John Cunningham</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Surya Ganguli</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Neil Lawrence</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Guillermo Sapiro</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Tatyana Sharpee</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Richard Zemel</span></p><br><p dir="ltr" style="line-height:1.656;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Workshop Website:</span><a href="http://users.soe.ucsc.edu/%7Eafletcher/hdnips2013.html" style="text-decoration:none"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent"> </span><span style="font-size:13.3333px;font-family:Arial;color:rgb(17,85,204);font-weight:700;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">h</span></a><a href="https://users.soe.ucsc.edu/%7Eafletcher/neuralsysnips.html" style="text-decoration:none"><span style="font-size:13.3333px;font-family:Arial;color:rgb(17,85,204);font-weight:700;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">ttps://users.soe.ucsc.edu/~afletcher/neuralsysnips.html</span></a></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Email :</span><span style="font-size:13.3333px;font-family:Arial;color:rgb(56,106,42);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:rgb(226,250,228)"> </span><span style="font-size:13.3333px;font-family:Arial;color:rgb(56,106,42);font-weight:700;text-decoration:underline;vertical-align:baseline;white-space:pre-wrap;background-color:transparent"><a href="mailto:smnips2015@rctn.org">smnips2015@rctn.org</a></span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Submission details</span><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">:</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Submissions should be in the NIPS_2015 format (</span><a href="http://nips.cc/Conferences/2015/PaperInformation/StyleFiles" style="text-decoration:none"><span style="font-size:13.3333px;font-family:Arial;color:rgb(17,85,204);text-decoration:underline;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">http://nips.cc/Conferences/2015/PaperInformation/StyleFiles</span></a><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent">) with a maximum of four pages, not including references. Submissions will be considered both for poster and oral presentation. Submit at </span><a href="https://cmt.research.microsoft.com/SMN2015/Protected/Author/" style="text-decoration:none"><span style="font-size:13.3333px;font-family:Arial;color:rgb(17,85,204);text-decoration:underline;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">https://cmt.research.microsoft.com/SMN2015/Protected/Author/</span></a><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"> or via the link on the workshop website.</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);font-weight:700;vertical-align:baseline;white-space:pre-wrap;background-color:transparent">Important dates:</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"> Submission deadline: 10 October, 2015 11:59 PM PDT (UTC -7 hours)</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:13.3333px;font-family:Arial;color:rgb(0,0,0);vertical-align:baseline;white-space:pre-wrap;background-color:transparent"> Acceptance notification: 24 October, 2015</span></p><br></span></div>