Connectionists: Program for SNL 2017 (Nagoya) July 7-8

David McAllester mcallester at ttic.edu
Fri May 19 12:28:41 EDT 2017


The program is now available for the conference on Symbolic-Neural Learning
<http://www.ttic.edu/SNL2017/> (SNL) to be held in Nagoya Japan July 7-8.

Registration deadline: Friday, June 9, 2017

Program:

First International Workshop on Symbolic-Neural Learning (July 7-8)
Nagoya Congress Center, Japan

(Friday, July 7, 2017)

13:00-13:10 Opening
13:10-14:10 Keynote talk I (invited): Yoshua Bengio
14:10-14:30 Coffee break
14:30-15:30 Keynote talk II: Masashi Sugiyama
15:30-15:40 Break
15:40-17:20 Session 1: Computer Vision
Weihua Hu, Takeru Miyato, Seiya Tokui, Eiichi Matsumoto and Masashi
Sugiyama, Learning Discrete Representations via Information Maximizing
Self-Augmented Training
Ruotian Luo and Gregory Shakhnarovich, Comprehension-guided referring
expressions
Tristan Hascoet, Yasuo Ariki and Tetsuya Takiguchi, Semantic Web and
Zero-Shot Learning of Large Scale Visual Classes.
Bowen Shi, Taehwan Kim, Jonathan Keane, Weiran Wang, Hao Tang, Gregory
Shakhnarovich, Diane Brentari and Karen Livescu, Neural Models for
Fingerspelling Recognition from Video

18:00-20:00 Banquet

(Saturday, July 8, 2017)

9:00-10:00 Keynote talk III (invited): William Cohen
10:00-10:20 Coffee break
10:20-12:00 Session 2: Speech & Language
Andrea F. Daniele, Mohit Bansal and Matthew Walter, Navigational
Instruction Generation as Inverse Reinforcement Learning with Neural
Machine Translation
Kurt Junshean Espinosa, Riza Batista-Navarro and Sophia Ananiadou, 5Ws:
What Went Wrong with Word embeddings
Shubham Toshniwal, Hao Tang, Liang Lu and Karen Livescu, Multitask Learning
with Low-Level Auxiliary Tasks for Encoder-Decoder Based Speech Recognition
ohn Wieting, Mohit Bansal, Kevin Gimpel and Karen Livescu, Neural
Architectures for Modeling Compositionality in Natural Language

12:00-13:30 Lunch break
13:30-14:30 Keynote talk IV: Jun-ichi Tsujii
14:30-15:30 Poster Session
Matthew Holland and Kazushi Ikeda, Variance reduction using biased gradient
estimates
Zhenghang Cui, Issei Sato and Masashi Sugiyama, Stochastic Divergence
Minimization for Biterm Topic Model
Yin Jun Phua, Sophie Tourret and Katsumi Inoue, Learning Logic Program
Representation from Delayed Interpretation Transition Using Recurrent
Neural Networks
Haiping Huang, Alireza Goudarzi and Taro Toyoizumi, Combining DropConnect
and Feedback Alignment for Efficient Regularization in Deep Networks
Jen-Tzung Chien and Ching-Wei Huang, Variational and Adversarial Domain
Adaptation
Miki Ueno, Comic Book Interpretation based on Deep Neural Networks
Nicholas Altieri, Sherdil Niyaz, Samee Ibraheem and John Denero, Improved
Word and Symbol Embedding for Part-of-Speech Tagging
Marc Evrard, Makoto Miwa and Yutaka Sasaki, TTI's Approaches to
Symbolic-Neural Learning*

15:30-17:35 Session 3: New Learning Approach
Tomoya Sakai, Marthinus Christoffel Du Plessis, Gang Niu and Masashi
Sugiyama, Semi-Supervised Classification based on Positive-Unlabeled
Classification
Han Bao, Masashi Sugiyama and Issei Sato, Multiple Instance Learning from
Positive and Unlabeled Data
Tetsuya Hada, Akifumi Okuno and Hidetoshi Shimodaira, Deep Multi-view
Representation Learning Based on Adaptive Weighted Similarity
Makoto Yamada, Koh Takeuchi, Tomoharu Iwata, John Shawe-Taylor and Samuel
Kaski, Localized Lasso for High-Dimensional Regression
Tim Rocktaschel and Sebastian Riedel, End-to-end Differentiable Proving
17:40-17:50 Closing
---------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20170519/c11d8987/attachment.html>


More information about the Connectionists mailing list