<div dir="ltr"><b>TL;DR – <span class="gmail-il" style="margin:0px;padding:0px;border:0px">CFP</span></b><span style="margin:0px;padding:0px;border:0px">:  Neural Networks journal </span><span class="gmail-il" style="margin:0px;padding:0px;border:0px">special</span><span style="margin:0px;padding:0px;border:0px"> </span><span class="gmail-il" style="margin:0px;padding:0px;border:0px">issue</span><span style="margin:0px;padding:0px;border:0px"> on </span><span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span><br><b>GUEST EDITORS</b><span style="margin:0px;padding:0px;border:0px">: </span><font size="2" style="margin:0px;padding:0px;border:0px">Ariel Ruiz-Garcia, Jürgen Schmidhuber, Vasile Palade, Clive Cheong Took, Danilo Mandic</font><br><b>SUBMISSION DEADLINE</b><span style="margin:0px;padding:0px;border:0px">: 30th November [final] 2019</span><br><span style="margin:0px;padding:0px;border:0px">------------------------------</span><span style="margin:0px;padding:0px;border:0px">----</span><br><div id="gmail-m_1760598177216006221gmail-m_6482979249326675986gmail-m_-4593170810047768539m_-978178437002378006:1cv" style="margin:0px;padding:0px;border:0px"><br><b>CALL FOR PAPERS[final reminder]</b>: Neural Networks Journal<br><br>  <font size="4" style="margin:0px;padding:0px;border:0px"><span style="margin:0px;padding:0px;border:0px;color:rgb(7,55,99)"><b><span class="gmail-il" style="margin:0px;padding:0px;border:0px">Special</span> <span class="gmail-il" style="margin:0px;padding:0px;border:0px">Issue</span> on Deep Neural Network Representation and Generative Adversarial Learning</b></span></font><br><br>Generative Adversarial Networks (<span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span>) have proven to be efficient systems for data generation. Their success is achieved by exploiting a minimax learning concept, which has proved to be an effective paradigm in earlier works, such as predictability minimization, in which two networks compete with each other during the learning process. One of the main advantages of <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> over other deep learning methods is their ability to generate new data from noise, as well as their ability to virtually imitate any data distribution. However, generating realistic data using <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> remains a challenge, particularly when specific features are required; for example, constraining the latent aggregate distribution space does not guarantee that the generator will produce an image with a specific attribute. On the other hand, new advancements in deep representation learning (RL) can help improve the learning process in Generative Adversarial Learning (GAL). For instance, RL can help address <span class="gmail-il" style="margin:0px;padding:0px;border:0px">issues</span> such as dataset bias and network co-adaptation, and identify a set of features that are best suited for a given task.<br><br>Despite their obvious advantages and their application to a wide range of domains, <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> have yet to overcome several challenges. They often fail to converge and are very sensitive to parameter and hyper-parameter initialization. Simultaneous learning of a generator and a discriminator network often results in overfitting. Moreover, the generator model is prone to mode collapse, which results in failure to generate data with several variations. Accordingly, new theoretical methods in deep RL and GAL are required to improve the learning process and generalization performance of <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span>, as well as to yield new insights into how <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> learn data distributions.<br><br>This <span class="gmail-il" style="margin:0px;padding:0px;border:0px">special</span> <span class="gmail-il" style="margin:0px;padding:0px;border:0px">issue</span> on Deep Neural Network Representation and Generative Adversarial Learning invites researchers and practitioners to present novel contributions addressing theoretical and practical aspects of deep representation and generative adversarial learning. The <span class="gmail-il" style="margin:0px;padding:0px;border:0px">special</span> <span class="gmail-il" style="margin:0px;padding:0px;border:0px">issue</span> will feature a collection of high quality theoretical articles for improving the learning process and the generalization of generative neural networks. State-of-the-art applications based on deep generative adversarial networks are also very welcome. Topics of interest for this <span class="gmail-il" style="margin:0px;padding:0px;border:0px">special</span> <span class="gmail-il" style="margin:0px;padding:0px;border:0px">issue</span> include, but are not limited to:<br><br>    Representation learning methods and theory;<br>    Adversarial representation learning for domain adaptation;<br>    Network interpretability in adversarial learning;<br>    Adversarial feature learning;<br>    RL and GAL for data augmentation and class imbalance;<br>    New GAN models and new GAN learning criteria;<br>    RL and GAL in classification;<br>    Adversarial reinforcement learning;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for noise reduction;<br>    Recurrent GAN models;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for imitation learning;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for image segmentation and image completion;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for image super-resolution;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for speech and audio processing<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for object detection;<br>    <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for Internet of Things;<br>    RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for image and video synthesis;<br>    RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for speech and audio synthesis;<br>    RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for text to audio or text to image synthesis;<br>    RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> for inpainting and sketch to image;<br>    RL and GAL in neural machine translation;<br>    RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span> in other application domains. <br><br><b>Important Dates:</b><br>    30 November 2019 – Submission deadline <br>    28 February 2020 – First decision notification<br>    30 April 2020 – Revised version deadline<br>    30 June 2020 – Final decision notification<br>    September 2020 – Publication (<span style="margin:0px;padding:0px;border:0px;color:rgb(255,0,0)"><b>papers available online as soon as accepted</b></span>).<br><br><b>Guest Editors:</b><br>Dr Ariel Ruiz-Garcia<br>Coventry University, UK<br>Email: <a href="mailto:ariel.9arcia@gmail.com" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">ariel.9arcia@gmail.com</a><br><br>Professor Jürgen Schmidhuber<br>NNAISENSE,<br>Swiss AI Lab IDSIA,<br>USI & SUPSI, Switzerland<br>Email: <a href="mailto:juergen@idsia.ch" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">juergen@idsia.ch</a><br><br>Professor Vasile Palade<br>Coventry University, UK<br>Email: <a href="mailto:vasile.palade@coventry.ac.uk" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">vasile.palade@coventry.ac.uk</a><br><br>Dr Clive Cheong Took<br>Royal Holloway (University of London), UK<br>Email: <a href="mailto:Clive.CheongTook@rhul.ac.uk" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">Clive.CheongTook@rhul.ac.uk</a><br><br>Professor Danilo Mandic<br>Imperial College London, UK<br>Email: <a href="mailto:d.mandic@imperial.ac.uk" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">d.mandic@imperial.ac.uk</a><br><br><b>Submission Procedure:</b><br><br>Prospective authors should follow the standard author instructions for Neural Networks, and submit manuscripts online at <a href="http://ees.elsevier.com/neunet/" rel="noreferrer" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">http://ees.elsevier.com/neunet/</a>. Authors should select “VSI:RL and <span class="gmail-il" style="margin:0px;padding:0px;border:0px">GANs</span>” when they reach the “Article Type” step and the "Request Editor" step in the submission process.<br><br>For any questions related to the <span class="gmail-il" style="margin:0px;padding:0px;border:0px">special</span> <span class="gmail-il" style="margin:0px;padding:0px;border:0px">issue</span> please email Dr Ariel Ruiz-Garcia (<a href="mailto:ariel.9arcia@gmail.com" target="_blank" style="margin:0px;padding:0px;border:0px;text-decoration-line:none">ariel.9arcia@gmail.com</a>)</div></div>