<div dir="ltr"><div class="gmail_default" style="font-family:trebuchet ms,sans-serif">apologies for cross-listings</div><div><span id="gmail-docs-internal-guid-08b35e27-7fff-96de-8535-8cee608cde0d"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><b style=""><font face="trebuchet ms, sans-serif"><span style="background-color:transparent">Full-time Research Assistant position in Xiao Lab at American University, Washington DC.</span><span style="font-size:26pt;background-color:transparent"> </span><span style="font-size:12pt"> </span></font></b></span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Position Overview</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The </span><a href="https://sites.google.com/site/beixiao/" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">Xiao Computational Perception Lab</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> in the </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Department of Computer Science</span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> at American University is seeking a full-time Research Assistant (RA) / Lab Programmer for an NIH-funded project on the computational modeling of human material perception.</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Job Description  </span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The RA is to pursue research projects of his/her own as well as provide support for research carried out in the Xiao lab. </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Possible duties include:</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">  </span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Building VR/AR experimental interfaces with Unity3D</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Python coding for behavioral data analysis</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Collecting data for psychophysical experiments</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Training machine learning models</span></p></li></ul><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">This is an ideal position for someone interested in gaining research experience in perception science and computational modeling before applying to graduate school. </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The position comes with a salary and full benefits. This position is initially for a </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">one-year </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">contract. Starting date is</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap"> September 1st, 2023, or soon after</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">.</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Position Requirements:</span></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The ideal candidate should have a Bachelor's degree in computer science, engineering, neuroscience, cognitive science, or a related field. </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The candidate should have strong programming skills in Python and is familiar with Numpy, Pandas, Sklearn. Having experience with PyTorch is a plus.  </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Experience with statistical methods (linear models, multivariate analysis, etc.).</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Experience with psychophysics is not required but would be useful. </span></p></li></ul><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The Lab and Facility</span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Xiao Lab studies both human and computer vision with an emphasis on material perception and recognition. The lab currently has a few ongoing research projects:  </span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><ul style="margin-top:0px;margin-bottom:0px"><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Learning latent representation of human perception of material properties (NIH R15, PI)</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Prediction of clinical trial outcomes with human experts and machine learning models  (NSF SBE Core, PI)</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Material and object perception in infants and children  (Internal funded by American University, collaborating with Dr.</span><a href="https://www.american.edu/profiles/faculty/bayet.cfm" style="text-decoration-line:none"><span style="font-size:12pt;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">Laurie Bayet</span></a><span style="font-size:12pt;color:rgb(34,34,34);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">). </span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Volumetric Capture Studio (NSF MRI Co-PI)</span></p></li><li dir="ltr" style="list-style-type:disc;font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre;margin-left:36pt"><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt" role="presentation"><span style="font-size:12pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Uncertainty estimation in few-shot learning in text classification</span></p></li></ul><p dir="ltr" style="line-height:1.38;margin-left:72pt;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">The Xiao Lab is located in a state-of-the-art technology building, which is home to computer science, physics, math<span class="gmail_default" style="font-family:"trebuchet ms",sans-serif"></span>, and a design and build lab. The lab has high-performing GPU workstations, haptic phantom devices, VR headsets, and 3D printers. <span class="gmail_default" style="font-family:"trebuchet ms",sans-serif">We also have access to a new NSF-funded Volumetric Capture Studio. </span></span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:"Helvetica Neue",sans-serif;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Washington, DC, is the US capital and has a vibrant scene of computational cognition and computer vision research (e.g., NIH, NIST, Johns Hopkins University, <span class="gmail_default" style="font-family:"trebuchet ms",sans-serif">George Washington University, George Mason University, </span>Georgetown University, and the University of Maryland). </span></p><p dir="ltr" style="line-height:1.2;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">How to apply</span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Please submit your application, including a CV,  and a cover letter describing your background, <span class="gmail_default" style="font-family:"trebuchet ms",sans-serif">computational skills, </span>experience, and motivation - preferably in PDF format, and the names of two references that have agreed to be contacted. Please submit the application no later than </span><span style="font-size:12pt;font-family:Arial;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">July 20th, 2023</span><span style="font-size:12pt;font-family:Arial;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">, to </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Prof. Bei Xiao at </span><a href="mailto:bxiao@american.edu" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">bxiao@american.edu</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">.</span></p><br><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Representative Recent Publications:  </span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">1.</span><span style="font-size:7pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><span class="gmail-Apple-tab-span" style="white-space:pre">        </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Liao, C, Sawayama, M, </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Xiao, B.</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">  (2023) </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Unsupervised learning reveals interpretable latent representations for translucency perception</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">. </span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">PLOS Computational Biology</span><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">. Feb 8, 2023. </span><a href="https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1010878" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">PDF.</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(32,32,32);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">2.</span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><span class="gmail-Apple-tab-span" style="white-space:pre">    </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Liao, C, Sawayama, M, </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Xiao, B. </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> (2022) Crystal or Jelly? Effect of Color on the Perception of Translucent Materials with Photographs of Real-world Objects</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">.</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Journal of Vision</span><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">. </span><a href="https://jov.arvojournals.org/Article.aspx?articleid=2778489" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">PDF</span></a><span style="font-size:12pt;font-family:Arial;color:rgb(33,33,33);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">.</span></p><p dir="ltr" style="line-height:1.38;margin-left:36pt;margin-top:0pt;margin-bottom:0pt"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">3.</span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> </span><span style="font-size:7pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><span class="gmail-Apple-tab-span" style="white-space:pre">        </span></span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">He, J. Zhang, X., Shuo L. Wang, S, Huang, Q., Lu, C-T, </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-weight:700;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Xiao, B</span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">. (2022) Semantic Editing On Segmentation Map Via Multi-Expansion Loss. </span><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-style:italic;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Neurocomputing. 501,306-317. </span><a href="https://arxiv.org/abs/2010.08128" style="text-decoration-line:none"><span style="font-size:12pt;font-family:Arial;color:rgb(0,0,255);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;text-decoration-line:underline;vertical-align:baseline;white-space:pre-wrap">PDF.</span></a></p></span><br class="gmail-Apple-interchange-newline"></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div><font face="garamond, serif" size="2">Bei Xiao, PhD<br>Associate Professor </font><div><font face="garamond, serif" size="2">Computer Science & Center for Behavioral Neuroscience</font></div><div><font face="garamond, serif" size="2">American University, Washington DC</font></div><div><font face="garamond, serif" size="2"><br></font></div><div><font><font face="garamond, serif" size="2">Homepage: <a href="https://sites.google.com/site/beixiao/" style="color:rgb(17,85,204)" target="_blank">https://sites.google.com/site/beixiao/</a></font><br></font></div></div><div><br></div><div><br></div><div><br></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>