Connectionists: PhD Fellowship in Computational Vision at American University, Washington DC.
bei.xiao at gmail.com
bei.xiao at gmail.com
Thu Aug 31 15:14:29 EDT 2023
The Xiao Computational Perception Lab
<https://sites.google.com/site/beixiao/> at American University, Washington
DC, has an opening for an NIH-funded Ph.D. fellowship in Neuroscience (
<https://www.american.edu/cas/psychology/behavioral/requirements.cfm>Application
deadline Dec 1, 2023). The program is situated in the
interdisciplinary Behavior,
Cognition, & Neuroscience Graduate Program at AU
<https://www.american.edu/cas/psychology/behavioral/index.cfm>.
Overview
The main topic of this PHD is to understand how humans estimate object
properties and plan interactions in an immersive multimodal environment
using Generative AI and Human Psychophysics. We use a combination of
psychophysics, deep learning, generative AI, VR/AR, computer graphics,
image acquisition, and volumetric captures. The candidate will receive
hands-on training in one or several of these areas.
Besides this topic, we also study the visual development of material and
object perception in infancy and early childhood through a collaboration
with Laurie Bayet <https://www.bayetlab.com/index.php/people/> in the
Department of Neuroscience. In addition, Xiao lab also studies computer
vision and collaborates with researchers at Virginia Tech, NIH/NEI, George
Mason University, University of Giessen, and the University of Tokyo. These
collaborations allow the candidate the opportunity to work on a variety of
related topics (see below in Our Lab and Facility).
Your role in research:
-
Performing research on mechanisms of human perception and computational
modeling of perception using a combination of psychophysics, machine
learning, computer graphics, and VR/AR methods
-
Collaborating with computer scientists and neuroscientists in and
outside the lab
-
Multimodal data analysis
-
Implementing and evaluating deep learning models
-
Preparing and presenting research at conferences and peer-reviewed
publications
Qualifications:
-
The candidate should have completed a Bachelor's degree in quantitative
psychology, vision science, neuroscience, cognitive science, computer
science, physics, engineering, or a related field.
-
The candidate should be interested in research on perceptual mechanisms
and human behavior.
-
The candidate should have solid programming skills in Python. Experience
with statistical methods (linear models, multivariate analysis, etc.) is a
plus.
-
Prior experience working in a research lab is a strong plus.
Our Lab
Prof. Bei Xiao has a primary appointment in computer science but is a core
member of the center of behavioral neuroscience.
<https://www.american.edu/cas/center-neuroscience/members.cfm>
The Lab is located in a state-of-the-art building that houses the
departments of computer science, physics and applied math and a design and
build lab. The lab has high-performing GPU workstations, tactile devices,
VR headsets, 3D printers, and access to an NSF-funded Volumetric Capture
Studio. <https://www.american.edu/centers/ideas-institute/> We are part of
both the Computer Science Department and the Center of Behavioral
Neuroscience and collaborate with faculty from both departments. The
student can take courses in both departments and attend research seminars
across campus.
Washington, DC, is the US capital and has a vibrant scene of computational
cognition and computer vision research (e.g., NIH/NEI/NIMH, NIST, Johns
Hopkins University, George Washington Georgetown University, and the
University of Maryland).
Xiao Lab studies both human and computer vision with an emphasis on
material perception and recognition. The lab currently has a few ongoing
research projects:
-
Learning latent representation of human perception of material
properties
-
Material and object perception in infants and children with behavioral
and EEG methods
-
Volumetric Capture
-
Uncertainty estimation in few-shot learning in text classification
-
Prediction of clinical trial outcomes with human experts and machine
learning models.
How to apply
Prospective graduate students should contact me directly and are required
to apply to the Behavior, Cognition, & Neuroscience (BCaN) Graduate Program
at AU <https://www.american.edu/cas/psychology/behavioral/index.cfm>. More
details and the graduate application at:
https://www.american.edu/cas/psychology/behavioral/requirements.cfm
More details about Xiao lab: https://sites.google.com/site/beixiao/
More details about the AU neuroscience department:
https://www.american.edu/cas/neuroscience/faculty.cfm
To inquire about the position (ideally well before the application
deadline), please submit your application, including a CV, a cover letter
describing your background, experience, and motivation (in PDF format), and
the names of two references agreed to be contacted. International
candidates need to get a TOFEL score above 100. The deadline for the
application is December 1st, 2023.
Representative Recent Publications:
1. Liao, C, Sawayama, M, Xiao, B. (2023) Unsupervised learning reveals
interpretable latent representations for translucency perception. PLOS
Computational Biology. Feb 8, 2023. PDF.
<https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1010878>
2. Zhang, X, Lei, S, Alhamadni, A, Chen, F, Xiao, B, and Lu, CT.
(2023) CLUR: Uncertainty Estimation for Few-Shot Text Classification with
Contrastive Learning. ACM SIGKDD 2023. PDF
<https://dl.acm.org/doi/10.1145/3580305.3599276>.
3. Liao, C, Sawayama, M, Xiao, B. (2022) Crystal or Jelly? Effect of Color
on the Perception of Translucent Materials with Photographs of Real-world
Objects. Journal of Vision. PDF
<https://jov.arvojournals.org/Article.aspx?articleid=2778489>.
--
Bei Xiao, PhD
Associate Professor
Computer Science & Center for Behavioral Neuroscience
American University, Washington DC
Homepage: https://sites.google.com/site/beixiao/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20230831/2adc63f7/attachment.html>
More information about the Connectionists
mailing list