Connectionists: PhD Position in Machine Learning at Orebro University (Sweden)

Denis Kleyko denis.kleyko at gmail.com
Mon Dec 8 08:42:48 EST 2025


The Machine Perception and Interaction Lab invites applications for a PhD position at the intersection of machine learning and neuroscience at the Department of Computer Science, Orebro University (Sweden). The project will focus on machine learning, specifically on developing novel neuro-inspired and computationally efficient methods.

The successful candidate should demonstrate strong independent problem-solving skills and the ability to think critically. Good communication and collaboration skills are also required. Courses, a thesis, or publications in any of the following areas will be considered a merit: digital signal processing, electrical engineering, computer vision, machine learning, artificial intelligence, cognitive science, or robotics.

The position offers competitive working conditions, and a generous travel budget is available.

University of Registration: Orebro University (Sweden)

Application deadline: January 16, 2026.

To apply, follow the link: https://www.oru.se/english/career/available-positions/job/?jid=20250375

Project Outline: 

Modern machine learning methods often depend on enormous datasets and extensive computing resources. This trend limits their scalability, increases their environmental footprint, and makes advanced machine learning accessible only to those with significant computational power. Meanwhile, biological neural systems, such as insect or human brains, operate under extreme energy constraints yet still handle complex tasks remarkably well. They achieve this by leveraging such computational principles as structural organization, recurrence (memory over time), and even randomness.

Integrating these biological principles into machine learning opens a path toward methods that require far less computation while still delivering strong processing capabilities. To make this possible, both theoretical and practical advances are needed: a deeper understanding of how learning systems can build compact and selective memories, how they can leverage prior knowledge to interpret new data efficiently, and how state-of-the-art architectures like transformers could incorporate these principles to better handle challenges such as long-range temporal dependencies, limited training data, and restricted computing budgets.

The goal of the project is to explore how structured prior knowledge, memories of past inputs, and randomized representations can be combined to create high-performing machine learning models that run even on resource-constrained devices. This work aims to produce a new framework for lightweight machine learning: methods that remain competitive with state-of-the-art models while dramatically reducing computations. The project will demonstrate its impact in demanding challenging domains such as long-term forecasting of dynamical systems and the processing of biomedical signals from wearable devices.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20251208/a6931dd0/attachment.html>


More information about the Connectionists mailing list