
June 12, 2025
Deniz Kucukahmetler visits the ICLR 2025 in Singapore
The Thirteenth International Conference on Learning Representations (ICLR 2025) in Singapore offered PhD student Deniz Kucukahmetler an opportunity to engage with current developments in deep learning and to participate in international exchange. As one of the leading conferences in representation learning, ICLR brings together researchers from around the world, covering work that spans from theoretical advances in neural networks to applications in neuroscience, robotics, language, and more. For Deniz, who joined SECAI last October and is pursuing a PhD at the intersection of computer science and neuroscience, attending ICLR provided new methodological perspectives, the chance to discuss open research questions, and an opportunity to refine her approach through dialogue with the broader community.
With a background in Computer Science and Neuroscience, Deniz Kucukahmetler is now researching at the intersection of these fields, focusing on the development of deep learning methods to recover latent neural dynamics, with the long-term goal of linking neuronal activity to cognition and behavior. Her work emphasizes deep latent-variable models and uncertainty-aware representation learning to uncover meaningful low-dimensional structure in neural data. This includes modelling brain processes through system identification approaches in the emerging field of NeuroAI, while also drawing on concepts from geometric deep learning and explainable AI to interpret the geometry of latent spaces and enhance transparency and performance.
Deniz Kucukahmetler at the International Conference on Learning Representations
As an early-stage PhD student, attending ICLR 2025 in Singapore was an opportunity for Deniz Kucukahmetler to engage directly with researchers whose work forms the foundation of her own. These exchanges sparked discussions around open questions, introduced new methodological perspectives, and helped broaden her research toolkit and refine the focus of her ongoing work.
The International Conference on Learning Representations (ICLR) is one of the premier global forums for advancing representation learning – an area of artificial intelligence more commonly known as deep learning. ICLR is internationally recognized for spotlighting cutting-edge research across a wide range of topics, from the theoretical foundations of AI and statistical modeling to applications in computer vision, computational biology, speech recognition, text understanding, robotics, and gaming. This year’s edition in Singapore was dominated by research on large language models (LLMs), work on reinforcement learning, robust generative model evaluation, and a growing interest in biologically inspired and energy-efficient learning systems.
Especially this year's keynote by Yi Ma on Pursuing the Nature of Intelligence resonated with Deniz. “It elegantly connected mathematics, modeling, and philosophy – reminding me why I chose this field,” she reflects. The poster sessions and community-led meetups provided the greatest insights for her, encouraging exchange and discussion. “It was my first international conference, and the scale was eye-opening. Interacting with thousands of researchers from every continent underscored how quickly the field moves and how diverse the approaches are,” says Deniz.