Keynote speakers
Implicit Neural Representations Meet Semantic Communications

Short Biography
Deniz Gündüz his Ph.D. from NYU Tandon School of Engineering in 2007. In 2012, he joined the Electrical and Electronic Engineering Department at Imperial College London, UK, where he is currently a Professor of Information Processing, and serves as the deputy head of the Intelligent Systems and Networks Group. In the past, he held positions at the University of Modena and Reggio Emilia (part-time faculty member, 2019-22), University of Padova (visiting professor, 2018, 2020), Centre Tecnologic de Telecomunicacions de Catalunya (CTTC) (research associate, 2009-12), Princeton University (postdoctoral researcher, 2007-09, visiting researcher, 2009-11) and Stanford University (research assistant professor, 2007-09). His research interests lie in the areas of information theory, machine learning, wireless communications and privacy. Dr. Gündüz is a Fellow of the IEEE. He is an elected member of the IEEE Signal Processing Society Signal Processing for Communications and Networking (SPCOM) and Machine Learning for Signal Processing (MLSP) Technical Committees. He chairs the UK and Ireland Chapter of the IEEE Information Theory Society, and serves as an area editor for the IEEE Transactions on Information Theory. In the past, he served in editorial roles for the IEEE Transactions on Communications, IEEE Transactions on Wireless Communications, IEEE Journal on Selected Areas in Communications and IEEE Journal on Selected Areas in Information Theory. He is the recipient of the IEEE Communications Society - Communication Theory Technical Committee (CTTC) Early Achievement Award in 2017, Starting (2016), Consolidator (2022) and Proof-of-Concept (2023) Grants of the European Research Council (ERC), and has co-authored several award-winning papers, most recently the IEEE Communications Society - Young Author Best Paper Award (2022), and the IEEE International Conference on Communications Best Paper Award (2023). He received the Imperial College London - President's Award for Excellence in Research Supervision in 2023.
Dr. Erkip is a member of the Science Academy of Turkey and is a Fellow of the IEEE. She received the NSF CAREER award in 2001, the IEEE Communications Society WICE Outstanding Achievement Award in 2016, the IEEE Communications Society Communication Theory Technical Committee (CTTC) Technical Achievement Award in 2018, and the IEEE Communications Society Edwin Howard Armstrong Achievement Award in 2021. She was the Padovani Lecturer of the IEEE Information Theory Society in 2022. Her paper awards include the IEEE Communications Society Stephen O. Rice Paper Prize in 2004, the IEEE Communications Society Award for Advances in Communication in 2013 and the IEEE Communications Society Best Tutorial Paper Award in 2019. She was a member of the Board of Governors of the IEEE Information Theory Society 2012-2020, where she was the President in 2018. She was a Distinguished Lecturer of the IEEE Information Theory Society from 2013 to 2014.
Abstract
As deep neural networks (DNNs) continue to revolutionize computing, a critical challenge emerges: how can we efficiently deliver these increasingly large models over bandwidth-constrained mobile networks? This challenge is particularly pressing as state-of-the-art DNNs become more specialized for specific times, locations, and tasks, requiring on-demand delivery to users' devices. Similarly, distributed learning scenarios require frequent exchange of model parameters across wireless networks. I will highlight that these problems are examples of `semantic communications’, where the goal is not to recover network parameters reliably, but to make sure they serve for the desired task at the receiver.
In this talk, I will first present our recent breakthroughs in efficient DNN parameter compression and transmission over wireless channels, demonstrating how these techniques can reduce bandwidth requirements while preserving model performance. Building on these foundations, I will then explore how our techniques extend naturally to implicit neural representations (INRs), which treat signals as continuous functions represented by neural networks. This innovative approach bridges traditional signal processing with modern deep learning, yielding remarkable results for practical applications. Specifically, I'll demonstrate how our INR-based methods achieve state-of-the-art performance in MIMO channel state feedback and image compression tasks, reducing transmission overhead while maintaining signal fidelity.