Generated by GPT-5-mini| Michael Gastpar | |
|---|---|
| Name | Michael Gastpar |
| Fields | Information theory, signal processing, communications |
| Workplaces | École Polytechnique Fédérale de Lausanne, Massachusetts Institute of Technology, Swiss Federal Institute of Technology Lausanne |
| Alma mater | École Polytechnique, École Nationale Supérieure des Télécommunications, Massachusetts Institute of Technology |
| Doctoral advisor | David Tse |
| Known for | Network information theory, distributed source coding, sensor networks |
Michael Gastpar is a researcher and academic recognized for contributions to information theory, signal processing, and communication networks. He has held faculty positions at institutions in Switzerland and the United States and contributed to theoretical foundations that connect data compression, distributed estimation, and wireless communication. His work spans both rigorous mathematical analysis and applications to practical systems such as sensor networks and wireless sensor networks.
Gastpar completed early studies in France and pursued advanced degrees at prominent engineering schools and research institutions. He studied at École Polytechnique and the École Nationale Supérieure des Télécommunications before undertaking doctoral work at the Massachusetts Institute of Technology under the supervision of David Tse. His doctoral research built upon foundations in Shannon theory, rate–distortion theory, and multi-terminal source coding problems that are central to network information theory.
Gastpar has held academic appointments at the École Polytechnique Fédérale de Lausanne (EPFL), where he served as a professor within the School of Computer and Communication Sciences and contributed to research groups bridging communication theory and statistical signal processing. Prior to EPFL, he completed postdoctoral and visiting roles at institutions including the Massachusetts Institute of Technology and collaborations with faculty at the University of California, Berkeley and the École Normale Supérieure. He has supervised doctoral students who went on to positions at entities such as Google Research, Bell Labs, Intel Labs, and other academic departments. Gastpar has been active in organizing programs and workshops at venues including the Institute of Electrical and Electronics Engineers symposia, the IEEE International Symposium on Information Theory, and summer schools hosted by the Centre for Mathematics and Computer Science.
Gastpar's research contributions address central problems in information theory and signal processing for networks. He developed analytical frameworks for distributed source coding and joint source–channel coding that informed understanding of multi-node systems such as sensor networks and cooperative wireless networks. Notable themes include the study of the tradeoffs between compression and communication in networks influenced by Claude Shannon's foundational work and extensions by researchers at Bell Labs and MIT.
One strand of his work formalized "sensor selection" and "in-network compression" strategies, relating to classical problems such as the CEO problem and multiterminal rate–distortion theory pioneered by figures like Thomas Cover and Jacob Wolfowitz. He investigated schemes where nodes perform local processing and encoding before forwarding to a fusion center, connecting to results in distributed detection and estimation studied at institutions such as Stanford University and the University of Illinois Urbana–Champaign. Another significant contribution concerns the capacity and efficiency of relay and cooperative networks, building on concepts like the relay channel and network coding introduced by researchers at Lucent Bell Labs and California Institute of Technology.
Gastpar also explored approximate and scalable algorithms for large-scale systems, aligning with contributions in compressed sensing from researchers at Duke University and the California Institute of Technology, and with machine learning applications prominent at Carnegie Mellon University and University of Toronto. His analyses often combine probabilistic models of sources and channels with information-theoretic bounds, complementing advances made by scholars at Princeton University and Harvard University.
He has collaborated with researchers from diverse institutions including ETH Zurich, Columbia University, University of Maryland, Microsoft Research, and Télécom ParisTech, expanding the interdisciplinary reach of his work into areas such as distributed control and energy-constrained sensing.
Gastpar's contributions have been recognized by awards and invitations to speak at leading venues. He has delivered invited lectures at the IEEE International Symposium on Information Theory and at seminars organized by the International Teletraffic Congress and the Royal Society of London. His publications have been cited across research communities at EPFL, MIT, and international laboratories. He has served on technical program committees for conferences such as IEEE ISIT, Allerton Conference on Communication, Control, and Computing, and workshops organized by NSF-funded initiatives.
- Gastpar, M., "On the capacity of large wireless networks", in proceedings associated with IEEE Transactions on Information Theory, addressing scaling laws for multi-node communication. - Gastpar, M., "Sparse sensing and in-network compression for sensor networks", conference paper presented at the ACM/IEEE IPSN workshop, linking to distributed source coding literature. - Gastpar, M. and Vetterli, M., "Source–channel communication in sensor networks", in proceedings influenced by advances from EPFL and Harvard University teams on joint source–channel coding. - Gastpar, M., "Uncoded transmission and optimality conditions", journal article exploring when analog transmission schemes meet limits established by Shannon and modern information theorists.
Category:Information theorists Category:École Polytechnique alumni Category:Massachusetts Institute of Technology alumni Category:École Polytechnique Fédérale de Lausanne faculty