I'm now a Researcher at the Media Computing Group in Microsoft Research Asia, Beijing. I received the PhD degree in 2024 at the University of Science and Technology of China (USTC) advised by Prof. Zhibo Chen (Page). Prior to that, I spent my undergraduate years in the Department of Electronic Engineering and Information Science also at USTC, from 2015 to 2019, and was awarded as an Outstanding Graduate.
I was visiting the machine learning group in computational and biological learning (CBL) lab at the University of Cambridge, advised by Prof. José Miguel Hernández-Lobato (Page) and collaborating with Gergely Flamich. W
Feel free to email me if you are interested in my research.
Quick links: Email, Google scholar, Resume。
As a researcher in machine learning, I am focused on developing effective and practical methods for neural data compression, a topic that is theorectically founded by probabilistic generative models and Bayesian inference. My interest spans multiple areas, including implicit neural representations [1, 2], Bayesian neural networks [1], and various probabilistic generative models such as variational autoencoders [5] and diffusion models.
Recently, I am interested in compression for AGI. The pioneer of information theory, Claude Shannon, defined a language model as a bandwidth-limited communication system in his seminal 1948 paper. This conception of communication channels having limited bandwidth arises from the human propensity to convey information using as little energy as possible. Humans are adept at identifying relationships within data, allowing us to compress information into shorter descriptions - an act that, I believe, is a manifestation of intelligence. If you share the same interest with me, I welcome you to share your thoughts with me via email.
October 2022 - September 2023 (expected), Cambridge, UK
Visiting Scholar, University of Cambridge. In-person visiting now
Topics: probabilistic machine learning and neural compression
September 2019 - June 2024 (expected), Hefei, China
PhD, University of Science and Technology of China
Topics: neural compression with probabilistic generative models
December 2021 - December 2022, Beijing, China
Research Intern, Microsoft Research Asia
Topics: implicit neural representations and probabilistic machine learning
September 2015 - June 2019, Hefei, China
Bachelor, University of Science and Technology of China
Department of Electronic Engineering and Information Science
December 2018 - March 2019, Beijing, China
Research Intern, JD AI Lab
Topics: attention mechanism and its applications in vision
June 2017 - July 2017, Vladivostok, Russia
Seminar in Far Eastern Federal University (FEFU)
Seminar for young student leaders from pacific-rim universities
Zongyu Guo#, Gergely Flamich#, Jiajun He, Zhibo Chen, José Miguel Hernández-Lobato
Paper, Code. TL;DR: Compress data as variational Bayesian implicit neural representations with relative entropy coding, a new approach that supports joint rate-distortion optimization.
Zongyu Guo, Cuiling Lan, Zhizheng Zhang, Yan Lu, Zhibo Chen.
Paper, Code. TL;DR: An efficient neural process method for learning the implicit neural representations w.r.t. various signals, including complex 3D scenes.
Zongyu Guo#, Runsen Feng#, Zhizheng Zhang, Xin Jin, Zhibo Chen
Paper, Code. TL;DR: Improve the inter-prediction for neural video compression and introduce many important engineering experiences.
Runsen Feng, Zongyu Guo, Weiping Li, Zhibo Chen
Paper. TL;DR: Scalar quantization with non-linear transform cannot catch up with vector quantization in latent space.
Zongyu Guo, Zhizheng Zhang, Runsen Feng, Zhibo Chen
Paper. TL;DR: A comprehensive analysis of quantization in neural image compression. A simple yet effective strategy proposed to solve the train-test mismatch of quantization.
Zongyu Guo, Zhizheng Zhang, Runsen Feng, Zhibo Chen
Paper. TL;DR: Impressive performance for neural image compression, with several advanced techniques employed such as masked transformer.
Zongyu Guo#, Yaojun Wu#, Runsen Feng, Zhizheng Zhang, Zhibo Chen
Paper. TL;DR: The fourth place in the learned image compression challenge in terms of perceptual metric, the first place in terms of MS-SSIM.
Talks:
Teaching Assistant:
Reviewer: