Zongyu Guo (郭宗昱)


I'm a PhD candidate at the University of Science and Technology of China (USTC) advised by Prof. Zhibo Chen (Page). I am expected to receive my PhD degree in June, 2024. Prior to that, I spent my undergraduate years in the Department of Electronic Engineering and Information Science also at USTC, from 2015 to 2019, and was awarded as an Outstanding Graduate.

I was visiting the machine learning group in computational and biological learning (CBL) lab at the University of Cambridge, advised by Prof. José Miguel Hernández-Lobato (Page) and collaborating with Gergely Flamich. We were excited to have a new approach developed for neural compression [1].

I am actively looking for a position in AI research started in the summer of 2024. Feel free to email me if you are interested in my research.
Quick links: Email, Google scholar, Resume

Research


As a researcher in machine learning, I am focused on developing effective and practical methods for neural data compression, a topic that is theorectically founded by probabilistic generative models and Bayesian inference. My interest spans multiple areas, including implicit neural representations [1, 2], Bayesian neural networks [1], and various probabilistic generative models such as variational autoencoders [5] and diffusion models.

Recently, I am interested in compression for AGI. The pioneer of information theory, Claude Shannon, defined a language model as a bandwidth-limited communication system in his seminal 1948 paper. This conception of communication channels having limited bandwidth arises from the human propensity to convey information using as little energy as possible. Humans are adept at identifying relationships within data, allowing us to compress information into shorter descriptions - an act that, I believe, is a manifestation of intelligence. If you share the same interest with me, I welcome you to share your thoughts with me via email.

Experience


  • October 2022 - September 2023 (expected), Cambridge, UK
    Visiting Scholar, University of Cambridge. In-person visiting now
    Topics: probabilistic machine learning and neural compression

  • September 2019 - June 2024 (expected), Hefei, China
    PhD, University of Science and Technology of China
    Topics: neural compression with probabilistic generative models

  • December 2021 - December 2022, Beijing, China
    Research Intern, Microsoft Research Asia
    Topics: implicit neural representations and probabilistic machine learning

  • September 2015 - June 2019, Hefei, China
    Bachelor, University of Science and Technology of China
    Department of Electronic Engineering and Information Science

  • December 2018 - March 2019, Beijing, China
    Research Intern, JD AI Lab
    Topics: attention mechanism and its applications in vision

  • June 2017 - July 2017, Vladivostok, Russia
    Seminar in Far Eastern Federal University (FEFU)
    Seminar for young student leaders from pacific-rim universities

Selected Publications


[1] Compression with Bayesian Implicit Neural Representations (NeurIPS'23 Spotlight)

Zongyu Guo#, Gergely Flamich#, Jiajun He, Zhibo Chen, José Miguel Hernández-Lobato
Paper, Code. TL;DR: Compress data as variational Bayesian implicit neural representations with relative entropy coding, a new approach that supports joint rate-distortion optimization.

[2] Versatile Neural Processes for Learning Implicit Neural Representations (ICLR'23)

Zongyu Guo, Cuiling Lan, Zhizheng Zhang, Yan Lu, Zhibo Chen.
Paper, Code. TL;DR: An efficient neural process method for learning the implicit neural representations w.r.t. various signals, including complex 3D scenes.

[3] Learning Cross-Scale Weighted Prediction for Efficient Neural Video Compression (TIP'23)

Zongyu Guo#, Runsen Feng#, Zhizheng Zhang, Xin Jin, Zhibo Chen
Paper, Code. TL;DR: Improve the inter-prediction for neural video compression and introduce many important engineering experiences.

[4] NVTC: Nonlinear Vector Transform Coding (CVPR'23)

Runsen Feng, Zongyu Guo, Weiping Li, Zhibo Chen
Paper. TL;DR: Scalar quantization with non-linear transform cannot catch up with vector quantization in latent space.

[5] Soft then Hard: Rethinking the Quantization in Neural Image Compression (ICML'21)

Zongyu Guo, Zhizheng Zhang, Runsen Feng, Zhibo Chen
Paper. TL;DR: A comprehensive analysis of quantization in neural image compression. A simple yet effective strategy proposed to solve the train-test mismatch of quantization.

[6] Causal Contextual Prediction for Learned Image Compression (TCSVT'21)

Zongyu Guo, Zhizheng Zhang, Runsen Feng, Zhibo Chen
Paper. TL;DR: Impressive performance for neural image compression, with several advanced techniques employed such as masked transformer.

[7] 3-D Context Entropy Model for Improved Practical Image Compression (CVPR'20 Workshop)

Zongyu Guo#, Yaojun Wu#, Runsen Feng, Zhizheng Zhang, Zhibo Chen
Paper. TL;DR: The fourth place in the learned image compression challenge in terms of perceptual metric, the first place in terms of MS-SSIM.



Talks and Service


Talks:

  • Talk in Microsoft Research Asia for my work [2] (Slides).
  • Talk in Tsinghua SIGS for my work [5] and [6] (Slides).

Teaching Assistant:

  • Elements of Video Technology - 2021 Postgraduate Course.

Reviewer:

  • Conference Reviewer: NeurIPS, ICLR, CVPR, AAAI, VCIP etc.
  • Journal Reviewer: IJCV, TIP, TNNLS, TCSVT, Neurocomputing.