Optimizing Decoding Paths in Masked Diffusion Models by Quantifying Uncertainty
Authors
Ziyu Chen; Xinbei Jiang; Peng Sun; Tao Lin
Scores
Rationale
The paper introduces Denoising Entropy, a novel metric to quantify uncertainty in Masked Diffusion Models, which is an original contribution to the field. It addresses a significant issue in non-autoregressive generation related to decoding path sensitivity, thereby enhancing model performance, which is technically significant. The proposed methods are primarily applicable to MDMs but have potential relevance to other generative models, suggesting moderate transferability. The work aligns well with current trends in improving generative model robustness and efficiency, evidenced by its focus on reasoning and planning tasks. The experiments are comprehensive, covering various benchmarks, which supports strong evidence for the claims. The introduction of a principled metric like Denoising Entropy could influence future research directions, indicating a high long-term impact potential.