Only Strict Saddles in the Energy Landscape of Predictive Coding Networks? has been accepted for this year’s NeurIPS conference in Vancouver.


Predictive coding (PC) is an energy-based learning algorithm that improves the loss landscape by making saddles easier to escape, potentially leading to faster convergence than backpropagation. This work shows that PC inference reshapes the energy landscape, making it more robust to vanishing gradients, though challenges remain in scaling the inference to larger models.

You can find the pre-print of this paper here