Hessian-based analysis can be used to observe the loss landscape of a neural network by computing the Hessian matrix of the loss function with respect to the network’s parameters. The Hessian matrix provides information about the curvature of the loss function at each point in parameter space.
One way to visualize this curvature is to compute the eigenvalues and eigenvectors of the Hessian matrix. The eigenvalues represent the curvature along each eigenvector, while the eigenvectors represent directions in parameter space where there is significant curvature. By plotting these eigenvalues and eigenvectors, it is possible to obtain a 2D or 3D visualization of the loss landscape around a given point in parameter space.
Another way to use Hessian-based analysis to observe the loss landscape is by computing its condition number, which represents how sensitive the loss function is to changes in parameters. A high condition number indicates that small changes in parameters can lead to large changes in the loss function, while a low condition number indicates that small changes in parameters have little effect on the loss function.
Overall, using Hessian-based analysis to observe the loss landscape can provide valuable insights into how a neural network behaves during training and may help researchers identify potential issues such as local minima or saddle points that can slow down convergence or hinder performance.