Discrete Hamilton-Jacobi Theory
We develop a discrete analogue of Hamilton-Jacobi theory in the framework of discrete Hamiltonian mechanics. The resulting discrete Hamilton-Jacobi equation is discrete only in time. We describe a discrete analogue of Jacobi’s solution and also prove a discrete version of the geometric Hamilton-Jacobi theorem. The theory applied to discrete linear Hamiltonian systems yields the discrete Riccati equation as a special case of the discrete Hamilton-Jacobi equation. We also apply the theory to discrete optimal control problems, and recover some well-known results, such as the Bellman equation (discrete-time HJB equation) of dynamic programming and its relation to the costate variable in the Pontryagin maximum principle. This relationship between the discrete Hamilton-Jacobi equation and Bellman equation is exploited to derive a generalized form of the Bellman equation that has controls at internal stages.
💡 Research Summary
The paper presents a systematic development of a discrete analogue of Hamilton‑Jacobi (HJ) theory within the framework of discrete Hamiltonian mechanics. Starting from a discrete Lagrangian (L_d(q_k,q_{k+1})) and its associated discrete Legendre transform, the authors construct a discrete Hamiltonian (H_d(q_k,p_k)) that preserves the canonical duality between state and costate variables at each time step. By applying a discrete variational principle they obtain the discrete Euler‑Lagrange equations and the corresponding symplectic map (\Phi_d) that advances the phase‑space point ((q_k,p_k)) to ((q_{k+1},p_{k+1})).
The core contribution is the formulation of the discrete Hamilton‑Jacobi equation (HJBE). Unlike the continuous HJ equation (\partial_t S + H(q,\partial_q S)=0), the discrete version reads
\
Comments & Academic Discussion
Loading comments...
Leave a Comment