# The Magnus Expansion

15 Jun 2017The goal of all real-time electronic dynamics methods is to solve the time-dependent Schrödinger equation (TDSE)

where is the time-dependent Hamiltonian and is the
time-dependent wave function. The goal of the Magnus expansion is to
find a general solution for the time-dependent wave function in the case
where is time-dependent, and, more crucially, when does not
commute with itself at different times, e.g. when
. In the following we will follow
closely the notation of Blanes, *et al.*.

First, for simplicity we redefine and introduce a scalar as a bookkeeping device, so that

At the heart of the Magnus expansion is the idea of solving the TDSE by using the quantum propagator that connects wave functions at different times, e.g. Furthermore, the Magnus expansion assumes that can be represented as an exponential, This yields the modified TDSE

Now, for scalar and , the above has a simple solution, namely

However, if and are matrices this is not necessarily true. In other words, for a given matrix the following expression does not necessarily hold:

because the matrix and its derivatives do not necessarily commute. Instead, Magnus proved that in general satisfies

where are the Bernoulli numbers. This equation may be solved by integration, and iterative substitution of . While it may appear that we are worse off than when we started, collecting like powers of (and setting ) allows us to obtain a power-series expansion for ,

This is the Magnus expansion, and here we have given up to the third-order terms. We have also made the notational simplification that . This is the basis for nearly all numerical methods to integrate the many-body TDSE in molecular physics. Each subsequent order in the Magnus expansion is a correction that accounts for the proper time-ordering of the Hamiltonian.

The Magnus expansion immediately suggests a route to many numerical integrators. The simplest would be to approximate the first term by

leading to a forward-Euler-like time integrator of

which we can re-write as

where subscript gives the node of the time-step stencil. This gives a first-order method with error . A more accurate second-order method can be constructed by approximating the first term in the Magnus expansion by the midpoint rule, leading to an time integrator

Modifying the stencil to eliminate the need to evaluate the Hamiltonian at fractional time steps (e.g. change time step to ) leads to the modified midpoint unitary transformation (MMUT) method

which is a leapfrog-type unitary integrator. Note that the midpoint
method assumes is linear over its time interval, and the
higher order terms (containing the commutators) in this approximation go to zero. There are many other types of integrators
based off the Magnus expansion that can be found in the
literature. The key point for all of these
integrators is that they are *symplectic*, meaning they preserve
phase-space relationships. This has the practical effect of conserving
energy (within some error bound) in long-time dynamics, whereas
non-symplectic methods such as Runge-Kutta will experience energetic
“drift” over long times. A final note: in each of these schemes it is
necessary to evaluate the exponential of the Hamiltonian. In real-time
methods, this requires computing a matrix exponential. This is not a
trivial task, and, aside from the construction of the Hamiltonian
itself, is often the most expensive step in the numerical solution of
the TDSE. However, many elegant solutions to the construction of the
matrix exponential can be found in the literature.

## References

Blanes, S., Casas, F., Oteo, J.A. and Ros, J., 2010. A pedagogical approach to the Magnus expansion. European Journal of Physics, 31(4), p.907.

Magnus, W., 1954. On the exponential solution of differential equations for a linear operator. Communications on pure and applied mathematics, 7(4), pp.649-673.