Recently, I wondered whether it is possible for a linear operator in a continuous (infinite-dimensional) vector space to have a finite determinant. By "continuous vector space", I mean that the identity operator can be resolved for a complete orthonormal basis \( |\phi(x) \rangle \) for all \( x \) such that \( \langle \phi(x), \phi(x') \rangle = \delta(x - x') \) as \( \hat{1} = \int |\phi(x)\rangle\langle \phi(x)|~\mathrm{d}x \). If an operator \( \hat{A} \) has continuous matrix elements \( A(x, x') = \langle \phi(x), \hat{A}\phi(x') \rangle \), then it is easy to see that the conditions for its trace \( \operatorname{trace}(\hat{A}) = \int A(x, x)~\mathrm{d}x \) to be finite are that the integral must converge, so the "function" \( A(x, x) \) must asymptotically approach 0 strictly faster than \( 1/x \) as \( |x| \to \infty \) and must at most have singularities at finite points \( x_{0} \) that diverge strictly slower than \( 1/|x - x_{0}| \). This can be seen as the continuum limit of a sum over the diagonal. However, the determinant is harder to express in this way because it involves products over diagonals & subdiagonals that are harder to express in a continuum space.

For this post, I will only consider Hermitian positive-definite operators. The conditions that I will list for which the determinant exists for such operators are sufficient for the determinant to exist, but I am not convinced that they are necessary. If such operators have an eigenvalue decomposition \( \hat{A} = \int a(x) |\phi(x)\rangle\langle \phi(x)|~\mathrm{d}x \) where the vectors \( \{ |\phi(x) \rangle \} \) form a complete orthonormal basis and the eigenvalues satisfy \( a(x) > 0 \) for all \( x \), then one can make use of the identity \( \ln(\det(\hat{A})) = \operatorname{trace}(\ln(\hat{A})) \) to say that \( \ln(\det(\hat{A})) = \int \ln(a(x))~\mathrm{d}x \). For the right-hand side to converge, then \( \ln(a(x)) \) must asymptotically approach 0 with \( x \) as \( |x| \to \infty \) strictly faster than \( 1/x \), which means that \( a(x) \) must asymptotically 1 with \( x \) as \( |x| \to \infty \) strictly faster than \( \exp(1/x) \) (which is *not* the same as \( e^{-x} \)), and \( \ln(a(x)) \) can at most have singularities at finite points \( x_{0} \) that diverge strictly slower than \( 1/|x - x_{0}| \), which means that \( a(x) \) must either diverge to \( \infty \) strictly slower than \( \exp(1/|x - x_{0}|) \) or drop to 0 strictly slower than \( \exp(-1/|x - x_{0}|) \). For example, \( a(x) = \exp(1/(x^{2} + x_{0}^{2})) \) fits the bill; note that this is *not* the same as the Gaussian kernel \( \exp(-(x^{2} + x_{0}^{2})) \). Intuitively, this condition makes sense, because for a finite-dimensional diagonal matrix as the dimension becomes arbitrarily large, the diagonal elements must mostly be exactly or very close to 1 for the determinant to not grow arbitrarily large with the dimension.

In finite-dimensional vector spaces, it is also easy to compute the determinants of triangular matrices simply as the products of the diagonal elements. (This is why the determinant is most often computed by an algorithm like first computing the LU decomposition and then taking the product of the diagonal elements of the upper-triangular matrix, which for an \( N \times N \) matrix involves \( O(N^{3}) \) operations, as opposed to the Leibniz formula involving every permutation which involves \( O(N!N) \) operations.) In infinite-dimensional vector spaces, a matrix that is triangular in a countable basis can have the determinant computed similarly as in finite-dimensional vector spaces; if an operator \( \hat{A} \) in that basis has elements \( A_{ij} \), then using the definition \( \ln(|\det(\hat{A})|) = \prod_{i} \ln(|A_{ii}|) \), the determinant converges as long as the diagonal elements \( |A_{ii}| \) are mostly exactly or very close to 1, specifically such that as \( |i| \to \infty \), \( \ln(|A_{ii}|) \) decays to 0 strictly faster than \( 1/i \). (Note that \( i \) is an integer index written in slanted font, not the imaginary unit \( \operatorname{i} \) written in upright font.) However, I am not sure how to generalize this to operators that are expressed as triangular matrices in continuous bases.