T(1) theorem
In mathematics, the T theorem, first proved by, describes when an operator T given by a kernel can be extended to a bounded linear operator on the Hilbert space L2. The name T theorem refers to a condition on the distribution T, given by the operator T applied to the function 1.
Statement
Suppose that T is a continuous operator from Schwartz functions on Rn to tempered distributions, so that T is given by a kernel K which is a distribution. Assume that the kernel is standard, which means that off the diagonal it is given by a function satisfying certain conditions.Then the T theorem states that T can be extended to a bounded operator on the Hilbert space L2 if and only if the following conditions are satisfied:
- T is of bounded mean oscillation.
- T* is of bounded mean oscillation, where T* is the adjoint of T.
- T is weakly bounded, a weak condition that is easy to verify in practice.