# Antiderivative

{{#invoke:Hatnote|hatnote}} The slope field of F(x) = (x3/3)-(x2/2)-x+c, showing three of the infinitely many solutions that can be produced by varying the arbitrary constant C.

{{#invoke: Sidebar | collapsible }}

In calculus, an antiderivative, primitive integral or indefinite integral of a function f is a differentiable function F whose derivative is equal to f, i.e., F ′ = f. The process of solving for antiderivatives is called antidifferentiation (or indefinite integration) and its opposite operation is called differentiation, which is the process of finding a derivative. Antiderivatives are related to definite integrals through the fundamental theorem of calculus: the definite integral of a function over an interval is equal to the difference between the values of an antiderivative evaluated at the endpoints of the interval.

The discrete equivalent of the notion of antiderivative is antidifference.

## Example

The function F(x) = x3/3 is an antiderivative of f(x) = x2. As the derivative of a constant is zero, x2 will have an infinite number of antiderivatives; such as (x3/3) + 0, (x3/3) + 7, (x3/3) − 42, (x3/3) + 293 etc. Thus, all the antiderivatives of x2 can be obtained by changing the value of C in F(x) = (x3/3) + C; where C is an arbitrary constant known as the constant of integration. Essentially, the graphs of antiderivatives of a given function are vertical translations of each other; each graph's vertical location depending upon the value of C.

In physics, the integration of acceleration yields velocity plus a constant. The constant is the initial velocity term that would be lost upon taking the derivative of velocity because the derivative of a constant term is zero. This same pattern applies to further integrations and derivatives of motion (position, velocity, acceleration, and so on).

## Uses and properties

Antiderivatives are important because they can be used to compute definite integrals, using the fundamental theorem of calculus: if F is an antiderivative of the integrable function f and f is continuous over the interval [a, b], then:

$\int _{a}^{b}f(x)\,\mathrm {d} x=F(b)-F(a).$ Because of this, each of the infinitely many antiderivatives of a given function f is sometimes called the "general integral" or "indefinite integral" of f and is written using the integral symbol with no bounds:

$\int f(x)\,\mathrm {d} x.$ If F is an antiderivative of f, and the function f is defined on some interval, then every other antiderivative G of f differs from F by a constant: there exists a number C such that G(x) = F(x) + C for all x. C is called the arbitrary constant of integration. If the domain of F is a disjoint union of two or more intervals, then a different constant of integration may be chosen for each of the intervals. For instance

$F(x)={\begin{cases}-{\frac {1}{x}}+C_{1}\quad x<0\\-{\frac {1}{x}}+C_{2}\quad x>0\end{cases}}$ Every continuous function f has an antiderivative, and one antiderivative F is given by the definite integral of f with variable upper boundary:

$F(x)=\int _{0}^{x}f(t)\,\mathrm {d} t.$ Varying the lower boundary produces other antiderivatives (but not necessarily all possible antiderivatives). This is another formulation of the fundamental theorem of calculus.

There are many functions whose antiderivatives, even though they exist, cannot be expressed in terms of elementary functions (like polynomials, exponential functions, logarithms, trigonometric functions, inverse trigonometric functions and their combinations). Examples of these are

$\int e^{-x^{2}}\,\mathrm {d} x,\qquad \int \sin x^{2}\,\mathrm {d} x,\qquad \int {\frac {\sin x}{x}}\,\mathrm {d} x,\qquad \int {\frac {1}{\ln x}}\,\mathrm {d} x,\qquad \int x^{x}\,\mathrm {d} x.$ From left to right, the first four are the error function, the Fresnel function, the trigonometric integral, and the logarithmic integral function.

## Techniques of integration

Finding antiderivatives of elementary functions is often considerably harder than finding their derivatives. For some elementary functions, it is impossible to find an antiderivative in terms of other elementary functions. See the article on elementary functions for further information.

There are various methods available:

$\int _{x_{0}}^{x}\int _{x_{0}}^{x_{1}}\dots \int _{x_{0}}^{x_{n-1}}f(x_{n})\,\mathrm {d} x_{n}\dots \,\mathrm {d} x_{2}\,\mathrm {d} x_{1}=\int _{x_{0}}^{x}f(t){\frac {(x-t)^{n-1}}{(n-1)!}}\,\mathrm {d} t.$ ## Antiderivatives of non-continuous functions

Non-continuous functions can have antiderivatives. While there are still open questions in this area, it is known that:

• Some highly pathological functions with large sets of discontinuities may nevertheless have antiderivatives.
• In some cases, the antiderivatives of such pathological functions may be found by Riemann integration, while in other cases these functions are not Riemann integrable.

Assuming that the domains of the functions are open intervals:

• A necessary, but not sufficient, condition for a function f to have an antiderivative is that f have the intermediate value property. That is, if [ab] is a subinterval of the domain of f and C is any real number between f(a) and f(b), then f(c) = C for some c between a and b. To see this, let F be an antiderivative of f and consider the continuous function
$g(x)=F(x)-Cx$ on the closed interval [ab]. Then g must have either a maximum or minimum c in the open interval (ab) and so

$0=g'(c)=f(c)-C.$ {\begin{aligned}\sum _{i=1}^{n}f(x_{i}^{*})(x_{i}-x_{i-1})&=\sum _{i=1}^{n}[F(x_{i})-F(x_{i-1})]\\&=F(x_{n})-F(x_{0})=F(b)-F(a)\end{aligned}} However if f is unbounded, or if f is bounded but the set of discontinuities of f has positive Lebesgue measure, a different choice of sample points $x_{i}^{*}$ may give a significantly different value for the Riemann sum, no matter how fine the partition. See Example 4 below.