|
|
Line 1: |
Line 1: |
| {{two other uses|generalized functions in mathematical analysis|the probability meaning|Probability distribution|artificial landscapes|Test functions for optimization|other uses|Distribution (disambiguation)}}
| | The 8900 4, 8 and 16 channel DVR will handle as much as 4 RCA audio inputs on and 1 RCA audio output. Instead of various [http://wiki.wavespot.net/index.php/User:HuckebyTrachsel security guards] posted at several locations, ideally, a burglar alarm guard could investigate a bank of monitors are fed from multiple video cameras hidden.<br><br> |
|
| |
|
| In [[mathematical analysis]], '''distributions''' (or '''[[generalized functions]]''') are objects that generalize [[function (mathematics)|function]]s. Distributions make it possible to [[derivative|differentiate]] functions whose derivatives do not exist in the classical sense. In particular, any [[locally integrable]] function has a distributional derivative. Distributions are widely used to formulate generalized solutions of [[partial differential equation]]s. Where a classical solution may not exist or be very difficult to establish, a distribution solution to a differential equation is often much easier. Distributions are also important in [[physics]] and [[engineering]] where many problems naturally lead to differential equations whose solutions or initial conditions are distributions, such as the [[Dirac delta]] function (which is historically called a "function" even though it is not considered a proper function mathematically). | | In the succeeding years, extra surveillance cameras might be added because the budget allows. Most systems are [http://96.21.43.7510101/mediawiki/index.php/What_Google_Can_Teach_You_About_8_Channel_Cctv_System automatically detected] with the network and computer software for managing them so installation really is quite straightforward.<br><br>Remember that any scraps of knowledge are useful to some clever thief that can use public information to find out more. The big corporate offices, the federal government offices, railway stations, and other alike other places use this form of cameras hugely.<br><br>Find out more to do with each type of camera to determine which kind is most ideal for you. This in fact is really practical in a monitoring exercise and also to your reason of proof or proof in different court of law.<br><br>Once you've got identified the things that you would like and the opportunity to explore make a plan and Spring into action. i - Pods, i - Pads, Blackberries, DVRs, Kindles and much more – all fascinating varieties of technology.<br><br>All in our surveillance systems are complete and easy to install. Don't get emotional: A good commercial should evoke emotional responses from the audience during the entire entirety with the commercial.<br><br>Some needs people think they need to have, they don't need. Today, recorders have become space efficient and still provide various storage options. However, a far more aggressive demeanor could very well be in line with the classifications many would apply for the [http://96.21.43.7510101/mediawiki/index.php/What_Google_Can_Teach_You_About_8_Channel_Cctv_System average athlete].<br><br>There are secret wireless cameras, numerous cameras with receivers for observing an extensive area, dome-covered wireless cameras, and wireless video sender. This strategy is also utilised by cars because they are locked while in parking places.<br><br>The Rape Abuse and Incest National Network (RAINN) reports that one out of 6 women is going to be sexually assaulted in their lifetime. The USB DVR enables you to watch your premises in real time over the Internet from anywhere within the world. Having a CCTV camera [http://Safalaya.com/blog/view/61630/the-tried-and-true-method-for-video-capture-card-in-step-by-step-detail installed] in a very home, office or business lets you observe the venue from anywhere in the world via computer or closed circuit television. People just like the felling for being remembered and they also really appreciate when people take out time to do something special for [http://www.iohotspots.nl/out.php?url=cctvdvrreviews.com best security dvr system] the kids even if this means sending an individual text message. A home CCTV camera includes specially-modified video cameras that are purposely manufactured for capturing unauthorized personnel.<br><br>CCTV is called (Closed Circuit television) method which is meant for confidential just use, not for public transmit. During the course in felt ([http://icio.us/redirect_warn?url=http%3A//cctvdvrreviews.com icio.us]) the show, you'll find episodes where every one of the remaining contestants go back home for one or more weeks. Most of the modern CCTV systems are according to new age technology. Over the of time, there happen to be several technological advancement hit the car audio and video systems. Concrete walls and other barriers make no difference to their transmission quality. Use secure passwords and change them regularly: While making a password you ought to keep in mind that it should not be easy for you to definitely guess.<br><br>DVR also allows you to accomplish searches for your chosen shows and films. CCTV systems are generally used to evaluate stores, banks, and public buildings, etc. With almost all their features and advantages along with their affordable packages, they just don't just feature peace of mind, but full bargain too. The problem is, you only use a cam recorder and you want it to record while you're driving. For prosecution, there is no better evidence when compared to a time stamped video presenting in court.<br><br>Resolution and sensitivity are two top features of cameras that you will need to consider. In a manufacturing environment, these CCTV systems can continuously monitor the performance with the employees working inside shop floor.<br><br>However, most external cameras feature infrared capability as standard. Motion Logs: Motion logs ensure it is easy to start playback when you have a very rough thought of when something happened. Why can you need rest and amusement, physically and mentally. By following the above mentioned precautions you could be relatively safe from identity fraud & thefts.<br><br>Digital point and click cameras such as the ones you're taking pictures of your respective kids with are in most cases between five and 8 megapixels. They've recently been ready to help in solving a huge selection of criminal situations. CCTV monitoring makes use of your video camera for you a signal from the place or [http://techoma-Llc.com/wiki/index.php/Honeywell_Video_Transmissions_-_Is_it_a_Scam location] to your pre defined location on the specific group of monitors. com online store, so you will enjoy the lowest price and also the best service.<br><br>However on your specific difficutly, there might be some particular restricted problems characterizing which it is possible to better come to a decision in case you indeed require a CCTV. You will not want him to co-parent yours either (you've managed just for this long). |
| | |
| According to {{harvtxt|Kolmogorov|Fomin|1999}}, generalized functions were introduced by [[Sergei Lvovich Sobolev|Sergei Sobolev]] in 1935-1936. They were re-introduced in the late 1940s by [[Laurent Schwartz]], who developed a comprehensive theory of distributions.
| |
| | |
| The [[hyperfunction|Sato's hyperfunction]]s is also one of the extensions of the distributions.
| |
| | |
| == Basic idea ==
| |
| | |
| [[Image:Mollifier illustration.png|right|thumb|280px|A typical test function, the [[bump function]] Ψ(''x''). It is [[smooth function|smooth]] (infinitely differentiable) and has [[compact support]] (is zero outside an interval, in this case the interval [−1, 1]).]]
| |
| | |
| Distributions are a class of [[linear functional]]s that map a set of ''test functions'' (conventional and [[well-behaved]] functions) onto the set of real numbers. In the simplest case, the set of test functions considered is D('''R'''), which is the set of functions φ : '''R''' → '''R''' having two properties:
| |
| * φ is [[smooth function|smooth]] (infinitely differentiable);
| |
| * φ has [[compact support]] (is identically zero outside some bounded interval).
| |
| Then, a distribution ''d'' is a linear mapping D('''R''') → '''R'''. Instead of writing ''d''(φ), where φ is a test function in D('''R'''), it is conventional to write <math>\langle d,\varphi \rangle</math>. A simple example of a distribution is the [[Dirac delta]] δ, defined by
| |
| | |
| : <math>\delta(\varphi) = \left\langle \delta, \varphi \right\rangle = \varphi(0).</math>
| |
| | |
| There are straightforward mappings from both [[locally integrable function]]s and [[probability distribution]]s to corresponding distributions, as discussed below. However, not all distributions can be formed in this manner.
| |
| | |
| Suppose that ''f'' : '''R''' → '''R''' is a [[locally integrable function]], and let φ : '''R''' → '''R''' be a test function in D('''R'''). We can then define a corresponding distribution ''T<sub>f</sub>'' by:
| |
| | |
| : <math>\left\langle T_{f}, \varphi \right\rangle = \int_\mathbf{R} f(x) \varphi(x) \,dx. </math> | |
| | |
| This integral is a [[real number]] which [[linear operator|linearly]] and [[Continuous function|continuously]] depends on φ. This suggests the requirement that a distribution should be linear and continuous over the space of test functions D('''R'''), which completes the definition. In a conventional [[abuse of notation]], ''f'' may be used to represent both the original function ''f'' and the distribution ''T<sub>f</sub>'' derived from it.
| |
| | |
| Similarly, if ''p'' is a [[probability distribution]] on the reals and φ is a test function, then a corresponding distribution ''T<sub>p</sub>'' may be defined by:
| |
| | |
| : <math>\left\langle T_p, \varphi \right\rangle = \int_{\mathbf{R}} \varphi\, dp </math>
| |
| | |
| Again, this integral continuously and linearly depends on φ, so that ''T<sub>p</sub>'' is in fact a distribution.
| |
| | |
| Such distributions may be multiplied with real numbers and can be added together, so they form a real [[vector space]]. In general it is not possible to define a multiplication for distributions, but distributions may be multiplied with infinitely differentiable functions.
| |
| | |
| It is desirable to choose a definition for the derivative of a distribution which, at least for distributions derived from locally integrable functions, has the property that <math>T'_f = T_{f'}</math>. If φ is a test function, we can use [[integration by parts]] to see that
| |
| | |
| :<math>\left\langle f', \varphi\right\rangle = \int_{\mathbf{R}}{}{f'\varphi \,dx} = \left[ f(x) \varphi(x) \right]_{-\infty}^\infty - \int_{\mathbf{R}}{}{f\varphi' \,dx} = -\left\langle f, \varphi' \right\rangle</math>
| |
| | |
| where the last equality follows from the fact that φ is zero outside of a bounded set. This suggests that if ''S'' is a ''distribution'', we should define its derivative ''S''′ by
| |
| | |
| : <math>\left\langle S', \varphi \right\rangle = - \left\langle S, \varphi' \right\rangle.</math>
| |
| | |
| It turns out that this is the proper definition; it extends the ordinary definition of derivative, every distribution becomes infinitely differentiable and the usual properties of derivatives hold.
| |
| | |
| '''Example:''' Recall that the [[Dirac delta]] (so-called Dirac delta function) is the distribution defined by
| |
| | |
| : <math>\left\langle \delta, \varphi \right\rangle = \varphi(0)</math>
| |
| | |
| It is the derivative of the distribution corresponding to the [[Heaviside step function]] ''H'': For any test function φ,
| |
| | |
| : <math>\left\langle H', \varphi \right\rangle = - \left\langle H, \varphi' \right\rangle = - \int_{-\infty}^{\infty} H(x) \varphi'(x) dx = - \int_{0}^{\infty} \varphi'(x) dx = \varphi(0) - \varphi(\infty) = \varphi(0) = \left\langle \delta, \varphi \right\rangle,</math>
| |
| | |
| so ''H''′ = δ. Note, φ(∞) = 0 because of compact support. Similarly, the derivative of the Dirac delta is the distribution
| |
| | |
| :<math>\langle\delta',\varphi\rangle= -\varphi'(0).</math>
| |
| | |
| This latter distribution is our first example of a distribution which is derived from neither a function nor a probability distribution.
| |
| | |
| == Test functions and distributions ==
| |
| | |
| In the following, real-valued distributions on an [[open set|open subset]] ''U'' of '''R'''<sup>''n''</sup> will be formally defined. With minor modifications, one can also define complex-valued distributions, and one can replace '''R'''<sup>''n''</sup> by any ([[paracompactness|paracompact]]) [[smooth manifold]].
| |
| | |
| The first object to define is the space D(''U'') of test functions on ''U''. Once this is defined, it is then necessary to equip it with a [[topology]] by defining the [[limit of a sequence]] of elements of D(''U''). The space of distributions will then be given as the space of [[continuous linear functional]]s on D(''U'').
| |
| | |
| === Test function space ===
| |
| | |
| The space D(''U'') of '''test functions''' on ''U'' is defined as follows. A function φ : ''U'' → '''R''' is said to have [[Compact support#Compact_support|compact support]] if there exists a [[compact space|compact]] subset ''K'' of ''U'' such that φ(''x'') = 0 for all ''x'' in ''U'' \ ''K''. The elements of D(''U'') are the infinitely differentiable functions φ : ''U'' → '''R''' with compact support – also known as [[bump function]]s. This is a real [[vector space]]. It can be given a [[topology]] by defining the [[limit of a sequence]] of elements of D(''U''). A sequence (φ<sub>''k''</sub>) in D(''U'') is said to converge to φ ∈ D(''U'') if the following two conditions hold {{harv|Gelfand|Shilov|1966–1968|loc=v. 1, §1.2}}:
| |
| | |
| * There is a compact set ''K'' ⊂ ''U'' containing the supports of all φ<sub>''k''</sub>:
| |
| | |
| ::<math>\bigcup\nolimits_k \operatorname{supp}(\varphi_k)\subset K.</math>
| |
| | |
| * For each [[multi-index]] α, the sequence of partial derivatives ''D''<sup>α</sup>φ<sub>''k''</sub> tends [[uniform convergence|uniformly]] to ''D''<sup>α</sup>φ.
| |
| | |
| With this definition, D(''U'') becomes a [[completeness (topology)|complete]] [[locally convex]] [[topological vector space]] satisfying the [[Heine–Borel theorem|Heine–Borel property]] {{harv|Rudin|1991|loc=§6.4–5}}.
| |
| | |
| This topology can be placed in the context of the following general construction: let
| |
| | |
| :<math>X = \bigcup\nolimits_i X_i</math>
| |
| | |
| be a countable increasing union of locally convex topological vector spaces and ι<sub>''i''</sub> : ''X<sub>i</sub>'' → ''X'' be the inclusion maps. In this context, the [[inductive limit]] topology, or [[final topology]], τ on ''X'' is the finest locally convex vector space topology making all the inclusion maps <math>\iota_i</math> continuous. The topology τ can be explicitly described as follows: let β be the collection of convex balanced subsets ''W'' of ''X'' such that ''W'' ∩ ''X<sub>i</sub>'' is open for all ''i''. A base for the inductive limit topology τ then consists of the sets of the form ''x'' + ''W'', where ''x'' in ''X'' and ''W'' in β.
| |
| | |
| The proof that τ is a vector space topology makes use of the assumption that each ''X<sub>i</sub>'' is locally convex. By construction, β is a local base for τ. That any locally convex vector space topology on ''X'' must necessarily contain τ means it is the weakest one. One can also show that, for each ''i'', the subspace topology ''X<sub>i</sub>'' inherits from τ coincides with its original topology. When each ''X<sub>i</sub>'' is a [[Fréchet space]], (''X'', τ) is called an [[LF space]].
| |
| | |
| Now let ''U'' be the union of ''U<sub>i</sub>'' where {''U<sub>i</sub>''} is a countable nested family of open subsets of ''U'' with compact closures ''K<sub>i</sub>'' = {{overline|''U''}}<sub>''i''</sub>. Then we have the countable increasing union
| |
| | |
| :<math>\mathrm{D}(U) = \bigcup\nolimits_i \mathrm{D}_{K_i} </math>
| |
| | |
| where D<sub>''K<sub>i</sub>''</sub> is the set of all smooth functions on ''U'' with support lying in ''K<sub>i</sub>''. On each D<sub>''K<sub>i</sub>''</sub>, consider the topology given by the seminorms
| |
| | |
| :<math>\| \varphi \|_{\alpha} = \max_{x \in K_i} \left |\delta^{\alpha} \varphi \right | ,</math>
| |
| | |
| i.e. the topology of uniform convergence of derivatives of arbitrary order. This makes each D<sub>''K<sub>i</sub>''</sub> a [[Fréchet space]]. The resulting [[LF space]] structure on D(''U'') is the topology described in the beginning of the section.
| |
| | |
| On D(''U''), one can also consider the topology given by the [[seminorm]]s
| |
| | |
| :<math>\| \varphi \|_{\alpha, K_i} = \max_{x \in K_i} \left |\delta^{\alpha} \varphi \right | .</math>
| |
| | |
| However, this topology has the disadvantage of not being complete. On the other hand, because of the particular features of D<sub>''K<sub>i</sub>''</sub>'s, a set this bounded with respect to τ if and only if it lies in some D<sub>''K<sub>i</sub>''</sub>'s. The completeness of (''D''(''U''), τ) then follow from that of D<sub>''K<sub>i</sub>''</sub>'s.
| |
| | |
| The topology τ is not [[metrizable]] by the [[Baire category theorem]], since D(''U'') is the union of subspaces of the [[first category]] in D(''U'') {{harv|Rudin|1991|loc=§6.9}}.
| |
| | |
| === Distributions ===
| |
| | |
| A '''distribution''' on ''U'' is a [[linear functional]] ''S'' : D(''U'') → '''R''' (or ''S'' : D(''U'') → '''C'''), such that
| |
| | |
| :<math>\lim_{n\to\infty}S(\varphi_n)= S\left(\lim_{n\to\infty}\varphi_n\right)</math>
| |
| | |
| for any convergent sequence φ<sub>''n''</sub> in D(''U''). The space of all distributions on ''U'' is denoted by D′(''U''). Equivalently, the vector space D′(''U'') is the [[continuous dual space]] of the topological vector space D(''U'').
| |
| | |
| The dual pairing between a distribution ''S'' in D′(''U'') and a test function φ in D(''U'') is denoted using [[angle brackets]] thus:
| |
| | |
| :<math>\begin{cases}
| |
| \mathrm{D}'(U) \times \mathrm{D}(U) \to \mathbf{R} \\
| |
| (S, \varphi) \mapsto \langle S, \varphi \rangle.
| |
| \end{cases}</math>
| |
| | |
| Equipped with the [[weak-* topology]], the space D′(''U'') is a [[locally convex]] topological vector space. In particular, a sequence (''S<sub>k</sub>'') in D′(''U'') converges to a distribution ''S'' if and only if
| |
| | |
| :<math>\langle S_k, \varphi\rangle \to \langle S, \varphi\rangle</math>
| |
| | |
| for all test functions φ. This is the case if and only if ''S<sub>k</sub>'' [[uniform convergence|converges uniformly]] to ''S'' on all bounded subsets of D(''U''). (A subset ''E'' of D(''U'') is bounded if there exists a compact subset ''K'' of ''U'' and numbers ''d<sub>n</sub>'' such that every φ in ''E'' has its support in ''K'' and has its ''n''-th derivatives bounded by ''d<sub>n</sub>''.)
| |
| | |
| === Functions as distributions ===
| |
| | |
| The function ''f'' : ''U'' → '''R''' is called '''locally integrable''' if it is [[Lebesgue integration|Lebesgue integrable]] over every compact subset ''K'' of ''U''. This is a large class of functions which includes all continuous functions and all [[Lp space|''L<sup>p</sup>'' functions]]. The topology on D(''U'') is defined in such a fashion that any locally integrable function ''f'' yields a continuous linear functional on D(''U'') – that is, an element of D′(''U'') – denoted here by ''T<sub>f</sub>'', whose value on the test function φ is given by the Lebesgue integral:
| |
| | |
| :<math>\langle T_f,\varphi \rangle = \int_U f\varphi\,dx.</math>
| |
| | |
| Conventionally, one [[abuse of notation|abuses notation]] by identifying ''T<sub>f</sub>'' with ''f'', provided no confusion can arise, and thus the pairing between ''f'' and φ is often written
| |
| | |
| :<math>\langle f, \varphi\rangle = \langle T_f,\varphi\rangle.</math>
| |
| | |
| If ''f'' and ''g'' are two locally integrable functions, then the associated distributions ''T<sub>f</sub>'' and ''T<sub>g</sub>'' are equal to the same element of D′(''U'') if and only if ''f'' and ''g'' are equal [[almost everywhere]] (see, for instance, {{harvtxt|Hörmander|1983|loc=Theorem 1.2.5}}). In a similar manner, every [[Radon measure]] μ on ''U'' defines an element of D′(''U'') whose value on the test function φ is ∫φdμ. As above, it is conventional to abuse notation and write the pairing between a Radon measure μ and a test function φ as ⟨μ, φ⟩. Conversely, as shown in a theorem by Schwartz (similar to the [[Riesz representation theorem]]), every distribution which is non-negative on non-negative functions is of this form for some (positive) Radon measure.
| |
| | |
| The test functions are themselves locally integrable, and so define distributions. As such they are [[dense (topology)|dense]] in D′(''U'') with respect to the topology on D′(''U'') in the sense that for any distribution ''S'' ∈ D′(''U''), there is a sequence φ<sub>''n''</sub> ∈ D(''U'') such that
| |
| | |
| :<math>\langle\varphi_n,\psi\rangle\to \langle S,\psi\rangle</math>
| |
| | |
| for all ψ ∈ D(''U''). This follows at once from the [[Hahn–Banach theorem]], since by an elementary fact about weak topologies the dual of D′(''U'') with its weak-* topology is the space D(''U'') {{harv|Rudin|1991|loc=Theorem 3.10}}. This can also be proven more constructively by a convolution argument.
| |
| | |
| == Operations on distributions ==
| |
| | |
| Many operations which are defined on smooth functions with compact support can also be defined for distributions. In general, if ''T'' : D(''U'') → D(''U'') is a linear mapping of vector spaces which is continuous with respect to the weak-* topology, then it is possible to extend ''T'' to a mapping ''T'' : D′(''U'') → D′(''U'') by passing to the limit. (This approach works for more general non-linear mappings as well, provided they are assumed to be [[uniformly continuous]].)
| |
| | |
| In practice, however, it is more convenient to define operations on distributions by means of the [[adjoint of an operator|transpose]] (or adjoint transformation) ({{harvnb|Strichartz|1994|loc=§2.3}}; {{harvnb|Trèves|1967}}). If ''T'' : D(''U'') → D(''U'') is a continuous linear operator, then the transpose is an operator ''T*'' : D(''U'') → D(''U'') such that
| |
| | |
| :<math>\langle T\varphi,\psi\rangle = \langle\varphi, T^*\psi\rangle</math>
| |
| | |
| for all φ, ψ ∈ D(''U''). If such an operator ''T*'' exists, and is continuous, then the original operator ''T'' may be extended to distributions by defining
| |
| | |
| :<math>Tf(\psi) = f(T^*\psi).\,</math>
| |
| | |
| === Differentiation ===
| |
| | |
| If ''T'' : D(''U'') → D(''U'') is given by the partial derivative
| |
| | |
| :<math>T\varphi = \frac{\partial\varphi}{\partial x_k}.</math>
| |
| | |
| By integration by parts, if φ and ψ are in D(''U''), then
| |
| | |
| :<math>\langle T\varphi,\psi\rangle=\left\langle\frac{\partial\varphi}{\partial x_k},\psi\right\rangle = -\left\langle\varphi,\frac{\partial\psi}{\partial x_k}\right\rangle</math>
| |
| | |
| so that ''T*'' = −''T''. This is a continuous linear transformation D(''U'') → D(''U''). So, if ''S'' ∈ D′(''U'') is a distribution, then the partial derivative of ''S'' with respect to the coordinate ''x<sub>k</sub>'' is defined by the formula
| |
| | |
| :<math>\left\langle \frac{\partial S}{\partial x_{k}}, \varphi \right\rangle = - \left\langle S, \frac{\partial \varphi}{\partial x_{k}} \right\rangle</math>
| |
| | |
| for all test functions φ. In this way, every distribution is infinitely differentiable, and the derivative in the direction ''x<sub>k</sub>'' is a [[linear operator]] on D′(''U''). In general, if α = (α<sub>1</sub>, ..., α<sub>''n''</sub>) is an arbitrary [[multi-index]] and ∂<sup>α</sup> denotes the associated mixed partial derivative operator, the mixed partial derivative ∂<sup>α</sup>''S'' of the distribution ''S'' ∈ D′(''U'') is defined by
| |
| | |
| :<math>\left\langle \partial^{\alpha} S, \varphi \right\rangle = (-1)^{| \alpha |} \left\langle S, \partial^{\alpha} \varphi \right\rangle \mbox{ for all } \varphi \in \mathrm{D}(U).</math>
| |
| | |
| Differentiation of distributions is a ''continuous'' operator on D′(''U''); this is an important and desirable property that is not shared by most other notions of differentiation.
| |
| | |
| === Multiplication by a smooth function ===
| |
| | |
| If ''m'' : ''U'' → '''R''' is an infinitely differentiable function and ''S'' is a distribution on ''U'', then the product m''S'' is defined by (''mS'')(φ) = ''S''(''m''φ) for all test functions φ. This definition coincides with the transpose transformation of
| |
| | |
| :<math>T_m : \varphi\mapsto m\varphi</math>
| |
| | |
| for φ ∈ D(''U''). Then, for any test function ψ
| |
| | |
| :<math>\langle T_m\varphi,\psi\rangle = \int_U m(x)\varphi(x)\psi(x)\,dx = \langle\varphi, T_m\psi\rangle</math>
| |
| | |
| so that ''T*<sub>m</sub>'' = ''T<sub>m</sub>''. Multiplication of a distribution ''S'' by the smooth function ''m'' is therefore defined by
| |
| | |
| :<math>mS(\psi) = \langle mS, \psi\rangle = \langle S, m\psi\rangle = S(m\psi).</math>
| |
| | |
| Under multiplication by smooth functions, D′(''U'') is a [[module (mathematics)|module]] over the [[ring (mathematics)|ring]] C<sup>∞</sup>(''U''). With this definition of multiplication by a smooth function, the ordinary product rule of calculus remains valid. However, a number of unusual identities also arise. For example, the Dirac delta distribution δ is defined on '''R''' by ⟨δ, φ⟩ = φ(0), so that ''m''δ = ''m''(0)δ. Its derivative is given by ⟨δ′, φ⟩ = −⟨δ, φ′⟩ = −φ′(0). But the product mδ′ of m and δ′ is the distribution
| |
| | |
| :<math>m\delta' = m(0)\delta' - m'\delta = m(0)\delta' - m'(0)\delta.\,</math>
| |
| | |
| This definition of multiplication also makes it possible to define the operation of a linear [[differential operator]] with smooth coefficients on a distribution. A linear differential operator takes a distribution ''S'' ∈ D′(''U'') to another distribution given by a sum of the form
| |
| | |
| :<math>PS = \sum\nolimits_{|\alpha|\le k} p_\alpha \partial^\alpha S</math>
| |
| | |
| where the coefficients ''p''<sub>α</sub> are smooth functions in ''U''. If ''P'' is a given differential operator, then the minimum integer ''k'' for which such an expansion holds for every distribution ''S'' is called the '''order''' of ''P''. The transpose of ''P'' is given by
| |
| | |
| :<math>\left\langle \sum\nolimits_\alpha p_\alpha \partial^\alpha S,\varphi\right\rangle = \left\langle S,\sum\nolimits_\alpha (-1)^{|\alpha|} \partial^\alpha(p_\alpha\varphi)\right\rangle.</math>
| |
| | |
| The space D′(''U'') is a [[D-module]] with respect to the action of the ring of linear differential operators.
| |
| | |
| === Composition with a smooth function ===
| |
| | |
| Let ''S'' be a distribution on an open set ''U'' ⊂ '''R'''<sup>''n''</sup>. Let ''V'' be an open set in '''R'''<sup>''n''</sup>, and ''F'' : ''V'' → ''U''. Then provided ''F'' is a [[submersion (mathematics)|submersion]], it is possible to define
| |
| | |
| :<math>S\circ F \in \mathrm{D}'(V).</math>
| |
| | |
| This is the '''composition''' of the distribution ''S'' with ''F'', and is also called the '''[[pullback (differential geometry)|pullback]]''' of ''S'' along ''F'', sometimes written
| |
| | |
| :<math>F^\sharp : S\mapsto F^\sharp S = S\circ F.</math>
| |
| | |
| The pullback is often denoted ''F*'', but this notation risks confusion with the above use of '*' to denote the transpose of a linear mapping.
| |
| | |
| The condition that ''F'' be a submersion is equivalent to the requirement that the [[Jacobian]] derivative ''dF''(''x'') of ''F'' is a [[surjective]] linear map for every ''x'' ∈ ''V''. A necessary (but not sufficient) condition for extending ''F''<sup>#</sup> to distributions is that ''F'' be an [[open mapping]] {{harv|Hörmander|1983|loc=Theorem 6.1.1}}. The [[inverse function theorem]] ensures that a submersion satisfies this condition.
| |
| | |
| If ''F'' is a submersion, then ''F''<sup>#</sup> is defined on distributions by finding the transpose map. Uniqueness of this extension is guaranteed since ''F''<sup>#</sup> is a continuous linear operator on D(''U''). Existence, however, requires using the [[integration by substitution|change of variables]] formula, the inverse function theorem (locally) and a [[partition of unity]] argument; see {{harvtxt|Hörmander|1983|loc=Theorem 6.1.2}}.
| |
| | |
| In the special case when ''F'' is a [[diffeomorphism]] from an open subset ''V'' of '''R'''<sup>''n''</sup> onto an open subset ''U'' of '''R'''<sup>''n''</sup> change of variables under the integral gives
| |
| | |
| :<math>\int_V\varphi\circ F(x) \psi(x)\,dx = \int_U\varphi(x) \psi \left (F^{-1}(x) \right ) \left |\det dF^{-1}(x) \right |\,dx.</math>
| |
| | |
| In this particular case, then, ''F''<sup>#</sup> is defined by the transpose formula:
| |
| | |
| :<math>\left \langle F^\sharp S,\varphi \right \rangle = \left \langle S, \left |\det d(F^{-1}) \right | \varphi\circ F^{-1} \right \rangle.</math>
| |
| | |
| ==Localization of distributions==
| |
| There is no way to define the value of a distribution in D′(''U'') at a particular point of ''U''. However, as is the case with functions, distributions on ''U'' restrict to give distributions on open subsets of ''U''. Furthermore, distributions are ''locally determined'' in the sense that a distribution on all of ''U'' can be assembled from a distribution on an open cover of ''U'' satisfying some compatibility conditions on the overlap. Such a structure is known as a [[sheaf (mathematics)|sheaf]].
| |
| | |
| ===Restriction===
| |
| Let ''U'' and ''V'' be open subsets of '''R'''<sup>''n''</sup> with ''V'' ⊂ ''U''. Let ''E<sub>VU</sub>'' : D(''V'') → D(''U'') be the operator which ''extends by zero'' a given smooth function compactly supported in ''V'' to a smooth function compactly supported in the larger set ''U''. Then the restriction mapping ρ<sub>''VU''</sub> is defined to be the transpose of ''E<sub>VU</sub>''. Thus for any distribution ''S'' ∈ D′(''U''), the restriction ρ<sub>''VU''</sub>''S'' is a distribution in the dual space D′(''V'') defined by
| |
| | |
| :<math>\langle \rho_{VU}S,\varphi\rangle = \langle S, E_{VU}\varphi\rangle</math>
| |
| | |
| for all test functions φ ∈ D(''V'').
| |
| | |
| Unless ''U'' = ''V'', the restriction to ''V'' is neither [[injective]] nor [[surjective]]. Lack of surjectivity follows since distributions can blow up towards the boundary of ''V''. For instance, if ''U'' = '''R''' and ''V'' = (0, 2), then the distribution
| |
| | |
| :<math>S(x) = \sum_{n=1}^\infty n\,\delta\left(x-\frac{1}{n}\right)</math>
| |
| | |
| is in D′(''V'') but admits no extension to D′(''U'').
| |
| | |
| === Support of a distribution ===
| |
| | |
| Let ''S'' ∈ D′(''U'') be a distribution on an open set ''U''. Then ''S'' is said to vanish on an open set ''V'' of ''U'' if ''S'' lies in the [[kernel (algebra)|kernel]] of the restriction map ρ<sub>''VU''</sub>. Explicitly ''S'' vanishes on ''V'' if
| |
| | |
| :<math>\langle S,\varphi\rangle = 0</math>
| |
| | |
| for all test functions φ ∈ C<sup>∞</sup>(''U'') with support in ''V''. Let ''V'' be a maximal open set on which the distribution ''S'' vanishes; i.e., ''V'' is the union of every open set on which ''S'' vanishes. The '''support''' of ''S'' is the complement of ''V'' in ''U''. Thus
| |
| | |
| :<math>\operatorname{supp}\,S = U - \bigcup\left\{V \mid \rho_{VU}S = 0\right\}.</math>
| |
| | |
| The distribution ''S'' has '''compact support''' if its support is a compact set. Explicitly, ''S'' has compact support if there is a compact subset ''K'' of ''U'' such that for every test function φ whose support is completely outside of ''K'', we have ''S''(φ) = 0. Compactly supported distributions define continuous linear functionals on the space C<sup>∞</sup>(''U''); the topology on C<sup>∞</sup>(''U'') is defined such that a sequence of test functions φ<sub>''k''</sub> converges to 0 if and only if all derivatives of φ<sub>''k''</sub> converge uniformly to 0 on every compact subset of ''U''. Conversely, it can be shown that every continuous linear functional on this space defines a distribution of compact support. The embedding of C<sub>c</sub><sup>∞</sup>(''U'') into C<sup>∞</sup>(''U''), where the spaces are given their respective topologies, is continuous and has dense image. Thus compactly supported distributions can be identified with those distributions that can be extended from C<sub>c</sub><sup>∞</sup>(''U'') to C<sup>∞</sup>(''U'').
| |
| | |
| == Tempered distributions and Fourier transform {{anchor|Tempered distribution}} ==
| |
| By using a larger space of test functions, one can define the '''tempered distributions''', a subspace of D′('''R'''<sup>''n''</sup>). These distributions are useful if one studies the [[Fourier transform]] in generality: all tempered distributions have a Fourier transform, but not all distributions have one.
| |
| | |
| The space of test functions employed here, the so-called [[Schwartz space]] ''S''('''R'''<sup>''n''</sup>), is the function space of all infinitely differentiable functions that are [[rapidly decreasing]] at infinity along with all partial derivatives. Thus {{nowrap|φ : '''R'''<sup>''n''</sup> → '''R'''}} is in the Schwartz space provided that any derivative of φ, multiplied with any power of |''x''|, converges towards 0 for |''x''| → ∞. These functions form a complete [[topological vector space]] with a suitably defined family of [[seminorm]]s. More precisely, let
| |
| | |
| :<math> p_{\alpha , \beta} (\varphi) = \sup_{x \in \mathbf{R}^n} | x^\alpha D^\beta \varphi(x)| </math>
| |
| | |
| for α, β [[multi-indices]] of size ''n''. Then φ is a Schwartz function if all the values
| |
| | |
| :<math> p_{\alpha, \beta} (\varphi) < \infty.</math>
| |
| | |
| The family of seminorms ''p''<sub>α, β</sub> defines a [[locally convex]] topology on the Schwartz-space. The seminorms are, in fact, [[norm (mathematics)|norms]] on the Schwartz space, since Schwartz functions are smooth. The Schwartz space is [[metrizable]] and [[complete space|complete]]. Because the Fourier transform changes differentiation by ''x''<sup>α</sup> into multiplication by ''x''<sup>α</sup> and vice-versa, this symmetry implies that the Fourier transform of a Schwartz function is also a Schwartz function.
| |
| | |
| The space of '''tempered distributions''' is defined as the (continuous) [[dual space|dual]] of the Schwartz space. In other words, a distribution ''F'' is a tempered distribution if and only if
| |
| | |
| : <math> \lim_{m\to\infty} F(\varphi_m)=0. </math>
| |
| | |
| is true whenever,
| |
| | |
| : <math> \lim_{m\to\infty} p_{\alpha , \beta} (\varphi_m) = 0 </math>
| |
| | |
| holds for all [[multi-indices]] α, β.
| |
| | |
| The derivative of a tempered distribution is again a tempered distribution. Tempered distributions generalize the bounded (or slow-growing) locally integrable functions; all distributions with compact support and all [[square-integrable]] functions are tempered distributions. More generally, all functions that are products of polynomials with elements of [[Lp space|''L<sup>p</sup>''('''R'''<sup>''n''</sup>)]] for ''p'' ≥ 1 are tempered distributions.
| |
| | |
| The ''tempered distributions'' can also be characterized as ''slowly growing''. This characterization is ''dual'' to the ''rapidly falling'' behaviour, e.g. <math>\propto |x|^n \cdot \exp (- x^2)</math>, of the test functions.
| |
| | |
| To study the Fourier transform, it is best to consider ''complex''-valued test functions and complex-linear distributions. The ordinary [[continuous Fourier transform]] ''F'' yields then an [[automorphism]] of Schwartz function space, and we can define the '''Fourier transform''' of the tempered distribution ''S'' by (''FS'')(ψ) = ''S''(''F''ψ) for every test function ψ. ''FS'' is thus again a tempered distribution. The Fourier transform is a continuous, linear, bijective operator from the space of tempered distributions to itself. This operation is compatible with differentiation in the sense that
| |
| | |
| :<math>F\dfrac{dS}{dx}=ixFS</math>
| |
| | |
| and also with convolution: if ''S'' is a tempered distribution and ψ is a ''slowly increasing'' infinitely differentiable function on '''R'''<sup>''n''</sup> (meaning that all derivatives of ψ grow at most as fast as [[polynomial]]s), then ''S''ψ is again
| |
| a tempered distribution and
| |
| | |
| :<math>F(S\psi)=FS*F\psi\,</math>
| |
| | |
| is the convolution of ''FS'' and ''F''ψ. In particular, the Fourier transform of the unity function is the δ distribution.
| |
| | |
| ==Convolution==
| |
| | |
| Under some circumstances, it is possible to define the [[convolution]] of a function with a distribution, or even the convolution of two distributions.
| |
| | |
| ;Convolution of a test function with a distribution
| |
| | |
| If ''f'' ∈ D('''R'''<sup>''n''</sup>) is a compactly supported smooth test function, then convolution with ''f'',
| |
| | |
| :<math>\begin{cases}
| |
| C_f : \mathrm{D}(\mathbf{R}^n)\to \mathrm{D}(\mathbf{R}^n) \\
| |
| C_f g \mapsto f * g
| |
| \end{cases}</math>
| |
| | |
| defines a linear operator which is [[continuous function|continuous]] with respect to the [[LF space]] topology on D('''R'''<sup>''n''</sup>).
| |
| | |
| Convolution of ''f'' with a distribution ''S'' ∈ D′('''R'''<sup>''n''</sup>) can be defined by taking the transpose of ''C<sub>f</sub>'' relative to the duality pairing of D('''R'''<sup>''n''</sup>) with the space D′('''R'''<sup>''n''</sup>) of distributions {{harv|Trèves|1967|loc=Chapter 27}}. If ''f'', ''g'', φ ∈ D('''R'''<sup>''n''</sup>), then by [[Fubini's theorem]]
| |
| | |
| :<math>\left \langle C_fg, \varphi \right \rangle = \int_{\mathbf{R}^n}\varphi(x)\int_{\mathbf{R}^n}f(x-y)g(y)\,dydx = \left \langle g, C_{\widetilde{f}}\varphi \right \rangle</math>
| |
| | |
| where <math>\scriptstyle{\widetilde{f}(x) = f(-x)}</math>. Extending by continuity, the convolution of ''f'' with a distribution ''S'' is defined by
| |
| | |
| :<math>\langle f*S, \varphi\rangle = \left \langle S, \widetilde{f}*\varphi \right \rangle</math>
| |
| | |
| for all test functions φ ∈ D('''R'''<sup>''n''</sup>).
| |
| | |
| An alternative way to define the convolution of a function ''f'' and a distribution ''S'' is to use the translation operator τ<sub>''x''</sub> defined on test functions by
| |
| | |
| :<math>\tau_x \varphi(y) = \varphi(y-x)</math>
| |
| | |
| and extended by the transpose to distributions in the obvious way {{harv|Rudin|1991|loc=§6.29}}. The convolution of the compactly supported function ''f'' and the distribution ''S'' is then the function defined for each ''x'' ∈ '''R'''<sup>''n''</sup> by
| |
| | |
| :<math>(f*S)(x) = \left \langle S, \tau_x\widetilde{f} \right \rangle.</math>
| |
| | |
| It can be shown that the convolution of a compactly supported function and a distribution is a smooth function. If the distribution ''S'' has compact support as well, then ''f''∗''S'' is a compactly supported function, and the [[Titchmarsh convolution theorem]] {{harv|Hörmander|1983|loc=Theorem 4.3.3}} implies that
| |
| | |
| :<math>\operatorname{ch}(f*S) = \operatorname{ch}f + \operatorname{ch}S</math>
| |
| | |
| where ''ch'' denotes the [[convex hull]].
| |
| | |
| ;Distribution of compact support
| |
| | |
| It is also possible to define the convolution of two distributions ''S'' and ''T'' on '''R'''<sup>''n''</sup>, provided one of them has compact support. Informally, in order to define ''S''∗''T'' where ''T'' has compact support, the idea is to extend the definition of the convolution ∗ to a linear operation on distributions so that the associativity formula
| |
| | |
| :<math>S*(T*\varphi) = (S*T)*\varphi</math>
| |
| | |
| continues to hold for all test-functions φ. {{harvtxt|Hörmander|1983|loc=§IV.2}} proves the uniqueness of such an extension.
| |
| | |
| It is also possible to provide a more explicit characterization of the convolution of distributions {{harv|Trèves|1967|loc=Chapter 27}}. Suppose that it is ''T'' that has compact support. For any test function φ in D('''R'''<sup>''n''</sup>), consider the function
| |
| | |
| :<math>\psi(x) = \langle T, \tau_{-x}\varphi\rangle.</math>
| |
| | |
| It can be readily shown that this defines a smooth function of ''x'', which moreover has compact support. The convolution of ''S'' and ''T'' is defined by
| |
| | |
| :<math>\langle S * T,\varphi\rangle = \langle S, \psi\rangle.</math>
| |
| | |
| This generalizes the classical notion of [[convolution]] of functions and is compatible with differentiation in the following sense:
| |
| | |
| :<math>\partial^\alpha(S*T)=(\partial^\alpha S)*T=S*(\partial^\alpha T).</math>
| |
| | |
| This definition of convolution remains valid under less restrictive assumptions about ''S'' and ''T''; see for instance {{harvtxt|Gel'fand|Shilov|1966–1968|loc=v. 1, pp. 103–104}} and {{harvtxt|Benedetto|1997|loc=Definition 2.5.8}}.
| |
| | |
| ==Distributions as derivatives of continuous functions==
| |
| | |
| The formal definition of distributions exhibits them as a subspace of a very large space, namely the topological dual of D(''U'') (or S('''R'''<sup>''d''</sup>) for tempered distributions). It is not immediately clear from the definition how exotic a distribution might be. To answer this question, it is instructive to see distributions built up from a smaller space, namely the space of continuous functions. Roughly, any distribution is locally a (multiple) derivative of a continuous function. A precise version of this result, given below, holds for distributions of compact support, tempered distributions, and general distributions. Generally speaking, no proper subset of the space of distributions contains all continuous functions and is closed under differentiation. This says that distributions are not particularly exotic objects; they are only as complicated as necessary.
| |
| | |
| | |
| ===Tempered distributions===
| |
| | |
| If ''f'' ∈ ''S''′('''R'''<sup>''n''</sup>) is a tempered distribution, then there exists a constant ''C'' > 0, and positive integers ''M'' and ''N'' such that for all Schwartz functions φ ∈ ''S''('''R'''<sup>''n''</sup>)
| |
| | |
| :<math>\langle f, \varphi\rangle \le C\sum\nolimits_{|\alpha|\le N, |\beta|\le M}\sup_{x\in\mathbf{R}^n} \left |x^\alpha D^\beta \varphi(x) \right |=C\sum\nolimits_{|\alpha|\le N, |\beta|\le M}p_{\alpha,\beta}(\varphi).</math>
| |
| | |
| This estimate along with some techniques from functional analysis can be used to show that there is a continuous slowly increasing function ''F'' and a multi-index α such that
| |
| | |
| :<math>f=D^\alpha F.\,</math>
| |
| | |
| | |
| ===Restriction of distributions to compact sets===
| |
| | |
| If ''f'' ∈ D′('''R'''<sup>''n''</sup>), then for any compact set ''K'' ⊂ '''R'''<sup>''n''</sup>, there exists a continuous function ''F '' compactly supported
| |
| in '''R'''<sup>''n''</sup> (possibly on a larger set than ''K'' itself) and a multi-index α such that ''f''=''D''<sup>α</sup>''F'' on C<sub>c</sub><sup>∞</sup>(''K'').
| |
| This follows from the previously quoted result on tempered distributions by means of a localization argument.
| |
| | |
| ===Distributions with point support===
| |
| | |
| If ''f'' has support at a single point {''P''}, then ''f'' is in fact a finite linear combination of distributional derivatives of the δ function at ''P''. That is, there exists an integer ''m'' and complex constants ''a''<sub>α</sub> for [[multi-index|multi-indices]] |α| ≤ ''m'' such that
| |
| | |
| :<math> f = \sum\nolimits_{|\alpha|\le m}a_\alpha D^\alpha(\tau_P\delta)</math>
| |
| | |
| where τ<sub>''P''</sub> is the translation operator.
| |
| | |
| ===General distributions===
| |
| | |
| A version of the above theorem holds locally in the following sense {{harv|Rudin|1991}}. Let ''S'' be a distribution on ''U'', then one can find for every multi-index α a continuous function ''g''<sub>α</sub> such that
| |
| | |
| : <math>\displaystyle S = \sum\nolimits_{\alpha} D^{\alpha} g_{\alpha}</math>
| |
| | |
| and that any compact subset ''K'' of ''U'' intersects the supports of only finitely many ''g''<sub>α</sub>; therefore, to evaluate the value of ''S'' for a given smooth function ''f'' compactly supported in ''U'', we only need finitely many ''g''<sub>α</sub>; hence the infinite sum above is well-defined as a distribution. If the distribution ''S'' is of finite order, then one can choose ''g''<sub>α</sub> in such a way that only finitely many of them are nonzero.
| |
| | |
| == Using holomorphic functions as test functions ==
| |
| | |
| The success of the theory led to investigation of the idea of [[hyperfunction]], in which spaces of [[holomorphic function]]s are used as test functions. A refined theory has been developed, in particular [[Mikio Sato]]'s [[algebraic analysis]], using [[sheaf theory]] and [[several complex variables]]. This extends the range of symbolic methods that can be made into rigorous mathematics, for example [[Path integral formulation|Feynman integrals]].
| |
| | |
| == Problem of multiplication ==
| |
| | |
| A possible limitation of the theory of distributions (and hyperfunctions) is that it is a purely linear theory, in the sense that the product of two distributions cannot consistently be defined (in general), as has been proved by [[Laurent Schwartz]] in the 1950s. For example, if p.v. 1/''x'' is the distribution obtained by the [[Cauchy principal value]]
| |
| | |
| :<math>\left(\operatorname{p.v.}\frac{1}{x}\right)[\phi] = \lim_{\epsilon\to 0^+} \int_{|x|\ge\epsilon} \frac{\phi(x)}{x}\, dx</math>
| |
| | |
| for all φ ∈ ''S''('''R'''), and δ is the Dirac delta distribution then
| |
| | |
| : <math>\left(\delta \times x \right) \times \operatorname{p.v.} \frac{1}{x} = 0</math>
| |
| | |
| but
| |
| | |
| : <math>\delta \times \left( x \times \operatorname{p.v.} \frac{1}{x} \right) = \delta</math>
| |
| | |
| so the product of a distribution by a smooth function (which is always well defined) cannot be extended to an [[associativity|associative]] product on the space of distributions.
| |
| | |
| Thus, nonlinear problems cannot be posed in general and thus not solved within distribution theory alone. In the context of [[quantum field theory]], however, solutions can be found. In more than two spacetime dimensions the problem is related to the [[Regularization (physics)|regularization]] of [[Ultraviolet divergence|divergences]]. Here [[Henri Epstein]] and [[Vladimir Glaser]] developed the mathematically rigorous (but extremely technical) ''[[causal perturbation theory]]''. This does not solve the problem in other situations. Many other interesting theories are non linear, like for example [[Navier–Stokes equations]] of [[fluid dynamics]].
| |
| | |
| In view of this, several not entirely satisfactory theories of '''[[algebra (ring theory)|algebra]]s''' of [[generalized function]]s have been developed, among which [[Colombeau algebra|Colombeau's (simplified) algebra]] is maybe the most popular in use today.
| |
| | |
| A simple solution of the multiplication problem is dictated by the [[path integral formulation]] of [[quantum mechanics]]. Since this is required to be equivalent to the [[Schrödinger]] theory of [[quantum mechanics]] which is invariant under coordinate transformations, this property must be shared by path integrals. This fixes all products of distributions as shown by {{harvtxt|Kleinert|Chervyakov|2001}}. The result is equivalent to what can be derived from [[dimensional regularization]] {{harv|Kleinert|Chervyakov|2000}}.
| |
| | |
| ==See also==
| |
| | |
| *[[Current (mathematics)]]
| |
| *[[Distribution (number theory)]]
| |
| *[[Colombeau algebra]]
| |
| *[[Dual vector]]
| |
| *[[Gelfand triple]]
| |
| *[[Generalized function]]
| |
| *[[Homogeneous distribution]]
| |
| *[[Hyperfunction]]
| |
| *[[Laplacian of the indicator]]
| |
| *[[Malgrange–Ehrenpreis theorem]]
| |
| *[[Pseudodifferential operator]]
| |
| *[[Riesz representation theorem]]
| |
| *[[Vague topology]]
| |
| *[[Weak solution]]
| |
| | |
| ==References==
| |
| *{{citation|first=J.J.|last=Benedetto|title=Harmonic Analysis and Applications|publisher=CRC Press|year=1997}}.
| |
| *{{citation|first1=I.M.|last1=Gel'fand|first2=G.E.|last2=Shilov|title=Generalized functions|volume=1–5|publisher=Academic Press|year=1966–1968}}.
| |
| *{{citation|mr=0717035|first=L.|last= Hörmander|authorlink=Lars Hörmander|title=The analysis of linear partial differential operators I|series= Grundl. Math. Wissenschaft. |volume= 256 |publisher= Springer |year=1983|isbn=3-540-12104-8 }}.
| |
| * {{citation
| |
| | title = Rules for integrals over products of distributions from coordinate independence of path integrals
| |
| | first1 = H.|last1=Kleinert
| |
| | first2 = A.|last2=Chervyakov| journal = Europ. Phys. J.
| |
| | volume = C 19
| |
| | issue = 4
| |
| | pages = 743–747
| |
| | year = 2001
| |
| | doi = 10.1007/s100520100600
| |
| | url = http://www.physik.fu-berlin.de/~kleinert/kleiner_re303/wardepl.pdf|authorlink1=Hagen Kleinert| bibcode=2001EPJC...19..743K|arxiv = quant-ph/0002067 }}.
| |
| * {{citation
| |
| | title = Coordinate Independence of Quantum-Mechanical Path Integrals
| |
| | first1 = H.|last1=Kleinert
| |
| | first2 = A.|last2=Chervyakov
| |
| | journal = Phys. Lett.
| |
| | volume = A 269
| |
| | issue =
| |
| | pages = 63
| |
| | year = 2000
| |
| | doi = 10.1016/S0375-9601(00)00475-8
| |
| | url = http://www.physik.fu-berlin.de/~kleinert/305/klch2.pdf|authorlink1=Hagen Kleinert|bibcode = 2000PhLA..273....1K }}.
| |
| *{{citation|first1=A. N.|last1=Kolmogorov|author1-link=Andrey Kolmogorov|first2=S. V.|last2=Fomin|author2-link=Sergei Fomin|year=1999|title=Elements of the Theory of Functions and Functional Analysis|publisher=Dover Books}}.
| |
| * {{citation|first=W.|last=Rudin|authorlink=Walter Rudin|title=Functional Analysis|edition=2nd|publisher=McGraw–Hill|year=1991|isbn=0-07-054236-8}}.
| |
| * {{citation|first=L.|last=Schwartz|year=1954|authorlink=Laurent Schwartz|title=Sur l'impossibilité de la multiplications des distributions|journal=C.R.Acad. Sci. Paris|volume=239|pages=847–848}}.
| |
| * {{citation|first=L.|last=Schwartz|authorlink=Laurent Schwartz|title=Théorie des distributions|volume=1–2|publisher=Hermann|year=1950–1951}}.
| |
| * {{citation|first1=Elias|last1=Stein|authorlink1=Elias Stein|first2=Guido|last2=Weiss|title=Introduction to Fourier Analysis on Euclidean Spaces|publisher=Princeton University Press|year=1971|isbn=0-691-08078-X}}.
| |
| * {{citation|first=R.|last=Strichartz|year=1994|title=A Guide to Distribution Theory and Fourier Transforms|publisher=CRC Press|isbn=0-8493-8273-4}}.
| |
| *{{citation|first=François|last=Trèves|title=Topological Vector Spaces, Distributions and Kernels|publisher=Academic Press|year=1967|pages=126 ff}}.
| |
| | |
| ==Further reading==
| |
| * M. J. Lighthill (1959). ''Introduction to Fourier Analysis and Generalised Functions''. Cambridge University Press. ISBN 0-521-09128-4 (requires very little knowledge of analysis; defines distributions as limits of sequences of functions under integrals)
| |
| * [[Hagen Kleinert|H. Kleinert]], ''Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets'', 4th edition, [http://www.worldscibooks.com/physics/6223.html World Scientific (Singapore, 2006)](also available online [http://www.physik.fu-berlin.de/~kleinert/b5 here]). See Chapter 11 for defining products of distributions from the physical requirement of coordinate invariance.
| |
| * [[Vasily Vladimirov|V.S. Vladimirov]] (2002). ''Methods of the theory of generalized functions''. Taylor & Francis. ISBN 0-415-27356-0
| |
| * {{springer|id=G/g043810|title=Generalized function|first=V.S.|last=Vladimirov| author-link= Vasilii Sergeevich Vladimirov|year=2001}}.
| |
| * {{springer|id=G/g043840|title=Generalized functions, space of|first=V.S.|last=Vladimirov| author-link= Vasilii Sergeevich Vladimirov|year=2001}}.
| |
| * {{springer|id=G/g043820|title=Generalized function, derivative of a|first=V.S.|last=Vladimirov| author-link= Vasilii Sergeevich Vladimirov|year=2001}}.
| |
| * {{springer|id=G/g043830|title=Generalized functions, product of|first=V.S.|last=Vladimirov| author-link= Vasilii Sergeevich Vladimirov|year=2001}}.
| |
| * {{springer|id=G/g130030|title=Generalized function algebras|first=Michael|last=Oberguggenberger|year=2001}}.
| |
| | |
| [[Category:Generalized functions]]
| |
| [[Category:Functional analysis]]
| |
| [[Category:Smooth functions]]
| |