Infinitesimal strain theory: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Mark viking
Dummy edit to force server purge
Line 1: Line 1:
{{distinguish|discrete Chebyshev polynomials}}
The web is providing a lot of firms the chance to attain a lot more buyers then before. We're within the information age today and which means they is sufficient of ways for companies to have their products or services to their clients.<br><br>Set up a photo of yourself on your own website and be willing to answer the telephone when people contact. An area Toronto Chiropractor that does this for many his customers. When they first come for their website and their web stats show that numerous individuals are hitting the photograph They have had major changes. This helps get more individuals to trust you and build credibility. Once people trust you, they will more or less buy anything you put infront of them.<br><br>Your product has improved chances of receiving considered and most importantly realized, when your product material is readable. It takes to be created in this way, that each and every individual of the targeted teams understands what precisely you would like to claim and ultimately sell.<br><br><br><br>As you'll notice from this show You don't have to be an expert. You go through the people in the display and they are not the brightest people on earth but they do have willpower to carve-out their little niche within the shipping world. It's a market that no-one else desires to do. It's ideal. If you are a local organization that desires to stand-out, find a market and benefit from it.<br><br>The strategies are much the same as [http://20thstreetblockparty.com/2014/jordan-kurland-intro/ Jordan Kurland]. Nonetheless, there are some distinctions which you have to find out to reach your goals. This program offered below will educate you on steps to make the necessary modifications from online to mobile-marketing within the ideal niches of your selection. Industry techniques for mobile markets aren't hard, but without the right information, you could waste some funds studying them.<br><br>I am writing this article to show my understanding and potential of web marketing with the desire that I may attain new leads for my companies, but much more to put a link with the exact text I want on the website with a top page rank.<br><br>If you are a realtor anywhere on earth your agency may currently offer free web marketing by giving an internet site and a profile and contact podium to you. Or even you could contact a web site design firm to get you started.<br><br>Roughly said, these are the most popularly-used kinds of online advertising. Thus, if you're trying to advertise on the site, you can choose exactly what the most efficient method to reach your target market is. And if you're simply browsing through, you know what kind of marketing you're experiencing on numerous sites!
{{Merge from|Dickson polynomial|discuss=Talk:Chebyshev polynomials#Dickson polynomial|date=September 2011}}
In [[mathematics]] the '''Chebyshev polynomials''', named after [[Pafnuty Chebyshev]],<ref>Chebyshev polynomials were first presented in: P. L. Chebyshev (1854) "Théorie des mécanismes connus sous le nom de parallélogrammes," ''Mémoires des Savants étrangers présentés à l’Académie de Saint-Pétersbourg'', vol. 7, pages 539–586.</ref> are a [[polynomial sequence|sequence]] of [[orthogonal polynomials]] which are related to [[de Moivre's formula]] and which can be defined [[Recursion|recursive]]ly. One usually distinguishes between '''Chebyshev polynomials of the first kind''' which are denoted {{math|''T''<sub>''n''</sub>}} and '''Chebyshev polynomials of the second kind''' which are denoted {{math|''U''<sub>''n''</sub>}}. The letter T is used because of the alternative [[transliteration]]s of the name ''Chebyshev'' as ''Tchebycheff'', ''Tchebyshev'' (French) or ''Tschebyschow'' (German).
 
The Chebyshev polynomials {{math|''T''<sub>''n''</sub>}} or {{math|''U''<sub>''n''</sub>}} are polynomials of degree {{mvar|n}} and the [[sequence]] of Chebyshev polynomials of either kind composes a [[polynomial sequence]].
 
Chebyshev polynomials are polynomials with the largest possible leading coefficient, but subject to the condition that their absolute value is bounded on the interval by 1.  They are also the extremal polynomials for many other properties.<ref>Rivlin, Theodore J. '''The Chebyshev polynomials. '''Pure and Applied Mathematics. ''Wiley-Interscience [John Wiley & Sons],'' New York-London-Sydney,1974.  Chapter 2, "Extremal Properties", pp. 56--123.</ref>
 
Chebyshev polynomials are important in [[approximation theory]] because the roots of the Chebyshev polynomials of the first kind, which are also called [[Chebyshev nodes]], are used as nodes in [[polynomial interpolation]]. The resulting interpolation polynomial minimizes the problem of [[Runge's phenomenon]] and provides an approximation that is close to the polynomial of best approximation to a [[continuous function]] under the [[maximum norm]]. This approximation leads directly to the method of [[Clenshaw–Curtis quadrature]].
 
In the study of [[differential equation]]s they arise as the solution to the [[Chebyshev equation|Chebyshev differential equations]]
 
:<math>(1-x^2)\,y'' - x\,y' + n^2\,y = 0 \,\!</math>
and
:<math>(1-x^2)\,y'' - 3x\,y' + n(n+2)\,y = 0 \,\!</math>
 
for the polynomials of the first and second kind, respectively. These equations are special cases of the [[Sturm&ndash;Liouville problem|Sturm&ndash;Liouville differential equation]].
 
==Definition==
The '''Chebyshev polynomials of the first kind''' are defined by the [[recurrence relation]]
 
:<math>
\begin{align}
T_0(x) & = 1 \\
T_1(x) & = x \\
T_{n+1}(x) & = 2xT_n(x) - T_{n-1}(x).
\end{align}
</math>
 
The ordinary [[generating function]] for ''T''<sub>''n''</sub> is
 
:<math>\sum_{n=0}^{\infty}T_n(x) t^n = \frac{1-tx}{1-2tx+t^2}; \,\!</math>
 
the exponential [[generating function]] is
 
:<math>\sum_{n=0}^{\infty}T_n(x) \frac{t^n}{n!} = \tfrac{1}{2}\left( e^{(x-\sqrt{x^2 -1})t}+e^{(x+\sqrt{x^2 -1})t}\right)
= e^{tx} \cosh(t \sqrt{x^2-1}). \,\!</math>
 
The generating function relevant for 2-dimensional [[potential theory]] and [[Cylindrical multipole moments|multipole expansion]] is
 
:<math>\sum\limits_{n=1}^{\infty }T_{n}\left( x\right) \frac{t^{n}}{n}=\ln \frac{1}{\sqrt{1-2tx+t^{2}}}.</math>
 
The '''Chebyshev polynomials of the second kind''' are defined by the [[recurrence relation]]
 
:<math>
\begin{align}
U_0(x) & = 1 \\
U_1(x) & = 2x \\
U_{n+1}(x) & = 2xU_n(x) - U_{n-1}(x).
\end{align}
</math>
 
The ordinary [[generating function]] for ''U''<sub>''n''</sub> is
 
:<math>\sum_{n=0}^{\infty}U_n(x) t^n = \frac{1}{1-2 t x+t^2}; \,\!</math>
the exponential [[generating function]] is
 
:<math>\sum_{n=0}^{\infty}U_n(x) \frac{t^n}{n!} = e^{tx}
\left( \cosh(t \sqrt{x^2-1}) + \frac{x}{\sqrt{x^2-1}} \sinh(t \sqrt{x^2-1}) \right). \,\!</math>
 
===Trigonometric definition===
The Chebyshev polynomials of the first kind can be defined as the unique polynomials satisfying
:<math>T_n(x)=\cos(n \arccos x)=\cosh(n\,\mathrm{arccosh}\,x) \,\!</math>
or, in other words, as the unique polynomials satisfying
:<math>T_n(\cos(\vartheta))=\cos(n\vartheta) \,\!</math>
for ''n'' = 0, 1, 2, 3, ... which is a variant (equivalent transpose) of [[Schröder's equation]],
viz. ''T<sub>n</sub>''(''x'') is functionally conjugate to ''nx'', codified in  
the nesting property below. Further compare to the [[spread polynomials]], in the section below.
 
The polynomials of the second kind satisfy:
:<math> U_n(\cos(\vartheta)) = \frac{\sin((n+1)\vartheta)}{\sin\vartheta} \, ,</math>
which is structurally quite similar to the [[Dirichlet kernel]] <math>D_n(x) \,\!</math>:
:<math> D_n(x) = \frac{\sin((2n+1)(x/2))}{\sin (x/2)} = U_{2n}(\cos (x/2))\, .</math>
 
That cos(''nx'') is an ''n''th-degree polynomial in cos(''x'') can be seen by observing that cos(''nx'') is the real part of one side of [[de Moivre's formula]], and the real part of the other side is a polynomial in cos(''x'') and sin(''x''), in which all powers of sin(''x'') are even and thus replaceable through the identity cos<sup>2</sup>(''x'') + sin<sup>2</sup>(''x'') = 1.
 
This identity is quite useful in conjunction with the recursive generating formula, inasmuch as it enables one to calculate the cosine of any integral multiple of an angle solely in terms of the cosine of the base angle.
 
Evaluating the first two Chebyshev polynomials,
:<math>T_0(x)=\cos(0x) =1</math>
and
:<math>T_1(\cos(x))=\cos(x) \, ,</math>
one can straightforwardly determine that
:<math>
\cos(2 \vartheta)=2\cos\vartheta \cos\vartheta - \cos(0 \vartheta) = 2\cos^{2}\,\vartheta - 1 \,\!
</math>
:<math>
\cos(3 \vartheta)=2\cos\vartheta \cos(2\vartheta) - \cos\vartheta = 4\cos^3\,\vartheta - 3\cos\vartheta \,  ,</math>
and so forth.
 
Two immediate corollaries are the ''composition identity'' (or '''nesting property''' specifying a [[semigroup]])
::<math>T_n(T_m(x)) = T_{nm}(x)\, ;</math>
 
and the expression of complex exponentiation in terms of Chebyshev polynomials: given&nbsp;''z''&nbsp;=&nbsp;''a''&nbsp;+&nbsp;''bi'',
:<math>
\begin{align}
z^n & = |z|^n \left(\cos \left(n\arccos \frac a{|z|}\right) + i \sin \left(n\arccos \frac a{|z|}\right)\right) \\
& = |z|^n T_n\left(\frac a{|z|}\right) + ib\ |z|^{n - 1}\ U_{n-1}\left(\frac a{|z|}\right).
\end{align}
</math>
 
===Pell equation definition===
The Chebyshev polynomials can also be defined as the solutions to the [[Pell equation]]
 
:<math>T_n(x)^2 - (x^2-1) U_{n-1}(x)^2 = 1 \,\!</math>
 
in a ring R[''x''].<ref>Jeroen Demeyer [http://cage.ugent.be/~jdemeyer/phd.pdf Diophantine Sets over Polynomial Rings and Hilbert's Tenth Problem for Function Fields], Ph.D. theses (2007), p.70.</ref> Thus, they can be generated by the standard technique for Pell equations of taking powers of a fundamental solution:
 
:<math>T_n(x) + U_{n-1}(x) \sqrt{x^2-1} = (x + \sqrt{x^2-1})^n. \,\!</math>
 
==Relation between Chebyshev polynomials of the first and second kinds==
The Chebyshev polynomials of the first and second kind are closely related by the following equations
 
:<math>\tfrac{d}{dx} \, T_n(x) = n U_{n-1}(x) \mbox{ , } n=1,\ldots</math>
 
:<math>T_n(x) = \tfrac{1}{2} (U_n(x) - \, U_{n-2}(x)). </math>
 
:<math>T_{n+1}(x) = xT_n(x) - (1 - x^2)U_{n-1}(x)\,</math>
 
:<math>T_n(x) = U_n(x) - x \, U_{n-1}(x), </math>
 
:<math> U_n(x) =2\sum_{j\,\, \text{odd}}^n T_j(x)  </math>, where n is odd.
 
:<math> U_n(x) =2\sum_{j\, \text{even}}^n T_j(x)-1  </math>, where n is even.
 
The recurrence relationship of the derivative of Chebyshev polynomials can be derived from these relations
 
:<math>2 T_n(x) = \frac{1}{n+1}\; \frac{d}{dx} T_{n+1}(x) - \frac{1}{n-1}\; \frac{d}{dx} T_{n-1}(x) \mbox{ , }\quad n=1,\ldots</math>
 
This relationship is used in the [[Chebyshev spectral method]] of solving differential equations.
 
Equivalently, the two sequences can also be defined from a pair of '''mutual recurrence''' equations:
 
:<math>T_0(x) = 1\,\!</math>
 
:<math>U_{-1}(x) = 0\,\!</math>
 
:<math>T_{n+1}(x) = xT_n(x) - (1 - x^2)U_{n-1}(x)\,</math>
 
:<math>U_n(x) = xU_{n-1}(x) + T_n(x)\,</math>
 
These can be derived from the trigonometric formulae; for example, if <math>x = \cos\vartheta</math>, then
 
:<math>\begin{align}
T_{n+1}(x) &= T_{n+1}(\cos(\vartheta)) \\
&= \cos((n + 1)\vartheta) \\
&= \cos(n\vartheta)\cos(\vartheta) - \sin(n\vartheta)\sin(\vartheta) \\
&= T_n(\cos(\vartheta))\cos(\vartheta) - U_{n-1}(\cos(\vartheta))\sin^2(\vartheta) \\
&= xT_n(x) - (1 - x^2)U_{n-1}(x). \\
\end{align}</math>
 
Note that both these equations and the trigonometric equations take a simpler form if we, like some works, follow the alternate convention of denoting our ''U''<sub>''n''</sub> (the polynomial of degree ''n'') with ''U''<sub>''n''+1</sub> instead.
 
[[Turán's inequalities]] for the Chebyshev polynomials are
:<math>T_n(x)^2-T_{n-1}(x) T_{n+1}(x)= 1-x^2>0 \text{ for } -1<x<1\!</math> and
:<math>U_n(x)^2-U_{n-1}(x) U_{n+1}(x)= 1>0.\!</math>
 
==Explicit expressions==
Different approaches to defining Chebyshev polynomials lead to different explicit expressions such as:
 
:<math>T_n(x) =
\begin{cases}
\cos(n\arccos(x)), & \ |x| \le 1 \\
\cosh(n \, \mathrm{arccosh}(x)), & \ x \ge 1 \\
(-1)^n \cosh(n \, \mathrm{arccosh}(-x)), & \ x \le -1 \\
\end{cases} \,\!
</math>
 
<!-- extra blank line for legibility -->
 
:<math>
\begin{align}
T_n(x) & = \frac{(x-\sqrt{x^2-1})^n+(x+\sqrt{x^2-1})^n}{2} \\
& = \sum_{k=0}^{\lfloor n/2\rfloor} \binom{n}{2k} (x^2-1)^k x^{n-2k} \\
& = x^n \sum_{k=0}^{\lfloor n/2\rfloor} \binom{n}{2k} (1 - x^{-2})^k \\
& = \tfrac{n}{2}\sum_{k=0}^{\lfloor n/2\rfloor}(-1)^k \frac{(n-k-1)!}{k!(n-2k)!}~(2x)^{n-2k} \quad (n>0) \\
& = n \sum_{k=0}^{n}(-2)^{k} \frac{(n+k-1)!} {(n-k)!(2k)!}(1 - x)^k \quad (n>0)\\
& = \, _2F_1\left(-n,n;\frac 1 2; \frac{1-x} 2 \right) \\
\end{align}
</math>
 
<!-- extra blank line for legibility -->
 
:<math>
\begin{align}
U_n(x) & = \frac{(x+\sqrt{x^2-1})^{n+1} - (x-\sqrt{x^2-1})^{n+1}}{2\sqrt{x^2-1}} \\
& = \sum_{k=0}^{\lfloor n/2\rfloor} \binom{n+1}{2k+1} (x^2-1)^k x^{n-2k} \\
& = x^n \sum_{k=0}^{\lfloor n/2\rfloor} \binom{n+1}{2k+1} (1 - x^{-2})^k \\
& =\sum_{k=0}^{\lfloor n/2\rfloor} \binom{2k-(n+1)}{k}~(2x)^{n-2k} \quad (n>0)\\
& =\sum_{k=0}^{\lfloor n/2\rfloor}(-1)^k \binom{n-k}{k}~(2x)^{n-2k} \quad (n>0)\\
& = \sum_{k=0}^{n}(-2)^{k} \frac{(n+k+1)!} {(n-k)!(2k+1)!}(1 - x)^k \quad (n>0)\\
& = (n+1) \, _2F_1\left(-n,n+2; \tfrac{3}{2}; \tfrac{1}{2}\left[1-x\right] \right)
\end{align}
</math>
 
where <math>_2F_1</math> is a [[hypergeometric function]].
 
==Properties==
 
===Roots and extrema===
A Chebyshev polynomial of either kind with degree ''n'' has ''n'' different simple roots, called '''Chebyshev roots''', in the interval [&minus;1,1]. The roots of the Chebyshev polynomial of the first kind are sometimes called [[Chebyshev nodes]] because they are used as ''nodes'' in polynomial interpolation. Using the trigonometric definition and the fact that
 
:<math>\cos\left(\tfrac{\pi}{2}\,(2k+1)\right)=0</math>
 
one can easily prove that the roots of ''T''<sub>''n''</sub> are
 
:<math> x_k = \cos\left(\tfrac{\pi}{2}\,\frac{2k-1}{n}\right),\quad k=1,\ldots,n.</math>
 
Similarly, the roots of ''U''<sub>''n''</sub> are
 
:<math> x_k = \cos\left(\frac{k}{n+1}\pi\right),\quad k=1,\ldots,n.</math>
 
The [[Maxima and minima|extrema]] of ''T''<sub>''n''</sub> on the interval {{nowrap|&minus;1 ≤ ''x'' ≤ 1}} are located at
 
:<math> x_k = \cos\left(\tfrac{k}{n}\pi\right),\quad k=0,\ldots,n.</math>
 
One unique property of the Chebyshev polynomials of the first kind is that on the interval {{nowrap|&minus;1 ≤ ''x'' ≤ 1}} all of the [[Maxima and minima|extrema]] have values that are either &minus;1 or 1. Thus these polynomials have only two finite [[critical value]]s, the defining property of [[Shabat polynomial]]s. Both the first and second kinds of Chebyshev polynomial have extrema at the endpoints, given by:
 
:<math>T_n(1) = 1\,</math>
 
:<math>T_n(-1) = (-1)^n\,</math>
 
:<math>U_n(1) = n + 1\,</math>
 
:<math>U_n(-1) = (n + 1)(-1)^n.\,</math>
 
===Differentiation and integration===
The derivatives of the polynomials can be less than straightforward. By differentiating the polynomials in their trigonometric forms, it's easy to show that:
 
:<math>\frac{d T_n}{d x} = n U_{n - 1}\,</math>
 
:<math>\frac{d U_n}{d x} = \frac{(n + 1)T_{n + 1} - x U_n}{x^2 - 1}\,</math>
 
:<math>\frac{d^2 T_n}{d x^2} = n \frac{n T_n - x U_{n - 1}}{x^2 - 1} = n \frac{(n + 1)T_n - U_n}{x^2 - 1}.\,</math>
 
The last two formulas can be numerically troublesome due to the division by zero (0/0 [[indeterminate form]], specifically) at {{nowrap|''x'' {{=}} 1}} and {{nowrap|''x'' {{=}} &minus;1}}. It can be shown that:
 
:<math>\frac{d^2 T_n}{d x^2} \Bigg|_{x = 1} \!\! = \frac{n^4 - n^2}{3},</math>
 
:<math>\frac{d^2 T_n}{d x^2} \Bigg|_{x = -1} \!\! = (-1)^n \frac{n^4 - n^2}{3}.</math>
 
:Proof
----
The second derivative of the [[Chebyshev polynomial]] of the first kind is
 
:<math>T''_n = n \frac{n T_n - x U_{n - 1}}{x^2 - 1}</math>
 
which, if evaluated as shown above, poses a problem because it is [[indeterminate form|indeterminate]] at ''x'' = ±1. Since the function is a polynomial, (all of) the derivatives must exist for all real numbers, so the taking to limit on the expression above should yield the desired value:
 
:<math>T''_n(1) = \lim_{x \to 1} n \frac{n T_n - x U_{n - 1}}{x^2 - 1}</math>
 
where only <math>x = 1</math> is considered for now. Factoring the denominator:
 
:<math>T''_n(1) = \lim_{x \to 1} n \frac{n T_n - x U_{n - 1}}{(x + 1)(x - 1)} =
\lim_{x \to 1} n \frac{\frac{n T_n - x U_{n - 1}}{x - 1}}{x + 1}.</math>
 
Since the limit as a whole must exist, the limit of the numerator and denominator must independently exist, and
 
:<math>T''_n(1) = n \frac{\lim_{x \to 1} \frac{n T_n - x U_{n - 1}}{x - 1}}{\lim_{x \to 1} (x + 1)} =
\frac{n}{2} \lim_{x \to 1} \frac{n T_n - x U_{n - 1}}{x - 1}
.</math>
 
The denominator (still) limits to zero, which implies that the numerator must be limiting to zero, i.e. <math>U_{n - 1}(1) = n T_n(1) = n</math> which will be useful later on. Since the numerator and denominator are both limiting to zero, [[L'Hôpital's rule]] applies:
 
:<math>\begin{align}
T''_n(1) & = \frac{n}{2} \lim_{x \to 1} \frac{\frac{d}{dx}(n T_n - x U_{n - 1})}{\frac{d}{dx}(x - 1)} \\
& = \frac{n}{2} \lim_{x \to 1} \frac{d}{dx}(n T_n - x U_{n - 1}) \\
& = \frac{n}{2} \lim_{x \to 1} \left(n^2 U_{n - 1} - U_{n - 1} - x \frac{d}{dx}(U_{n - 1})\right) \\
& = \frac{n}{2} \left(n^2 U_{n - 1}(1) - U_{n - 1}(1) - \lim_{x \to 1} x \frac{d}{dx}(U_{n - 1})\right) \\
& = \frac{n^4}{2} - \frac{n^2}{2} - \frac{1}{2} \lim_{x \to 1} \frac{d}{dx}(n U_{n - 1}) \\
& = \frac{n^4}{2} - \frac{n^2}{2} - \frac{T''_n(1)}{2} \\
T''_n(1) & = \frac{n^4 - n^2}{3}. \\
\end{align}</math>
 
The proof for <math>x = -1</math> is similar, with the fact that <math>T_n(-1) = (-1)^n</math> being important.
----
Indeed, the following, more general formula holds:
 
:<math>\frac{d^p T_n}{d x^p} \Bigg|_{x = \pm 1} \!\! = (\pm 1)^{n+p}\prod_{k=0}^{p-1}\frac{n^2-k^2}{2k+1}.</math>
 
This latter result is of great use in the numerical solution of eigenvalue problems.
 
Concerning integration, the first derivative of the ''T''<sub>''n''</sub> implies that
 
:<math>\int U_n\, dx = \frac{T_{n + 1}}{n + 1}\,</math>
 
and the recurrence relation for the first kind polynomials involving derivatives establishes that
 
:<math>\int T_n\, dx = \frac{1}{2} \left(\frac{T_{n + 1}}{n + 1} - \frac{T_{n - 1}}{n - 1}\right) = \frac{n T_{n + 1}}{n^2 - 1} - \frac{x T_n}{n - 1}.\,</math>
 
===Orthogonality===
Both the {{math|''T''<sub>''n''</sub>}} and the {{math|''U''<sub>''n''</sub>}} form a sequence of [[orthogonal polynomials]]. The polynomials of the first kind are orthogonal with respect to a linear transformation of [[Jeffreys prior]] that has the weight
 
:<math>\frac{1}{\sqrt{1-x^2}}, \,\!</math>
 
on the interval [&minus;1,1], i.e. we have:
 
:<math>\int_{-1}^1 T_n(x)T_m(x)\,\frac{dx}{\sqrt{1-x^2}}=
\begin{cases}
0 &: n\ne m \\
\pi &: n=m=0\\
\pi/2 &: n=m\ne 0
\end{cases}
</math>
 
This can be proven by letting {{math|''x'' {{=}} cos (''θ'')}} and using the defining identity {{math|''T''<sub>''n''</sub>(cos (''θ'')) {{=}} cos (''nθ'')}}.
Similarly, the polynomials of the second kind are orthogonal with respect to the weight
 
:<math>\sqrt{1-x^2} \,\!</math>
 
on the interval [&minus;1,1], i.e. we have:
 
:<math>\int_{-1}^1 U_n(x)U_m(x)\sqrt{1-x^2}\,dx =
\begin{cases}
0 &: n\ne m, \\
\pi/2 &: n=m.
\end{cases}
</math>
 
(Note that the measure <math>\sqrt{1-x^2} \,dx</math> is, to within a normalizing constant, the [[Wigner semicircle distribution]]).
 
The {{math|''T''<sub>''n''</sub>}}  also satisfy a discrete orthogonality condition:
 
:<math> \sum_{k=0}^{N-1}{T_i(x_k)T_j(x_k)} =
\begin{cases}
0 &: i\ne j \\
N &: i=j=0 \\
N/2 &: i=j\ne 0
\end{cases} \,\!
</math>
 
where the {{math|''x''<sub>''k''</sub>}} are the ''N'' [[Carl Friedrich Gauss|Gauss]]&ndash;[[Rehuel Lobatto|Lobatto]] zeros of  {{math|''T''<sub>''N''</sub>(''x'')}}
 
:<math> x_k=\cos\left(\frac{\pi\left(k+\frac{1}{2}\right)}{N}\right) .</math>
 
===Minimal ∞-norm===
For any given ''n'' ≥ 1, among the polynomials of degree ''n'' with leading coefficient 1,
 
:<math>f(x) = \frac1{2^{n-1}}T_n(x)</math>
 
is the one of which the maximal absolute value on the interval [&minus;1,&nbsp;1] is minimal.
 
This maximal absolute value is
 
:<math>\frac1{2^{n-1}}</math>
 
and |ƒ(''x'')| reaches this maximum exactly {{nowrap|''n'' + 1}} times at
 
: <math>x = \cos \frac{k\pi}{n}\text{ for }0 \le k \le n.</math>
 
:Proof
----
Let's assume that <math>w_n(x)</math> is a polynomial of degree ''n'' with leading coefficient 1 with maximal absolute value on the interval [&minus;1,&nbsp;1] less than <math>\frac1{2^{n-1}}</math>.
 
Define
 
:<math>f_n(x) = \frac1{2^{n-1}}T_n(x) - w_n(x)</math>
 
Because at extreme points of <math>T_n</math> we have <math>|w_n(x)| < \left|\frac1{2^{n-1}}T_n(x)\right|</math>
 
:<math>f_n(x) > 0 \text{ for } x = \cos \frac{2k\pi}{n} \text{ where } 0 \le 2k \le n</math>
 
:<math>f_n(x) < 0 \text{ for } x = \cos \frac{(2k + 1)\pi}{n} \text{ where } 0 \le 2k + 1 \le n</math>
 
From the [[intermediate value theorem]], <math>f_n(x)</math> has at least ''n'' roots.  However, this is impossible, as <math>f_n(x)</math> is a polynomial of degree {{nowrap|''n'' − 1}}, so the [[fundamental theorem of algebra]] implies it has at most {{nowrap|''n'' − 1}} roots.
 
===Other properties===
The Chebyshev polynomials are a special case of the ultraspherical or [[Gegenbauer polynomials]], which themselves are a special case of the [[Jacobi polynomials]]:
* <math>T_n(x)= \frac 1{{n-\frac 1 2 \choose n}} P_n^{-\frac 1 2, -\frac 1 2}(x)= \frac n 2 C_n^0(x),</math>
* <math>U_n(x)= \frac 1{2{n+\frac 1 2 \choose n}} P_n^{\frac 1 2,  \frac 1 2}(x)= C_n^1(x).</math>
 
For every nonnegative integer ''n'', ''T''<sub>''n''</sub>(''x'') and ''U''<sub>''n''</sub>(''x'') are both polynomials of degree ''n''. They are [[Even and odd functions|even or odd functions]] of ''x'' as ''n'' is even or odd, so when written as polynomials of ''x'', it only has even or odd degree terms respectively.  In fact,
:<math>T_{2n}(x)=T_n\left(2x^2-1\right)=2 T_n(x)^2-1</math>
and
:<math>2 x U_n\left(1-2x^2\right)= (-1)^n U_{2n+1}(x).</math>
 
The leading coefficient of ''T''<sub>''n''</sub> is {{nowrap|2<sup>''n'' &minus; 1</sup>}} if {{nowrap|1 ≤ ''n''}}, but 1 if {{nowrap|0 {{=}} ''n''}}.
 
''T''<sub>''n''</sub> are a special case of [[Lissajous curve]]s with frequency ratio equal to ''n''.
 
Several polynomial sequences like [[Lucas polynomials]] (''L''<sub>''n''</sub>), [[Dickson polynomials]]  (''D''<sub>''n''</sub>), [[Fibonacci polynomials]] (''F''<sub>''n''</sub>) are related to Chebyshev polynomials ''T''<sub>''n''</sub> and ''U''<sub>''n''</sub>.
 
The Chebyshev polynomials of the first kind satisfy the relation
 
:<math> T_j(x) T_k(x) = \tfrac{1}{2}\left( T_{j+k}(x) + T_{|k-j|}(x)\right),\quad\forall j,k\ge 0,\,</math>
 
which is easily proved from the [[List_of_trigonometric_identities#Product-to-sum_and_sum-to-product_identities|product-to-sum formula]] for the cosine. The polynomials of the second kind satisfy the similar relation
 
:<math> T_j(x) U_k(x) = \tfrac{1}{2}\left( U_{j+k}(x) + U_{k-j}(x)\right),\quad\forall j,k. </math>
 
Similar to the formula
 
:<math> T_n\left(\cos\theta\right) = \cos(n \theta), </math>
 
we have the analogous formula
 
:<math> T_{2n+1}\left(\sin\theta\right) = (-1)^n \sin((2n+1)\theta) </math>.
 
For <math> x\ne 0</math>,
:<math> T_n\left(\tfrac{1}{2}\left[x+x^{-1}\right]\right)=\tfrac{1}{2}\left(x^n+x^{-n}\right)</math> and
:<math>x^n= T_n\left(\tfrac{1}{2}\left[x+x^{-1}\right]\right)+ \tfrac{1}{2}\left(x-x^{-1}\right) U_{n-1}\left(\tfrac{1}{2}\left[x+x^{-1}\right]\right)</math>,
which follows from the fact that this holds by definition for <math> x = e^{i\theta}</math>.
 
Let
:<math>C_n(x)=2T_n \left(\frac{x}{2}\right)</math>.
Then <math> C_n(x)</math> and <math> C_m(x)</math> are commuting polynomials:
:<math>C_n\left(C_m(x)\right)=C_m(C_n(x))</math>,
as is evident in the [[Abelian group|Abelian]] nesting property specified above.
 
==Examples==
=== First kind ===
[[File:Chebyshev Polynomials of the 1st Kind (n=0-5, x=(-1,1)).svg|thumb|300px|The first few Chebyshev polynomials of the first kind in the domain {{nowrap|&minus;1 < ''x'' < 1}}:
The flat <span style="color:purple;">''T''<sub>0</sub></span>, <span style="color:red;">''T''<sub>1</sub></span>, <span style="color:blue;">''T''<sub>2</sub></span>, <span style="color:green;">''T''<sub>3</sub></span>, <span style="color:orange;">''T''<sub>4</sub></span> and <span style="color:black;">''T''<sub>5</sub></span>.]]
 
The first few Chebyshev polynomials of the first kind are {{OEIS2C|A028297}}
 
:<math> T_0(x) = 1 \,</math>
 
:<math> T_1(x) = x \,</math>
 
:<math> T_2(x) = 2x^2 - 1 \,</math>
 
:<math> T_3(x) = 4x^3 - 3x \,</math>
 
:<math> T_4(x) = 8x^4 - 8x^2 + 1 \,</math>
 
:<math> T_5(x) = 16x^5 - 20x^3 + 5x \,</math>
 
:<math> T_6(x) = 32x^6 - 48x^4 + 18x^2 - 1 \,</math>
 
:<math> T_7(x) = 64x^7 - 112x^5 + 56x^3 - 7x \,</math>
 
:<math> T_8(x) = 128x^8 - 256x^6 + 160x^4 - 32x^2 + 1 \,</math>
 
:<math> T_9(x) = 256x^9 - 576x^7 + 432x^5 - 120x^3 + 9x. \,</math>
 
:<math> T_{10}(x) = 512x^{10} - 1280x^8 + 1120x^6 - 400x^4 + 50x^2-1. \,</math>
 
:<math> T_{11}(x) = 1024x^{11} - 2816x^9 + 2816x^7 - 1232x^5 +220x^3 - 11x. \,</math>
 
=== Second kind ===
[[File:Chebyshev Polynomials of the 2nd Kind (n=0-5, x=(-1,1)).svg|thumb|300px|The first few Chebyshev polynomials of the second kind in the domain &minus;1&nbsp;<&nbsp;''x''&nbsp;<&nbsp;1: The flat <span style="color:purple;">''U''<sub>0</sub></span>, <span style="color:red;">''U''<sub>1</sub></span>, <span style="color:blue;">''U''<sub>2</sub></span>, <span style="color:green;">''U''<sub>3</sub></span>, <span style="color:orange;">''U''<sub>4</sub></span> and <span style="color:black;">''U''<sub>5</sub></span>.
Although not visible in the image, {{nowrap|''U''<sub>''n''</sub>(1) {{=}} ''n'' + 1}} and {{nowrap|''U''<sub>''n''</sub>(&minus;1) {{=}} (''n'' + 1)(&minus;1)<sup>''n''</sup>}}.]]
 
The first few Chebyshev polynomials of the second kind are
 
:<math> U_0(x) = 1 \,</math>
 
:<math> U_1(x) = 2x \,</math>
 
:<math> U_2(x) = 4x^2 - 1 \,</math>
 
:<math> U_3(x) = 8x^3 - 4x \,</math>
 
:<math> U_4(x) = 16x^4 - 12x^2 + 1 \,</math>
 
:<math> U_5(x) = 32x^5 - 32x^3 + 6x \,</math>
 
:<math> U_6(x) = 64x^6 - 80x^4 + 24x^2 - 1 \,</math>
 
:<math> U_7(x) = 128x^7 - 192x^5 + 80x^3 - 8x \,</math>
 
:<math> U_8(x) = 256x^8 - 448 x^6 + 240 x^4 - 40 x^2 + 1 \,</math>
 
:<math> U_9(x) = 512x^9 - 1024 x^7 + 672 x^5 - 160 x^3 + 10 x. \,</math>
 
==As a basis set==
[[Image:ChebyshevExpansion.png|thumb|right|240px|The non-smooth function (top) {{nowrap|''y'' {{=}} &minus;''x''<sup>3</sup>''H''(&minus;''x'')}}, where ''H'' is the [[Heaviside step function]], and (bottom) the 5th partial sum of its Chebyshev expansion. The 7th sum is indistinguishable from the original function at the resolution of the graph.]]
 
In the appropriate [[Sobolev space]], the set of Chebyshev polynomials form a [[Hilbert space#Orthonormal_bases|orthonormal basis]], so that a function in the same space can, on {{nowrap|&minus;1 ≤ ''x'' ≤ 1}} be expressed via the expansion:<ref name = boyd>{{cite book|title = Chebyshev and Fourier Spectral Methods| first = John P.| last = Boyd|isbn = 0-486-41183-4|edition = second|year = 2001| publisher = Dover| url = http://www-personal.umich.edu/~jpboyd/aaabook_9500may00.pdf}}</ref>
 
:<math>f(x) = \sum_{n = 0}^\infty a_n T_n(x).</math>
 
Furthermore, as mentioned previously, the Chebyshev polynomials form an [[orthogonal]] basis which (among other things) implies that the coefficients ''a''<sub>''n''</sub> can be determined easily through the application of an [[inner product]]. This sum is called a '''Chebyshev series''' or a '''Chebyshev expansion'''.
 
Since a Chebyshev series is related to a [[Fourier cosine series]] through a change of variables, all of the theorems, identities, etc. that apply to [[Fourier series]] have a Chebyshev counterpart.<ref name=boyd/> These attributes include:
 
* The Chebyshev polynomials form a [[Complete metric space|complete]] orthogonal system.
* The Chebyshev series converges to ƒ(''x'') if the function is [[piecewise]] [[Smooth function|smooth]] and [[Continuous function|continuous]]. The smoothness requirement can be relaxed in most cases — as long as there are a finite number of discontinuities in ƒ(''x'') and its derivatives.
* At a discontinuity, the series will converge to the average of the right and left limits.
 
The abundance of the theorems and identities inherited from [[Fourier series]] make the Chebyshev polynomials important tools in [[numeric analysis]]; for example they are the most popular general purpose basis functions used in the [[spectral method]],<ref name=boyd/> often in favor of trigonometric series due to generally faster convergence for continuous functions ([[Gibbs' phenomenon]] is still a problem).
 
===Example 1===
 
Consider the Chebyshev expansion of <math> \log(1+x) </math>. One can express
 
:<math> \log(1+x) = \sum_{n = 0}^\infty a_n T_n(x). </math>
 
One can find the coefficients <math> a_n </math> either through the application of an [[inner product]] or by the discrete orthogonality condition. For the inner product,
 
:<math>
\int_{-1}^{+1}\frac{T_m(x)\log(1+x)}{\sqrt{1-x^2}}dx = \sum_{n=0}^{\infty}a_n\int_{-1}^{+1}\frac{T_m(x)T_n(x)}{\sqrt{1-x^2}}dx,
</math>
 
which gives
 
:<math>
a_n=
\begin{cases}
-\log(2) &:n = 0 \\
\frac{-2(-1)^n}{n} &: n > 0.
\end{cases}
</math>
 
Alternatively, when you cannot evaluate the inner product of the function you are trying to approximate, the discrete orthogonality condition gives
 
:<math>
a_n=\frac{2-\delta_{0n}}{N}\sum_{k=0}^{N-1}T_n(x_k)\log(1+x_k),
</math>
 
where <math> \delta_{ij} </math> is the [[Kronecker delta]] function and the <math> x_k </math> are the ''N'' Gauss&ndash;Lobatto zeros of <math> T_N(x) </math>
 
:<math> x_k=\cos\left(\frac{\pi\left(k+\frac{1}{2}\right)}{N}\right) .</math>
 
This allows us to compute the coefficients <math> a_n </math> very efficiently through the [[discrete cosine transform]]
 
:<math>
a_n=\frac{2-\delta_{0n}}{N}\sum_{k=0}^{N-1}\cos\left(\frac{n\pi\left(k+\frac{1}{2}\right)}{N}\right)\log(1+x_k).
</math>
 
===Example 2===
To provide another example:
 
:<math>\begin{align}(1-x^2)^\alpha=& -\frac 1 {\sqrt \pi}\frac{\Gamma(\frac 1 2+\alpha)}{\Gamma(\alpha+1)}+ 2^{1-2\alpha} \sum_{n=0} (-1)^n {2\alpha \choose \alpha-n} T_{2n}(x)\\=& 2^{-2\alpha}\sum_{n=0} (-1)^n {2\alpha+1 \choose \alpha-n} U_{2n}(x).\end{align}</math>
 
===Partial sums===
The partial sums of
 
:<math>f(x) = \sum_{n = 0}^\infty a_n T_n(x)</math>
 
are very useful in the [[approximation theory|approximation]] of various functions and in the solution of [[differential equation]]s (see [[spectral method]]). Two common methods for determining the coefficients ''a''<sub>''n''</sub> are through the use of the [[inner product]] as in [[Galerkin's method]] and through the use of [[collocation method|collocation]] which is related to [[interpolation]].
 
As an interpolant, the ''N'' coefficients of the {{nowrap|(''N'' &minus; 1)<sup>th</sup>}} partial sum are usually obtained on the Chebyshev&ndash;Gauss&ndash;Lobatto<ref>[http://www.joma.org/images/upload_library/4/vol6/Sarra/Chebyshev.html Chebyshev Interpolation: An Interactive Tour]</ref> points (or Lobatto grid), which results in minimum error and avoids [[Runge's phenomenon]] associated with a uniform grid. This collection of points corresponds to the extrema of the highest order polynomial in the sum, plus the endpoints and is given by:
 
:<math>x_k = -\cos\left(\frac{k \pi}{N - 1}\right) ; \qquad \ k = 0, 1, \dots, N - 1.</math>
 
===Polynomial in Chebyshev form===
An arbitrary polynomial of degree ''N'' can be written in terms of the Chebyshev polynomials of the first kind. Such a polynomial ''p''(''x'') is of the form
 
:<math>p(x) = \sum_{n=0}^{N} a_n T_n(x).</math>
 
Polynomials in Chebyshev form can be evaluated using the [[Clenshaw algorithm]].
 
==Spread polynomials==
The [[spread polynomials]] are in a sense equivalent to the Chebyshev polynomials of the first kind, but enable one to avoid square roots and conventional trigonometric functions in certain contexts, notably in [[rational trigonometry]].
 
==See also==
*[[Chebyshev filter]]
*[[Chebyshev cube root]]
*[[Dickson polynomials]]
*[[Legendre polynomials]]
*[[Hermite polynomials]]
*[[Chebyshev rational functions]]
*[[Approximation theory]]
*[[Chebfun|The Chebfun system]]
*[[Discrete Chebyshev transform]]
 
==Notes==
{{Reflist}}
 
==References==
* {{Abramowitz Stegun ref|22|773}}
* Dette, Holger (1995), [http://journals.cambridge.org/download.php?file=%2FPEM%2FPEM2_38_02%2FS001309150001912Xa.pdf&code=c12b9a2fc1aba9e27005a5003ed21b36 A Note on Some Peculiar Nonlinear Extremal Phenomena of the Chebyshev Polynomials], ''Proceedings of the Edinburgh Mathematical Society''  '''38''', 343-355
*Eremenko, A.; Lempert, L. (1994), [http://www.math.purdue.edu/~eremenko/dvi/lempert.pdf An Extremal Problem For Polynomials],  ''Proceedings of the American Mathematical Society'', Volume 122,  Number  1, 191-193
*{{dlmf|id=18|title=Orthogonal Polynomials|first=Tom H. |last=Koornwinder|first2=Roderick S. C.|last2= Wong|first3=Roelof |last3=Koekoek||first4=René F. |last4=Swarttouw}}
* Remes, Eugene, [http://www.math.technion.ac.il/hat/fpapers/remeztrans.pdf On an Extremal Property of Chebyshev Polynomials]
*{{eom|id=C/c021940|first=P.K.|last= Suetin}}
 
==External links==
* {{MathWorld|urlname=ChebyshevPolynomialoftheFirstKind|title=Chebyshev Polynomial of the First Kind}}
* [http://math.fullerton.edu/mathews/n2003/ChebyshevPolyMod.html Module for Chebyshev Polynomials by John H. Mathews]
* [http://www.joma.org/images/upload_library/4/vol6/Sarra/Chebyshev.html Chebyshev Interpolation: An Interactive Tour], includes illustrative [[Java applet]].
* Numerical Computing with Functions: [http://www.maths.ox.ac.uk/chebfun/ The Chebfun Project]
* [http://mathoverflow.net/questions/25534/is-there-an-intuitive-explanation-for-an-extremal-property-of-chebyshev-polynomia Is there an intuitive explanation for an extremal property of Chebyshev polynomials?]
 
[[Category:Special hypergeometric functions]]
[[Category:Orthogonal polynomials]]
[[Category:Numerical analysis]]
[[Category:Approximation theory]]

Revision as of 13:57, 15 February 2014

The web is providing a lot of firms the chance to attain a lot more buyers then before. We're within the information age today and which means they is sufficient of ways for companies to have their products or services to their clients.

Set up a photo of yourself on your own website and be willing to answer the telephone when people contact. An area Toronto Chiropractor that does this for many his customers. When they first come for their website and their web stats show that numerous individuals are hitting the photograph They have had major changes. This helps get more individuals to trust you and build credibility. Once people trust you, they will more or less buy anything you put infront of them.

Your product has improved chances of receiving considered and most importantly realized, when your product material is readable. It takes to be created in this way, that each and every individual of the targeted teams understands what precisely you would like to claim and ultimately sell.



As you'll notice from this show You don't have to be an expert. You go through the people in the display and they are not the brightest people on earth but they do have willpower to carve-out their little niche within the shipping world. It's a market that no-one else desires to do. It's ideal. If you are a local organization that desires to stand-out, find a market and benefit from it.

The strategies are much the same as Jordan Kurland. Nonetheless, there are some distinctions which you have to find out to reach your goals. This program offered below will educate you on steps to make the necessary modifications from online to mobile-marketing within the ideal niches of your selection. Industry techniques for mobile markets aren't hard, but without the right information, you could waste some funds studying them.

I am writing this article to show my understanding and potential of web marketing with the desire that I may attain new leads for my companies, but much more to put a link with the exact text I want on the website with a top page rank.

If you are a realtor anywhere on earth your agency may currently offer free web marketing by giving an internet site and a profile and contact podium to you. Or even you could contact a web site design firm to get you started.

Roughly said, these are the most popularly-used kinds of online advertising. Thus, if you're trying to advertise on the site, you can choose exactly what the most efficient method to reach your target market is. And if you're simply browsing through, you know what kind of marketing you're experiencing on numerous sites!