Perlin noise: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Thumperward
Added {{lead too short}} and {{one source}} tags to article (TW)
The requested reference is available in the references below the article - http://www.noisemachine.com/talk1/. This is alose the primary source for the whole algorithm
 
Line 1: Line 1:
In [[mathematics]], two [[function (mathematics)|functions]] <math>f</math> and <math>g</math> are called '''orthogonal''' if their [[inner product]] <math>\langle f,g\rangle</math> is zero for ''f''&nbsp;≠&nbsp;''g''. How the [[inner product]] of two functions is defined may vary depending on context. However, a typical definition of an inner product for functions is
Greetings. The author's name is Dalton nevertheless it's not the some masucline name out that there. To drive is one of our own things he loves most people. His wife and him chose to measure in [http://data.gov.uk/data/search?q=South+Carolina South Carolina] in [http://Www.Thefreedictionary.com/addition addition] , his family loves it. Auditing is even his primary income is derived from. He is running and maintaining a blog here: http://Circuspartypanama.com/<br><br>Here is my web page - [http://Circuspartypanama.com/ clash of clans cheats ipad gems]
 
:<math> \langle f,g\rangle = \int f^*(x) g(x)\,dx  </math>
 
with appropriate [[integral|integration]] boundaries. Here, the asterisk indicates the [[complex conjugate]] of f.
 
For another perspective on this inner product, suppose approximating vectors <math>\vec{f}</math> and <math>\vec{g}</math> are created whose entries are the values of the functions ''f'' and ''g'', sampled at equally spaced points. Then this inner product between ''f'' and ''g'' can be roughly understood as the dot product between approximating vectors <math>\vec{f}</math> and <math>\vec{g}</math>, in the limit as the number of sampling points goes to infinity. Thus, roughly, two functions are orthogonal if their approximating vectors are perpendicular (under this common inner product).[http://maze5.net/?page_id=369]
 
Solutions of linear [[differential equation]]s with boundary conditions can often be written as a weighted sum of orthogonal solution functions (a.k.a. [[eigenfunction]]s).
 
Examples of sets of orthogonal functions:
*[[Fourier series|Sines and cosines]]
*[[Bessel function]]s
*[[Hermite polynomials]]
*[[Legendre polynomials]]
*[[Spherical harmonics]]
*[[Walsh function]]s
*[[Zernike polynomials]]
*[[Chebyshev polynomials]]
 
==Generalization of vectors==
It can be shown that orthogonality of functions is a generalization of the concept of orthogonality of vectors. Suppose we define V to be the set of variables on which the functions ''f'' and ''g'' operate. (In the example above, V={x} since x is the only parameter to ''f'' and ''g''. Since there is one parameter, one integral sign is required to determine orthogonality. If V contained two variables, it would be necessary to integrate twice—over a range of each variable—to establish orthogonality.) If V is an empty set, then ''f'' and ''g'' are just constant vectors, and there are no variables over which to integrate. Thus, the equation reduces to a simple inner-product of the two vectors.
 
==See also==
* [[Hilbert space]]
* [[Harmonic analysis]]
* [[Orthogonal polynomials]]
* [[Orthonormal basis]]
* [[Eigenfunction]]
* [[Eigenvalues and eigenvectors]]
{{Unreferenced|date=January 2008}}
 
[[Category:Functional analysis]]

Latest revision as of 14:15, 11 January 2015

Greetings. The author's name is Dalton nevertheless it's not the some masucline name out that there. To drive is one of our own things he loves most people. His wife and him chose to measure in South Carolina in addition , his family loves it. Auditing is even his primary income is derived from. He is running and maintaining a blog here: http://Circuspartypanama.com/

Here is my web page - clash of clans cheats ipad gems