C space: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Boodlepounce
name in words
 
en>Yobot
m →‎References: WP:CHECKWIKI error fixes using AWB (10093)
 
Line 1: Line 1:
In [[machine learning]], a '''margin classifer''' is a [[classifier]] which is able to give an associated distance from the decision boundary for each example. For instance, if a [[linear classifier]] (e.g. [[perceptron]] or [[linear discriminant analysis]]) is used, the distance (typically [[euclidean distance]], though others may be used) of an example from the separating hyperplane is the margin of that example.
CExcesul de apa este eliminat din organism, iar arderea grasimilor este intensificata de combinatia magica a ingredientelor naturale. Cu toate acestea, pentru ca este vorba de un supliment alimentar [http://Www.alexa.com/search?q=diferit&r=topsites_index&p=bigtop diferit] de alte suplimente, este bine sa ceri sfatul ginecologului sau a farmacistului. si mai ales, nu uita sa ceri si sfatul unui nutritionist, pentru ca ForFemina nu este un produs de slabit si nu trebuie sa renunti la o alimentatie sanatoasa. Prin intermediul acestui articol speram sa lamurim aceasta problema si sa punem bazele unor discutii pro si contra pentru 4 Femina Forum, la care va invitam sa participati deschisNe-am interesat pentru cititoarele noastre si am aflat ca acest supliment alimentar inglobeaza extract de guarana, extract de urzica, extract de ceai verde, crom si pulbere din suc de anghinare. Toate combinate armonios pentru ca dumneavoastra sa puteti contracara in mod natural toate acele efecte secundare nedorite ale pilulelor contraceptive.<br><br>my web site; [http://2u4.us/biocitrindeslabit943108 anghinare pentru slabit]
 
The notion of margin is important in several machine learning classification algorithms, as it can be used to bound the [[generalization error]] of the classifier.  These bounds are frequently shown using the [[VC dimension]].  Of particular prominence is the generalization [[error bound]] on [[Boosting (meta-algorithm)|boosting]] algorithms and [[support vector machine]]s.
 
==Support vector machine definition of margin==
See [[support vector machine]]s and [[maximum-margin hyperplane]] for details.
 
==Margin for boosting algorithms==
The margin for an iterative [[Boosting (machine learning)|boosting]] algorithm given a set of examples with two classes can be defined as follows.  The classifier is given an example pair <math>(x,y)</math> where <math>x \in X</math> is a domain space and <math>y \in Y = \{-1, +1\}</math> is the label of the example.  The iterative boosting algorithm then selects a classifier <math>h_j \in C</math> at each iteration <math>j</math> where <math>C</math> is a space of possible classifiers that predict real values. This hypothesis is then weighted by <math>\alpha_j \in R</math> as selected by the boosting algorithm. At iteration <math>t</math>, The margin of an example <math>x</math> can thus be defined as
 
: <math>\frac{ \sum_j^t \alpha_j h_j (x)}{\sum |\alpha_j|}</math>
 
By this definition, the margin is positive if the example is labeled correctly and negative is the example is labeled incorrectly.
 
This definition may be modified and is not the only way to define margin for boosting algorithms.  However, there are reasons why this definition may be appealing.<ref name="Statistics 1686">Robert E. Schapire, Yoav Freund, Peter Bartlett and Wee Sun Lee.(1998) "Boosting the margin: A new explanation for the effectiveness of voting methods", ''The Annals of Statistics'', 26(5):1651–1686</ref>
 
==Examples of margin-based algorithms==
Many classifiers can give an associated margin for each example.  However, only some classifiers utilize information of the margin while learning from a data set.
 
Many boosting algorithms rely on the notion of a margin to give weights to examples.  If a convex loss is utilized (as in [[AdaBoost]], [[LogitBoost]], and all members of the [[AnyBoost]] family of algorithms) then an example with higher margin will receive less (or equal) weight than an example with lower margin. This leads the boosting algorithm to focus weight on low margin examplesIn nonconvex algorithms (e.g. [[BrownBoost]]), the margin still dictates the weighting of an example, though the weighting is non-monotone with respect to margin.  There exists boosting algorithms that provably maximize the minimum margin (e.g. see <ref>Manfred Warmuth and Karen Glocer and Gunnar R&auml;tsch. Boosting Algorithms for Maximizing the Soft Margin. In the Proceedings of Advances in Neural Information Processing Systems 20, 2007, pp 1585–1592.</ref>).
 
[[Support vector machine]]s provably maximize the margin of the separating hyperplane.  Support vector machines that are trained using noisy data (there exists no perfect separation of the data in the given space) maximize the soft margin.  More discussion of this can be found in the [[support vector machine]] article.
 
The [[voted-perceptron]] algorithm is a margin maximizing algorithm based on an iterative application of the classic [[perceptron]] algorithm.
 
==Generalization error bounds==
One theoretical motivation behind margin classifiers is that their [[generalization error]] may be bound by parameters of the algorithm and a margin term.  An example of such a bound is for the AdaBoost algorithm.<ref name="Statistics 1686"/> Let <math>S</math> be a set of <math>m</math> examples sampled independently at random from a distribution <math>D</math>. Assume the VC-dimension of the underlying base classifier is <math>d</math> and <math>m \geq d \geq 1</math>.  Then with probability <math>1-\delta</math> we have the bound 
 
: <math>P_D\left( \frac{y \sum_j^t \alpha_j h_j (x)}{\sum |\alpha_j|} \leq 0\right)          \leq        P_S\left(\frac{y \sum_j^t \alpha_j h_j (x)}{\sum |\alpha_j|} \leq \theta\right)  + O\left(\frac{1}{\sqrt{m}} \sqrt{d\log^2(m/d)/ \theta^2  + \log(1/\delta)}\right)</math>
 
for all <math>\theta > 0</math>.
 
==References==
{{reflist}}
 
[[Category:Classification algorithms]]
[[Category:Statistical classification]]

Latest revision as of 13:58, 5 May 2014

CExcesul de apa este eliminat din organism, iar arderea grasimilor este intensificata de combinatia magica a ingredientelor naturale. Cu toate acestea, pentru ca este vorba de un supliment alimentar diferit de alte suplimente, este bine sa ceri sfatul ginecologului sau a farmacistului. si mai ales, nu uita sa ceri si sfatul unui nutritionist, pentru ca ForFemina nu este un produs de slabit si nu trebuie sa renunti la o alimentatie sanatoasa. Prin intermediul acestui articol speram sa lamurim aceasta problema si sa punem bazele unor discutii pro si contra pentru 4 Femina Forum, la care va invitam sa participati deschis. Ne-am interesat pentru cititoarele noastre si am aflat ca acest supliment alimentar inglobeaza extract de guarana, extract de urzica, extract de ceai verde, crom si pulbere din suc de anghinare. Toate combinate armonios pentru ca dumneavoastra sa puteti contracara in mod natural toate acele efecte secundare nedorite ale pilulelor contraceptive.

my web site; anghinare pentru slabit