|
|
Line 1: |
Line 1: |
| {{Primary sources|date=January 2012}} <!-- relies entirely on Collani and Wurzburg affiliates -->
| | People play in the rain for the love of paintball and the rain adds so much interest and challenge to the sport. <br><br>When playing in the rain, everything will be dull and slippery: therefore running, preventing and crouching will be tougher than normal; awareness will be lower; and wind can make paintball lessons more difficult. Many of these make playing paintball challenging that will be exactly shat paintball fans look for; the activity and the excitement! <br><br>Listed below are instructions o-n a rainy day: to help you enjoy your paintball game <br><br>1. Placed on something which keeps water from the body, like a raincoat, slicker, waterproof clothing or even plastic trash bags. It"s very important that you keep dry so you dont shiver as this may influence your mobility, precision and may be very uncomfortable. <br><br>Remember to protect your head too. Wear any waterproof masking but make sure that water will not be falling down your face and cause fog to create on-your glasses. Cleaning the fog from your goggles each time you stop behind a tree or even a bunker can distract from your concentration on the-game. My uncle discovered [http://www.kiwibox.com/nationwidetwine/blog/ partner sites] by browsing Google Books. <br><br>2. Simply take paper towels with you. The lens on your mask, even though it is a type of lens, will fog up and paper towels are available in handy; you are able to only wash off the fog. Do keep in mind that you should turn away from the activity - in a sitting duck (crouch) situation - before you remove the fog from your mask. <br><br>3. To get different interpretations, please check out: [http://www.midland457.com/estate-silver-jewelry-is-a-hot-fashion-pick/ close remove frame]. Use shoes or cleats in the place of sneakers to keep the feet from getting wet. <br><br>4. Use gloves to prevent cold and wet hands. <br><br>5. Having a soaked paintball could be really annoying. You ought to pack additional garbage bags to be used as extra shades. These may be used a raincoat in emergencies to cover-your equipment and function. Big plastic transparent bags are great for covering the head, hoppers and the weapon. You are able to effortlessly turn your weapon in-to a waterproof weapon by simply covering it with plastic and securing it with rubber bands. <br><br>6. Shooting over a rainy day is much more different than the typical line down. Consider the level of barrel smoke which is produced even when using HPA. Bring along an unported barrel if you happen to have one. Discover further on [http://armorgames.com/user/officemanagerdpc address] by visiting our original link. Porting allows water to go into the barrel causing twisting photos. <br><br>There a lot of ways to add joy to your paintball game: just allow your imagination to run wild and try out things are really. Have fun in the pouring rain!.<br><br>If you have any queries relating to where by and how to use [http://accidentalscene86.jimdo.com health hotline], you can get hold of us at our own internet site. |
| '''Bernoulli stochastics''' is a new branch of science and deals with human uncertainty of future developments.<ref>Elart von Collani, State of the Art and Future of Stochastics, in: Ivor Grattan-Guinness and B.S. Yadav (eds): ''History of Mathematical Sciences'', Hindustan Book Agency, New Delhi, pp. 171–190, 2004.</ref> It aims at developing
| |
| quantitative models of the transition from past to future for making reliable and accurate predictions. Bernoulli stochastics should not be confused with [[stochastics]] which is a special branch of mathematics covering [[probability theory]], the theory of [[stochastic processes]] and [[mathematical statistics]].
| |
| | |
| Bernoulli stochastics is based on Jakob Bernoulli's [[quantification of randomness]] and it was mainly developed by Elart von Collani<ref>Elart von Collani (ed), Defining the Science of Stochastics, Helderman, Lemgo,2004.</ref><ref>Elart von Collani and Xiaomin Zhai, ''Stochastics'', Beijing Publisher Group, Beijing, 2005 (in Chinese).</ref> during the last two decades. Since uncertainty of the future constitutes one of the main problems of mankind, Bernoulli stochastics adopts an exceptional position as it provides the means to define and measure uncertainty and thus enables to handle risks adequately for preventing catastrophic developments.
| |
| | |
| {{TOC limit|3}}
| |
| | |
| ==Scope==
| |
| | |
| Uncertainty of the future constitutes not only the main problem for individuals but also for societies and science. Therefore, Bernoulli stochastics which develops models of uncertainty can be considered as a universal approach for solving problems since it provides the rules how to deal with uncertainty and the indeterminate future.
| |
| | |
| The quantitative models of uncertainty developed according to the rules of Bernoulli stochastics include the two sources of human uncertainty of future development. These are human [[ignorance]] about the past or the initial conditions and [[randomness]] which affects the future. Ignorance represents the internal source of human uncertainty, while randomness is the external source. Ignorance is characteristic for man, while randomness is characteristic for universe.
| |
| | |
| {{Quote box
| |
| | quote = “For decisions on the most important projects, like the safety of space travel or the prospects for global warming, it is still the custom to rely on the opinion of experts to evaluate risks. They are known to make mistakes, but no method has been found to replace them.”
| |
| | source = [[James Franklin]], ''The Science of Conjecturing'', The JohnsHopkins University Press, Baltimore (2000), p. 369.
| |
| | width = 50%
| |
| | align= right
| |
| }}
| |
| | |
| There are two types of methods in Bernoulli stochastics. The first type of method enables a glance into the future, while the second type enables to look into the past. The first type is called [[stochastic prediction procedure]], the second type [[stochastic measurement procedure]]. These stochastic procedures which are based on a model that objectively reflects reality aim at replacing belief and opinion which are still the prevailing means to overcome the problems generated by uncertainty and risks.
| |
| | |
| For understanding and applying Bernoulli stochastics the prevailing [[causal thinking]] must be abandoned in favor of [[stochastic thinking]].<ref>Elart von Collani, [http://pubs.amstat.org/doi/pdfplus/10.1198/tast.2010.09190 "Response to ‘Desired and Feared—What Do We Do Now and Over the Next 50 Years’ by Xiao-Li Meng"], ''The American Statistician'', 2010, 64(1): 23–25.</ref> Actually, adopting stochastic thinking constitutes a major difficulty in understanding and applying Bernoulli stochastics.
| |
| | |
| ==History==
| |
| | |
| The development of Bernoulli stochastics started more than 300 years ago with the theologian and mathematician [[Jakob Bernoulli]] (1655–1705)<ref>Usually the year of Jakob Bernoulli's birth is given as 1654, however, this is not correct if the nowadays valid Gregorian calendar is applied.</ref> from Basel in Switzerland. Jakob Bernoulli succeeded to quantify randomness of future events.<ref>Elart von Collani, [http://isi.cbs.nl/bnews/06b/index.html "Jacob Bernoulli Deciphered"], Bernoulli News, 2006, Vol. 13/2.</ref> Due to randomness a future event may or may not occur. Randomness
| |
| can be observed by repeating the same experiment several times. Then some events will occur often and others more seldom. Bernoulli
| |
| explained randomness of a future event by "the degree of certainty of the occurrence of the event" and called this degree
| |
| "probability of the event". He planned to develop a science based on the concept of [[probability]] and named this science in Latin
| |
| "Ars conjectandi" or in Greek "stochastike", i.e. "science of prediction.". Unfortunately, he died too early and his masterpiece [[Ars conjectandi]] was only published posthumously in 1713.<ref>Jacob Bernoulli, The Art of Conjecturing, translated by Edith Dudley Sykka, 2006, Johns Hopkins University Press, Baltimore.</ref> His proposal was not taken up by science and instead "[[probability theory]]" as a branch of mathematics, and "[[statistics]]" as a branch of empirical science were developed.<ref>Elart von Collani, The forgotton science of prediction, in: V. Nithyanantha Bhat, T. Thrivikraman, V. Madhikar Mallayya, and S. Madhavan (eds.), ''History and Heritage of Mathematical Sciences'', Sukrtindra Oriental Research Institute, Kerala, pp. 54–70, 2009.</ref>
| |
| | |
| Bernoulli stochastics was introduced in 2000 during the BS Symposium<ref>During the World Mathematical Year 2000 a number of international conferences and
| |
| workshop were organized under the aegus of the [[Bernoulli Society]], [http://isi.cbs.nl/bnews/00b/bn_4.html].</ref> on "Defining the Science Stochastics, in Memoriam Jakob Bernoulli". The revised and updated versions of the lectures delivered at the symposium were published in 2004. Since then Bernoulli stochastics has been further developed and its methods have been successfully applied in various areas of science and technology, for example in metrology,<ref>[http://ib.ptb.de/8/84/MATHMET2010/VORTRAEGE/MathMet2010_Collani.pdf]</ref> quality control,<ref>Elart von Collani and Karl Baur:
| |
| Was zum Teufel ist Qualität?
| |
| Heldermann Verlag, Lemgo, 2007.</ref> wind energy<ref>Elart von Collani, A. Binder, W. Sans, A. Heitmann, K. Al-Ghazali:
| |
| Design Load Definition by LEXPOL.
| |
| Wind Energy 11, 637–653, 2008.</ref> and nuclear technology.<ref>Elart von Collani and Karl Baur, Brennstabauslegung und Brennstabmodellierung – Teil 1 (Fuel rod design and modeling of fuel rods – Part I), 'Kerntechnik', Vol. 64, 253–260, 2004.</ref><ref>Elart von Collani and Karl Baur, Brennstabauslegung und Brennstabmodellierung – Teil 2 (Fuel rod design and modeling of fuel rods – Part II), 'Kerntechnik', Vol. 70, 158–167, 2005.</ref>
| |
| | |
| In 2002 the company Stochastikon GmbH was founded and started to further develop Bernoulli stochastics. In 2008 the first PhD-thesis <ref>Andreas Binder, ''Die stochastische Wissenschyaft und zwei Teilsysteme eines Web-basierten Informations- und Anwendungesystems zu ihrer Etablierung'', Ph.D. Thesis, Faculty of Mathematics and Computer Science, University Würzburg, 2006, [http://www.opus-bayern.de/uni-wuerzburg/volltexte/2008/2614/].</ref> by Andreas Binder was published dealing with Bernoulli stochastics and two subsystems of a web-based information and application system for its establishment. In April 2011, the second PhD-thesis <ref>Xiaomin Zhai, ''Design, Development and Evaluation of a Virtual Classroom and Teaching Contents for Bernoulli Stochastics'', Ph.D. Thesis, Faculty of Mathematics and Computer Science, University Würzburg, 2011, [http://www.opus-bayern.de/uni-wuerzburg/volltexte/2011/5610/]</ref> on Bernoulli stochastics by Xiaomin Zhai was completed about design, development and evaluation of a virtual classroom and teaching contents for Bernoulli stochastics.
| |
| | |
| ==Overview==
| |
| | |
| {{Quote box
| |
| | quote = “... it is only the manipulation of uncertainty that interests us. We are not concerned with the matter that is uncertain. Thus we do not study the mechanism of rain; only whether it will rain.”
| |
| | source = [[Dennis Lindley]], "The Philosophy of Statistics", ''The Statistician'' (2000).
| |
| | width = 50%
| |
| | align= right
| |
| }}
| |
| | |
| The models in Bernoulli stochastics describe the change from the past to the future including the entire uncertainty as good as the available information permit.
| |
| When developing a model of uncertainty, then one should have always in mind that not the mechanism of the considered process is of major interest, but the future events as expressed by Dennis Lindley.
| |
| # The first step consists of identifying the aspect of the future development which is of interest. Since the future development is subject to randomness it is quantified by a variable ''X'' which is called random variable. The future value of a random variable is indeterminate and generally varies when an experiment is repeated.
| |
| # In a second step the relevant aspects of the past must be identified and represented by a variable ''D''. Since the past is determinate, the variable ''D'' is called deterministic variable.
| |
| | |
| The model refers to the stochastic relation between the deterministic variable ''D'' and the random variable ''X'' and in order to describe the uncertainty of the future development realistically it must necessarily cover the two sources of uncertainty, i.e., ignorance about the past and randomness of the future.
| |
| | |
| ==Stochastic model==
| |
| | |
| The stochastic model of uncertainty specifies what is known about the past, i.e., what is known about the value of the deterministic variable ''D'', and what can occur in the future with respect to the random variable ''X''. A stochastic model describes quantitatively the relation between past and future by considering the entire uncertainty generated by ignorance and randomness. The model is called [[Bernoulli space]] and is denoted by <math>\mathcal{B}_{X,D}</math>. It consists of three components.
| |
| * The [[ignorance space]] denoted <math>\mathcal{D}</math> is a bounded set that contains all those values of the deterministic variable ''D'' which according the available knowledge cannot be excluded. The ignorance space thus describes quantitatively the existing ignorance about the initial conditions. Each subset of the ignorance space represents a certain level of knowledge or equivalently of ignorance, where the singletons represent complete knowledge.
| |
| * The [[variability function]] denoted <math>\mathcal{X}</math> assigns to each level of knowledge (= subset of the ignorance space) a corresponding [[range of variability]] of the random variable ''X''. In other words, each image of the variability function consists of those values of ''X'' which might occur in the future, under the condition that the true, but unknown value of ''D'' is an element of the considered subset of the ignorance space.
| |
| * The [[random structure function]] denoted <math>\mathcal{P}</math> assigns to each level of knowledge (= subset of the ignorance space) a corresponding probability distribution over the corresponding image of the variability function.
| |
| | |
| A Bernoulli Space <math>\mathcal{B}_{X,D} = (\mathcal{D}, \mathcal{X},\mathcal{P}) </math> refers to the pair of variables <math>(X,D)</math> where ''X'' represents the future and ''D'' the past. The ignorance space<math>\mathcal{D}</math> specifies the available knowledge about the past, the variability function <math>\mathcal{X}</math> gives the amount of variability in the future as function of the available knowledge, and finally the random structure function <math>\mathcal{P}</math> specifies the probabilities of future events again as a function of the available knowledge about the initial conditions.
| |
| | |
| ===Learning theory===
| |
| | |
| Knowledge or equivalently ignorance refers to facts, i.e., the past, since the future does not exist so far and
| |
| which of the many future developments will actually occur is subject to randomness and it is therefore in principle impossible to know it.
| |
| | |
| Learning means to increase knowledge or reduce ignorance about facts implying that modelling a learning process is possible only, if ignorance and randomness are explicitly incorporated into the model. In case of a Bernoulli Space, ignorance is modelled by the ignorance space <math>\mathcal{D}</math> and randomness by the variability function <math>\mathcal{X}</math> and the random structure function <math>\mathcal{P}</math>. It follows that the stochastic model given by the Bernoulli Space may be used as basis for developing a theory of learning.
| |
| | |
| ===Natural laws===
| |
| | |
| [[Physical law|Natural laws in physics]] are quantitative models of the transition from the past to the future, and therefore competitors to the Bernoulli Space. Thus, it is of interest to compare the two approaches.
| |
| | |
| A natural law is a function which maps the initial conditions represented by the variable ''D'' on the future outcome of the variable ''X''. For any natural law it is assumed that the initial conditions are known exactly, i.e., the ignorance space is assumed to be a singleton <math>\mathfrak{D} = \{d\}</math>. Moreover most natural laws assume that the future is a mere transformation of the past, i.e., it is assumed that for given initial conditions ''d'' there is only exactly one possible future outcome implying that the range of variability of ''X'' is given by a singleton <math>\{x(d)\}</math>. The image of the random structure function is a probability distribution. In case of a natural law it degenerates to a one-point distribution. Thus, natural laws prove to be degenerate limiting cases of the Bernoulli Space.
| |
| | |
| None of the natural laws invented in physics incorporates the always existing human ignorance. Contrary, the natural laws assume complete knowledge and do therefore not admit improving by learning. Consequently, the approach that results in natural laws turns out to be one of the most serious obstacles for any learning process.
| |
| | |
| ==Procedures of Bernoulli stochastics==
| |
| | |
| There are two main types of procedures in Bernoulli stochastics referring to the two main types of problems mankind is confronted with. The first type consists of prediction procedures and the second type of measurement procedures. Prediction procedures allow to look into the future, while measurement procedures allow to look into the past. The future is characterized by indeterminate events, while the past is characterized by determinate facts.
| |
| | |
| ===Stochastic prediction procedure===
| |
| | |
| A stochastic prediction procedure aims at reducing the uncertainty about the future development of interest represented by the random variable ''X''. A Bernoulli Space is developed in order to enable reliable and accurate prediction about the future development, i.e., about the indeterminate outcome of the random variable ''X''. A stochastic prediction procedure is a function denoted <math>A_X</math> which assigns to each level of knowledge (= subset of the ignorance space) a prediction, i.e., a subset of the corresponding range of variability of $X$. The quality of a prediction is determined by its reliability and its accuracy. The reliability of a prediction is defined by the probability of its occurrence, and the accuracy of a prediction is defined by its size. A stochastic prediction procedure <math>A_X</math> is derived in a way that it meets the following two requirements:
| |
| | |
| * Reliability requirement: A stochastic prediction procedure yields predictions that will occur with a probability of at least <math>\beta</math> where the lower bound <math>\beta</math> is called [[reliability level]] of the prediction procedure.
| |
| * Accuracy requirement: The size of the predictions obtained by a stochastic prediction procedure is minimum.
| |
| | |
| The first condition guarantees a sufficient large reliability of the predictions, while the second condition ensures that the accuracy of the obtained predictions is optimal. A prediction procedure meeting the reliability requirement given by the reliability level <math>\beta</math> is called <math>\beta</math>-prediction procedure denoted <math>A_X^{(\beta)}</math>.
| |
| | |
| ===Stochastic measurement procedure===
| |
| | |
| A stochastic measurement procedure aims at reducing the ignorance about the true but unknown value of the deterministic variable ''D''. Reducing ignorance is equivalent with learning and learning is possible only by a learning process, which is called here a measurement process. The measurement process has an indeterminate outcome which is represented by a random variable ''X'' and for deriving a suitable stochastic measurement procedure the uncertainty related to the measurement process must be described by a Bernoulli Space <math>\mathcal{B}_{X,D}</math>. The Bernoulli Space allows to predict for any possible value of ''D'' which is an element of the ignorance space <math>\mathcal{D}</math> and any reliability level <math>\beta</math> a prediction <math>A^{(\beta)}_X(\{d\})</math>.
| |
| | |
| A stochastic measurement procedure assigns to each outcome of the measurement process, i.e., a subset of the range of variability of ''X'', a measurement result, i.e., a subset of the ignorance space. Thus, a measurement procedure is a function denoted by <math>C_D</math>. A stochastic measurement procedure meets the following three requirements:
| |
| | |
| * Reliability requirement: The probability to obtain a correct result when applying a stochastic measurement procedure is not smaller than a prescribed reliability level <math>\beta</math>, where a result is called correct if it contains the true value of the deterministic variable. If a measurement procedure meets this condition it is called <math>\beta</math>-measurement procedure denoted <math>C^{(\beta)}_D</math>.
| |
| | |
| * Completeness requirement: Each possible, i.e., observable result of the measurement process yields a meaningful measurement result, i.e., a nonempty subset of the ignorance space.
| |
| | |
| * Accuracy requirement: The measurement results of a stochastic <math>\beta</math>-measurement procedure are on average most accurate, i.e., have on average minimum size.
| |
| | |
| Any stochastic <math>\beta</math>-measurement procedure <math>C^{(\beta)}_D</math> is based on a suitable stochastic <math>\beta</math>-prediction procedure <math>A_X^{(\beta)}</math> by the following relation:
| |
| | |
| :::<math>C^{(\beta)}_D(\{x\}) = \{d|x \epsilon A_X^{(\beta)}(\{d\})\}</math>
| |
| | |
| where <math>\{x\}</math> is the observed result of the measurement process. The above relation means that every value ''d'' of the deterministic variable ''D'' is considered in the measurement result <math>C^{(\beta)}_D(\{x\})</math> for which the observation <math>\{x\}</math> had been predicted.
| |
| | |
| The reliability requirement of the measurement procedures is met by the reliability level <math>\beta</math> of the involved prediction procedure. The completeness and accuracy requirements are met by rather complicated mathematical optimization procedures. Because of these two requirements, the prediction procedures for measurement procedures are different from those obtained for prediction procedures.
| |
| | |
| ==Stochastic thinking==
| |
| | |
| Bernoulli stochastics explicitly admits randomness as a characteristic feature of real world. However, as shown in subsection "Natural Law", the stochastic model also covers deterministic relations, however, as degenerate limiting cases. The stochastic approach emanates from the almost obvious fact that everything in the universe is connected with everything. This universal connectivity excludes causal relations since any change is simultaneously cause and effect.
| |
| | |
| Furthermore, the universal connectivity is a property of the entire universe and not of any part of it. It follows that the whole cannot be understood by investigating parts of it and in particular not by investigating elementary particles, i.e., the smallest parts of the universe. Bernoulli stochastics therefore represents not only a stochastic, but also a holistic approach in contrast to physics which is based on determinism and reductionism.
| |
| | |
| As already mentioned applying Bernoulli stochastics requires to abandon causal thinking in favor of stochastic thinking. The difficulty is that almost everybody seems to understand causal thinking, but only very few can explain stochastic thinking. Therefore, the main differences are listed below.
| |
| | |
| * Causal thinking means to trace back the occurrence of a problem to a culprit, i.e., a part of the system. In contrast, according to stochastic thinking the design of the system yields a positive probability for the problem.
| |
| | |
| * The solution of a problem based on causal thinking consists of eliminating the culprit but maintaining the system. In contrast, according to stochastic thinking the situation can only be improved by changing the design of the system to reduce the probability of the problem.
| |
| | |
| * Causal thinking means to explain developments by cause and effect chains which refer to isolated parts of the system. Stochastic thinking does not explain certain partial developments, but looks at the entire system and its stochastic evolution rules.
| |
| | |
| Similar as in subsection Learning Theory the above list illustrates that the stochastic approach represents a learning approach while the causal approach appears as an unsurmountable obstacle for learning.
| |
| | |
| == References ==
| |
| <!--- See http://en.wikipedia.org/wiki/Wikipedia:Footnotes on how to create references using <ref></ref> tags which will then appear here automatically -->
| |
| {{Reflist}}
| |
| | |
| == External links ==
| |
| * Stochastikon Ecyclopedia, [http://www.encyclopedia.stochastikon.com]
| |
| * E-Learning Programme Stochastikon Magister, [http://www.magister.stochastikon.com]
| |
| * Homepage of Stochastikon GmbH, [http://www.stochastikon.com/]
| |
| * Economic Quality Control, [http://www.heldermann-verlag.de/eqc/eqc23/eqc23003.pdf]
| |
| * Journal of Uncertain Systems, [http://www.worldacademicunion.com/journal/jus/jusVol02No3paper05.pdf]
| |
| | |
| <!--- Categories --->
| |
| | |
| [[Category:Information, knowledge, and uncertainty]]
| |
| [[Category:Randomness]]
| |
| [[Category:Probability interpretations]]
| |