Euler characteristic: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Steelpillow
Arbitrary dimensions: "also" for readability
en>Steelpillow
Undid revision 641083476 by 94.189.138.15 (talk) they are wholly unrelated
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{Distinguish|Valuation risk}}
If you compare registry cleaners there are a amount of aspects to look out for. Because of the sheer amount of for registry products accessible on the Internet at when it can be quite effortless to be scammed. Something usually overlooked is the fact that certain of these products can after all end up damaging the PC. And the registry they say they have cleaned will simply cause more problems with the computer than the ones we started with.<br><br>You might discover which there are registry products that are free plus those that you will have to pay a nominal sum for. Some registry cleaners provide a bare bones program for free with the way of upgrading to a more advanced, efficient variation of the same program.<br><br>The PC could have a fragmented hard drive or the windows registry would have been corrupted. It may equally be due to the dust and dirt that needs to be cleaned. Whatever the problem, we can always find a solution. Here are some strategies on how to make the PC run faster.<br><br>First, constantly clean a PC and keep it without dust plus dirt. Dirt clogs up all fans plus can result the PC to overheat. We moreover have to clean up disk room in purchase to create your computer run faster. Delete temporary and unwanted files plus unused programs. Empty the recycle bin plus remove programs you're not utilizing.<br><br>The second step to fixing these mistakes is to use a system called a "[http://bestregistrycleanerfix.com/fix-it-utilities fix it utilities]" to scan by the computer plus fix any of the registry errors which could equally be leading to this error. A registry cleaner is a software system which may scan through the computer plus repair some of the problems which Windows has inside, allowing the computer to "remember" all the settings it has when it loads up. Although the registry is continually being used to aid load up a big number of programs on a PC, it's continually being saved incorrectly - leading to a big number of errors to be formed. To fix this issue, it's suggested you download a registry cleaner from the Internet plus install it on your Pc, allowing Windows to run smoother again.<br><br>Reinstall Windows 7 - If nothing appears to function, reinstall Windows 7 with the installation disc which came with the pack. Kindly backup or restore all your information to a flash drive or another hard drive/CD etc. before performing the reinstallation.<br><br>To speed up your computer, we just have to be able to do away with all these junk files, permitting your computer to obtain exactly what it wants, when it wants. Luckily, there's a tool that enables us to do this conveniently and promptly. It's a tool called a 'registry cleaner'.<br><br>Before you purchase a whole new system; it's time to get the old 1 cleaned up to start getting more done online today! Visit our website under plus access the most reputable registry cleaner software available.
:'''''VaR''' redirects here. For the statistical technique '''VAR''', see [[Vector autoregression]]. For the statistic denoted '''Var''' or '''var''', see [[Variance]].''
 
[[Image:VaR diagram.JPG|thumb|300px|The 5% Value at Risk of a hypothetical profit-and-loss probability density function]]
In [[financial mathematics]] and [[financial risk management]], '''value at risk''' ('''VaR''') is a widely used [[risk measure]] of the [[market risk|risk of loss]] on a specific [[Portfolio (finance)|portfolio]] of financial assets. For a given portfolio, [[probability]] and time horizon, VaR is defined as a threshold value such that the probability that the [[Mark to market accounting|mark-to-market]] loss on the portfolio over the given time horizon exceeds this value (assuming normal markets and no trading in the portfolio) is the given probability level.<ref name="Jorion">{{cite book|last=Jorion|first=Philippe|title=Value at Risk: The New Benchmark for Managing Financial Risk|edition=3rd|publisher=McGraw-Hill|year=2006|isbn=978-0-07-146495-6}}</ref> {{clarify|date=September 2012}}
 
For example, if a portfolio of stocks has a one-day 5% VaR of $1 million, there is a 0.05 probability that the portfolio will fall in value by more than $1 million over a one day period if there is no trading. Informally, a loss of $1 million or more on this portfolio is expected on 1 day out of 20 days (because of 5% probability). A loss which exceeds the VaR threshold is termed a “VaR break.”<ref name="Holton">{{cite book|first=Glyn|last=Holton|title=Value-at-Risk: Theory and Practice|publisher=Academic Press|year=2003|isbn=978-0-12-354010-2}}</ref>
 
VaR has four main uses in [[finance]]: [[risk management]], financial [[Comptroller|control]], [[Financial statements|financial reporting]] and computing [[capital requirement|regulatory capital]]. VaR is sometimes used in non-financial applications as well.<ref name="McNeil">{{cite book|first1=Alexander|last1=McNeil|first2=Rüdiger|last2=Frey|first3=Paul|last3=Embrechts|title=Quantitative Risk Management: Concepts Techniques and Tools|publisher=Princeton University Press|year=2005|isbn=978-0-691-12255-7}}</ref>
 
Important related ideas are [[economic capital]], [[backtesting]], [[stress testing]], [[expected shortfall]], and [[tail conditional expectation]].<ref name="Dowd">{{cite book|last=Dowd|first=Kevin|title=Measuring Market Risk|year=2005|publisher=John Wiley & Sons|isbn=978-0-470-01303-8}}</ref>
 
== Details ==
 
Common parameters for VaR are 1% and 5% probabilities and one day and two week horizons, although other combinations are in use.<ref name="Pearson">{{cite book|first=Neil|last=Pearson|title=Risk Budgeting: Portfolio Problem Solving with Value-at-Risk|publisher=John Wiley & Sons|year=2002|isbn=978-0-471-40556-6}}</ref>
 
The reason for assuming normal markets and no trading, and to restricting loss to things measured in [[financial statements|daily accounts]], is to make the loss [[Observability|observable]]. In some extreme financial events it can be impossible to determine losses, either because market prices are unavailable or because the loss-bearing institution breaks up. Some longer-term consequences of disasters, such as lawsuits, loss of market confidence and employee morale and impairment of brand names can take a long time to play out, and may be hard to allocate among specific prior decisions. VaR marks the boundary between normal days and extreme events. Institutions can lose far more than the VaR amount; all that can be said is that they will not do so very often.<ref name="Unbearable" />
 
The probability level is about equally often specified as one minus the probability of a VaR break, so that the VaR in the example above would be called a one-day 95% VaR instead of one-day 5% VaR. This generally does not lead to confusion because the probability of VaR breaks is almost always small, certainly less than 0.5.<ref name="Jorion" />
 
Although it virtually always represents a loss, VaR is conventionally reported as a positive number. A negative VaR would imply the portfolio has a high probability of making a profit, for example a one-day 5% VaR of negative $1 million implies the portfolio has a 95% chance of making more than $1 million over the next day.<ref name="Crouhy">{{cite book|first1=Michel|last1=Crouhy|first2=Dan|last2=Galai|first3=Robert|last3=Mark|title=The Essentials of Risk Management|publisher=McGraw-Hill|year=2001|isbn=978-0-07-142966-5}}</ref>
 
Another inconsistency is that VaR is sometimes taken to refer to profit-and-loss at the end of the period, and sometimes as the maximum loss at any point during the period. The original definition was the latter, but in the early 1990s when VaR was aggregated across trading desks and time zones, end-of-day valuation was the only reliable number so the former became the ''de facto'' definition. As people began using multiday VaRs in the second half of the 1990s, they almost always estimated the distribution at the end of the period only. It is also easier theoretically to deal with a point-in-time estimate versus a maximum over an interval. Therefore the end-of-period definition is the most common both in theory and practice today.<ref name="Lopez">{{cite journal|author=Jose A. Lopez|title=Regulatory Evaluation of Value-at-Risk Models|publisher=Wharton Financial Institutions Center Working Paper 96-51|date=September 1996}}</ref>
 
== Varieties of VaR ==
 
The definition of VaR is [[Constructive proof|nonconstructive]]; it specifies a [[Property (philosophy)|property]] VaR must have, but not how to compute VaR. Moreover, there is wide scope for interpretation in the definition.<ref name="Roundtable I">{{cite conference|first1=Joe|last1=Kolman|first2=Michael|last2=Onak|first3=Philippe|last3=Jorion|first4=Nassim|last4=Taleb|first5=Emanuel|last5=Derman|first6=Blu|last6=Putnam|first7=Richard|last7=Sandor|first8=Stan|last8=Jonas|first9=Ron|last9=Dembo|first10=George|last10=Holt|first11=Richard|last11=Tanenbaum|first12=William|last12=Margrabe|first13=Dan|last13=Mudge|first14=James|last14=Lam|first15=Jim|last15=Rozsypal|title=Roundtable: The Limits of VaR|publisher=Derivatives Strategy|date=April 1998}}</ref> This has led to two broad types of VaR, one used primarily in [[risk management]] and the other primarily for risk measurement. The distinction is not sharp, however, and hybrid versions are typically used in financial [[Comptroller|control]], [[Financial statements|financial reporting]] and computing [[capital requirement|regulatory capital]].<ref name="Brown">{{Citation|author=[[Aaron Brown (financial author)|Aaron Brown]]|title=The Next Ten VaR Disasters|publisher=Derivatives Strategy|date=March 1997}}</ref>
 
To a risk manager, VaR is a system, not a number. The system is run periodically (usually daily) and the published number is compared to the computed price movement in opening positions over the time horizon. There is never any subsequent adjustment to the published VaR, and there is no distinction between VaR breaks caused by input errors (including [[Information Technology]] breakdowns, [[fraud]] and [[rogue trader|rogue trading]]), computation errors (including failure to produce a VaR on time) and market movements.<ref name="Wilmott">{{cite book|first1=Paul|last1=Wilmott|author-link1=Paul Wilmott|title=Paul Wilmott Introduces Quantitative Finance|publisher=Wiley|year=2007|isbn=978-0-470-31958-1}}</ref>
 
A [[Frequency probability|frequentist]] claim is made, that the long-term frequency of VaR breaks will equal the specified probability, within the limits of sampling error, and that the VaR breaks will be [[Statistical independence|independent]] in time and [[Statistical independence|independent]] of the level of VaR. This claim is validated by a [[backtesting|backtest]], a comparison of published VaRs to actual price movements. In this interpretation, many different systems could produce VaRs with equally good [[backtesting|backtests]], but wide disagreements on daily VaR values.<ref name="Jorion" />
 
For risk measurement a number is needed, not a system. A [[Bayesian probability]] claim is made, that given the information and beliefs at the time, the [[Bayesian probability|subjective probability]] of a VaR break was the specified level. VaR is adjusted after the fact to correct errors in inputs and computation, but not to incorporate information unavailable at the time of computation.<ref name="Crouhy" /> In this context, “[[backtesting|backtest]]” has a different meaning. Rather than comparing published VaRs to actual market movements over the period of time the system has been in operation, VaR is retroactively computed on scrubbed data over as long a period as data are available and deemed relevant. The same position data and pricing models are used for computing the VaR as determining the price movements.<ref name="Holton" />
 
Although some of the sources listed here treat only one kind of VaR as legitimate, most of the recent ones seem to agree that risk management VaR is superior for making short-term and tactical decisions today, while risk measurement VaR should be used for understanding the past, and making medium term and strategic decisions for the future. When VaR is used for [[Comptroller|financial control]] or [[Financial statements|financial reporting]] it should incorporate elements of both. For example, if a [[trader (finance)|trading desk]] is held to a VaR limit, that is both a risk-management rule for deciding what risks to allow today, and an input into the risk measurement computation of the [[trader (finance)|desk’s]] risk-adjusted [[Return (finance)|return]] at the end of the reporting period.<ref name="Dowd" />
 
=== VaR in Governance ===
VaR can also be applied to governance of endowments, trusts, and pension plans. Essentially trustees adopt portfolio Values-at-Risk metrics for the entire pooled account and the diversified parts individually managed. Instead of probability estimates they simply define maximum levels of acceptable loss for each. Doing so provides an easy metric for oversight and adds accountability as managers are then directed to manage, but with the additional constraint to avoid losses within a defined risk parameter. VaR utilized in this manner adds relevance as well as an easy way to monitor risk measurement control far more intuitive than Standard Deviation of Return. Use of VaR in this context, as well as a worthwhile critique on board governance practices as it relates to investment management oversight in general can be found in ''Best Practices in Governance.''<ref>{{Citation|title=Best Practices in Governance|author=Lawrence York|year=2009}}</ref>
 
== Mathematical definition ==
 
Given a confidence level <math>\alpha \in (0,1)</math>, the VaR of the portfolio at the confidence level <math>\alpha</math> is given by the smallest number <math>l</math> such that the probability that the loss <math>L</math> exceeds <math>l</math> is at most <math>(1-\alpha)</math>.<ref name="McNeil" />  Mathematically, if <math>L</math> is the loss of a portfolio, then <math>\operatorname{VaR}_{\alpha}(L)</math> is the level <math>\alpha</math>-[[quantile]], i.e.
 
:<math>\operatorname{VaR}_\alpha(L)=\inf\{l \in \mathbb{R}:P(L>l)\le 1-\alpha\}=\inf\{l\in \mathbb{R}:F_L(l) \ge \alpha\}.</math><ref>{{cite journal|last=Artzner|first=Philippe|last2=Delbaen|first2=Freddy|last3=Eber|first3=Jean-Marc|last4=Heath|first4=David|year=1999|title=Coherent Measures of Risk|journal=Mathematical Finance|volume=9|issue=3|pages=203–228|url=http://www.math.ethz.ch/~delbaen/ftp/preprints/CoherentMF.pdf|format=pdf|accessdate=February 3, 2011}}</ref>
 
The left equality is a definition of VaR. The right equality assumes an underlying probability distribution, which makes it true only for parametric VaR. Risk managers typically assume that some fraction of the bad events will have undefined losses, either because markets are closed or illiquid, or because the entity bearing the loss breaks apart or loses the ability to compute accounts. Therefore, they do not accept results based on the assumption of a well-defined probability distribution.<ref name="Unbearable">{{Citation|author=[[Aaron Brown (financial author)|Aaron Brown]]|title=The Unbearable Lightness of Cross-Market Risk|publisher=Wilmott Magazine|date=March 2004}}</ref> [[Nassim Taleb]] has labeled this assumption, "charlatanism."<ref name="Taleb II">{{Citation|author=Nassim Taleb|title=The World According to Nassim Taleb|publisher=Derivatives Strategy|date=December 1996/January 1997|url=http://www.derivativesstrategy.com/magazine/archive/1997/1296qa.asp}}</ref> On the other hand, many academics prefer to assume a well-defined distribution, albeit usually one with [[Kurtosis|fat tails]].<ref name="Jorion" /> This point has probably caused more contention among VaR theorists than any other.<ref name="Roundtable I" />
 
Value of Risks can also be written as a [[distortion risk measure]] given by the [[distortion function]] <math>g(x) = \begin{cases}0 & \text{if }0 \leq x < 1-\alpha\\ 1 & \text{if }1-\alpha \leq x \leq 1\end{cases}.</math><ref name="Wirch">{{cite web|title=Distortion Risk Measures: Coherence and Stochastic Dominance|author=Julia L. Wirch|author2=Mary R. Hardy|url=http://pascal.iseg.utl.pt/~cemapre/ime2002/main_page/papers/JuliaWirch.pdf|format=pdf|accessdate=March 10, 2012}}</ref><ref name="PropertiesDRM">{{cite doi|10.1007/s11009-008-9089-z}}</ref>
 
== Risk measure and risk metric ==
 
The term “VaR” is used both for a risk [[Measure (mathematics)|measure]] and a [[risk metric]]. This sometimes leads to confusion. Sources earlier than 1995 usually emphasize the risk measure, later sources are more likely to emphasize the metric.
 
The VaR risk measure defines risk as [[Mark to market accounting|mark-to-market]] loss on a fixed portfolio over a fixed time horizon, assuming normal markets. There are many alternative risk measures in finance. Instead of mark-to-market, which uses market prices to define loss, loss is often defined as change in [[Intrinsic value (finance)|fundamental value]]. For example, if an institution holds a [[loan]] that declines in market price because [[interest]] rates go up, but has no change in cash flows or credit quality, some systems do not recognize a loss. Or we could try to incorporate the [[Economics|economic]] cost of things not measured in daily [[financial statements]], such as loss of market confidence or employee morale, impairment of brand names or lawsuits.<ref name="Dowd" />
 
Rather than assuming a fixed portfolio over a fixed time horizon, some risk measures incorporate the effect of expected trading (such as a [[Order (exchange)|stop loss order]]) and consider the expected holding period of positions. Finally, some risk measures adjust for the possible effects of abnormal markets, rather than excluding them from the computation.<ref name="Dowd" />
 
The VaR risk metric summarizes the [[Probability distribution|distribution]] of possible losses by a [[Quantile function|quantile]], a point with a specified probability of greater losses. Common alternative metrics are [[standard deviation]], mean [[absolute deviation]], [[expected shortfall]] and [[Sortino ratio|downside risk]].<ref name="Jorion" />
 
== VaR risk management ==
 
Supporters of VaR-based risk management claim the first and possibly greatest benefit of VaR is the improvement in [[systems]] and modeling it forces on an institution. In 1997, Philippe Jorion [http://www.derivativesstrategy.com/magazine/archive/1997/0497fea2.asp wrote]:<ref name="Jorion I">{{cite conference|first1=Philippe|last1=Jorion|title=The Jorion-Taleb Debate|publisher=Derivatives Strategy|date=April 1997}}</ref><blockquote>[T]he greatest benefit of VAR lies in the imposition of a structured methodology for critically thinking about risk. Institutions that go through the process of computing their VAR are forced to confront their exposure to financial risks and to set up a proper risk management function. Thus the process of getting to VAR may be as important as the number itself.</blockquote>
 
Publishing a daily number, on-time and with specified [[Statistics|statistical]] properties holds every part of a trading organization to a high objective standard. Robust backup systems and default assumptions must be implemented. Positions that are reported, modeled or priced incorrectly stand out, as do data feeds that are inaccurate or late and systems that are too-frequently down. Anything that affects profit and loss that is left out of other reports will show up either in inflated VaR or excessive VaR breaks. “A risk-taking institution that ''does not'' compute VaR might escape disaster, but an institution that ''cannot'' compute VaR will not.” <ref name="Einhorn I">{{cite journal|author=[[Aaron Brown (financial author)|Aaron Brown]]|title=Private Profits and Socialized Risk|journal=GARP Risk Review|date=June–July 2008}}</ref>
 
The second claimed benefit of VaR is that it separates risk into two [[regime]]s. Inside the VaR limit, conventional [[statistical]] methods are reliable. Relatively short-term and specific data can be used for analysis. Probability estimates are meaningful, because there are enough data to test them. In a sense, there is no true risk because you have a sum of many [[Statistical independence|independent]] observations with a left bound on the outcome. A casino doesn't worry about whether red or black will come up on the next roulette spin. Risk managers encourage productive risk-taking in this regime, because there is little true cost. People tend to worry too much about these risks, because they happen frequently, and not enough about what might happen on the worst days.<ref name="Haug" />
 
Outside the VaR limit, all bets are off. Risk should be analyzed with [[stress testing]] based on long-term and broad market data.<ref name="Zask">{{Citation|author=Ezra Zask|title=Taking the Stress Out of Stress Testing|publisher=Derivative Strategy|date=February 1999}}</ref> Probability statements are no longer meaningful.<ref name="Roundtable II">{{cite journal|first1=Joe|last1=Kolman|first2=Michael|last2=Onak|first3=Philippe|last3=Jorion|first4=Nassim|last4=Taleb|first5=Emanuel|last5=Derman|first6=Blu|last6=Putnam|first7=Richard|last7=Sandor|first8=Stan|last8=Jonas|first9=Ron|last9=Dembo|first10=George|last10=Holt|first11=Richard|last11=Tanenbaum|first12=William|last12=Margrabe|first13=Dan|last13=Mudge|first14=James|last14=Lam|first15=Jim|last15=Rozsypal|title=Roundtable: The Limits of Models|journal=Derivatives Strategy|date=April 1998}}</ref>  Knowing the distribution of losses beyond the VaR point is both impossible and useless. The risk manager should concentrate instead on making sure good plans are in place to limit the loss if possible, and to survive the loss if not.<ref name="Jorion" />
 
One specific system uses three regimes.<ref name="Size">{{cite journal|author=[[Aaron Brown (financial author)|Aaron Brown]]|title=On Stressing the Right Size|journal=GARP Risk Review|date=December 2007}}</ref>
 
# One to three times VaR are normal occurrences. You expect periodic VaR breaks. The loss distribution typically has [[Kurtosis|fat tails]], and you might get more than one break in a short period of time. Moreover, markets may be abnormal and trading may exacerbate losses, and you may take losses not measured in daily [[financial statements|marks]] such as lawsuits, loss of employee morale and market confidence and impairment of brand names. So an institution that can't deal with three times VaR losses as routine events probably won't survive long enough to put a VaR system in place.
# Three to ten times VaR is the range for [[stress testing]]. Institutions should be confident they have examined all the foreseeable events that will cause losses in this range, and are prepared to survive them. These events are too rare to estimate probabilities reliably, so risk/return calculations are useless.
# Foreseeable events should not cause losses beyond ten times VaR. If they do they should be [[Hedge (finance)|hedged]] or insured, or the business plan should be changed to avoid them, or VaR should be increased. It's hard to run a business if foreseeable losses are orders of magnitude larger than very large everyday losses. It's hard to plan for these events, because they are out of scale with daily experience. Of course there will be unforeseeable losses more than ten times VaR, but it's pointless to anticipate them, you can't know much about them and it results in needless worrying. Better to hope that the discipline of preparing for all foreseeable three-to-ten times VaR losses will improve chances for surviving the unforeseen and larger losses that inevitably occur.
 
"A risk manager has two jobs: make people take more risk the 99% of the time it is safe to do so, and survive the other 1% of the time. VaR is the border."<ref name="Einhorn I" />
 
== VaR risk measurement ==
 
The VaR risk measure is a popular way to aggregate risk across an institution. Individual business units have risk measures such as [[Bond duration|duration]] for a [[fixed income]] [[Portfolio (finance)|portfolio]] or [[Beta (finance)|beta]] for an [[Stock|equity]] business. These cannot be combined in a meaningful way.<ref name="Jorion" /> It is also difficult to aggregate results available at different times, such as positions marked in different [[time zone]]s, or a high frequency trading desk with a business holding relatively [[Market liquidity|illiquid]] positions. But since every business contributes to profit and loss in an [[Additive function|additive]] fashion, and many [[Finance|financial]] businesses mark-to-market daily, it is natural to define firm-wide risk using the distribution of possible losses at a fixed point in the future.<ref name="Dowd" />
 
In risk measurement, VaR is usually reported alongside other risk metrics such as [[standard deviation]], [[expected shortfall]] and “[[Greeks (finance)|greeks]]” ([[partial derivative]]s of portfolio value with respect to market factors). VaR is a [[Non-parametric statistics|distribution-free]] metric, that is it does not depend on assumptions about the probability distribution of future gains and losses.<ref name="Einhorn I" /> The probability level is chosen deep enough in the left tail of the loss distribution to be relevant for risk decisions, but not so deep as to be difficult to estimate with accuracy.<ref name="Glasserman">{{cite book|author=Paul Glasserman|title=Monte Carlo Methods in Financial Engineering|publisher=Springer|year=2004|isbn=978-0-387-00451-8}}</ref>
 
== Computation methods ==
 
VaR can be estimated either parametrically (for example, [[variance]]-[[covariance]] VaR or [[Greeks (finance)#Delta|delta]]-[[Greeks (finance)#Gamma|gamma]] VaR) or nonparametrically (for examples, historical [[simulation]] VaR or [[Resampling|resampled]] VaR).<ref name="Dowd" /><ref name="Unbearable" /> Nonparametric methods of VaR estimation are discussed in Markovich <ref name="Markovich">{{Citation|last=Markovich|first=N.|title=Nonparametric analysis of univariate heavy-tailed data|publisher=Wiley|year=2007}}</ref> and Novak.<ref name="Novak">{{cite book|last=Novak|first=S.Y.|title=Extreme value methods with applications to finance|publisher=Chapman & Hall/CRC Press|year=2011|isbn=978-1-4398-3574-6}}</ref>
 
A McKinsey report<ref name="McKinsey">{{cite web|title=McKinsey Working Papers on Risk, Number 32|author=McKinsey & Company|url=http://www.mckinsey.com/~/media/McKinsey/dotcom/client_service/Risk/Working%20papers/Working_Papers_on_Risk_32.ashx|format=pdf}}</ref> published in May 2012 estimated that 85% of large banks were using [[historical simulation]]. The other 15% used Monte Carlo methods.
 
== History of VaR ==
 
The problem of risk measurement is an old one in [[statistics]], [[economics]] and [[finance]]. Financial risk management has been a concern of regulators and financial executives for a long time as well. Retrospective analysis has found some VaR-like concepts in this history. But VaR did not emerge as a distinct concept until the late 1980s. The triggering event was the stock market [[Black Monday (1987)|crash of 1987]]. This was the first major financial crisis in which a lot of academically-trained [[Quantitative analyst|quants]] were in high enough positions to worry about firm-wide survival.<ref name="Jorion" />
 
The crash was so unlikely given standard [[statistical]] models, that it called the entire basis of [[Quantitative analyst|quant]] finance into question. A reconsideration of history led some quants to decide there were recurring crises, about one or two per decade, that overwhelmed the statistical assumptions embedded in models used for [[Trader (finance)|trading]], [[investment management]] and [[Derivative (finance)|derivative]] pricing. These affected many markets at once, including ones that were usually not [[correlation|correlated]], and seldom had discernible economic cause or warning (although after-the-fact explanations were plentiful).<ref name="Roundtable II" /> Much later, they were named "[[Black Swan theory|Black Swans]]" by [[Nassim Nicholas Taleb|Nassim Taleb]] and the concept extended far beyond [[finance]].<ref name="Black Swan">{{cite book | author=Taleb, Nassim Nicholas | title=[[The Black Swan: The Impact of the Highly Improbable]] | publisher=[[Random House]] | location=New York | year=2007 | isbn=978-1-4000-6351-2}}</ref>
 
If these events were included in [[quantitative analysis (finance)|quantitative analysis]] they dominated results and led to strategies that did not work day to day. If these events were excluded, the profits made in between "Black Swans" could be much smaller than the losses suffered in the crisis. Institutions could fail as a result.<ref name="Einhorn I" /><ref name="Roundtable II" /><ref name="Black Swan" />
 
VaR was developed as a systematic way to segregate extreme events, which are studied qualitatively over long-term history and broad market events, from everyday price movements, which are studied quantitatively using short-term data in specific markets. It was hoped that "Black Swans" would be preceded by increases in estimated VaR or increased frequency of VaR breaks, in at least some markets. The extent to which this has proven to be true is controversial.<ref name="Roundtable II" />
 
Abnormal markets and trading were excluded from the VaR estimate in order to make it observable.<ref name="Haug">{{cite book|author=[[Espen Gaarder Haug|Espen Haug]]|title=Derivative Models on Models|publisher=John Wiley & Sons|year=2007|isbn=978-0-470-01322-9}}</ref> It is not always possible to define loss if, for example, markets are closed as after [[September 11 attacks|9/11]], or severely illiquid, as happened several times in 2008.<ref name="Einhorn I" /> Losses can also be hard to define if the risk-bearing institution fails or breaks up.<ref name="Haug" /> A measure that depends on traders taking certain actions, and avoiding other actions, can lead to [[self reference]].<ref name="Jorion" />
 
This is risk management VaR. It was well established in [[Quantitative analyst|quantitative trading]] groups at several financial institutions, notably [[Bankers Trust]], before 1990, although neither the name nor the definition had been standardized. There was no effort to aggregate VaRs across trading desks.<ref name="Roundtable II" />
 
The financial events of the early 1990s found many firms in trouble because the same underlying bet had been made at many places in the firm, in non-obvious ways. Since many trading desks already computed risk management VaR, and it was the only common risk measure that could be both defined for all businesses and aggregated without strong assumptions, it was the natural choice for reporting firmwide risk. [[JPMorgan Chase|J. P. Morgan]] CEO [[Dennis Weatherstone]] famously called for a “4:15 report” that combined all firm [[risk]] on one page, available within 15 minutes of the market close.<ref name="Roundtable I" />
 
Risk measurement VaR was developed for this purpose. Development was most extensive at [[JPMorgan Chase|J. P. Morgan]], which published the methodology and gave free access to estimates of the necessary underlying parameters in 1994. This was the first time VaR had been exposed beyond a relatively small group of quants. Two years later, the methodology was spun off into an independent for-profit business now part of [http://www.riskmetrics.com/ RiskMetrics Group].<ref name="Roundtable I" />
 
In 1997, the [[U.S. Securities and Exchange Commission]] ruled that public corporations must disclose quantitative information about their [[Derivative (finance)|derivatives]] activity. Major [[bank]]s and dealers chose to implement the rule by including VaR information in the notes to their [[financial statements]].<ref name="Jorion" />
 
Worldwide adoption of the [[Basel II Accord]], beginning in 1999 and nearing completion today, gave further impetus to the use of VaR. VaR is the preferred [[Measure (mathematics)|measure]] of [[market risk]], and concepts similar to VaR are used in other parts of the accord.<ref name="Jorion" />
 
== Criticism ==
 
VaR has been controversial since it moved from trading desks into the public eye in 1994. A famous 1997 [http://www.derivativesstrategy.com/magazine/archive/1997/0497fea2.asp debate] between [[Nassim Nicholas Taleb|Nassim Taleb]] and Philippe Jorion set out some of the major points of contention. Taleb claimed VaR:<ref name="Taleb Criticism">{{Citation|author=Nassim Taleb|title=The Jorion-Taleb Debate|publisher=Derivatives Strategy|date=April 1997}}</ref>
 
# Ignored 2,500 years of experience in favor of untested models built by non-traders
# Was charlatanism because it claimed to estimate the risks of rare events, which is impossible
# Gave false confidence
# Would be exploited by traders
 
More recently [[David Einhorn (hedge fund manager)|David Einhorn]] and [[Aaron Brown (financial author)|Aaron Brown]] debated VaR in [http://www.garpdigitallibrary.org/download/GRR/2012.pdf Global Association of Risk Professionals Review]<ref name="Einhorn I" /><ref name="Einhorn II">{{Citation|author=David Einhorn|title=Private Profits and Socialized Risk|publisher=GARP Risk Review|date=June–July 2008}}</ref> Einhorn compared VaR to “an airbag that works all the time, except when you have a car accident.” He further charged that VaR:
 
# Led to excessive risk-taking and leverage at financial institutions
# Focused on the manageable risks near the center of the distribution and ignored the tails
# Created an incentive to take “excessive but remote risks”
# Was “potentially catastrophic when its use creates a false sense of security among senior executives and watchdogs.”
 
[[The New York Times|New York Times]] reporter [[Joseph Nocera|Joe Nocera]] wrote an extensive piece [http://www.nytimes.com/2009/01/04/magazine/04risk-t.html?pagewanted=1&_r=1 Risk Mismanagement]<ref name="Nocera">{{Citation|author=[[Joe Nocera]]|title=Risk Mismanagement|publisher=[[The New York Times]] Magazine|date=January 4, 2009}}</ref> on January 4, 2009 discussing the role VaR played in the [[Financial crisis of 2007-2008]]. After interviewing risk managers (including several of the ones cited above) the article suggests that VaR was very useful to risk experts, but nevertheless exacerbated the crisis by giving false security to bank executives and regulators. A powerful tool for professional risk managers, VaR is portrayed as both easy to misunderstand, and dangerous when misunderstood.
 
Taleb, in 2009, testified in Congress asking for the banning of VaR on two arguments, the first that "tail risks are non-measurable" scientifically and the second is that for [[anchoring]] reasons VaR for leading to higher risk taking.<ref>http://gop.science.house.gov/Media/hearings/oversight09/sept10/taleb.pdf</ref>
 
A common complaint among academics is that VaR is not [[Subadditivity|subadditive]].<ref name="Dowd" /> That means the VaR of a combined portfolio can be larger than the sum of the VaRs of its components. To a practising risk manager this makes sense. For example, the average bank branch in the United States is robbed about once every ten years. A single-branch bank has about 0.0004% chance of being robbed on a specific day, so the risk of robbery would not figure into one-day 1% VaR. It would not even be within an order of magnitude of that, so it is in the range where the institution should not worry about it, it should insure against it and take advice from insurers on precautions. The whole point of insurance is to aggregate risks that are beyond individual VaR limits, and bring them into a large enough portfolio to get statistical predictability. It does not pay for a one-branch bank to have a security expert on staff.
 
As institutions get more branches, the risk of a robbery on a specific day rises to within an order of magnitude of VaR. At that point it makes sense for the institution to run internal stress tests and analyze the risk itself. It will spend less on insurance and more on in-house expertise. For a very large banking institution, robberies are a routine daily occurrence. Losses are part of the daily VaR calculation, and tracked statistically rather than case-by-case. A sizable in-house security department is in charge of prevention and control, the general risk manager just tracks the loss like any other cost of doing business.
 
As portfolios or institutions get larger, specific risks change from low-probability/low-predictability/high-impact to statistically predictable losses of low individual impact. That means they move from the range of far outside VaR, to be insured, to near outside VaR, to be analyzed case-by-case, to inside VaR, to be treated statistically.<ref name="Einhorn I" />
 
Even VaR supporters generally agree there are common abuses of VaR:<ref name="Unbearable" /><ref name="Roundtable I" />
 
# Referring to VaR as a "worst-case" or "maximum tolerable" loss. In fact, you expect two or three losses per year that exceed one-day 1% VaR.
# Making VaR control or VaR reduction the central concern of risk management. It is far more important to worry about what happens when losses exceed VaR.
# Assuming plausible losses will be less than some multiple, often three, of VaR. The entire point of VaR is that losses can be extremely large, and sometimes impossible to define, once you get beyond the VaR point. To a risk manager, VaR is the level of losses at which you stop trying to guess what will happen next, and start preparing for anything.
# Reporting a VaR that has not passed a [[backtesting|backtest]]. Regardless of how VaR is computed, it should have produced the correct number of breaks (within [[sampling error]]) in the past. A common specific violation of this is to report a VaR based on the unverified assumption that everything follows a [[multivariate normal distribution]].
 
===VaR, CVaR and EVaR===
 
The VaR is not a [[coherent risk measure]] since it violates the sub-additivity property, which is
 
<math>\mathrm{If}\; X,Y \in \mathbf{L} ,\; \mathrm{then}\; \rho(X + Y) \leq \rho(X) + \rho(Y).</math>
 
However, it can be bounded by coherent risk measures like [[Conditional Value-at-Risk]] (CVaR) or [[entropic value at risk]] (EVaR). In fact, for <math> X\in \mathbf{L}_{M^+} </math> (with <math>\mathbf{L}_{M^+} </math> the set of all [[Borel measure|Borel]] [[measurable function]]s whose [[moment-generating function]] exists for all positive real values) we have
 
<math>\text{VaR}_{1-\alpha}(X)\leq\text{CVaR}_{1-\alpha}(X)\leq\text{EVaR}_{1-\alpha}(X),</math>
 
where
 
<math>
\begin{align}
&\text{VaR}_{1-\alpha}(X):=\inf_{t\in\mathbf{R}}\{t:\text{Pr}(X\leq t)\geq 1-\alpha\},\\
&\text{CVaR}_{1-\alpha}(X) := \frac{1}{\alpha}\int_0^{\alpha} \text{VaR}_{1-\gamma}(X)d\gamma,\\
&\text{EVaR}_{1-\alpha}(X):=\inf_{z>0}\{z^{-1}\ln(M_X(z)/\alpha)\},
\end{align}
</math>
 
in which <math> M_X(z) </math> is the moment-generating function of <math> X </math> at <math> z </math>.  In the above equations the variable <math>X</math> denotes the financial loss, rather than wealth as is typically the case.
 
== See also ==
* [[Capital Adequacy Directive]]
* [[Valuation risk]]
* [[Conditional value-at-risk]]
* [[Entropic value at risk]]
 
== References ==
<!--This section uses the Cite.php citation mechanism. If you would like more information on how to add references to this article, please see http://meta.wikimedia.org/wiki/Cite/Cite.php -->
{{reflist|30em}}
 
== External links ==
;Discussion
* [http://www.wilmott.com/blogs/satyajitdas/enclosures/perfectstorms%28may2007%291.pdf “Perfect Storms” – Beautiful & True Lies In Risk Management], [[Satyajit Das]]
* [http://www.gloriamundi.org/ “Gloria Mundi” – All About Value at Risk], Barry Schachter
* [http://www.nytimes.com/2009/01/04/magazine/04risk-t.html?dlbk=&pagewanted=all Risk Mismanagement], [[Joe Nocera]] [[NYTimes]] article.
* [http://www.savvysoft.com/registerpagevarnothard.htm "VaR Doesn't Have To Be Hard"], Rich Tanenbaum
;Tools
* [http://www.cba.ua.edu/~rpascala/VaR/VaRForm.php Online real-time VaR calculator], Razvan Pascalau, [[University of Alabama]]
* [http://simonbenninga.com/wiener/MiER74.pdf Value-at-Risk (VaR)], Simon Benninga and Zvi Wiener. (Mathematica in Education and Research Vol. 7 No. 4 1998.)
* [http://www.derivativesstrategy.com/magazine/archive/1998/0298fea2.asp Derivatives Strategy Magazine. "Inside D. E. Shaw"] Trading and Risk Management 1998
{{Financial risk}}
 
[[Category:Actuarial science]]
[[Category:Mathematical finance]]
[[Category:Financial risk]]

Latest revision as of 17:34, 5 January 2015

If you compare registry cleaners there are a amount of aspects to look out for. Because of the sheer amount of for registry products accessible on the Internet at when it can be quite effortless to be scammed. Something usually overlooked is the fact that certain of these products can after all end up damaging the PC. And the registry they say they have cleaned will simply cause more problems with the computer than the ones we started with.

You might discover which there are registry products that are free plus those that you will have to pay a nominal sum for. Some registry cleaners provide a bare bones program for free with the way of upgrading to a more advanced, efficient variation of the same program.

The PC could have a fragmented hard drive or the windows registry would have been corrupted. It may equally be due to the dust and dirt that needs to be cleaned. Whatever the problem, we can always find a solution. Here are some strategies on how to make the PC run faster.

First, constantly clean a PC and keep it without dust plus dirt. Dirt clogs up all fans plus can result the PC to overheat. We moreover have to clean up disk room in purchase to create your computer run faster. Delete temporary and unwanted files plus unused programs. Empty the recycle bin plus remove programs you're not utilizing.

The second step to fixing these mistakes is to use a system called a "fix it utilities" to scan by the computer plus fix any of the registry errors which could equally be leading to this error. A registry cleaner is a software system which may scan through the computer plus repair some of the problems which Windows has inside, allowing the computer to "remember" all the settings it has when it loads up. Although the registry is continually being used to aid load up a big number of programs on a PC, it's continually being saved incorrectly - leading to a big number of errors to be formed. To fix this issue, it's suggested you download a registry cleaner from the Internet plus install it on your Pc, allowing Windows to run smoother again.

Reinstall Windows 7 - If nothing appears to function, reinstall Windows 7 with the installation disc which came with the pack. Kindly backup or restore all your information to a flash drive or another hard drive/CD etc. before performing the reinstallation.

To speed up your computer, we just have to be able to do away with all these junk files, permitting your computer to obtain exactly what it wants, when it wants. Luckily, there's a tool that enables us to do this conveniently and promptly. It's a tool called a 'registry cleaner'.

Before you purchase a whole new system; it's time to get the old 1 cleaned up to start getting more done online today! Visit our website under plus access the most reputable registry cleaner software available.