Loading [MathJax]/jax/output/CommonHTML/jax.js
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  Bestsellers

  • articleOpen Access

    AN AI APPROACH TO MEASURING FINANCIAL RISK

    AI artificial intelligence brings about new quantitative techniques to assess the state of an economy. Here, we describe a new measure for systemic risk: the Financial Risk Meter (FRM). This measure is based on the penalization parameter (λ) of a linear quantile lasso regression. The FRM is calculated by taking the average of the penalization parameters over the 100 largest US publicly-traded financial institutions. We demonstrate the suitability of this AI-based risk measure by comparing the proposed FRM to other measures for systemic risk, such as VIX, SRISK and Google Trends. We find that mutual Granger causality exists between the FRM and these measures, which indicates the validity of the FRM as a systemic risk measure. The implementation of this project is carried out using parallel computing, the codes are published on www.quantlet.de with keyword formula FRM. The R package RiskAnalytics is another tool with the purpose of integrating and facilitating the research, calculation and analysis methods around the FRM project. The visualization and the up-to-date FRM can be found on hu.berlin/frm.

  • articleNo Access

    RISK MANAGEMENT UNDER A FACTOR STOCHASTIC VOLATILITY MODEL

    In this paper, we study risk measures and portfolio problems based on a Stochastic Volatility Factor Model (SVFM). We analyze the sensitivity of Value at Risk (VaR) and Expected Shortfall (ES) to the changes in the parameters of the model. We compare the positions of a linear portfolio under assets following a SVFM, a Black–Scholes Model and a model with constant dependence structure. We consider an application to a portfolio of three selected Asian funds.

  • articleNo Access

    RELIABILITY DESIGN AND RVaR

    This paper provides an alternative valuation approach to reliability design based a concept of Value at Risk (VaR) used in finance to measure risk exposure. An accounting of the direct and indirect costs combined with the costs needed to meet contingent claims in case of default is used to redefine the traditional reliability design problem, resulting is a nonlinear optimization problem which we can solve by the usual methods. Applications are used to demonstrate the approach used.

  • articleNo Access

    EQUITY ALLOCATION AND PORTFOLIO SELECTION IN INSURANCE: A SIMPLIFIED PORTFOLIO MODEL

    A quadratic discrete time probabilistic model, for optimal portfolio selection, under risk constraint, is introduced in the context of (re-) insurance and finance. The portfolio is composed of contracts with arbitrary underwriting and maturity times. For positive values of underwriting levels, the expected value of the accumulated final result is optimized under constraints on its variance and on annual Returns On Equity. Existence of a unique solution is proved and a Lagrangian formalism is given. An effective method for solving the Euler-Lagrange equations is developed. The approximate determination of the multipliers is discussed. This basic model, which can include both assets and liabilities, is an important building block for more general models, with constraints also on non-solvency probabilities, market-shares, short-fall distributions and Values at Risk.

  • articleNo Access

    TIME VARYING SENSITIVITIES ON A GRID ARCHITECTURE

    We investigate the gains obtained by using GRID, an innovative web-based technology for parallel computing, in a Risk Management application. We show, by estimating a parametric Value at Risk, how GRID computing offers an opportunity to enhance the solution of computationally demanding problems with decentralized data retrieval. Furthermore, we also provide an analysis of the risk factors in the US market, by empirically testing, on the Fama and French database, a classic one factor model augmented with a time varying specification of beta.

  • articleNo Access

    THE STRESS-DEPENDENT RANDOM WALK

    A log-normal random walk with parameters that are functions of market stress naturally accounts for volatility clustering and fat-tailed return distributions. Fitting this model to a stock and a bond index we find no evidence of significant misspecification despite the fact that the model has no adjustable parameters. This model can be interpreted as a stochastic volatility model without latent variables. We obtain a closed-form expression for the Value at Risk (VaR) that accommodates returns of any magnitude and discuss several other applications.

  • articleNo Access

    THEORETICAL SENSITIVITY ANALYSIS FOR QUANTITATIVE OPERATIONAL RISK MANAGEMENT

    We study the asymptotic behavior of the difference between the values at risk (VaR) VaRα(L) and VaRα(L+S) for heavy-tailed random variables L and S with α1 for application in sensitivity analysis of quantitative operational risk management within the framework of the advanced measurement approach of Basel II (and III). Here, L describes the loss amount of the present risk profile and S describes the loss amount caused by an additional loss factor. We obtain different types of results according to the relative magnitudes of the thicknesses of the tails of L and S. In particular, if the tail of S is sufficiently thinner than that of L, then the difference between prior and posterior risk amounts VaRα(L+S)VaRα(L) is asymptotically equivalent to the expectation (expected loss) of S.

  • articleNo Access

    A LIQUIDATION RISK ADJUSTMENT FOR VALUE AT RISK AND EXPECTED SHORTFALL

    This paper proposes an intuitive and flexible framework to quantify liquidation risk for financial institutions. We develop a model where the “fundamental” dynamics of assets is modified by price impacts from fund liquidations. We characterize mathematically the liquidation schedule of financial institutions and study in detail the fire sales resulting endogenously from margin constraints when a financial institution trades through an exchange. Our study enables to obtain tractable formulas for the value at risk and expected shortfall of a financial institution in the presence of fund liquidation. In particular, we find an additive decomposition for liquidation-adjusted risk measures. We show that such a measure can be expressed as a “fundamental” risk measure plus a liquidation risk adjustment that is proportional to the size of fund positions as a fraction of asset market depths. Our results can be used by risk managers in financial institutions to tackle liquidity events arising from fund liquidations better and adjust their portfolio allocations to liquidation risk more accurately.

  • articleNo Access

    Volatility Forecast by Volatility Index and Its Use as a Risk Management Tool Under a Value-at-Risk Approach

    This paper examines the predicting power of the volatility indexes of VIX and VHSI on the future volatilities (or called realized volatility, σ30,r) of their respective underlying indexes of S&P500 Index, SPX and Hang Seng Index, HSI. It is found that volatilities indexes of VIX and VHSI, on average, are numerically greater than the realized volatilities of SPX and HSI, respectively. Further analysis indicates that realized volatility, if used for pricing options, would, on some occasions, result in greatest losses of 2.21% and 1.91% of the spot price of SPX and HSI, respectively while the greatest profits are 2.56% and 2.93% of the spot price of SPX and HSI, respectively, making it not an ideal benchmark for validating volatility forecasting techniques in relation to option pricing. Hence, a new benchmark (fair volatility, σf) that considers the premium of option and the cost of dynamic hedging the position is proposed accordingly. It reveals that, on average, options priced by volatility indexes contain a risk premium demanded by the option sellers. However, the options could, on some occasions, result in greatest losses of 4.85% and 3.60% of the spot price of SPX and HSI, respectively while the greatest profits are 4.60% and 5.49% of the spot price of SPX and HSI, respectively. Nevertheless, it can still be a valuable tool for risk management. z-values of various significance levels for value-at-risk and conditional value-at-value have been statistically determined for US, Hong Kong, Australia, India, Japan and Korea markets.

  • articleNo Access

    Convolution Approach for Value at Risk Estimation

    Formally, Value at risk (VaR) measures the worst expected loss over a given horizon under normal market conditions at a given confidence level. Very often, daily data are used to compute VaR and scale up to the required time horizon with the square root of time adjustment. This gives rise to an important question when we perform VaR estimation: whether the values of VaR (i.e., “loss”) should be interpreted as (1) exactly onith day and (2) within i days. This research attempted to answer the above question using actual data of SPX and HSI. It was found that, in determining the proportionality of the values, (i.e., slopes) of VaR versus the holding period, the slopes for within i days are generally greater than those for exactly onith day. This has great implications to risk managers as it would be inappropriate to simply scale up the one-day volatility by multiplying the square root of time (or the number of days) ahead to determine the risk over a longer horizon of i days.

    The evolution of log return distribution over time using actual data has also been performed empirically. It provides a better understanding than a series of VaR values. However, the number of samples in actual data is limited. There may not be enough data to draw reliable observation after it has been evolved a few times. Numerical simulation can help solve the problem by generating, say, one million log returns. It has been used to provide many insights as to how the distribution evolves over time and reveals an interesting dynamic of minimum cumulative returns.

    Numerical simulation is time consuming for performing evolution of distribution. The convolution approach provides an efficient way to analyze the evolution of the whole data distribution that encompasses VaR and others. The convolution approach with modified/scaled input distribution has been developed and it matches the results of numerical simulation perfectly for independent data for both exactly onith day and within i days. As the autocorrelation of the single index is mostly close to zero, results show that the convolution approach is able to match empirical results to a large extent.

  • articleNo Access

    Do Mutual Funds Reward Downside Risk? Evidence from an Emerging Economy

    The increasing participation of retail investors — generally having limited risk bearing ability — in mutual funds, has surprisingly not motivated rigorous examination of downside risks-forward return relationship of these funds, which this study aims to accomplish.

    We use a survivorship-bias free database of returns and portfolio holdings of Indian equity mutual funds covering the period April 2008–November 2018 with Cornish–Fisher expansion of Value-at-Risk (VaR) as a measure of downside risk to examine downside risk-forward return relationship controlling for the fund characteristics such as age, size and style. To complete the story, we also test for the presence of Downside Risk Timing (DRT) ability of fund managers.

    We find that downside risk does predict future returns of mutual funds and the result holds across different fund sizes, age and styles, particularly strongly in large and midcap style. We also find that the fund manager of the median fund exhibits positive DRT at the 1- and 6-month horizons and negative at the 3- and 12-month horizons. The mixed evidence on DRT suggests that fund managers either did not possess the requisite risk-timing ability or had failed in its application. Findings have important implications for both investors and fund managers.

  • articleNo Access

    A New Coherent Risk Measure of Entropic Value at Risk for Uncertain Systems

    Uncertain risk analysis was initially explored by Liu, who introduced the concepts of loss function and risk index within uncertainty theory. Various risk measures, such as value at risk (VaR), tail VaR (TVaR), expected loss and entropic VaR (EVaR), have been proposed within probability theory. To date, Peng extended the concepts of VaR and TVaR, while Liu and Ralescu extended the notion of expected loss for uncertain systems. This paper mainly extends the concept of EVaR as a novel uncertain coherent risk measure. It will be demonstrated as the tightest upper bound one can find using Chernoff inequality for the VaR. In addition, we verify the properties of EVaR under uncertain measure, including positive homogeneity, translation invariance, monotonicity and subadditivity under independence. Furthermore, this paper provides a comparison of VaR, TVaR and EVaR under uncertain measure. Two examples will be given to illustrate this comparison.

  • articleNo Access

    The sum of two independent polynomially-modified hyperbolic secant random variables with application in computational finance

    In this paper, by reshaping the hyperbolic secant distribution using Hermite polynomial, we devise a polynomially-modified hyperbolic secant distribution which is more flexible than secant distribution to capture the skewness, heavy-tailedness and kurtosis of data. As a portfolio possibly consists of multiple assets, the distribution of the sum of independent polynomially-modified hyperbolic secant random variables is derived. In exceptional cases, we evaluate risk measures such as value at risk and expected shortfall (ES) for the sum of two independent polynomially-modified hyperbolic secant random variables. Finally, using real datasets from four international computers stocks, such as Adobe Systems, Microsoft, Nvidia and Symantec Corporations, the effectiveness of the proposed model is shown by the goodness of Gram–Charlier-like expansion of hyperbolic secant law, for performance of value at risk and ES estimation, both in and out of the sample period.

  • articleFree Access

    Alleviating Coordination Problems and Regulatory Constraints Through Financial Risk Management

    Seeing the firm as a nexus of activities and projects, we propose a characterization of the firm where variations in the market price of risk should induce adjustments in the firm's portfolio of projects. In a setting where managers disagree with respect to what investment maximizes value, changing the portfolio of projects generates coordination costs. We then propose a new role for financial risk management based on the idea that the use of financial derivatives reduces coordination costs by moving the organization's expected cash flows and risks toward a point where coordination in favor of real changes is easier to achieve. We find empirical support for this new rationale for the use of financial derivatives, after controlling for the traditional variables explaining the need for financial risk management.

  • articleNo Access

    MARKET RISK OF INVESTMENT IN US SUBPRIME CRISIS: COMPARISON OF A PURE DIFFUSION AND A PURE JUMP MODEL

    We consider the oldest financial model to estimate the market risk of investment underlying the world indexes and compare its risk management features with those of a newer model. Our concern is the risk underlying the world indexes in the recent US subprime crisis period. We illustrate how the recent variance gamma (VG) pure jump model is comparable with one of the earliest pure diffusion (Bachelier (BC)) model in estimating investment risk in financial markets using the tail risk measure value-at-risk (VaR) and its coherent version expected shortfall (ES). We observe that for pure jump VG model single quantile VaR is consistently a better performer across indexes; however for tail average risk measure ES, VG is not a consistently better performer; pure diffusion Bachelier model gives ES estimates which are often — not always — better than VG. This provides one more empirical indication that the combination of diffusion and jump is likely to be more effective in turbulent times, e.g., in US subprime crisis period.

  • articleNo Access

    MODELING THE DYNAMICS OF INTERNATIONAL AGRICULTURAL COMMODITY PRICES: A COMPARISON OF GARCH AND STOCHASTIC VOLATILITY MODELS

    In this study, we employ generalized autoregressive conditional heteroscedastic (GARCH) and stochastic volatility models to investigate the dynamics of wheat, corn, and soybean prices. We find that the stochastic volatility model provides the highest persistence of the volatility estimation in all cases. In addition, based on the monthly data, we find that the jump process and asymmetric effect do not exist in agricultural commodity prices. Finally, by estimating Value at risk (VaR) for these agricultural commodities, we find that the upsurge in agricultural prices in 2008 may have been caused by financialization.

  • articleOpen Access

    ACCURACY VERSUS COMPLEXITY TRADE-OFF IN VaR MODELING: COULD TECHNICAL ANALYSIS BE A SOLUTION?

    Accurate Value at Risk (VaR) estimations are crucial for the robustness and stability of a financial system. Even though significant advances have been made in the field of risk modeling, many crises have emerged during the same period, and an explanation for this is that the advanced models are not widely applied in the financial industry due to their mathematical complexity. In contrast to the mathematically complex models that torture the data in the output stage, we suggest a new approach that filters the data inputs, based on Technical Analysis (TA) signals. When the trading signals suggest that the conditions are positive (negative) for investments we use data from the previously documented positive (negative) periods in order to calculate the VaR. In this way, we use input data that are more representative of the financial conditions under examination and thus VaR estimations are more accurate and more representative (nonprocyclical) than the conventional models’ estimation that use the last nonfiltered x-day observations. Testing our assumptions in the US stock market for the period 2000–2017, the empirical data confirmed our hypothesis. Moreover, we suggest specific legislative adjustments that contribute to more accurate and representative VaR estimations: (i) an extra backtesting procedure at a lower than the 99% confidence level as a procyclicality test and (ii) to ease the minimum requirement of 250 observations that is currently the input threshold because it leads to less accurate VaR estimations.

  • articleNo Access

    Optimal trade execution under displaced diffusions dynamics across different risk criteria

    We solve a version of the optimal trade execution problem when the mid asset price follows a displaced diffusion (DD). Optimal strategies in the adapted class under various risk criteria, namely value-at-risk (VaR), expected shortfall (ES) and a new criterion called squared asset expectation (SAE), related to a version of the cost variance measure, are derived and compared. It is well known that DDs exhibit dynamics that are in-between arithmetic Brownian motions (ABM) and geometric Brownian motions (GBM) depending of the choice of the shift parameter. Furthermore, DD allows for changes in the support of the mid asset price distribution, allowing one to include a minimum permitted value for the mid price, either positive or negative. We study the dependence of the optimal solution on the choice of the risk aversion criterion. Optimal solutions across criteria and asset dynamics are comparable although differences are not negligible for high levels of risk aversion and low market impact assets. This is illustrated with numerical examples.

  • chapterNo Access

    Chapter 72: Non-Parametric Inference on Risk Measures for Integrated Returns

    When evaluating the market risk of long-horizon equity returns, it is always difficult to provide a statistically sound solution due to the limitation of the sample size. To solve the problem for the value-at-risk (VaR) and the conditional tail expectation (CTE), Ho et al. (2016, 2018) introduce a general multivariate stochastic volatility return model from which asymptotic formulas for the VaR and the CTE are derived for integrated returns with the length of integration increasing to infinity. Based on the formulas, simple non-parametric estimators for the two popular risk measures of the long-horizon returns are constructed. The estimates are easy to implement and shown to be consistent and asymptotically normal. In this chapter, we further address the issue of testing the equality of the CTEs of integrated returns. Extensive finite-sample analysis and real data analysis are conducted to demonstrate the efficiency of the test statistics we propose.