Please login to be able to save your searches and receive alerts for new content matching your search criteria.
In this paper, we study risk measures and portfolio problems based on a Stochastic Volatility Factor Model (SVFM). We analyze the sensitivity of Value at Risk (VaR) and Expected Shortfall (ES) to the changes in the parameters of the model. We compare the positions of a linear portfolio under assets following a SVFM, a Black–Scholes Model and a model with constant dependence structure. We consider an application to a portfolio of three selected Asian funds.
Often, in situations of uncertainty in portfolio management, it is difficult to apply the numerical methods based on the linearity principle. When this happens it is possible to use nonnumeric techniques to assess the situations with a non linear attitude. One of the concepts that can be used in these situations is the concept of grouping. In the last thirty years, several studies have tried to give good solutions to the problems of homogeneous groupings. For example, we could mention the Pichat algorithm, the affinities algorithms and several studies developed by the authors of this work. In this paper, we use some topological axioms in order to develop an algorithm that is able to reduce the number of elements of the power sets of the related sets by connecting them to the sets that form the topologies. We will apply this algorithm in the grouping of titles listed in the Stock Exchange or in its dual perspective.
The model parameters in optimal asset allocation problems are often assumed to be deterministic. This is not a realistic assumption since most parameters are not known exactly and therefore have to be estimated. We consider investment opportunities which are modeled as local geometric Brownian motions whose drift terms may be stochastic and not necessarily measurable. The drift terms of the risky assets are assumed to be affine functions of some arbitrary factors. These factors themselves may be stochastic processes. They are modeled to have a mean-reverting behavior. We consider two types of factors, namely observable and unobservable ones. The closed-form solution of the general problem is derived. The investor is assumed to have either constant relative risk aversion (CRRA) or constant absolute risk aversion (CARA). The optimal asset allocation under partial information is derived by transforming the problem into a full-information problem, where the solution is well known. The analytical result is empirically tested in a real-world application. In our case, we consider the optimal management of a balanced fund mandate. The unobservable risk factors are estimated with a Kalman filter. We compare the results of the partial-information strategy with the corresponding full-information strategy. We find that using a partial-information approach yields much better results in terms of Sharpe ratios than the full-information approach.
The paper offers a new perspective on optimal portfolio choice by investigating how and to what extent knowledge of an investor's desirable initial investment choice can be used to determine his future optimal portfolio allocations. Optimality of investment decisions is built on the so-called forward investment performance criteria and, in particular, on the time-monotone ones. It is shown that for this class of forward criteria the desired initial allocations completely characterize the future optimal investment strategies. The analysis uses the connection between a nonlinear equation, satisfied by the local risk tolerance, and the backward heat equation. Complete solutions are provided as well as various examples.
This paper introduces a new functional optimization approach to portfolio optimization problems by treating the unknown weight vector as a function of past values instead of treating them as fixed unknown coefficients in the majority of studies. We first show that the optimal solution, in general, is not a constant function. We give the optimal conditions for a vector function to be the solution, and hence give the conditions for a plug-in solution (replacing the unknown mean and variance by certain estimates based on past values) to be optimal. After showing that the plug-in solutions are sub-optimal in general, we propose gradient-ascent algorithms to solve the functional optimization for mean–variance portfolio management with theorems for convergence provided. Simulations and empirical studies show that our approach can perform significantly better than the plug-in approach.
We apply impulse control techniques to a cash management problem within a mean-variance framework. We consider the strategy of an investor who is trying to minimise both fixed and proportional transaction costs, whilst minimising the tracking error with respect to an index portfolio. The cash weight is constantly fluctuating due to the stochastic inflow and outflow of dividends and liabilities. We show the existence of an optimal strategy and compute it numerically.
Since social-economic systems increase interdependency, a crucial question arises: Is an interconnected world a safer or a more dangerous place to live? Over the last few years, we have witnessed the dark side of increasing interdependencies. As such, there is a growing need to focus on how to mitigate networked risk and to enhance the system resilience to the impact of a large-scale shock. The traditional engineering approach has been to design systems that are less vulnerable to damage from hazard events. On the other hand, system resilience is the ability to recover from failure and provide the continuity of system function. The goal of the present paper is to investigate the gain from risk sharing. We propose a mechanism of risk sharing that may enhance the resilience of the networked systems. The proposed risk sharing protocols are based on coordinated incentives of agents to survive collectively by absorbing external shocks. The key issue we would like to analyze is how the gain from risk sharing depends on the capacity of each agent to absorb shock and on the interconnections patterns among agents with risk sharing rules. We demonstrate that risk sharing is beneficial from a systems point of view when the agents' capacities to shocks is high and detrimental when it is low. In particular, we evaluate the effectiveness of risk sharing in two domains. In the first domain, in which networked agents have the possibility of cascading failure, risk sharing is useful in mitigating systemic failure, especially if the agents are running at high load. In the second domain, we evaluate the ratio of safe agents who invest in risky portfolios or projects collectively. In this case, risk sharing is only beneficial if the agents' risk absorbing capacity is high.
In credit scoring, it is sometimes desirable to implement a new score in place of an existing score. Let Y denote the existing score and let X denote the new score. It is almost always the case that the score ranges for X and Y are different. For example, the score range for X might be 0 to 100, while the score range for Y might be 300 to 600. It follows from this difference in score ranges that a major difficulty in implementing any new score will be in training those who use the existing score to use the new score.
Score calibration is the process of constructing a calibration function g (·) in such a way that the calibrated score, g (X), is a score which mimics the behavior of Y. In applications, it is commonly found that problems in the distribution of g (X) arise as the result of the calibration process. There is a great need in production for an agorithm which automates the task of discovering these problems and correcting them. In this paper, we propose such a fully automatic algorithm. This algorithm has been successfully tested in many applications.
An algorithm is derived for the development of portfolio decision-support information service. The algorithm allows being automated evaluations for the definition and solution of portfolio problems. Small set of historical data of asset returns with limited set of assets are used for the portfolio, which is the case for no institutional portfolio manager. The algorithm applies analytical relations for decreasing the computational workload for the estimation of the market parameters due to the limited number of assets. The subjective expert views in the Black–Litterman (BL) model are defined from additional assessment of historical data of the asset returns. The algorithm makes comparisons of the results for active portfolio management from the mean variance (MV) model, the BL one and the equal-weighted investment strategy. Benefits of the algorithm are the usage of small set of historical data and limited number of assets, which are proved in investment rolling horizon.
This study analyzes the relationship between Project Portfolio Management (PPM) and the Information Technology (IT) strategic alignment in conjunction with the intensity of technology usage. The methodological approach was a survey-based research, using structural equation modeling based on a field study involving 158 responses. The results indicate PPM has a positive relationship with IT strategic alignment and the intensity of technology usage in the organization does not affect this relationship. It is an original practical and theoretical finding that adds to the PPM studies an important perspective to be considered in the organizations and new research.
Most research on the management of innovation portfolios has focused on new product portfolios, whereas the management of new service portfolios has not been researched correspondingly. This paper addresses this literature gap by exploring portfolio management of New Service Development (NSD) activities empirically. The paper applies a qualitative research design, where data was collected in 52 in-depth interviews with managers and employees involved with NSD. The study finds that the portfolio management activities and processes were carried out in parallel with the NSD process, and that the most important stakeholders in the NSD portfolio management organization were top managers not involved in the daily NSD operations. Findings reveal that the firms used a great variety of criteria when making portfolio decisions. However, contrary to prescriptions based on new product development research, the decision process exposed for NSD was to a limited degree assisted by explicit portfolio management tools. We explicate our findings in five propositions.
This paper aims at investigating issues of asset allocation and equity trading risk in the Gulf Cooperation Council (GCC) stock markets. The intent of this work is to bridge the gap in current asset market liquidity risk management methodologies and to assist GCC financial institutions in developing proactive asset market liquidity risk management techniques to assess potential market risks in light of the upshots of the current financial crisis. Using daily data of main market indicators for the period 2004–2009 and the Liquidity-Adjusted Value at Risk (L-VaR) model, the author finds that the distribution of the equity returns in the GCC stock markets is far from being normal and thus justifies using the L-VaR model, combined with other methods such as stress-testing, to incorporate the other remaining risks. Furthermore, the author shows that although there is a clear departure from normality, the asset market liquidity risk can be estimated without the need of complex mathematical and analytical procedures. To this end, several financial modeling strategies are achieved with the objective of creating a realistic framework of equity trading risk measurement in addition to the instigation of a practical iterative optimization technique for the calculation of maximum authorized L-VaR limits, subject to meaningful real-word operational constraints. Our modeling technique and empirical analysis have important implications for the GCC financial markets and can aid local financial institutions in developing advanced internal risk models and in complying with the requirements of the Basel II committee on capital adequacy.
The aim of this paper is to develop an optimization technique for the assessment of downside-risk limits and investable financial portfolios under crisis-driven outlooks subject to applying meaningful financial and operational constraints. The simulation and testing methods are based on the renowned concept of liquidity-adjusted value-at-risk (LVaR) along with the development of an optimization risk-algorithm utilizing matrix–algebra technique. With the purpose of demonstrating the effectiveness of LVaR and stress-testing techniques, real-world quantitative analysis of structured equity portfolios are depicted for the Gulf Cooperation Council (GCC) financial markets. To this end, several structural simulations studies are accomplished with the goal of establishing realistic financial modeling algorithm for the calculation of downside-risk parameters and to empirically assess portfolio managers' optimal and investable portfolios. The developed methodology and risk valuation algorithms can aid in advancing risk assessment and portfolio management practices in emerging markets, particularly in the wake of the most recent credit crunch and the subsequent financial turmoil.
This study investigates whether cryptocurrencies can be considered a viable addition to pension funds. Using the regulatory setting of Switzerland, it is assessed whether adding crypto-components to a standard pension fund portfolio has positive effects on the fund’s risk and return figures. The empirical data supports the notion that cryptocurrency components may well increase the yield of a pension fund portfolio, yet this enhancement of yield comes at slightly higher risk levels. This increase in risk can be mitigated by adding an actively managed crypto-component to the portfolio rather than a passive investment product. The paper contributes to the ongoing debate in the area of financial innovations on the purpose and solidity of cryptocurrencies as an asset class.
Investment in Islamic portfolios and funds must be subject to the provisions of Islamic Sharia standards in all their activities. The main objective of the Islamic investors and Islamic portfolio managers is not only achieving the conventional investment goal but rather achieving this goal in accordance with Sharia standards. In the absence of Islamic financial markets with Islamic financial instruments, Islamic investors and portfolio managers need to know the Islamic criteria for forming and managing different types of Islamic portfolios and funds. This study provides them useful insights and a practical framework for the Sharia standards and Islamic contracts’ terms, which are agreed upon by most scholars and Sharia boards, to invest in various types of financial instruments.
This chapter extends research literature associated with modern portfolio risk management techniques, by presenting robust modeling algorithms for nonlinear dynamic asset allocation and management under extreme events of illiquidity and adverse market perspectives. This research study examines, from portfolio managers’ perspective, the performance of liquidity-adjusted risk modeling in obtaining optimum and coherent economic capital structures, subject to the application of meaningful operational and financial constraints. Specifically, this chapter examines robust quantitative modeling methods to optimum economic capital allocation, in a liquidity-adjusted value at risk (L-VaR) framework, particularly from the perspective of trading portfolios that have both long and short-sales trading positions. The empirical results, of emerging Gulf Cooperation Council (GCC) financial markets, strongly confirm the importance of enforcing financially and operationally meaningful nonlinear and dynamic constraints, when they are available, on the L-VaR optimization procedure. The implemented optimization techniques and risk assessment algorithms can aid in advancing risk management practices in emerging markets, particularly in the wake of the 2007–2009 financial turmoil. Furthermore, the proposed risk management technique and optimization algorithms can have important applications for financial technology (FinTech) and reinforcement machine learning in big data environments.
The objective of this chapter is to examine reinforcement machine learning quadratic optimization techniques for the computation of downside-risk limits and investable portfolios in post 2007–2009 global financial meltdown. The modeling techniques are based on the notion of Liquidityadjusted Value-at-Risk (LVaR) as well as the application of reinforcement machine learning optimization algorithms with meaningful financial and operational constraints. In this chapter, some simulation case studies are presented for the computation of downside-risk limits and investable portfolios. The applied risk valuation techniques and quadratic optimization algorithms can help in advancing reinforcement machine learning methods, risk computations, and portfolio management practices in the wake of the 2007–2009 global financial turmoil.
Earnings forecasting data has been a consistent, and highly statistically significant, source of excess returns. This chapter discusses a composite model of earnings forecasts, revisions, and breadth, CTEF, a model of forecasted earnings acceleration, was developed in 1997 to identify mispriced stocks. Our most important result is that the forecasted earnings acceleration variable has produced statistically significant Active and Specific Returns in the Post-Global Financial Crisis Period. Simple earnings revisions and forecasted yields have not enhanced returns in the past 7–20 years, leading many financial observers to declare earnings research passé. We disagree! Moreover, earnings forecasting models complement fundamental data (earnings, book value, cash flow, sales, dividends, liquidity) and price momentum strategies in a composite model for stock selection. The composite model strategy excess returns are greater in international stocks than in US stocks. The models reported in Guerard and Mark (2003) are highly statistically significant in its post-publication time period, including booms, recessions, and highly volatile market conditions.
Artificial Intelligence (AI) is a highly-evolved area of computer science that strives to create intelligent machines that can replicate certain human behavior without its irrationalities for better predictability and consistency. Advanced AI that utilizes machine learning makes it possible for machines to learn from previous data (experience), adjust to new inputs (instructions), and perform tasks through updated algorithms. Through sophisticated algorithms, modern AI systems can be trained to accomplish specific tasks by processing large amounts of data, and obtaining insights and recognizable patterns in the data to act upon. As such, AI has become a hot topic, with much interest on its advantages to the highly regulated financial services industry.
Similarly, blockchain technology also has the potential to both enrich and improve financial processes and asset management systems, and progressive corporations have invested and devoted resources to utilize and incorporate blockchain into their businesses. The use of distributed ledgers or blockchains has been explored in areas such as compliance and securities settlement, and these technologies could also be used to improve efficiencies in asset management.
In this chapter, we provide a short discussion of AI and blockchain applications in asset management, understand the benefits and the shift in processes, as well as the challenges that need to be overcome for practical applications for AI and blockchain and how to approach such innovations.
Over the last 20 years, investors have come to approach investment decision-making in an increasingly mechanical manner. Optimizers are filled up with historical return data and the “optimal” portfolio follows almost automatically. In this paper, we argue that such an approach can be extremely dangerous, especially when alternative investments such as hedge funds are involved. Proper hedge fund investing requires a much more elaborate approach to investment decision-making than currently in use by most investors. The available data on hedge funds should not be taken at face value, but should first be corrected for various types of biases and autocorrelation. Tools like mean–variance analysis and the Sharpe ratio that many investors have become accustomed to over the years are no longer appropriate when hedge funds are involved as they concentrate on the good part while completely skipping over the bad part of the hedge fund story. Investors also have to find a way to figure in the long lock-up and advance notice periods, which makes hedge fund investments highly illiquid. In addition, investors will have to give weight to the fact that without more insight in the way in which hedge funds generate their returns it is very hard to say something sensible about hedge funds' future longer-run performance. The tools to accomplish this formally are not all there yet, meaning that more than ever investors will have to rely on common sense and doing their homework.