Processing math: 100%
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

SEARCH GUIDE  Download Search Tip PDF File

  • articleNo Access

    Comparison of Design Models: A Systematic Mapping Study

    Context: Model comparison plays a central role in many software engineering activities. However, a comprehensive understanding about the state-of-the-art is still required. Goal: This paper aims at classifying and performing a thematic analysis of the current literature. Method: For this, we have followed well-established empirical guidelines to define and perform a systematic mapping study. Results: Some studies (14 out of 40) provide generic model comparison techniques, rather than specific ones for UML diagrams. Conclusion: Fine-grained techniques are still required to support ever-present and complex model comparison tasks during the evolution of design models.

  • articleNo Access

    OPTIMAL HRF AND SMOOTHING PARAMETERS FOR FMRI TIME SERIES WITHIN AN AUTOREGRESSIVE MODELING FRAMEWORK

    The analysis of time series obtained by functional magnetic resonance imaging (FMRI) may be approached by fitting predictive parametric models, such as nearest-neighbor autoregressive models with exogeneous input (NNARX). As a part of the modeling procedure, it is possible to apply instantaneous linear transformations to the data. Spatial smoothing, a common preprocessing step, may be interpreted as such a transformation. The autoregressive parameters may be constrained, such that they provide a response behavior that corresponds to the canonical haemodynamic response function (HRF). We present an algorithm for estimating the parameters of the linear transformations and of the HRF within a rigorous maximum-likelihood framework. Using this approach, an optimal amount of both the spatial smoothing and the HRF can be estimated simultaneously for a given FMRI data set. An example from a motor-task experiment is discussed. It is found that, for this data set, weak, but non-zero, spatial smoothing is optimal. Furthermore, it is demonstrated that activated regions can be estimated within the maximum-likelihood framework.

  • articleNo Access

    EEG-fMRI INTEGRATION: A CRITICAL REVIEW OF BIOPHYSICAL MODELING AND DATA ANALYSIS APPROACHES

    The diverse nature of cerebral activity, as measured using neuroimaging techniques, has been recognised long ago. It seems obvious that using single modality recordings can be limited when it comes to capturing its complex nature. Thus, it has been argued that moving to a multimodal approach will allow neuroscientists to better understand the dynamics and structure of this activity. This means that integrating information from different techniques, such as electroencephalography (EEG) and the blood oxygenated level dependent (BOLD) signal recorded with functional magnetic resonance imaging (fMRI), represents an important methodological challenge. In this work, we review the work that has been done thus far to derive EEG/fMRI integration approaches. This leads us to inspect the conditions under which such an integration approach could work or fail, and to disclose the types of scientific questions one could (and could not) hope to answer with it.

  • articleNo Access

    MODELING THE CHANGE OF BEACH PROFILE UNDER TSUNAMI WAVES: A COMPARISON OF SELECTED SEDIMENT TRANSPORT MODELS

    In contrast to the efforts made to develop hydrodynamic models for large-scale tsunami propagation and run-up, little has been done to develop, test, and validate sediment transport models used to simulate tsunami-induced sediment movement. In this study, the performances of six widely-used sediment transport formulas are evaluated through case studies using an open source code XBeach, which is based on 2D depth-averaged nonlinear shallow water equations. Another open source code, Delft3D, is also used to see to what extent XBeach can give reliable results. The benchmarks used for case studies include three laboratory experiments and one field observation from a post-tsunami field survey conducted after the 2004 Indian tsunami. Our results show that most of the surveyed sediment transport formulas can give good results for laboratory-scale problems, but for real-scale problems, all six formulas failed to produce good results compared to those found in laboratory conditions. For laboratory-scale problems, both XBeach and Delft3D can predict satisfactory results with properly-chosen model parameters. For real tsunamis, high suspended sediment concentration may occur, and density stratification and hindered settling effect play an important role; therefore, Delft3D, with both hindered settling and density stratification being considered, may perform better than XBeach. The findings reported here will be useful for researchers and practitioners working on tsunami hazard mitigation.

  • articleOpen Access

    BEYOND 2020 — STRATEGIES AND COSTS FOR TRANSFORMING THE EUROPEAN ENERGY SYSTEM

    The Energy Modeling Forum 28 (EMF28) study systematically explores the energy system transition required to meet the European goal of reducing greenhouse gas (GHG) emissions by 80% by 2050. The 80% scenario is compared to a reference case that aims to achieve a 40% GHG reduction target. The paper investigates mitigation strategies beyond 2020 and the interplay between different decarbonization options. The models present different technology pathways for the decarbonization of Europe, but a common finding across the scenarios and models is the prominent role of energy efficiency and renewable energy sources. In particular, wind power and bioenergy increase considerably beyond current deployment levels. Up to 2030, the transformation strategies are similar across all models and for both levels of emission reduction. However, mitigation becomes more challenging after 2040. With some exceptions, our analysis agrees with the main findings of the "Energy Roadmap 2050" presented by the European Commission.

  • articleOpen Access

    OVERVIEW OF THE EMF 32 STUDY ON U.S. CARBON TAX SCENARIOS

    The Energy Modeling Forum (EMF) 32 study on carbon tax scenarios analyzed a set of illustrative policies in the United States that place an economy-wide tax on fossil-fuel-related carbon dioxide (CO2) emissions, a carbon tax for short. Eleven modeling teams ran these stylized scenarios, which vary by the initial carbon tax rate, the rate at which the tax escalates over time, and the use of the revenues. Modelers reported their results for the effects of the policies, relative to a reference scenario that does not include a carbon tax, on emissions, economic activity, and outcomes within the U.S. energy system. This paper explains the scenario design, presents an overview of the results, and compares results from the participating models. In particular, we compare various outcomes across the models, such as emissions, revenue, gross domestic product, sectoral impacts, and welfare.

  • articleOpen Access

    POLICY INSIGHTS FROM THE EMF 32 STUDY ON U.S. CARBON TAX SCENARIOS

    The Stanford Energy Modeling Forum exercise 32 (EMF 32) used 11 different models to assess emissions, energy, and economic outcomes from a plausible range of economy-wide carbon price policies to reduce carbon dioxide (CO2) emissions in the United States. Here we discuss the most policy-relevant results of the study, mindful of the strengths and weaknesses of current models. Across all models, carbon prices lead to significant reductions in CO2 emissions and conventional pollutants, with the vast majority of the reductions occurring in the electricity sector. Importantly, emissions reductions do not significantly depend on the rebate or tax cut used to return revenues to the economy. Expected economic costs, as modeled by either GDP or welfare, are modest, but vary across models. These costs are offset by benefits from avoided climate damages and health benefits from reductions in conventional air pollution. Using revenues to reduce preexisting capital or labor taxes reduces costs in most models relative to lump-sum rebates, but the size of the cost reductions varies significantly. Devoting at least some revenue to household rebates can significantly reduce adverse impacts on low income households. Carbon prices at $25/ton or even lower levels cause significant shifts away from coal as an energy source with responses of other energy sources highly dependent upon technology cost assumptions. Beyond 2030, we conclude that model uncertainties are too large to make quantitative results useful for near-term policy design. We close by describing recommendations for policymakers on interacting with model results in the future.

  • articleOpen Access

    DISTRIBUTIONAL IMPLICATIONS OF A NATIONAL CO2 TAX IN THE U.S. ACROSS INCOME CLASSES AND REGIONS: A MULTI-MODEL OVERVIEW

    This paper presents a multi-model assessment of the distributional impacts of carbon pricing. A set of harmonized representative CO2 taxes and tax revenue recycling schemes is implemented in five large-scale economy-wide general equilibrium models. Recycling schemes include various combinations of uniform transfers to households and labor and capital income tax reductions. Particular focus is put on equity — the distribution of impacts across household incomes — and efficiency, evaluated in terms of household welfare. Despite important differences in the assumptions underlying the models, we find general agreement regarding the ranking of recycling schemes in terms of both efficiency and equity. All models identify a clear trade-off between efficient but regressive capital tax reductions and progressive but costly uniform transfers to households; all agree upon the inferiority of labor tax reductions in terms of welfare efficiency; and all agree that different combinations of capital tax reductions and household transfers can be used to balance efficiency and distributional concerns. A subset of the models go further and find that equity concerns, particularly regarding the impact of the tax on low income households, can be alleviated without sacrificing much of the double-dividend benefits offered by capital tax rebates. There is, however, less agreement regarding the progressivity of CO2 taxation net of revenue recycling. Regionally, the models agree that abatement and welfare impacts will vary considerably across regions of the U.S. and generally agree on their broad geographical distribution. There is, however, little agreement regarding the regions which would profit more from the various recycling schemes.

  • articleNo Access

    Analyzing Future Water Scarcity in Computable General Equilibrium Models

    Incorporating water into a computable general equilibrium (CGE) model operating at global scale can be extremely demanding due to the absence of standardized data, the sheer dimensions caused by intersecting river basins with countries, and difficulties to model demand for and supply of water. This has led many authors to introduce water in their CGE modeling framework in different ways and at different spatial and sectoral aggregation levels. Of course, simplifying market for water and sacrificing the geographical realism risk introducing errors caused by inappropriate aggregation. In this paper, we use an elaborate global CGE model to investigate the three most commonly practiced simplifications: (1) tackling global questions in a national level model; (2) collapsing irrigated and rainfed crop production into a single sector; and (3) removing river basin boundaries within a country. In each case, we compare their performance in predicting the impacts of future irrigation scarcity on international trade, crop output, land use change and welfare, relative to the full scale model. As might be expected, the single region model does a good job of matching outcomes for that region, although changes in bilateral trade can entail significant errors. When it comes to the elimination of sub-national river basins and irrigation location, we find that, if the research question has to do with changes in national-scale trade, production and welfare changes, it may be sufficient to ignore the sub-national hydrological boundaries in global economic analysis of water scarcity. However, when decision makers have an interest in the distribution of inputs and outputs within a region, preserving the river basin and sectoral detail in the model brings considerable added value to the analysis.

  • chapterNo Access

    HIDDEN MARKOV EXPERTS

    Most approaches in forecasting merely try to predict the next value of the time series. In contrast, this paper presents a framework to predict the full probability distribution. It is expressed as a mixture model: the dynamics of the individual states is modeled with so-called "experts" (potentially non-linear neural networks), and the dynamics between the states is modeled using a hidden Markov approach. The full density predictions are obtained by a weighted superposition of the individual densities of each expert. This model class is called "hidden Markov experts".

    Results are presented for daily S&P500 data. While the predictive accuracy of the mean does not improve over simpler models, evaluating the prediction of the full density shows a clear out-of-sample improvement both over a simple GARCH(1,1) model (which assumes Gaussian distributed returns) and over a "gated experts" model (which expresses the weighting for each state non-recursively as a function of external inputs). Several interpretations are given: the blending of supervised and unsupervised learning, the discovery of hidden states, the combination of forecasts, the specialization of experts, the removal of outliers, and the persistence of volatility.