The aim of long-term mine planning (LTMP) is two-fold: to maximize the net present value of profits (NPV) and determine how ores are sequentially processed over the lifetime. This scheduling task is computationally complex as it is rife with variables, constraints, periods, uncertainties, and unique operations. In this paper, we present trends in the literature in the recent decade. One trend is the shift from deterministic toward stochastic problems as they reflect real-world complexities. A complexity of growing concern is also in sustainable mine planning. Another trend is the shift from traditional operational research solutions — relying on exact or (meta) heuristic methods — toward hybrid methods. They are compared through the scope of the problem formulation and discussed via solution quality, efficiency, and gaps. We finally conclude with opportunities to incorporate artificial intelligence (AI)-based methods due to paucity, multiple operational uncertainties simultaneously, sustainability indicator quantification, and benchmark instances.
We study a novel representation of semiautomata, which is motivated by the method of trace-assertion specifications of software modules. Each state of the semiautomaton is represented by an arbitrary word leading to that state, the canonical word. The transitions of the semiautomaton give rise to a right congruence, the state-equivalence, on the set of input words of the semiautomaton: two words are state-equivalent if and only if they lead to the same state. We present a simple algorithm for finding a set of generators for state-equivalence. Directly from this set of generators, we construct a confluent prefix-rewriting system which permits us to transform any word to its canonical representative. In general, the rewriting system may allow infinite derivations. To address this issue, we impose the condition of prefix-continuity on the set of canonical words. A set is prefix-continuous if, whenever a word w and a prefix u of w are in the set, then all the prefixes of w longer than u are also in the set. Prefix-continuous sets include prefix-free and prefix-closed sets as special cases. We prove that the rewriting system is Noetherian if and only if the set of canonical words is prefix-continuous. Furthermore, if the set of canonical words is prefix-continuous, then the set of rewriting rules is irredundant. We show that each prefix-continuous canonical set corresponds to a spanning forest of the semiautomaton.
In the paper [1], Theorem 23 and Example 24 are incorrect. The corrected version is presented in this note.
Two mobile agents, starting at arbitrary, possibly different times from arbitrary nodes of an unknown network, have to meet at some node. Agents move in synchronous rounds: in each round an agent can either stay at the current node or move to one of its neighbors. Agents have different labels which are positive integers. Each agent knows its own label, but not the label of the other agent. In traditional formulations of the rendezvous problem, meeting is accomplished when the agents get to the same node in the same round. We want to achieve a more demanding goal, called rendezvous with detection: agents must become aware that the meeting is accomplished, simultaneously declare this and stop. This awareness depends on how an agent can communicate to the other agent its presence at a node. We use two variations of the arguably weakest model of communication, called the beeping model, introduced in [8]. In each round an agent can either listen or beep. In the local beeping model, an agent hears a beep in a round if it listens in this round and if the other agent is at the same node and beeps. In the global beeping model, an agent hears a loud beep in a round if it listens in this round and if the other agent is at the same node and beeps, and it hears a soft beep in a round if it listens in this round and if the other agent is at some other node and beeps.
We first present a deterministic algorithm of rendezvous with detection working, even for the local beeping model, in an arbitrary unknown network in time polynomial in the size of the network and in the length of the smaller label (i.e., in the logarithm of this label). However, in this algorithm, agents spend a lot of energy: the number of moves that an agent must make, is proportional to the time of rendezvous. It is thus natural to ask if bounded-energy agents, i.e., agents that can make at most c moves, for some integer c, can always achieve rendezvous with detection as well. This is impossible for some networks of unbounded size. Hence we rephrase the question: Can bounded-energy agents always achieve rendezvous with detection in bounded-size networks? We prove that the answer to this question is positive, even in the local beeping model but, perhaps surprisingly, this ability comes at a steep price of time: the meeting time of bounded-energy agents is exponentially larger than that of unrestricted agents. By contrast, we show an algorithm for rendezvous with detection in the global beeping model that works for bounded-energy agents (in bounded-size networks) as fast as for unrestricted agents.
A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realization and a quantum realization. Extending the ''probability via expectation'' approach of Whittle to noncommuting quantities, this paper defines quantities, ensembles, and experiments as mathematical concepts and shows how to model complementarity, uncertainty, probability, nonlocality and dynamics in these terms. The approach carries no connotation of unlimited repeatability; hence it can be applied to unique systems such as the universe. Consistent experiments provide an elegant solution to the reality problem, confirming the insistence of the orthodox Copenhagen interpretation on that there is nothing but ensembles, while avoiding its elusive reality picture. The weak law of large numbers explains the emergence of classical properties for macroscopic systems.
Learning causal relations from observational data is a fundamental task in knowledge discovery. Recently, an Information Geometric Causal Inference (IGCI) framework is proposed, which represents causal relations by deterministic functions, defines causal mechanisms by a cause-mapping independence postulate and learns causal directions by an information geometry formulation. Observing IGCI's limitation in representing general causal relations and its ambiguity in inferring causal directions, we generalize IGCI's original postulate, and propose a new dependence causal inference (DCI) method where linear correlation and a new cross likelihood (CL) measure are introduced. We prove that CL dependence measure incorporates IGCI as its special case in some sense. Experimental results on synthetic and real-world data verify the effectiveness of our generalization and methods, and show that our method is more robust to disturbances on real data sets.
We analyzed the pupil size vs time from six subjects using pupilography and nonlinear techniques. The correlation dimensions ranged from 4.08 to 5.7. The Hurst exponents ranged from 0.132 to 0.546. All data sets contained at least one positive Lyapunov exponent. The use of surrogate yielded statistically significant differences for the correlation dimension. Phase space analysis yields a definite flow, and in subject two, period-doubling is evident. The accumulated evidence supports the notion that dynamics of pupil size are governed by deterministic chaos rather than a stochastic or linear process. This implies that one might discern between well and disease states using pupillography and that the dynamics can be mechanistically modeled.
Labelling line drawings is a useful technique in several applications, including that of automated interpretation of line drawings as polyhedral solid objects. Algorithms exist which can in practice label line drawings of trihedral polyhedra correctly in low-order polynomial time. However, the restriction to trihedral polyhedra is too limiting for practical applications.
We are primarily interested in interpreting line drawings of engineering objects, and many of these objects contain non-trihedral (usually tetrahedral) vertices. We therefore investigate two alternative algorithms for labeliing drawings of such objects, one deterministic and the other probabilistic. We find that our deterministic solution achieves better results in terms of correctness but can be unacceptably slow, and that our probabilistic solution, while acceptably quick, is less successful in terms of correctness.
The geographic distribution of different viruses has developed widely, giving rise to an escalating number of cases during the past two decades. The deterministic Susceptible, Exposed, Infectious (SEI) models can demonstrate the spatio-temporal dynamics of the diseases and have been used extensively in modern mathematical and mechano-biological simulations. This article presents a functional technique to model the stochastic effects and seasonal forcing in a reliable manner by satisfying the Lipschitz criteria. We have emphasized that the graphical portrayal can prove to be a powerful tool to demonstrate the stability analysis of the deterministic as well as the stochastic modeling. Emphasis is made on the dynamical effects of the force of infection. Such analysis based on the parametric sweep can prove to be helpful in predicting the disease spread in urban as well as rural areas and should be of interest to mathematical biosciences researchers.
We present a stable and deterministic quantum key distribution (QKD) system based on differential phase shift. With three cascaded Mach-Zehnder interferometers with different arm-length differences for creating key, its key creation efficiency can be improved to be 7/8, more than other systems. Any birefringence effects and polarization-dependent losses in the long-distance fiber are automatically compensated with a Faraday mirror. Added an eavesdropping check, this system is more secure than some other phase-coding-based QKD systems. Moreover, the classical information exchanged is reduced largely and the modulation of phase shifts is simplified. All these features make this QKD system more convenient than others in a practical application.
In this study, the seismic hazards of Myanmar are analyzed based on both deterministic and probabilistic scenarios. The area of the Sumatra–Andaman Subduction Zone is newly defined and the lines of faults proposed previously are grouped into nine earthquake sources that might affect the Myanmar region. The earthquake parameters required for the seismic hazard analysis (SHA) were determined from seismicity data including paleoseismological information. Using previously determined suitable attenuation models, SHA maps were developed. For the deterministic SHA, the earthquake hazard in Myanmar varies between 0.1 g in the Eastern part up to 0.45 g along the Western part (Arakan Yoma Thrust Range). Moreover, probabilistic SHA revealed that for a 2% probability of exceedance in 50 and 100 years, the levels of ground shaking along the remote area of the Arakan Yoma Thrust Range are 0.35 and 0.45 g, respectively. Meanwhile, the main cities of Myanmar located nearby the Sagaing Fault Zone, such as Mandalay, Yangon, and Naypyidaw, may be subjected to peak horizontal ground acceleration levels of around 0.25 g.
In this paper, a new hybrid method based on DWT and WNN methods is proposed to develop an accurate model for short term load forecasting. The wavelet filters are used to decompose the load series into deterministic and fluctuation components. The deterministic component models the general pattern of the load signal while the fluctuation component models the random characteristics of the load signal. The WNN technique is then used to find and weight similar data to predict the next day load. The effectiveness of the proposed methodology in forecasting the next day load is illustrated by applying it to practical load data of California and Spain energy markets. The performance is compared to the corresponding non-wavelet model and Naive predictors by using the weekly mean absolute percentage error metric and the weekly error variance.
This chapter highlights the impact and manageability of rapid, constantly/ continuously and unique changes taking place in humanity that are affecting the existence individuals, as well as all categories of human organizations. It has been observed that the traditional Newtonian mindset, and its associated reductionist hypothesis and design paradigm that have served humanity ‘well’ are manifesting their limits/constraints, vulnerability and disparities. The crux of the issue is escalating complexity density, incoherency, greater mismatch among current thinking, principles, values, structure, dynamic, and hierarchical dominance, limited predictability, and the overall changing ‘reality’.
Vividly, order and linearity are not the only inherent attributes of humanity. Consequently, the significance, appropriateness and exploration of certain properties of complexity theory are introduced, partially to better identify, analyze, comprehend and manage the accelerating gaps of inconsistency — in particular, to nurture a new mindset. Arising from the new mindset, human organizations/systems are confirmed as intrinsic composite complex adaptive systems (composite CAS, nonlinear adaptive dynamical systems) comprising human beings/agents that are CAS. In this respect, leadership, governance, management, and strategic approaches adopted by all human organizations must be redefined.
Concurrently, a special focus on intelligence (and its associated consciousness), the first inherent strengths of all human agents, and its role as the key latent impetus/driver, is vital. This recognition indicates that a change in era is inevitable. Humanity is entering the new intelligence era — the core of the knowledge-intensive and complexity-centric period. Overall, an integrated intelligence/consciousness-centric, complexity-centric and network-centric approach is essential. It adopts a complexity-intelligence-centric path that focuses on the optimization of all intense intrinsic intelligence/consciousness sources (human thinking systems), better exploitation of the co-existence of order and complexity, and integration of networks in human organizations — (certain spaces of complexity must be better utilized, coherency of network of networks must be achieved, and preparation for punctuation point must be elevated).
The new holistic (multi-dimensional) strategy of the intelligent organization theory (IO theory) is the complexity-intelligence strategy, and the new mission focuses on the new intelligence advantage.
This chapter is an introduction to complexity theory (encompassing chaos — a subset of complexity), a nascent domain, although, it possesses a historical root. Some fundamental properties of chaos/complexity (including complexity mindset, nonlinearity, interconnectedness, interdependency, far-from-equilibrium, butterfly effect, determinism/in-determinism, unpredictability, bifurcation, deterministic chaotic dynamic, complex dynamic, complex adaptive dynamic, dissipation, basin of attraction, attractor, chaotic attractor, strange attractor, phase space, rugged landscape, red queen race, holism, self-organization, self-transcending constructions, scale invariance, historical dependency, constructionist hypothesis and emergence), and its development are briefly examined. In particular, the similarities (sensitive dependence on initial conditions, unpredictability) and differences between deterministic chaotic systems (DCS) and complex adaptive systems (CAS) are analyzed. The edge of emergence (2nd critical value, a new concept) is also conceived to provide a more comprehensive explanation of the complex adaptive dynamic (CAD) and emergence. Subsequently, a simplified system spectrum is introduced to illustrate the attributes, and summarize the relationships of the various categories of common systems.
Next, the recognition that human organizations are nonlinear living systems (high finite dimensionality CAS) with adaptive and thinking agents is examined. This new comprehension indicates that a re-calibration in thinking is essential. In the human world, high levels of human intelligence/consciousness (the latent impetus that is fundamentally stability-centric) drives a redefined human adaptive and evolution dynamic encompassing better potentials of self-organization or self-transcending constructions, autocatalysis, circular causation, localized spaces/networks, hysteresis, futuristic, and emergent of new order (involving a multi-layer structure and dynamic) — vividly indicating that intelligence/consciousness-centric is extremely vital. Simultaneously, complexity associated properties/characteristics in human organizations must be better scrutinized and exploited — that is, establishing appropriate complexity-intelligence linkages is a significant necessity. In this respect, nurturing of the intelligence mindset and developing the associated paradigmatic shift is inevitable.
A distinct attempt (the basic strategic approach) of the new intelligence mindset is to organize around human intrinsic intelligence — intense intelligence-intelligence linkages that exploits human intelligence/consciousness sources individually and collectively by focusing on intelligence/consciousness-centricity, complexity-centricity, network-centricity, complexity-intelligence linkages, collective intelligence, org-consciousness, complex networks, spaces of complexity (better risk management <=> new opportunities <=> higher sustainability) and prepares for punctuation points (better crisis management <=> collectively more intelligent <=> higher resilience/sustainability) concurrently — illustrating the significance of self-organizing capability and emergence-intelligence capacity. The conceptual development introduced will serve as the basic foundation of the intelligent organization (IO) theory.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.