Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The paper characterizes the family of homomorphisms, under which the deterministic context-free languages, the LL context-free languages and the unambiguous context-free languages are closed. The family of deterministic context-free languages is closed under a homomorphism h if and only if h is either a code of bounded deciphering delay, or the images of all symbols under h are powers of the same string. The same characterization holds for LL context-free languages. The unambiguous context-free languages are closed under h if and only if either h is a code, or the images of all symbols under h are powers of the same string.
We identify the different styles of texting in Filipino short message service (SMS) texts and analyze the change in unigram and bigram frequencies due to these styles. Style preference vectors for sample texts were calculated and used to identify the style combination used by an average individual. The change in Shannon entropy of the SMS text is explained in light of a coding process.
Complex systems, as interwoven miscellaneous interacting entities that emerge and evolve through self-organization in a myriad of spiraling contexts, exhibit subtleties on global scale besides steering the way to understand complexity which has been under evolutionary processes with unfolding cumulative nature wherein order is viewed as the unifying framework. Indicating the striking feature of non-separability in components, a complex system cannot be understood in terms of the individual isolated constituents’ properties per se, it can rather be comprehended as a way to multilevel approach systems behavior with systems whose emergent behavior and pattern transcend the characteristics of ubiquitous units composing the system itself. This observation specifies a change of scientific paradigm, presenting that a reductionist perspective does not by any means imply a constructionist view; and in that vein, complex systems science, associated with multiscale problems, is regarded as ascendancy of emergence over reductionism and level of mechanistic insight evolving into complex system. While evolvability being related to the species and humans owing their existence to their ancestors’ capability with regards to adapting, emerging and evolving besides the relation between complexity of models, designs, visualization and optimality, a horizon that can take into account the subtleties making their own means of solutions applicable is to be entailed by complexity. Such views attach their germane importance to the future science of complexity which may probably be best regarded as a minimal history congruent with observable variations, namely the most parallelizable or symmetric process which can turn random inputs into regular outputs. Interestingly enough, chaos and nonlinear systems come into this picture as cousins of complexity which with tons of its components are involved in a hectic interaction with one another in a nonlinear fashion amongst the other related systems and fields. Relation, in mathematics, is a way of connecting two or more things, which is to say numbers, sets or other mathematical objects, and it is a relation that describes the way the things are interrelated to facilitate making sense of complex mathematical systems. Accordingly, mathematical modeling and scientific computing are proven principal tools toward the solution of problems arising in complex systems’ exploration with sound, stimulating and innovative aspects attributed to data science as a tailored-made discipline to enable making sense out of voluminous (-big) data. Regarding the computation of the complexity of any mathematical model, conducting the analyses over the run time is related to the sort of data determined and employed along with the methods. This enables the possibility of examining the data applied in the study, which is dependent on the capacity of the computer at work. Besides these, varying capacities of the computers have impact on the results; nevertheless, the application of the method on the code step by step must be taken into consideration. In this sense, the definition of complexity evaluated over different data lends a broader applicability range with more realism and convenience since the process is dependent on concrete mathematical foundations. All of these indicate that the methods need to be investigated based on their mathematical foundation together with the methods. In that way, it can become foreseeable what level of complexity will emerge for any data desired to be employed. With relation to fractals, fractal theory and analysis are geared toward assessing the fractal characteristics of data, several methods being at stake to assign fractal dimensions to the datasets, and within that perspective, fractal analysis provides expansion of knowledge regarding the functions and structures of complex systems while acting as a potential means to evaluate the novel areas of research and to capture the roughness of objects, their nonlinearity, randomness, and so on. The idea of fractional-order integration and differentiation as well as the inverse relationship between them lends fractional calculus applications in various fields spanning across science, medicine and engineering, amongst the others. The approach of fractional calculus, within mathematics-informed frameworks employed to enable reliable comprehension into complex processes which encompass an array of temporal and spatial scales notably provides the novel applicable models through fractional-order calculus to optimization methods. Computational science and modeling, notwithstanding, are oriented toward the simulation and investigation of complex systems through the use of computers by making use of domains ranging from mathematics to physics as well as computer science. A computational model consisting of numerous variables that characterize the system under consideration allows the performing of many simulated experiments via computerized means. Furthermore, Artificial Intelligence (AI) techniques whether combined or not with fractal, fractional analysis as well as mathematical models have enabled various applications including the prediction of mechanisms ranging extensively from living organisms to other interactions across incredible spectra besides providing solutions to real-world complex problems both on local and global scale. While enabling model accuracy maximization, AI can also ensure the minimization of functions such as computational burden. Relatedly, level of complexity, often employed in computer science for decision-making and problem-solving processes, aims to evaluate the difficulty of algorithms, and by so doing, it helps to determine the number of required resources and time for task completion. Computational (-algorithmic) complexity, referring to the measure of the amount of computing resources (memory and storage) which a specific algorithm consumes when it is run, essentially signifies the complexity of an algorithm, yielding an approximate sense of the volume of computing resources and seeking to prove the input data with different values and sizes. Computational complexity, with search algorithms and solution landscapes, eventually points toward reductions vis à vis universality to explore varying degrees of problems with different ranges of predictability. Taken together, this line of sophisticated and computer-assisted proof approach can fulfill the requirements of accuracy, interpretability, predictability and reliance on mathematical sciences with the assistance of AI and machine learning being at the plinth of and at the intersection with different domains among many other related points in line with the concurrent technical analyses, computing processes, computational foundations and mathematical modeling. Consequently, as distinctive from the other ones, our special issue series provides a novel direction for stimulating, refreshing and innovative interdisciplinary, multidisciplinary and transdisciplinary understanding and research in model-based, data-driven modes to be able to obtain feasible accurate solutions, designed simulations, optimization processes, among many more. Hence, we address the theoretical reflections on how all these processes are modeled, merging all together the advanced methods, mathematical analyses, computational technologies, quantum means elaborating and exhibiting the implications of applicable approaches in real-world systems and other related domains.
Bernal et al. provided a necessary and sufficient condition for a linear code to be realized as an ideal in a finite group algebra and De La Cruz and Willems proved a similar result for ideals in twisted group algebras. In this paper, we extend this characterization to crossed products. Furthermore, we determine conditions for some crossed product codes to be self-dual.
The possibility of encoding classical string of information into a quantum string via CNOT operation and Hadamard gate is discussed. It has been shown that maximizing/minimizing the quantum correlation of the generated quantum string depends on the values of the initial classical strings. The kernel of each string, as well as, the total string had been examined, where by applying the Hadamard gate on the classical string, one can decrease its similarity as well as the similarity of the total kernel. However, in the absence of the Hadamard gate, the large similarity of the final quantum string has been observed. The kernel of the final quantum string may be used as an indictor of the security degree of the encoded information, where the large degree of security has depicted at small kernel. The high degree of security can be implemented by applying the Hadamard gate on only one classical string.
In this paper, we propose a new family of graphs, matrix graphs, whose vertex set 𝔽N×nq is the set of all N×n matrices over a finite field 𝔽q for any positive integers N and n. And any two matrices share an edge if the rank of their difference is 1. Next, we give some basic properties of such graphs and also consider two coloring problems on them. Let χ′d(N×n,q) (resp., χd(N×n,q)) denote the minimum number of colors necessary to color the above matrix graph so that no two vertices that are at a distance at most d (resp., exactly d) get the same color. These two problems were proposed in the study of scalability of optical networks. In this paper, we determine the exact value of χ′d(N×n,q) and give some upper and lower bounds on χd(N×n,q).
In this paper, we present an overview of some combinatorial objects having interesting structure, namely, association schemes and strongly regular graphs. We propose an approach to obtain a certain type of association schemes from two weight linear codes. We use the projective dual transform of linear codes, and its basic concept is also described in the article.
The paper is devoted to the graph based cryptography. The girth of a directed graph (girth indicator) is defined via its smallest commutative diagram. The analogue of Erdøos's Even Circuit Theorem for directed graphs allows to establish upper bound on the size of directed graphs with a fixed girth indicator. Size of members of infinite family of directed regular graphs of high girth is close to an upper bound.
Finite automata related to members of such a family of algebraic graphs over chosen commutative ring can be used effectively for the design of cryptographical algorithm for different problems of data security (stream ciphers, data base encryption, public key mode an digital signatures).
The explicit construction of infinite family of algebraic graphs of high girth defined over the arbitrarily chosen ring is given. Some results on their properties, based on theoretical studies or software implementations are given.
We present an overview of results concerning partial periodic correlation of pseudorandom sequences, ranging from classical results on binary m–sequences to recent results on the first two partial period correlation moments of the sequences belonging to families A, B and C defined over Galois Rings. The use of Association Schemes provides us with a new uniform technique for analyzing the sequence families A, B and C.
These correlation moments have applications in synchronisation performance of CDMA communication systems using phase shift keying.