Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Approach to designing pseudo-random number generators (PRNGs) by combining the model of operation of plain random-access memory (PRAM) and the model of behavior of confined gas is investigated, proposed and described in the paper. It can be regarded as a transform, converting the input sequence of numbers, random or ordinary, into a new pseudo-random number output sequence. The proposed method is based on the processes of writing in the numbers into the memory matrix of the PRAM, and subsequently reading out its content along appropriate reading paths across the memory. Categories of writing and reading patterns are systematically considered, as ways of restructuring the two-dimensional content of the PRAM into one-dimensional arrangement. The PRAM is numerically formalized by the number matrix (NM), and also abstracted to its pure geometry, in which the concept of writing-reading paths gets its obvious meaning. The proposed skew (randomized) reading and/or writing of the PRAM content adds a new degree of freedom in generating the output random number sequence. The idea for skew reading and/or writing path came from the trajectory of a molecule of the gas confined in a vessel, which is modeled for a one-molecule case, for simplicity. The proposed method is general and enables the design of PRNGs with different characteristics, for different purposes and applications. Two such designs, the most interesting for practical applications, called basic design and improved design, are proposed and described. The proposed approach also can be used to improve the characteristics of some simpler PRNGs with lower levels of randomness. The description of the method and the way of pseudo-random number generator (PRNG) design are discussed and illustrated in the paper. The quality and randomness of the generated sequences of the proposed PRNG designs were analyzed using NIST tests. The test results show that all proposed designs are of satisfactory quality and pass NIST tests. Some of the test results are given in the paper.
In this article, we extend Pólya's legendary inequality for the Dirichlet Laplacian to the fractional Laplacian. Pólya's argument is revealed to be a powerful tool for proving such extensions on tiling domains. As in the Dirichlet Laplacian case, Pólya's inequality for the fractional Laplacian on any bounded domain is still an open problem. Moreover, we also investigate the equivalence of several related inequalites for bounded domains by using the convexity, the Lieb–Aizenman procedure (the Riesz iteration), and some transforms such as the Laplace transform, the Legendre transform, and the Weyl fractional transform.
Current multimedia design processes suffer from the excessively large time spent on testing new IP-blocks with references based on large video encoders specifications (usually several thousands lines of code). The appropriate testing of a single IP-block may require the conversion of the overall encoder from software to hardware, which is difficult to complete in the short time required by the competition-driven reduced time-to-market demanded for the adoption of a new video coding standard. This paper presents a new design flow to accelerate the conformance testing of an IP-block using the H.264/AVC software reference model. An example block of the simplified 8 × 8 transformation and quantization, which is adopted in FRExt, is provided as a case study demonstrating the effectiveness of the approach.
In this paper, a new analytical formula as an approximation to the value of American put options and their optimal exercise boundary is presented. A transform is first introduced to better deal with the terminal condition and, most importantly, the optimal exercise price which is an unknown moving boundary and the key reason that valuing American options is much harder than valuing its European counterparts. The pseudo-steady-state approximation is then used in the performance of the Laplace transform, to convert the systems of partial differential equations to systems of ordinary differential equations in the Laplace space. A simple and elegant formula is found for the optimal exercise boundary as well as the option price of the American put with constant interest rate and volatility. Other hedge parameters as the derivatives of this solution are also presented.
This paper proposes a new method of dimensionality reduction when performing Text Classification, by applying the discrete wavelet transform to the document-term frequencies matrix. We analyse the features provided by the wavelet coefficients from the different orientations: (1) The high energy coefficients in the horizontal orientation correspond to relevant terms in a single document. (2) The high energy coefficients in the vertical orientation correspond to relevant terms for a single document, but not for the others. (3) The high energy coefficients in the diagonal orientation correspond to relevant terms in a document in comparison to other terms. If we filter using the wavelet coefficients and fulfil these three conditions simultaneously, we can obtain a reduced vocabulary of the corpus, with less dimensions than in the original one. To test the success of the reduced vocabulary, we recoded the corpus with the new reduced vocabulary and we obtained a statistically relevant level of accuracy for document classification.
We discuss the performance of the Search and Fourier Transform algorithms on a hybrid computer constituted of classical and quantum processors working together. We show that this semi-quantum computer would be an improvement over a pure classical architecture, no matter how few qubits are available and, therefore, it suggests an easier implementable technology than a pure quantum computer with arbitrary number of qubits.
Frequency channelization is a fundamental signal processing operation employed across various domains, including communications and radio astronomy. The polyphase filterbank (PFB) represents an efficient and versatile means of channelization. When strict constraints are placed on the allowable spectral leakage between neighboring channels, an oversampled PFB design is advantageous. A helpful consequence of the oversampling is that inversion of the PFB to recover high temporal resolution is simplified and can be accomplished accurately using Fourier transforms. We describe this inversion approach and identify key design considerations. We examine the residual error and spectral/temporal leakage behavior when a channelizer and its corresponding inverter are cascaded, concluding that near-perfect reconstruction can be approached with appropriate selection of PFB and inverter design parameters.
An efficient and robust method for identification of coronary arteries and evaluation of the severity of the stenosis on the routine X-ray angiograms is proposed. It is a challenging process to accurately identify coronary artery due to poor signal-to-noise ratio, vessel overlap, and superimposition with various anatomical structures such as ribs, spine, or heart chambers. The proposed method consists of two major stages: (a) signal-based image segmentation and (b) vessel feature extraction. The 3D Fourier and 3D Wavelet transforms are first employed to reduce the background and noisy structures in the images. Afterwards, a set of matched filters was applied to enhance the coronary arteries in the images. At the end, clustering analysis, histogram technique, and size filtering were utilized to obtain a binary image that consists of the final segmented coronary arterial tree. To extract vessel features in terms of vessel centerline and diameter, a gradient vector-flow based snake algorithm is applied to determine the medial axis of a vessel followed by the calculations of vessel boundaries and width associated with the detected medial axis.
The XML greatly propelled the development of the information technology with it's widely use in various areas. Many XML applications depend on the XSLT transform technology, as the foundational technology of the XML transform, an XSLT transform engine with high performance will be widely used in many XML applications. This paper gives the comparison between the main transform technologies based on the XSLT and studies the pipeline technologies based on the XSLT. The design and implementation details of an efficient XSLT transform engine Xpipe which has pipelining capacity are given in this paper; the experiment results show that the Xpipe has higher performance compared with other existing XSLT transform technologies.
The existence is one of the ultimate concepts in the studies of western philosophy. Almost all well-known philosophers in history had their own understandings of this concept. As a new philosophical form in new era, the philosophy of information states that philosophy of information and information science should develop together. The existence in philosophy needs to be redefined. The segmentation of existential field should be the first step to tackle with the issue of existence and non-existence. In the view of the philosophy of information, there are four levels of the existence: A. objective direct existence, B. objective indirect existence, C. subjective for-itself existence and D. subjective regenerated existence. The existential levels of a certain thing cover four levels in part or in its totality. Consequently, the nonexistence, as the opposite concept of the existence, has also changed. The existence and non-existence can be transformed into each other, with five properties in the process, those are: continuity, developmental feature, contingency, retrospective feature and predictability.
Temporally persistent spatial patterns in the electroencephalogram (EEG) are extracted using the Karhunen-Loeve transformation (KLT). Three basic patterns are shown to be sufficient to account for more than 94% of the variance in a 1.0 s segment of the EEG from both a normal individual and a patient with a malignant brain tumor. These patterns interpolated to form topographic maps, reveal what appear to be important spatial characteristics of the EEG. The results suggest that the method may be extremely valuable not only for the reduction of the data collected during electroencephalography but also for delineating spatially independent brain electrical sources underlying the EEG…