Please login to be able to save your searches and receive alerts for new content matching your search criteria.
This paper presents a structural approach for testing SRAM-based FPGAs taking into account the configurability of such flexible devices. When SRAM-based FPGA testing is considered, different situations have first to be identified: namely the Application-Oriented Test situation and the Manufacturing-Oriented Test situation. This paper concentrates on Test Pattern Generation and DFT for an Application-Oriented test of SRAM-based FPGAs.
Reversible logic and Quantum dot cellular automata are the prospective pillars of quantum computing. These paradigms can potentially reduce the size and power of the future chips while simultaneously maintaining the high speed. RAM cell is a crucial component of computing devices. Design of a RAM cell using a blend of reversible logic and QCA technology will surpass the limitations of conventional RAM structure. This motivates us to explore the design of a RAM cell using reversible logic in QCA framework. The performance of a reversible circuit can be improved by utilizing a resilient reversible gate. This paper presents the design of QCA-based reversible RAM cell using an efficient, fault-tolerant and low power reversible gate. Initially, a novel reversible gate is proposed and implemented in QCA. The QCA layout of the proposed reversible gate is designed using a unique multiplexer circuit. Further, a comprehensive analysis of the gate is carried out for standard Boolean functions, cost function and power dissipation and it has been found that the proposed gate is 75.43% more cost-effective and 58.54% more energy-efficient than the existing reversible gates. To prove the inherent testability of the proposed gate, its rigorous testing is carried out against various faults and the proposed gate is found to be 69.2% fault-tolerant. For all the performance parameters, the proposed gate has performed considerably better than the existing ones. Furthermore, the proposed gate is explicitly used for designing reversible D latch and RAM cell, which are crucial modules of sequential logic circuits. The proposed latch is 45.4% more cost effective than the formerly reported D latch. The design of QCA-based RAM cell using reversible logic is novel and not reported earlier in the literature.
Quantum-dot cellular automata (QCA) is the best-suggested nanotechnology for designing digital electronic circuits. It has a higher switching frequency, low-power expenditures, low area, high speed and higher scale integration. Recently, many types of research have been on the design of reversible logic gates. Nevertheless, a high demand exists for designing high-speed, high-performance and low-area QCA circuits. Reversible circuits have notably improved with developments in complementary metal–oxide–semiconductor (CMOS) and QCA technologies. In QCA systems, it is important to communicate with other circuits and reversible gates reliably. So, we have used efficient approaches for designing a 3×3 reversible circuit based on XOR gates. Also, the suggested circuits can be widely used in reversible and high-performance systems. The suggested architecture for the 3×3 reversible circuit in QCA is composed of 28 cells, occupying only 0.04μm2. Compared to the state-of-the-art, shorter time, smaller areas, more operational frequency and better performance are the essential benefits of the suggested reversible gate design. Full simulations have been conducted with the utilization of QCADesigner software. Additionally, the proposed 3×3 gate has been schematized using two XOR gates.
Research on real-time systems now focuses on formal approaches to specify and analyze the behavior of real-time systems. Temporal logic is a natural candidate for this since it can specify properties of event and state sequences. However, “pure” temporal logic cannot specify “quantitative” aspect of time. The concepts of eventuality, fairness, etc. are essentially “qualitative” treatment of time. The pure temporal logic makes no reference to absolute time. For real-time systems, the pure qualitative specification and analysis of time are inadequate. In this paper, we present a modification of temporal logic—Event-based Real-time Logic (ERL), based on our event-based conceptual model. The ERL provides a high-level framework for specifying timing properties of real-time systems, and it can be implemented using Prolog programming language. In our approach to testing and debugging of real-time systems, the ERL is used to specify both expected behavior (specification) and actual behavior (execution traces) of the target system and to verify that the target system achieves the specification. In this paper, a method is presented to implement the ERL using Prolog programming language for testing and debugging real-time systems.
In a competitive business landscape, large organizations such as insurance companies and banks are under high pressure to innovate, improvise and differentiate their products and services while continuing to reduce the time-to market for new product introductions. Generating a single view of the customer is critical from different perspectives of the systems developer over a period of time because of the existence of disconnected systems within an enterprise. Therefore, to increase revenues and cost optimization, it is important build enterprise systems more closely with the business requirements by reusing the existing systems. While building distributed based applications, it is important to take into account the proven processes like Rational Unified Process (RUP) to mitigate the risks and increase the reliability of systems. Experiences in developing applications in Java Enterprise Edition (JEE) with customized RUP have been presented in this paper. RUP is adopted into an onsite-offshore development model along with ISO 9001 and SEI CMM Level 5 standards. This paper provides an RUP approach to achieve increased reliability with higher productivity and lower defect density along with competitiveness through cost effective custom software solutions. Early qualitative software reliability prediction is done using fuzzy expert systems, using which the expected number of defects in the software prior to the experimental testing is obtained. The predicted results are then compared with the practical values obtained during the actual testing procedure.
Demand for highly reliable software is increasing day by day which in turn has increased the pressure on the software firms to provide reliable software in no time. Ensuring high reliability of the software can only be done by prolonged testing which in result consumes more resources which is not feasible in the existing market situation. To overcome this, software firms are providing patches after software release so as to fix the remaining number of bugs and to give better product experience to users. An update/fix is a minor portion of software to repair the bugs. With such patches, organizations enhance the performance of the software. Delivering patches after release demands extra effort and resources which are costly and hence not economical for the firms. Also, early patch release might cause improper fixation of bugs, on the other hand, delayed release may increase the chances of more failure during the operational phase. Therefore, determining optimal patch release time is imperative. To overcome the above issues we have formulated a two-dimensional time and effort-based cost model to regulate the optimum release and patch time of software, so that the total cost is minimized. Proposed model is validated on a real life data set.
Testing life cycle poses a problem of achieving a high level of software reliability while achieving an optimal release time for the software. To enhance the reliability of the software, retain the market potential for the software and reduce the testing cost, the enterprise needs to know when to release the software and when to stop testing. To achieve this, enterprises usually release their product earlier in market and then release patches subsequently. Software patching is a process through which enterprises debug, update, or enhance their software. Software patching when used as a debugging process ensures an optimal release for the product, increasing the reliability of the software while reducing the economic overhead of testing. Today, due to the diverse and distributed nature of software, its journey in the market is dynamic, making patching an inherent aspect of testing. A patch is a piece of software designed to update a computer program or its supporting data to fix or improve it. Researchers have worked in the field to minimize the testing cost, but so far, reliability has not been considered in the models for optimal time scheduling using patching. In this paper, we discuss reliability, which is a major attribute of the quality of software. Thus, to address the issues of testing cost, release time of software, and a desirable reliability level, we propose a reliability growth model implementing software patching to make the software system reliable and cost effective. The numeric illustration has been implemented using real-life software failure data set.
This paper presents a multi-stage software design approach for fault-tolerance. In the first stage, a formalism is introduced to represent the behavior of the system by means of a set of assertions. This formalism enables an execution tree (ET) to be generated where each path from the root to the leaf is, in fact, a well-defined formula. During the automatic generation of the execution tree, properties like completeness and consistency of the set of assertions can be verified and consequently design faults can be revealed. In the second stage, the testing strategy is based on a set of WDFs. This set represents the structural deterministic test for the model of the software system and provides a framework for the generation of a functional deterministic test for the code implementation of the model. This testing strategy can reveal the implementation faults in the program code. In the third stage, the fault-tolerance of the software system against hardware failures is improved in a way such that the design and implementation features obtained from the first two stages are preserved. The proposed approach provides a high level of user-transparency by employing object-oriented principles of data encapsulation and polymorphism. The reliability of the software system against hardware failures is also evaluated. A tool, named Software Fault-Injection Tool (SFIT), is developed to estimate the reliability of a software system.
For prediction or verification of system reliability, it is often necessary to conduct individual tests of components that comprise the system. The question then arises as to how the total test efforts should be allocated among different components so as to minimize test costs. This paper describes the role of mathematical programming in obtaining the optimum test plans. The problem is formulated on the notion of producer’s and consumer’s risks in traditional acceptance sampling plans. Examples are given for different distributions of component failure times and for a series and a parallel system.
The Appearance and Development of Commercial Laboratories in China.
Independent Medical Laboratories in China - A Sunrise Industry under the Circumstances of Healthcare Reformation.
Tracing the Rise of KingMed and its Future Route - A Correspondence with Hongbo Li.
Establishment of IML Quality Managerial System in China.
The Collaboration between IML and Community Medical Hospitals: Supplementary Service with Tests, Technologies and Beyond.
The Collaboration between IML and Major Medical Institutions - Supplement Service with Esoteric Testing.
Notice from Ministry of Health on Printing and Distributing "Basic Standards for Medical Laboratory (on Trial)" - Ministry of Health of the People's Republic of China.
Yak genome provides new insights into high altitude adaptation.
Gentris and Shanghai Institutes of Preventative Medicine expand collaboration.
Chinese researchers identify rice gene enhancing quality, productivity.
Quintiles opens new Center of Excellence in Dalian to support innovative drug development.
BGI demonstrated genomic data transfer at nearly 10 gigabits per second between US and China.
Quintiles deepens investment in China - New Quintiles China Headquarters and local lab testing solution announced.
Beike earns AABB Accreditation for cord blood and cord tissue banking.
Epigenomic differences between newborns and centenarians provide insight to the understanding of aging.
Metabolic Syndrome and Diabetes: Current Asian Perspectives.
A Crisis in the Development of Antibiotics.
The Marketing of Unapproved Stem Cell Products: An Industry-wide Challenge.
Draining the Goodwill of Science – The Direct-to-Consumer Genetic Testing Industry in East Asia.
Biodiesel – From Lab to Industry.
The Appearance and Development of Commercial Laboratories in China.
Cord Blood Banking – To Go Public or Stay Private.
Open Source – The Future of Drug Discovery.
VACCINES – Where are we headed?
Leveraging on External Expertise.
Asia-Pacific: Falling behind in the fight against HIV/AIDS
This paper presents design and performance of a prototype of new humanoid arm that has been developed at the LARM2 laboratory of the University of Rome “Tor Vergata”. This new arm, called LARMbot PK arm, is an upper limb that is designed for the LARMbot humanoid robot. LARMbot is a humanoid robot designed to move freely in open spaces, and able to adapt to task environment. Its objective is to transport objects weighing a few kilograms in order to facilitate the restocking of workstations, or to manage small warehouses and other tasks feasible for humanoids. The LARMbot PK arm is conceived as a solution that is designed on the basis of a parallel tripod structure using linear actuators to provide high agility of movement. This solution is designed with components that can be found on the market or can be created by 3D printing in order to offer a quality and price ratio well convenient for user-oriented humanoid robots. Experimental tests are discussed with the built prototype to demonstrate the capabilities of the proposed solution in terms of agility, autonomy, and power to validate the LARMbot PK arm solution as a satisfactory solution for the new upper limbs of the LARMbot humanoid robot.
Emerging pathogens have no known therapies or vaccines and therefore can only be controlled via traditional methods of contact tracing, quarantine and isolation that require rapid and widespread testing. The most recent outbreak from an emerging pathogen is due to the highly transmissible SARS-CoV-2 virus causing COVID-19 disease, which is associated with no symptoms or mild symptoms in 80–90% of the infected individuals, while in the remainder of the patients it exhibits severe illness that can be lethal or persist for several weeks to months after infection. The first tests to diagnose infection by SARS-CoV-2 were developed soon after the genome of the virus became known, and use probes to measure viral RNA by reverse transcriptase-polymerase chain reaction (RT-PCR). These tests are highly sensitive and specific but can require several days to return results, which makes contact tracing and more generally efforts to control the spread of the infection very difficult. Furthermore, the sensitivity threshold is orders of magnitude below the viral load necessary for transmission; therefore, individuals recovering from the infection may still be have a positive test and be required to isolate unnecessarily while they are no longer infectious. Antigen tests were subsequently developed that use antibodies mostly targeted to the nucleocapsid protein of the virus. These tests are about 100 times less sensitive than RT-PCR, yes they detect viral loads that are about 1/10 that needed for transmission. Furthermore, such tests are potentially much cheaper than RT-PCR and yield results in 15 min or less. Antibody, also known as serological testing, is available and can provide useful information to understand the extent to which a population has been exposed to the virus; however, it is not a good indicator of current infection and not useful for infection control. Viral transmission models that incorporate testing and contact tracing show that infection control is much more readily achieved by increasing testing frequency than by using higher sensitivity testing. For example, compared to no testing at all, testing once every other week has a marginal benefit, while testing weekly can decrease the number of infections to 20–40%, and testing twice weekly or more can bring about a 95%+ reduction in infections. These lessons learned from dealing from the COVID-19 pandemic should guide future planning against potential emerging viruses.
This chapter complements the chapters on technical reviews and software reliability engineering in Vol. 1 of the handbook. It is primarily concerned with the verification of code by means of testing, but an example of an informal proof of a program is also given. A practitioner's view of testing is taken throughout, including an overview of how testing is done at Microsoft.
Testing is one of the most important phases in the software development process, often requiring considerable effort and resources. We propose a novel approach for generating test cases, based on requirements specification. We make use of scenarios used in the requirements specification phase, taking into consideration the various relationships that can exist between them. These relationships are represented as dependency diagrams and they play an important role both in requirements specifications and in test case generation. Using our approach we can ensure that a larger proportion of requirements are actually tested.
This paper analyzes the source and the characterization method of white noise in FOCT, researches the statistical properties of white noise in FOCT while the equipment is in a zero-input current environment. Data in time-domain and frequency-domain is analyzed and compared under different primary input current. The result indicates that white noise in FOCT accords with a normal distribution with mean zero, and the noise magnitude is independent of the primary current value. On that basis, we research how the white noise affects the test result of FOCT and provide an effective method to reduce the impact.