Copper oxide (CuO) nanostructures have gained popularity in glucose biosensor development due to their excellent electrochemical properties, affordability and ease of fabrication. This review examines the progress of glucose biosensors based on CuO nanostructures and shows differences between enzyme-based and enzyme-dependent systems. Enzyme-based glucose oxidase sensors offer high specificity but are hampered by difficulties associated with enzyme stability at different temperatures, pH and interfering inhibitors. Unlike nonenzyme-based sensors, those using glucose oxidation directly offer high sensitivity and long working life but are struggling with specificity. The main drivers of these sensors are nanostructure morphology, synthesis techniques, surface modifications and electrode materials. The switch from more conventional CuO to sophisticated nanostructures significantly improved the sensitivity and durability of the sensor. This research focuses on optimizing these variables to address existing limitations and to advance CuO nanostructures as leads for next-generation glucose biosensing technologies.
This paper discusses ways to search for lead-free functional materials for various applications. Using the example of the solid solution systems based on alkali metal niobates, the influence of the position on the phase diagram of the corresponding systems, the number of the components in them, and modification with mono- and combined metal oxides on their characteristics is shown. It has been established that the most effective in terms of piezoelectric characteristics are the solid solution systems in or near the morphotropic region, with 3 or 4 components. A number of materials have been developed to create highly sensitive electromechanical transducers, ultrasonic delay lines and other applications.
Some progresses in shape problems following the Helfrich model are reported. Based on the Helfrich model, we have predicted not only the exact solution for discoidal shape of red blood cells but also a special kind of toroidal vesicle with the ratio of two generation radii being . The Helfrich model can also be extended to investigate the complex structures in other soft matters such as the formation of focal conic domains in smectic liquid crystal, the tube-to-sphere transition in peptide nanostructures, and icosahedral configuration of small virus capsids.
The traditional computer simulation landscape painting has the defect that it cannot completely present the painting, the application research of local color simulation method of landscape painting based on deep learning adversarial networks is proposed. Based on the improved generative adversarial networks design, the semantic label hierarchical classification design of landscape paintings is carried out. Through the training and design of generative adversarial networks, the generator and discriminator of landscape painting are designed, and the semantic segmentation algorithm of landscape painting of generative adversarial networks is proposed. Finally, a simulation experiment test is carried out on the local color simulation of landscape painting. The results show that this application method is better than the traditional computer simulation method, it can fully reflect the realism of landscape art painting and the integrity of the picture itself. The texture detail clarity of the generated map is stronger than that of the traditional one, and the semantic content is more accurate. It has important practical reference value.
We propose a new iterative method for inducing classification knowledge from symbolic data. Our method generates a hierarchy of clusters and does not assume a particular knowledge representation formula (e.g., a conjunctive formula, a linear discriminant function). Our method consists of two stages, i.e., the optimization and clustering stages. The first stage maps the symbolic problem into the numerical domain based on an optimization approach. In the second stage, the examples are clustered into positive, negative, and fuzzy zones using induced membership degrees. This learning procedure is iterated until the fuzzy zone becomes empty. Out method learns "topological knowledge" which is found to be useful for the evaluation of the training data quality. Further, we show that our method is useful using real world data of an industrial knowledge acquisition problem.
In a manufacturing industry, often the quality characteristic under study is found to have only one of the two specification limits viz., upper specification limit (USL) or lower specification limit (LSL). In such cases the process capability indices (PCI) designed for bilateral specifications become inappropriate. However, in the literature only a few indices are available to address this problem. In the present paper, we have made an extensive study of the PCI's for unilateral specifications with a brief discussion of their possible fields of applications and drawbacks, if any. We have also proposed a logical formulation of a parameter of one of these indices which reduces the subjectivity of the index and hence makes it more suitable for practical application. An example, with the computed values of the various PCI's, is discussed to make a comparative study of the performance and inter-relationship between these PCI's. We have concluded the paper with a discussion on the future scope of study in this field.
Pulsed laser deposition (PLD) exhibits unique advantages for the preparation of functional thin films which are widely used in microelectronics, photoelectrons, integrate circuits, superconductors and biomedical fields. The principle of and the characteristics of PLD are introduced, its applications in ferroelectrics, high-temperature superconductors, diamond-like and superlattices. The future application trend is reviewed.
Nowadays, Enterprise Application Integration (EAI) constitutes a real and growing need for most of enterprises, particularly for large and dynamic ones. Actually, the major problem of EAI is the heterogeneity problem, especially the semantic one. This latter is not correctly addressed by today's integration solutions, which focus mainly on syntactical integration. Dealing with the semantic aspect, which will certainly promote EAI by providing it more consistency and robustness, needs some appropriate principles such as ontology urbanization and mediation. These latter that constitute the main focus of this paper aim to favor the semantic application integration allowing to correctly capture, structure, master and reason upon the semantics, which currently constitutes a big challenge for several enterprises that are in quest of more flexibility and manageability.
Amplified detection of nucleic acid by G-quadruplex based hybridization chain reaction.
Dow opens Photovoltaics Films Application Lab in Shanghai.
Researchers discover molecular mechanisms of left-right asymmetric control in the sea urchin.
China mulls new rule on human genetic research.
China to phase out organ donation from executed criminals.
Charles River Laboratories to expand research models business in China.
Chinese Science Academy Chief urges seizing on new technological revolution.
BGI contributes genome sequencing and bioinformatics expertise.
Taiwan government to encourage formation of smaller biotech funds.
Clinical Genomics signs Chinese colorectal screening deal.
WuXi PharmaTech laboratory testing division expands in U.S. with acquisition of XenoBiotic Laboratories.
Roche to invest 450 million Swiss Francs in new diagnostic manufacturing facility in China.
SCYNEXIS, Inc. enters into agreement with Waterstone Pharmaceutical for the development and commercialization of treatment for viral diseases.
Johnson & Johnson Innovation launches Asia Pacific Innovation Center.
ScinoPharm and Nanjing King-friend Biochemical Pharmaceutical team up to jointly enter the Chinese market for new drugs.
Ascletis gains exclusive China market rights from Presidio for clinical stage Hepatitis C virus inhibitor.
Irvine Pharmaceutical Services opens an analytical facility in Hangzhou, China.
Pioneering application of Google Glass by Renji Hospital.
WuXi PharmaTech launches representative office in Israel, forms strategic collaboration with Pontifax.
International Symposium on molecular biology of fruit trees held at Huazhong Agricultural University.
From Home to Hospital: Digitisation of Healthcare.
Microsoft with RingMD, Oneview Healthcare, Vital Images, Aruba, and Clinic to Cloud: The Ecosystem of Healthcare Solutions Providers in Asia.
Data Helps in Improving Nursing Practice, Making Better Decisions.
Launch of Asian Branch for QuintilesIMS Institute.
The study of random dynamical systems involves understanding the evolution of state variables that contain uncertainties and that are usually hidden, or not directly observable. Therefore, state variables have to be estimated and updated based on system models using information from observational data, which themselves are noisy, in the sense that they contain uncertainties and disturbances due to imperfections in observational devices and disturbances in the environment within which data are being collected. The development of efficient data assimilation methods for integrating observational data in predicting the evolution of random state variables is thus an important aspect in the study of random dynamical systems. In this paper, we consider a particle filtering approach to nonlinear filtering in multiscale dynamical systems. Particle filtering methods [1–3] utilizes ensembles of particles to represent the conditional density of state variables using particle positions, distributed over a sample space. The distribution of an ensemble of particles is updated using observational data to obtain the best representation of the conditional density of the state variables of interest. On the other hand, homogenization theory [4, 5], allows us to estimate the coarse-grained (slow) dynamics of a multiscale system on a larger timescale without having to explicitly study the fast variable evolution on a small timescale. The results of filter convergence presented in [6] shows the convergence of the filter of the actual state variable to a homogenized solution to the original multiscale system, and thus we develop a particle filtering scheme for multiscale random dynamical systems that utilizes this convergence result. This particle filtering method is called the Homogenized Hybird Particle Filter, and it incorporates a multiscale computation scheme, the Heterogeneous Multiscale Method developed in [7], with the novel branching particle filter described in [8–10]. By incorporating a multiscale scheme based on homogenization of the original system, estimation of the coarse-grained dynamics using observational data is performed over a larger timescale, thus resulting in computational time and cost reduction in terms of the evolution of the state variables as well as functional evaluations for the filtering aspect. We describe the theory behind this combined scheme and its general algorithm, concluded with an application to the Lorenz-96 [11] atmospheric model that mimics midlatitude geophysical dynamics with microscopic convective processes.
Among text-oriented applications, Natural Language Processing (NLP) plays a significant role in managing and identifying a particular text data. NLP is broadly used in text mining domains as well. In general, text mining is the process of combining diverse techniques to characterise and transform the text. Hence, the syntactic information and semantic information are utilised together in the NLP model to assist the process of analysing, or extracting the text. On the other hand, the text mining model is examined by different standard measurements, which also vary concerning text objectives or applications. Due to the advent of machine and deep learning models, text mining has become the hot research area used in various domains like classification, recognition, sentiment analysis, and speech-related topics. Though the models are not time-consuming and effective, certain factors are considered for further enhancement of these models. Thus, this survey paper elucidates the evaluation metrics used in text mining approaches by deploying standard algorithms. It explores the literature work on the formerly implemented text mining approaches for analysing the evaluation metrics in text mining. In addition to this, the proposed model to perform text mining in every survey paper is analysed. Further, it provides the algorithmic categorisation of existing research works, discussion on different datasets used with the consideration of various evaluation metrics, and finally categorises the metrics used for analysing the performance of such text mining approaches. This survey work also illustrates the merits and demerits of the existing text-mining approaches. Finally, the research gaps and challenging issues are given to direct future work.
The Ministry of Economy, Trade and Industry (METI) of Japan ran an R&D project on humanoid robotics, called HRP. In the project, a humanoid robotics platform was developed in the first phase, and contributors of the project followed up with research on the applications of humanoid robots to various industries. In this paper, we describe the five applications that we think are suitable for humanoid robots and expect to open a new industry.
Total quality management (TQM) and innovation have long been discussed as major contributors to organizational success. TQM has been discussed with regard to management practices while innovation is discussed with regard to technological change. However, a vague relationship has been established between these two due to the challenge to establish statistical relationship. The main purpose of this research is to empirically study the application of TQM for innovation. Fourteen companies from Japan, India and Thailand are considered in an empirical exploratory research to uncover the relationship between TQM and innovation. Three factors of TQM (human factor, technology factor and information factor) are considered to see the effect on process innovation and product innovation. The research findings indicate that TQM facilitates the development of innovation capabilities in organizations through harmonized management practices in a sequence of inter-related incremental innovation initiatives. Moreover, this research indicates that the innovation speed achieved by TQM is found to be dramatic as compared to traditional innovation. However, it was not only the speediness of the innovation achieved using TQM that surprises, but also its response to different market needs.
We present a distance metric learning algorithm for regression problems, which incorporates label information to form a biased distance metric in the process of learning. We use Newton's optimization method to solve an optimization problem for the sake of learning this biased distance metric. Experiments show that this method can find the intrinsic variation trend of data in a regression model by a relative small amount of samples without any prior assumption of the structure or distribution of data. In addition, the test sample data can be projected to this metric by a simple linear transformation and it is easy to be combined with manifold learning algorithms to improve the performance. Experiments are conducted on the FG-NET aging database, the UIUC-IFP-Y aging database, and the CHIL head pose database by Gaussian process regression based on the learned metric, which shows that our method is competitive among the start-of-art.
Fast chemical reactions involved in nanomaterials synthesis, polymerization, special chemicals production, reactive absorption, etc., are often difficult to control in terms of product quality, process efficiency and production consistency. After a theoretical analysis on such processes based on chemical reaction engineering fundamentals, an idea to intensify micromixing (mixing on the molecular scale) and mass transfer and therefore to control the process ideally was proposed. By experimental investigations of mass transfer and micromixing characteristics in the Rotating Packed Bed (RPB, or "HIGEE" device), we achieved unique intense micromixing. This led us to the invention of using RPB as a reactor for the fabrication of nanoparticles (Chen et al., 2000).
RPB consists mainly of a rotating packed rotator inside a stationary casing. The high gravity environment created by the RPB, which could be orders of magnitude larger than gravity, causes aqueous reactants going through the packing to spread or split into micro or nano droplets, threads or thin films, thus markedly intensifying mass transfer and micromixing to the extent of 1 to 3 orders of magnitude larger than that in a conventional packed bed.
In 1994, the first RPB reactor was designed to synthesize nanoparticles of CaCO3 through multiphase reaction between Ca(OH)2 slurry and CO2 gas, and nanoparticles of 15~30nm in mean size and with very uniform particle size distribution was obtained. In 1997, a pilot-scale RPB reactor was successfully set up for operation, and in 2000, the first commercial-scale RPB reactor for synthesis of such nanoparticles came into operation in China, establishing a milestone in the use of RPB as a reactor for the fabrication of nanomaterials (Chen et al., 2002).
Since then, the high gravity method has been employed for the synthesis of inorganic and organic nanoparticles via gas-liquid, liquid-liquid, and gas-liquid-solid multiphase reactions, e.g. inorganic nanoparticles like nanosized CaCO3, TiO2, SiO2, ZnO, Al2O3, ZnS, BaTiO3, BaCO3, SrCO3, Al(OH)3 and Mg(OH)2 flame retardants, and organic nano-pharmaceuticals including benzoic acid, salbutamol sulfate and cephradine. This technology received extensive attention in the field of nanomaterials fabrication and application. Dudukovic et al. commented, "The first large-scale application of RPB as a reactor occurred in China in production of nano CaCO3 by HGRP (high gravity reactive precipitation) of carbon dioxide and lime. Uniformly small particles were made in the RPB due to achievement of a sharp supersaturation interface and very short liquid residence times in the device." (Dudukovic et al., 2002). Date et al. said, "HGRP represents a second generation of strategies for nanosizing of hydrophobic drugs. In our opinion, among various methodologies described earlier, supercritical anti-solvent enhanced mass transfer method and HGRP method has potential to become technologies of the future owing to their simplicity, ease of scale-up and nanosizing efficiency" (Date et al., 2004).
As-synthesized nano CaCO3 was employed as a template to synthesize silica hollow spheres (SHS) with mesostructured walls. Characterizations indicated that the obtained SHS had an average diameter of about 40nm with a surface area of 766~1050 m2·g-1 (Le et al., 2004). SHS was further investigated as a carrier to study the controlled release behaviors of Brilliant Blue F (BB), which was used as a model drug. Loaded inside the inner core and on the surfaces of SHS, BB was released slowly into a bulk solution for as long as 1140min as compared to only 10min for the normal SiO2 nanoparticles, thus exhibiting a typical sustained release pattern without any burst effect. In addition, higher BET value of the carrier, lower pH value and lower temperature prolonged BB release from SHS, while stirring speed indicated little influence on the release behavior, showing the promising future of SHS in controlled drug delivery (Li et al., 2004).
Nano-CaCO3 synthesized by the high gravity method was also employed as a filler to improve the performance of organic materials. By adding CaCO3 nanoparticles into polypropylene-ethylene copolymer (PPE) matrix, the toughness of the matrix was substantially increased. At a nanosized CaCO3 content of 12phr (parts per hundred PPE resin by weight), the impact strength of CaCO3/PPE at room temperature reached 61.6kJ·m-2, which was 3.02 times that of unfilled PPE matrix. In the nanosized CaCO3/PPE/SBS (styrene-butadiene-styrene) system, the rubbery phase and filler phase were independently dispersed in the PPE matrix. As a result of the addition of nanosized CaCO3, the viscosity of PPE matrix significantly increased. The increased shear force during compounding continuously broke down SBS particles, resulting in the reduction of the SBS particle size and improving the dispersion of SBS in the polymer matrix. Thus the toughening effect of SBS on matrix was improved. Simultaneously, the existence of SBS provided the matrix with good intrinsic toughness, satisfying the condition that nanosized inorganic particles of CaCO3 efficiently toughened the polymer matrix, thus fully exhibiting the synergistic toughening function of nanosized CaCO3 and SBS on PPE matrix (Chen et al., 2004).
As-prepared nano-CaCO3 was blended with TiO2 and other additives to prepare complex master batches for use in the coloring of polypropylene. It was found that the obtained nano-CaCO3 is an excellent pigment dispersant, which can partially replace TiO2 pigments for polypropylene resin coloring. Nano-CaCO3 can prompt the dispersion of TiO2 in polymer matrix, boosting the whiteness of the materials without a negative effect on the UV absorbency of the materials (Guo et al., 2004). Studies on the mechanical properties of nano-CaCO3 toughened epoxy resin composite indicated that impact strength and flexural modulus of the composite improved remarkably when 6wt.% of nano-CaCO3 was added. Surface treatment of nano-CaCO3 by titanate coupling agents significantly improved the dispersibility of nano-CaCO3 in such a high viscous matrix (Li et al, 2005).
Electrochemical energy storage and conversion have been in a significant position for meeting the increasing demand for the development of the society. The design and preparation of high-performance functional semiconductor materials have played a critical role in the development of electrochemical energy storage and conversion. In this context, the nanostructures in functional semiconductor materials provide an alternative route by virtue of their unique and promising nanoscale. Controlling the geometrical parameters of nanostructures and thus forming expected nanomaterials is very important to the investigation and applications of functional semiconductor materials. In this review, we summarized the preparation and applications of the nanostructures for electrochemical energy storage and conversion. It demonstrates simple and controllable design methods for functional materials with ideal nanostructures.
Field-effect transistor (FET)-based biosensors exhibit excellent performance characteristics such as small size, ease of mass production, high versatility, and comparably low cost. In recent years, numerous FET biosensors based on various nanomaterials including silicon nanowires, carbon nanotubes, graphene, and transition metal dichalcogenides have been developed to detect a wide range of biomarkers that play a crucial role in early disease diagnosis, therapeutic monitoring and prognostic assessment. This review provides an overview of the structure, working principle, functionalization strategies and detection factors associated with FET biosensors based on diverse nanomaterials. Additionally, this paper discusses the applications of these diagnostic devices for detecting clinically relevant biomarkers such as nucleic acids, metabolites, proteins, cancer biomarkers, and hormones among others. The concluding section provides a comprehensive overview of the numerous challenges encountered in the widespread implementation of nanomaterial-based FET biosensors for clinical detection, while also presenting the future prospects for these biosensors based on current research advancements.
Carbon nano-onion (CNO) (also known as onion-like carbon, OLC), exhibiting multiple enclosed fullerene shell structures, as one of the most promising nanoforms, has attracted worldwide attention during the past decades due to its exceptional chemical and physical properties such as non-toxicity, high chemical stability, large sufficient surface area with low density, and superior high electronic and thermal conductivities, visible photoluminescence, etc. Nowadays, functional CNOs have been applied in energy storage devices, supercapacitors, photovoltaics, light-emitting diodes and bio-imaging technology. Since the first observation of CNOs by transmission electron microscopy as a byproduct in the synthesis of carbon black in 1980, numerous experimental and theoretical studies including expressive practical applications of CNOs have been intensively developed in modern chemistry. With respect to synthetic techniques, the high-temperature annealing of nano diamond, detonation of high explosive molecules, arc discharge of graphite, chemical vapor deposition, laser ablation, thermal pyrolysis, hydrothermal carbonization, and microwave pyrolysis have been reported. It has been addressed that the synthesis approach plays a key role in determining the structure of CNOs and resultant properties. This paper reviewed the developments of CNOs through major synthesis methods utilized for a selected wide spectrum of applications, by covering both the past and current progress. The contents outlined in the current review will offer readers comprehensive insights into the design and development of CNO materials.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.