Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Secure and private user data are more important than ever with the explosion of online gaming platforms and the resulting deluge of user information. Intending to protect gaming ecosystems and maintain user confidence, Heuristic Predictive Modeling provides a proactive security strategy by allowing early detection and mitigation of potential risks. The ever-changing nature of the game, the wide variety of user interactions, and the always-evolving strategies of cybercriminals all contribute to the singular problems that data management and security encounter in modern gaming settings. This research proposes Heuristic Predictive Modeling for Gaming Security (HPM-GS). This system can analyze gaming data in real time and detect trends and abnormalities that could indicate security breaches. It uses advanced algorithms and machine learning approaches. With HPM-GS, gaming platforms can keep their users safe and secure by anticipating and proactively addressing security threats. Several areas of gaming security can benefit from HPM-GS, such as user authentication, detection of cheats, prevention of fraud, and incident response. Enhanced user experience and platform reliability can be achieved by incorporating HPM-GS into pre-existing security frameworks, which allows gaming platforms to strengthen their defenses and efficiently reduce risks. Extensive simulation studies assess the effectiveness of HPM-GS in gaming security. The performance metrics of HPM-GS, such as detection accuracy, false positive rates, and response time, are evaluated using real-world datasets and simulated attack scenarios. The simulation findings show that HPM-GS is a good solution for protecting gaming environments from cyber-attacks. The HPM-GS is a proactive, elastic gaming application data management and security method. The purpose of this research is to emphasize the potential of HPM-GS to improve the security posture of online gaming platforms and to ensure that players have a gaming experience that is both safer and more pleasant. This is accomplished by addressing the significance of HPM-GS, potential difficulties, proposed techniques, implementations, and simulation analysis.
The main idea of this framework is that it is capable of overcoming the drawbacks that are always linked with conventional cloud-based methods. Computation and storage resources in Internet of Things (IoT) networks are distributed closer to the network’s edge; therefore, the amount of data processed in real time is reduced. By decreasing the distance of the data transfers, less bandwidth is used. Structural problems, data safety, interoperability, and resource allocation-related matters denote challenges preventing the successful implementation of those ideas. The proposed work is the cloud-enabled fog computing framework (C-FCF) in data center systems based on the IoT platform. It brings cloud computing to a new level of scalability by compounding the following: Scalable architecture, uniform communication interfaces, the dynamic algorithms that allocate resources, and the data-centered approach, on the one hand, and strong security protocols, on the other. The wireless sensor network (WSN) approach to this technology represents a greater versatility of this system as it can be applied to perform different tasks in various industries like smart cities, healthcare, transportation, and industrial automation services. The application of the given services illustrates C-FCF’s capability of creating innovation, modeling effectiveness, and uncovering potential for integration within the IoT network. The virtual simulation analysis is necessary to validate C-FCF’s effectiveness in real-life scenarios. The simulations provide evidence of features, including low latency, efficient resource utilization, and overall system performance, which underlines the practical aspects of applying C-FCF in different IoT settings. Developing this advanced computing architecture, which can surpass the limitations of conventional technology and the ability to entail many different use cases, will potentially change the data processing and management paradigm in IoT-enabled settings.
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago.
This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project.
The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
In this paper, we develop an automatic compile-time computation and data decomposition technique for distributed-memory machines. Our method handles complex programs containing perfect and non-perfect loop nests with or without loop-carried dependences. Applying our algorithms, a program will be divided into collections (called clusters) of loop nests, such that data redistributions are allowed only between the clusters. Within each cluster of loop nests, decomposition and data locality constraints are formulated as a system of homogeneous linear equations which is solved by polynomial time algorithms. Our algorithm can selectively relax data locality constraints within a cluster to achieve a balance between parallelism and data locality. Such relaxations are guided by exploiting the hierarchical program nesting structures from outer to inner nesting levels to keep the communications at a outer-most level possible.
In this work, Friedmann–Robertson–Walker (FRW) universe filled with dark matter (DM) (perfect fluid with negligible pressure) along with dark energy (DE) in the background of Galileon gravity is considered. Four DE models with different equation of state (EoS) parametrizations have been employed namely, linear, Chevallier–Polarski–Lindler (CPL), Jassal–Bagla–Padmanabhan (JBP) and logarithmic parametrizations. From Stern, Stern+Baryonic Acoustic Oscillation (BAO) and Stern+BAO+Cosmic Microwave Background (CMB) joint data analysis, we have obtained the bounds of the arbitrary parameters ω0 and ω1 by minimizing the χ2 test. The best fit values and bounds of the parameters are obtained at 66%, 90% and 99% confidence levels which are shown by closed confidence contours in the figures. For the logarithmic model unbounded confidence contours are obtained and hence the model parameters could not be finitely constrained. The distance modulus μ(z) against redshift z has also been plotted for our predicted theoretical models for the best fit values of the parameters and compared with the observed Union2 data sample and SNe Type Ia 292 data and we have shown that our predicted theoretical models permits the observational datasets. From the data fitting it is seen that at lower redshifts (z<0.3) the SNe Type Ia 292 data gives a better fit with our theoretical models compared to the Union2 data sample. So, from the data analysis, SNe Type Ia 292 data is the more favored data sample over its counterpart given the present choice of free parameters. From the study, it is also seen that the logarithmic parametrization model is less supported by the observational data. Finally, we have generated the plot for the deceleration parameter against the redshift parameter for all the theoretical models and compared the results with the work of Farooq et al., (2013).
The design, specification, and preliminary implementation of the SEMAL language, based upon the Calculus of Self-modifiable Algorithms model of computation is presented. A Calculus of Self-modifiable Algorithms is a universal theory for parallel and intelligent systems, integrating different styles of programming, and applied to a wealth of domains of future generation computers. It has some features from logic, rule-based, procedural, functional, and object-oriented programming. It has been designed to be a relatively universal tool for AI similar to the way Hoare’s Communicating Sequential Processes and Milner’s Calculus of Communicating Systems are basic theories for parallel systems. The formal basis of this approach is described. The model is used to derive a new programming paradigm, so-called cost languages and new computer architectures cost-driven computers. As a representative of cost languages, the SEMAL language is presented.
INDIA – A novel form of gene regulation in bacteria.
INDIA – Algal biofuels are no energy panacea.
JAPAN – Medical Data Vision enhances the quality of medical care with Actian Vectorwise.
SINGAPORE – Singapore heart surgeon to receive honour from The Royal College of Surgeons of Edinburgh.
SINGAPORE – ELGA® to deliver innovative water purification at new Singapore General Hospital expansion.
AUSTRALIA – Specialised Therapeutics Australia: New drug to fight hospital superbug infection.
AUSTRALIA – Group of genes hold the clue in migraine cases.
AUSTRALIA – CT scans can triple risk of brain cancer, leukemia.
BRAZIL – Science can do more for sustainable development.
MIDDLE EAST – Particles and persecution: why we should care about Iranian physicists.
EUROPE – Medicyte coordinates EU-funded collaboration on Biomimetic Bioartificial Liver.
EUROPE – Selvita and Orion Pharma achieve a research milestone in Alzheimer's Disease Program.
EUROPE – Zinforo (ceftaroline fosamil) receives positive CHMP opinion in the European Union for the treatment of patients with serious skin infections or community acquired pneumonia.
USA – Vein grown from girl's own stem cells transplanted.
USA – Hidden vitamin in milk yields remarkable health benefits - Weill Cornell researchers show tiny vitamin in milk, in high doses, makes mice leaner, faster and stronger.
USA – New report finds biotechnology companies are participating in 39% of all projects in development for new medicines and technologies for neglected diseases.
USA – TriReme Medical receives FDA clearance for expanded matrix of sizes of Chocolate PTA balloon catheter.
USA – New data show investigational compound dapagliflozin demonstrated significant reductions in blood sugar levels when added to sitagliptin in adults with type 2 diabetes at 24 weeks, with results maintained over 48 weeks.
USA – Zalicus successfully completes Phase 1 single ascending dose study with Z944, a novel, oral T-Type Calcium Channel Blocker.
USA – Study provides clues to clinical trial cost savings.
Yak genome provides new insights into high altitude adaptation.
Gentris and Shanghai Institutes of Preventative Medicine expand collaboration.
Chinese researchers identify rice gene enhancing quality, productivity.
Quintiles opens new Center of Excellence in Dalian to support innovative drug development.
BGI demonstrated genomic data transfer at nearly 10 gigabits per second between US and China.
Quintiles deepens investment in China - New Quintiles China Headquarters and local lab testing solution announced.
Beike earns AABB Accreditation for cord blood and cord tissue banking.
Epigenomic differences between newborns and centenarians provide insight to the understanding of aging.
SINGAPORE – Intelligent Sensor Informs You to Change a Diaper via SMS
JAPAN – Tokyo Institute of Technology research: Key genetic event underlying fin-to-limb evolution
ISRAEL – Independent Results Show that BiondVax's Universal Flu Vaccine Administered in a Trial 3 Years ago Improves Immunogenicity against Current Flu H3N2 Epidemic
UNITED KINGDOM – Imperial Innovations Launches Orthonika: A Novel Knee Meniscus Replacement
UNITED KINGDOM – New Vaccine For Chlamydia to Use Synthetic Biology
CANADA – Aeterna Zentaris Announces Data and Safety Monitoring Board Scheduled to Complete Second Interim Analysis of the ZoptEC Phase 3 Trial in Endometrial Cancer in Early October
UNITED STATES – NueMD Launches Free ICD-10 Training Tool Ahead of October 1 Deadline
UNITED STATES – A new hope for Moderate and Severe Dementia: Upsher-Smith receives FDA approval for generic version of Namenda (Memantine HCL) Tablets
UNITED STATES – FDA Approves U.S. Product Labeling Update for Sprycel® (dasatinib) to Include Five-Year First-Line and Seven-Year Second-Line Efficacy and Safety Data in Chronic Myeloid Leukemia in Chronic Phase
From Home to Hospital: Digitisation of Healthcare.
Microsoft with RingMD, Oneview Healthcare, Vital Images, Aruba, and Clinic to Cloud: The Ecosystem of Healthcare Solutions Providers in Asia.
Data Helps in Improving Nursing Practice, Making Better Decisions.
Launch of Asian Branch for QuintilesIMS Institute.
Cellular Biomedicine Group (CBMG) and GE Healthcare Life Sciences China Announce Strategic Partnership to Establish Joint Technology Laboratory to Develop Control Processes for the Manufacture of CAR-T and Stem Cell Therapies.
Zuellig Pharma to Invest over $50 Million in Singapore-Based Innovation Centre.
Holmusk: Using Data to Improve Clinical Outcomes for Cardiovascular Disease in Singapore.
Singapore Eye Bank Sets Another Record in Local Cornea Donations in 2016.
Plasticell and King’s College London to Collaborate in Trials of Blood Platelet Substitute.
Merck Partners with University of California, San Diego (UCSD) to Fight Neglected Tropical Diseases.
Mundipharma Wins Approval for Antineoplastic Agent mundesine® as Treatment for Relapsed/Refractory Peripheral T-cell Lymphoma in Japan.
Asian Myeloma Network (AMN) Brings Clinical Trials to Cancer Patients in Asia and Provides Early Access to Effective Drugs.
APBN Interview with Professor Chng Wee Joo.
For the month of February 2022, APBN looks at the many ways research is paving the way for improved cancer treatment outcomes. In Features, Kenneth Tan, President of Asia-Pacific, Japan and India at Varian, shares how we can accelerate the path from cancer diagnosis to therapy, improving survivorship. Then, Lance Kawaguchi, CEO of Cure Brain Cancer Foundation, writes about the importance of cancer research, its challenges, and how the Foundation is working to support brain cancer research. For our last feature, we have Dr Ahmed Abdelal, Global Head Medical Affairs Interventional Imaging and North America Head of Medical & Regulatory Affairs, Guerbet and Dr Binta Patel, North America Medical Affairs Manager, Guerbet, to give us a treatment update for hepatocellular carcinoma. Other topics covered in this issue includes the importance of trust in the pharmaceutical industry and how companies can improve trust with technology-led transparency, how we may learn from the pandemic and better prepare for the next unknown crisis, a discussion on the incidence of Omicron and Delta variants in two case studies, and an interview with a spokesperson from Novartis on their newly approved cholesterol drug.
For the months of March and April 2022, APBN looks at how we can improve and support women's reproductive health. In Features, we have Dr Samuel Prien and Dr Linsay Penrose from Texas Tech University Health Sciences on how we may select the best embryos that will provide the best chances for pregnancy; Dr Alok Javali and Dr Nicolas Rivron from IMBA-Institute of Molecular Biology in Vienna, Austria on how blastoids can be used as a drug discovery tool to improve IVF procedures; and Michelle Tan Min Shuen on reproductive genetic technology. In Columns, Sumir Bhatia, President, Asia Pacific, Lenovo Infrastructure Solutions Group talks about the future of genome sequencing and Justin Loh, Country Director for Singapore, Veritas Technologies tells us how we may protect patient data from data breaches. In Spotlights, we recap the highlights from the Future of Healthcare Week, discussing the path forward in a post-COVID world.
Artificial intelligence and robots are changing the economic and entrepreneurship environment in the industrial revolution. Artificial intelligence and robotics have become prevalent in modern economic, professional, social, and daily lives. As a function of its ability to update and develop business processes and innovative ideas, services, and products and resolve difficult tasks to achieve new, entrepreneurship has experienced massive development. Significant changes are occurring in entrepreneurship and economic growth due to artificial intelligence. Therefore, this paper aims to understand the effect and components of data entrepreneurship overall with the help of an artificial intelligence-based feasibility evaluation model (AI-FEM). Robot, edge and physical resource layers are described in depth in this document. We first deploy an edge node near the data sources to combine multiple devices’ interfaces and function as a raw data filter. Then it provides opportunity recognition, opportunity development, and opportunity implementation processes are part of the framework’s processes described in this paper. This paper aims to develop a basic framework for evaluating AI’s potential implications for the interaction between entrepreneurship and economic growth. The economic growth of industrial robots reduces basic labor costs. However, it increases hourly compensation, suggesting that the productivity-enhancing advantage of industrial robots equals the wage-increasing influence. The results show that the system is feasible and performs better in real-time and network transmission than in an AI-based industry scenario. The experimental results of AI-FEM show the high-performance ratio of 95.5%, productivity ratio of 96.3%, reliability ratio of 93.4%, the employment rate of 92.6%, an efficiency ratio of 93.6%, industrial management ratio of 90.3%, and cost-effectiveness ratio of 20.3% compared to other methods.
Entrepreneurship research has been criticized for a lack of methodological rigor, although evidence suggests that from a methodological perspective, it is improving (Davidsson, 2006). In this paper, we systematically review the methods used in the study of nascent entrepreneurs to identify challenges associated with the data used in these studies. We also review the field's achievements — notably, the successful use of representative sampling of populations of nascent entrepreneurs — and we raise concerns about the predominant use of secondary data sets and the use of scales originally developed for large, established firms. Drawing on methodological advancements in other fields, we offer suggestions related to study design, data collection, sampling and measurement. Although some of the challenges we note are inherent to the nature of entrepreneurship, we hope our discussion can help researchers design better studies and better interpret their findings.
The number of countries that have adopted International Financial Reporting Standards (IFRS) in some form has grown each year. However, the existing literature generally ignores the varied types and the complex timing of IFRS adoption. Our paper provides a cross-reference of IFRS adoption dates and types for 195 countries and territories around the world. This definitive data, including an extensive online dataset, was developed to help researchers better identify IFRS adoption events in the samples used in their empirical studies. Additionally, we highlight potential challenges in identifying IFRS adoption types and dates as well as provide areas of future research that can benefit from our dataset, which can be accessed online https://about.illinoisstate.edu/mktrimb/song-trimble-2022-dataset/.
Strategic Environmental Assessment (SEA) is the process through which the impacts of plans and programmes on the environment are assessed. Objectives, targets and indicators are the tools through which these environmental impacts can be measured. The same objectives, targets and indicators may be used for all planning levels but it is also necessary to identify additional plan specific ones. We used a workshop based approach to provide an interface between planners and environmental scientists and to give examples of objectives, targets and indicators for biodiversity, water, air and climatic factors, which could be used in SEA for national, regional and local plans. In addition, we highlight the need for careful consideration during the selection process of these variables which will result in a more rigorous and robust SEA. This is a challenging process but once completed will maximise resources and reduce the workload later in the SEA process.
We present a thorough empirical analysis of market impact on the Bitcoin/USD exchange market using a complete dataset that allows us to reconstruct more than one million metaorders. We empirically confirm the “square-root law” for market impact, which holds on four decades in spite of the quasi-absence of statistical arbitrage and market marking strategies. We show that the square-root impact holds during the whole trajectory of a metaorder and not only for the final execution price. We also attempt to decompose the order flow into an “informed” and “uninformed” component, the latter leading to an almost complete long-term decay of impact. This study sheds light on the hypotheses and predictions of several market impact models recently proposed in the literature and promotes heterogeneous agent models as promising candidates to explain price impact on the Bitcoin market — and, we believe, on other markets as well.
3D scanning technologies are deployed toward developing a digital 3D model for Additive Manufacturing (AM) applications. It collects data, turning it into a 3D model that uses designated 3D printing processes. Many scanners, ranging from low-cost alternatives to professional series that are far more accurate and reliable, are now available to assist in bringing designs to reality. 3D scanning solutions enable the appropriate measurement of 3D physical parts into the virtual world, allowing factory production teams and corporate offices to share critical design information. These techniques are utilized everywhere in the design process, including product design and development, reverse engineering, quality control and quality assurance. The manufacturing sector can decrease costs while accelerating time to market and resolving quality issues. This study investigates the metrological need as per the advancements of 3D scanners. The procedural steps of the 3D scanners, along with specific metrological components and soft tools for 3D scanning, are discussed briefly. Finally, various 3D scanning applications are identified and discussed in detail. Because of the overall relative advantages of these non-contact measurement techniques, 3D metrological tools are crucial for modern production. Almost every sector aims for smaller, more complex components, which are more vulnerable to contamination or injury from even the slightest touch with a probe. The market is driven by global Research and Development (R&D) investment to develop game-changing technologies and solutions. Precision inspection and quality control are significant market drivers for industry progress. Smart factories will have lifetime access to 3D metrological data, allowing them to enhance quality and gain a competitive advantage in the marketplace.
The following sections are included: