With the rapid development of computer networks and information technology, government departments, financial institutions, and enterprises are increasingly dependent on software, and software security issues have become a focus of attention. In this context, how to effectively evaluate the security of software has become an important issue for research institutions both domestically and internationally. On the basis of exploring software definition, this paper not only analyzes the security and potential threats of software in computer networks, but also introduces Embedded Neural Networks (ENNs) as an evaluation tool, and combines Fuzzy Analytic Hierarchy Process (FAHP) to deeply explore a new method for software security risk assessment. By utilizing the powerful pattern recognition capability of ENNs, software logs, system call sequences, and other data can be classified and analyzed to distinguish between normal and abnormal behavior. This ability is crucial for identifying security incidents such as malicious software and unauthorized access. ENNs are designed with resource constraints in mind, which can reduce energy consumption while ensuring performance. For software systems that require long-term operation, this means higher security and stability. Practice has proven that combining ENNs with FAHP can more scientifically and effectively evaluate software security. This method not only improves the accuracy and efficiency of evaluation, but also provides a more solid theoretical foundation and technical support for software security protection.
This study examines the impact of information and communication technologies (ICTs) and innovation on small and medium enterprises (SMEs) export likelihood using a two-stage instrumental variable logistic estimator in 6,844 ASEAN SMEs. Adopting ICT technologies allows SMEs to overcome the constraints faced when exporting, while innovation allows SMEs to gain a competitive advantage. Meanwhile, the ASEAN economies are committed toward regional integration and have implemented policies to develop the SME and ICT sector. Results indicate that both ICT technologies and innovation contribute positively to export likelihood, albeit the magnitude of ICT technologies on export likelihood is greater. Furthermore, the results show that ICT technologies can overcome the constraints faced by marginalized businesses in terms of exporting, and can also enhance export likelihood in the manufacturing industry. Policy implications are discussed.
The slogan that "information is power" has undergone a slight change. Today, "information updating" is in the focus of interest. The largest source of information today is the World Wide Web. Fast search methods are needed to utilize this enormous source of information. In this paper our novel crawler using support vector classification and on-line reinforcement learning is described. We launched crawler searches from different sites, including sites that offer, at best, very limited information about the search subject. This case may correspond to typical searches of non-experts. Results indicate that the considerable performance improvement of our crawler over other known crawlers is due to its on-line adaptation property. We used our crawler to characterize basic topic-specific properties of WWW environments. It was found that topic-specific regions have a broad distribution of valuable documents. Expert sites are excellent starting points, whereas mailing lists can form trape for the crawler. These properties of the WWW and the emergence of intelligent "high-performance" crawlers that monitor and search for novel information together predict a significant increase of communication load on the WWW in the near future.
An external scanning ion microbeam system has been developed for in-air micro-PIXE analysis at JAERI Takasaki. The analysis system is widely used for various researches in recent years. The system consists of the external scanning ion microbeam system, a multi-parameter data acquisition system, a file transfer protocol (FTP) server and analysis software. The software of the system provides a graphical user interface for interaction between users and an experimental setup. The server is connected to the Internet and allows remote users to access the experimental data.
We present here the preliminary results of the active measurements of Internet traffic. The main goal is the measurement of the effectiveness of hierarchical cache developed last years in a number of countries using Squid software. The passive measurements could not give any conclusive answer. The procedure of the active measurements was developed for that reason. First, we use active measurements (experiments) of Internet traffic sending the http-requests from the log files. This give possibility to perform the comparative measurements which is usually more accurate. Next, more sophisticated procedure using triangle of cache servers, one working as the only manager for the two others, which could be setup differently and, typically, serve only odd and even queries respectively was proposed. This procedure, probably, is the most precise experimental setup for the comparative study of server strategies could be done on the basis of standard hardware and software.
Long-range interactions are introduced to a two-dimensional model of agents with time-dependent internal variables ei = 0, ±1 corresponding to valencies of agent emotions. Effects of spontaneous emotion emergence and emotional relaxation processes are taken into account. The valence of agent i depends on valencies of its four nearest neighbors but it is also influenced by long-range interactions corresponding to social relations developed for example by Internet contacts to a randomly chosen community. Two types of such interactions are considered. In the first model the community emotional influence depends only on the sign of its temporary emotion. When the coupling parameter approaches a critical value a phase transition takes place and as result for larger coupling constants the mean group emotion of all agents is nonzero over long time periods. In the second model the community influence is proportional to magnitude of community average emotion. The ordered emotional phase was here observed for a narrow set of system parameters.
The traditional Gibrat's hypotheses were once used to model the topological fluctuations of Internet. Although it seems to reproduce the scaling relation of Internet's degree distribution, the detailed micro-dynamics have never been empirically validated. Here, we analyze the distribution of degree growth rates of the Internet for various time scales. We find that in contrast to the traditional Gibrat's assumptions, none of the degree growth rates are normally distributed, but behaves as an exponential decrease on its body and a power-law decay on its tail. Moreover, the observed growth rate distribution turns out independent of the initial degree when the time interval enlarges to a year. Our observations do not consist with the traditional Gibrat law model and suggest a more complex fluctuation mechanism underlying the evolution of Internet.
We present a fragmentation model that generates almost any inverse power-law size distribution, including dual-scaled versions, consistent with the underlying dynamics of systems with earthquake-like behavior. We apply the model to explain the dual-scaled power-law statistics observed in an Internet access dataset that covers more than 32 million requests. The non-Poissonian statistics of the requested data sizes m and the amount of time τ needed for complete processing are consistent with the Gutenberg–Richter–law. Inter-event times δt between subsequent requests are also shown to exhibit power-law distributions consistent with the generalized Omori law. Thus, the dataset is similar to the earthquake data except that two power-law regimes are observed. Using the proposed model, we are able to identify underlying dynamics responsible in generating the observed dual power-law distributions. The model is universal enough for its applicability to any physical and human dynamics that is limited by finite resources such as space, energy, time or opportunity.
The Space Telescope Science Institute (STScI) makes available a wide variety of information concerning the Hubble Space Telescope (HST) via the Space Telescope Electronic Information Service (STEIS). STEIS is accessible via anonymous ftp, gopher, WAIS, and WWW. The information on STEIS includes how to propose for time on the HST, the current status of HST, reports on the scientific instruments, the observing schedule, data reduction software, calibration files, and a set of publicly available images in JPEG, GIF and TIFF format. STEIS serves both the astronomical community as well as the larger Internet community. WWW is currently the most widely used interface to STEIS. Future developments on STEIS are expected to include larger amounts of hypertext, especially HST images and educational material of interest to students, educators, and the general public, and the ability to query proposal status.
TechTools™ is a professional development program for science and mathematics teachers for purposes of promoting a constructivist pedagogy with modern technologies: probeware, image processing, multimedia, e-mail, and the WWW. We report preliminary results on (1) changes in teachers' use of technology tools, classroom pedagogy and attitude, and (2) systemic ingredients which are catalytic and inhibitors to the technology reformation necessary in the educational system.
Smart cards are highly successful thanks to their unique combination of mobility and security. Based upon a single-chip microcontroller with volatile and non-volatile memories, a smart card implementes a small computer system that is very portable (credit card size), easy to use, and extremely resistant against external attacks.
However, today's smart cards use proprietary protocols, application schemes, and development tools. This is due to the limitations of current technology, and it leads to situation of "splendid isolation" where smart cards are not being regarded as an integral part of the overall IT architecture.
In this paper, we describe recent research towards "next generation" smart cards. It combines an advanced programming language (Java), novel hardware architectures that provide the required "MIPS budget" (RISC 32 bit), as well as an implementation of key Internet protocols (IP, HTTP) on smart cards.
As a result, we show how smart cards can be seamlessly integrated within a distributed computing environment.
The network disaster induced by cascading failures has traumatized modern societies. How to protect these networks and improve their robustness against cascading failures has become the key issue. To this end, we construct a cascading model and propose an efficient mitigation strategy against cascading failures. In many real-world networks, there exist some heterogeneous nodes, for example the hosts and the routers in the Internet, and the users and the supply-grid stations in the power grid. In previous studies, however, less cascading models were constructed to describe such fact. Including two types of nodes in a network, we present a new cascading model. We introduce a new mitigation strategy with dynamically adjusting the load and demonstrate its efficiency on the Barabási–Albert (BA) network and the power grid as well as the Internet. We show that with small changes in dynamically adjusting the load the robustness of these networks can be improved dramatically. Our results are useful not only for protecting networks from the local perspective, but also for significantly improving the robustness of the existing infrastructure networks.
In order to study the development trend of the Internet’s role in supporting enterprise innovation, a new method based on deep learning algorithm and knowledge graph technology is proposed. Experiments show that the accuracy, F1 value, recall rate and precision of the algorithm are distinctly improved compared with the existing algorithms. A new algorithm is applied to analyze the innovation evolution of Chinese enterprises, using papers, patents and other documents between 1905 and 2020 as data sources. Based on the experimental results, the development stages can be divided into five stages. The research focus is on product R&D innovation, manufacturing innovation, marketing innovation, resource allocation innovation and organizational innovation. It can be seen that the development process of the Internet supporting enterprise innovation system is an evolutionary development process from point to line, to surface, to the body, and to the ecosystem.
The Internet platform has become the most popular one to build integrated applications. This paper describes the design and implementation of an innovative e-Forecasting application over the Internet. The application delivers timeseries' forecasts on-line via the Internet. The user can choose amongst classic forecasting techniques including the Theta model. This study discusses why such an application is innovative and interesting to the IT community. The architecture which is proposed implements an advanced e-forecasting application and also offers web-service interfaces for third party applications. This dual functionality of the proposed architecture makes it extensible and rigid.
The Summary-based Object-Oriented Reuse Library System (SOORLS) was developed to support both librarians who manage databases of object-oriented reusable components, and software developers who intend to use these components to develop software on the Web. This paper presents the library management functions implemented by SOORLS, with focus on a software reuse approach based on the summary contents of the library. The cluster-based classification scheme proposed in this paper alleviates the labor-intensity domain analysis problem often attributed to traditional facet-based classification schemes. We then concentrate on the facilities offered by SOORLS' tools, as well as its Web-based architecture, which allows distributed access to reusable components on servers from a variety of platforms.
This paper describes a system to distribute and retrieve multimedia knowledge on a cluster of heterogeneous high performance architectures distributed over the Internet. The knowledge is represented using facts and rules in an associative logic-programming model. Associative computation facilitates distribution of facts and rules, and exploits coarse grain data parallel computation. Associative logic programming uses a flat data model that can be easily mapped onto heterogeneous architectures. The paper describes an abstract instruction set for the distributed version of the associative logic programming and the corresponding implementation. The implementation uses a message-passing library for architecture independence within a cluster, uses object oriented programming for modularity and portability, and uses Java as a front-end interface to provide a graphical user interface and multimedia capability and remote access via the Internet. The performance results on a cluster of IBM RS 6000 workstations are presented. The results show that distribution of data improves the performance almost linearly for small number of processors in a cluster.
The SHARE project seeks to apply information technologies in helping design teams gather, organize, re-access, and communicate both informal and formal design information to establish a "shared understanding" of the design and design process. This paper presents the visions of SHARE, along with the research and strategies undertaken to build an infrastructure toward its realization. A preliminary prototype environment is being used by designers working on a variety of industry sponsored design projects. This testbed continues to inform and guide the development of NoteMail, MovieMail, and Xshare, as well as other components of the next generation SHARE environment that will help distributed design teams work together more effectively on the Internet.
It is a network system for teaching English through a wireless communication (WC) premised distance teaching system. This is a process of education that is capable of encouraging students’ concerns to acquire knowledge voluntarily. The paper is designed to develop and implement an online intelligent English training system using artificial intelligence (AI) that helps students improve their English learning efficiency in line with knowledge and personality. The system’s numerous sensor nodes may create a variety of topologies. The gathered information is transmitted over the global system for mobile communication (GSM) network to the user interface. The operator can manage the remote sensor node via the GSM network. Nevertheless, there are certain derivative aspects such as the absence of verbal judgment, the actual evaluation and signaling system, the interactive educational platform teachers and learners need. The paper is based on the above issues. It contains a whole talk-based system where teachers, students, and English teaching can be revised together — AIWC (ET-AIWC) systems are designed to improve and advance the genetic algorithm based on an encoding technique for dynamic parameter adjustment of the iterative process based on these problems. In combination with an AI expert system, suitable learning techniques were created to enable students to double the learning effect by half the amount of work. An online teaching assistant system was designed to monitor, regulate, and engage with students throughout the learning process and a modified scoring system that provides real-time evaluation of student speakers to improve students’ oral competence in English better and more efficiently, achieving 95.2%.
The opportunities offered by the Internet are employed increasingly in medicine. To obtain data on the extent to which the Internet is used by hand surgeons, survey forms were sent to 1043 participants of the Congress of the IFSSH in Vancouver in 1998. Ninety-four per cent of the respondents use the Internet. Most of the participants use the World Wide Web for literature searches, information on events and to read scientific articles. E-mail is used for general and scientific communication with colleagues and also for transmission of patient-related data. Perceived apprehensions include secure transmission of sensitive data, slow data transmission, and the lack of structure and of an authority to control the contents of the Internet. Virtual congresses and a newsgroup on hand surgery seem to be worthwhile future goals. Some problems pointed out in this survey have already been solved, at least partially, and possible solutions for the rest are discussed.
Healthcare information contained on the World Wide Web is not screened or regulated and claims may be unsubstantiated and misleading. The objective of this study was to evaluate the nature and quality of information on the Web in relation to hand surgery. Three search engines were assessed for information on three hand operations: carpal tunnel decompression, Dupuytren's release and trigger finger release. Websites were classified and evaluated for completeness, accuracy, accountability and reference to a reliable source of information. A total of 172 websites were examined. Although 85% contained accurate information, in 65% this information was incomplete. Eighty-seven per cent of websites were accountable for the information presented, but only 24% made references to reliable sources. Until an organised approach to website control is established, it is important for hand surgeons to emphasise to their patients that not everything they read is complete or accurate. Publicising sites known to be of high quality will promote safe browsing of the Web.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.