This paper presents a user model server based on Web Services. User model servers are very important because they allow reusability of user modeling reasoning mechanisms which are typically very complex and difficult to construct from scratch. In this paper we show how the potential of interoperability, reusability and component sharing offered by the technology of Web Services have been exploited in the design of a user model server that performs decision making. The reasoning of the user modeling is based on a multi-criteria decision making theory and has been implemented as a Web Service to provide intelligent assistance to users over the Web. Reusability has been shown through the successful application of the user model server into two different applications: an e-mail and a file manager application.
Web Services technology is suitable for cross-platform and cross-application integration. To secure systems based on Web Services, a single sign-on protocol for Web Services supporting several login modes are presented. The architecture and the formalized flow of the protocol are described. The protocol is also analyzed and proven using an extended SVO logic.
The home network is one of the emerging areas from the last century. However, the growth of the home network market is stationary at present. This paper describes the limitations of the home network system and the requirements for overcoming the current limitations. Also described is a new home network service system known as COWS and its easy installation and scalable operation. COWS consists of power consumption monitor and control devices along with a service server that is a complementary combination of Open Service Gateway initiative (OSGi) and web services. A home network system has a dynamic, heterogeneous, distributed, and scalable topology. Service Oriented Architecture (SOA) has been proposed as a solution that satisfies the requirement of a home network, and OSGi and web services are two successful SOA-based frameworks. An included service server has a flexible architecture that consists of a core and extendable service packages. A power consumption monitor and control function provides useful context information for activity-based context-aware services and optimizes the power consumption. The system can be installed easily into existing and new houses to solve the current barrier of the popularization of home network services.
This paper proposes an approach to checking behavioral compatibility between Web services. If Web service B can be used in replacement of Web service A in such a way that the replacement is transparent to clients, Web service B is compatible to Web service A. We use state machines with guarded transitions to specify behaviors of Web services. To check compatibility between two Web services, we propose an extended version of the conventional methods rule, which has been used in object-oriented paradigm. To support our approach, we have implemented a tool. First, the tool constructs a state machine for a Web service whose behavior is expressed in WSDL and WSCI. Then, the tool can verify compatibility between Web services by using the extended methods rule.
Service Oriented Architectures (SOA) enable dynamic integration of Web Services (WS) to accomplish a user's need. As such, they are sensitive to user errors. This article presents a framework for mitigating the risks of user errors due to changes in the service delivery context. The underlying methodology incorporates usability in the design, testing, deployment and operation of dynamic collaborative WS, so that the error-prone elements of the User Interface (UI) are identified and eliminated. The methodology incorporates Statistical Process Control (SPC) of Web Service Indices (WSI), obtained by a Decision Support system for User Interface Design (DSUID), in which the users are elements of the control loop.
The Web Services Choreography Description Language (WS-CDL) is a specification developed by the W3C and can be viewed as a blueprint for the development of end-point services. Consequently, it is worth providing a systematic approach for its modeling, analysis and verification. The Unified Modeling Language (UML) is an industry standard for modeling. Applying UML to model WS-CDL is obviously a promising solution to bring together academics and practitioners through a unique standard language. In this paper, we propose to use different UML diagrams to model WS-CDL. UML Component Diagram is used to model the underlying structure of WS-CDL. UML Sequence Diagram is utilized to model the activities in WS-CDL. UML State Machine Diagram is utilized to model the behaviors of each role participating in a WS-CDL specification. We then enrich the UML State Machine Diagram with data by the use of UML Class Diagram. Given the UML specification of WS-CDL, we then provide a systematic way of formally analyzing and verifying WS-CDL against desired properties. Some experiments show that our approach can verify structural, behavioral and data properties in a middle-scale data-enriched WS-CDL specification.
With the widespread application of web services, trust-based service selection has become a significant requirement from a requester's point of view, and trust prediction mechanism has become a determining factor for any given service's success. But the dynamic nature of trust creates the biggest challenge in measuring a trust value and make trust predictions. In this paper, a new trust prediction model is proposed based on multiple decision factors. Firstly, the proposed model integrates multiple complementary decision factors to reflect complexity and uncertainty of trust relationship. In addition to including traditional experience-based trust factor and feedback-based trust factor, this model innovatively integrates QoS-based trust factor into global trust evaluation, which makes the model have a better rationality and a higher practicability than the existing approaches. Meanwhile, the proposed model applies the theory of OWA (Ordered Weighted Averaging) operator to assign the classification weights to these decision factors, which makes the model exhibit a strong adaptability in handling various dynamic behaviors of service providers. Simulation results show that, compared to the existing approaches, the proposed model has remarkable enhancements in the accuracy of trust prediction.
Access and reuse of authoritative phylogenetic knowledge have been a longstanding challenges in the evolutionary biology community — leading to a number of research efforts (e.g. focused on interoperation, standardization of formats, and development of minimum reporting requirements). The Phylotastic project was launched to provide an answer to such challenges — as an architectural concept collaboratively designed by evolutionary biologists and computer scientists. This paper describes the first comprehensive implementation of the Phylotastic architecture, based on an open platform for Web services composition. The implementation provides a portal, which composes Web services along a fixed collection of workflows, as well as an interface to allow users to develop novel workflows. The Web services composition is guided by automated planning algorithms and built on a Web services registry and an execution monitoring engine. The platform provides resilience through seamless automated recovery from failed services.
In this paper, we use attribute grammars as a formal approach for model checkers development. Our aim is to design an Alternating-Time Temporal Logic (ATL) model checker from a context-free grammar which generates the language of the ATL formulae. An attribute grammar may be informally defined as a context-free grammar which is extended with a set of attributes and a collection of semantic rules. We provide a formal definition for an attribute grammar used as input for Another Tool for Language Recognition (ANTLR) to generate an ATL model checker. The original implementation of the model-checking algorithm is based on Relational Databases and Web Services. Several database systems and Web Services technologies were used for evaluating the system performance in verification of large ATL models.
The number and size of information services available on the internet has been growing exponentially over the past few years. This growth has created an urgent need for information agents that act as brokers in the sense that they can autonomously search, gather, and integrate information on behalf of a user. To remain useful, such brokers will have to evolve throughout their lifetime to keep up with evolving and ever-changing information services. This paper proposes a framework named XIB (eXtensible Information Brokers) for building and evolving information brokers.
The XIB takes as input a description of required information services and supports the interactive generation of an integrated query interface. It also generates wrappers for each information service dynamically. Once the query interface and wrappers are in place, the user can specify a query and get back a result which integrates data from all wrapped information sources. The XIB depends heavily on XML-related techniques. More specifically, we use DTDs to model the input and output of each service, and XML to represent both input and output values. Based on such representations, the paper investigates service integration in the form DTD integration, and studies query decomposition in the form of XML element decomposition. Within the proposed framework, it is easy to add or remove information services to a broker, thereby facilitating maintenance, evolution and customization of information brokers.
We propose a specification-driven approach to Web service composition. Our framework allows the users (or service developers) to start with a high-level, possibly incomplete specification of a desired (goal) service that is to be realized using a subset of the available component services. These services are represented using labeled transition systems augmented with guards over variables with infinite domains and are used to determine a strategy for their composition that would realize the goal service functionality. However, in the event the goal service cannot be realized using the available services, our approach identifies the cause(s) for such failure which can then be used by the developer to reformulate the goal specification. Thus, the technique supports Web service composition through iterative reformulation of the functional specification. We present a prototype implementation in a tabled-logic programming environment that illustrates the key features of the proposed approach.
This paper presents a Service Oriented Architecture (SOA)-based content delivery model to facilitate mobile content delivery. The main contribution of this paper is the design and development of an SOA-equipped content delivery system based on a context-driven, access-controlled, profile-favored, and history-maintained (CAPH) model. We embody the generic model-view-controller (MVC) model to support a dynamic content adaptation technique based on mobile users' contextual environments. Self-adaptable presentation objects and modules are modeled as universal Web services resources, so that their interactions are formalized into Web services operations for high interoperability. Experimental results demonstrate that our proposed SOA-based model makes it easy to configure and construct a flexible Web content delivery system on the mobile Internet.
We examine two open engineering problems in the area of testing and formal verification of internet-enabled service oriented architectures (SOA). The first involves deciding when to formally and exhaustively verify versus when to informally and non-exhaustively test. The second concerns scalability limitations associated with formal verification, to which we propose a semi-formal technique that uses software agents. Finally, we assess how these findings can improve current software quality assurance practices.
Addressing the first problem, we present and explain two classes of tradeoffs. External tradeoffs between assurance, performance, and flexibility are determined by the business needs of each application, whether it be in engineering, commerce, or entertainment. Internal tradeoffs between assurance, scale, and level of detail involve the technical challenges of feasibly verifying or testing an SOA. To help decide whether to exhaustively verify or non-exhaustively test, we present and explain these two classes of tradeoffs.
Identifying a middle ground between testing and verification, we propose using software agents to simulate services in a composition. Technologically, this approach has the advantage of assuring the quality of compositions that are too large to exhaustively verify. Operationally, it supports testing these compositions in the laboratory without access to source code or use of network resources of third-party services. We identify and exploit the structural similarities between agents and services, examining how doing so can assure the quality of service compositions.
The growing quantity and distribution of bioinformatics resources means that finding and utilizing them requires a great deal of expert knowledge, especially as many resources need to be tied together into a workflow to accomplish a useful goal. We want to formally capture at least some of this knowledge within a virtual workbench and middleware framework to assist a wider range of biologists in utilizing these resources. Different activities require different representations of knowledge. Finding or substituting a service within a workflow is often best supported by a classification. Marshalling and configuring services is best accomplished using a formal description. Both representations are highly interdependent and maintaining consistency between the two by hand is difficult. We report on a description logic approach using the web ontology language DAML+OIL that uses property based service descriptions. The ontology is founded on DAML-S to dynamically create service classifications. These classifications are then used to support semantic service matching and discovery in a large grid based middleware project . We describe the extensions necessary to DAML-S in order to support bioinformatics service description; the utility of DAML+OIL in creating dynamic classifications based on formal descriptions; and the implementation of a DAML+OIL ontology service to support partial user-driven service matching and composition.
Web Services are emerging as the standard mechanism for making information and software available programmatically via the Internet, and as building blocks for applications. A composite web service may be built using multiple component web services. Once its specification has been developed, the composite service may be orchestrated either using a centralized engine or in a decentralized fashion. Decentralized orchestration brings performance benefits, and improves scalability and concurrency. Dynamic binding coupled with decentralized orchestration adds high availability and fault tolerance to the system. However in such systems, the coordination between components needs to be carefully designed to ensure correct execution of the composite and to limit the synchronization overheads.
In this paper, we categorize different forms of concurrency and provide an algorithm to identify these forms in a composite service specification. We explore different mechanisms for transferring data between the components in the presence of different forms of concurrency. Then we experimentally evaluate the efficiency and scalability of each mechanism. We also analyze the coordination requirements of a decentralized orchestration in the presence of dynamic binding and fault propagation.
The Web is changing the way organizations are conducting their business. Businesses are rushing to provide modular applications, called Web services, that can be programmatically accessed through the Web. Despite the tremendous developments achieved so far, one of the most important, yet untapped potential, is the use of Web services as facilitators for inter-organizational cooperation. This promising concept, known as Web service composition, is gaining momentum as the potential silver bullet for the envisioned Semantic Web. The development of such integrated services has so far been ad hoc, time-consuming, and requires extensive low-level programming efforts. In this paper, we present WebBIS (Web Base of Internet-accessible Services), a generic framework for composing and managing Web services. We combine the object-oriented and active rules paradigms for such a task. We also provide a ontology-based framework for organizing the Web service space. We finally propose a peer-to-peer mechanism for reporting, propagating, and reacting to changes in Web services.
Despite the recent uprising of the Web Services technology for programmatic interfaces of business-to-business (B2B) E-commerce services (e-services) over the Internet, most existing sites can only support human interactions with Hypertext Markup Language (HTML) through web browsers. Automating third-party client access into Web Services generally requires developing sophisticated programs to simulate human access by handling HTML pages. However, these HTML interfaces vary across web sites, and are often subject to changes. Client maintenance is therefore tedious and expensive. Even for the site owner, it may still require much effort in redeveloping the underlying presentation and application logics. This motivates our study for the requirement and the formulation of a conceptual model for such automation. Based on the requirement, we develop a novel approach to automating dialogs with web-based services (particularly for cross-organizational processes), using a high-level script language, called WebXcript language. The language provides features for HTML forms-based dialogues and eXtended Markup Language (XML) messaging. The XML syntax of WebXcript further enables convenient user authoring and easy engine development with extensively available XML tools. It supports expected responses and exception handling. We further propose a wrapper architecture based on WebXcript to integrate legacy sites into Web Services, where Web Service Definition Language (WSDL) interfaces are generated from high-level mappings from database or WebXcript parameter definitions. We demonstrate the applicability of our approach with examples in integrating distributed information, online ordering, and XML messaging, together with discussions on our experiences and the advantages of our approach.
The web-services stack of standards is designed to support the reuse and interoperation of software components on the web. A critical step in the process of developing applications based on web services is service discovery, i.e. the identification of existing web services that can potentially be used in the context of a new web application. Discovery through catalog-style browsing (such as supported currently by web-service registries) is clearly insufficient. To support programmatic service discovery, we have developed a suite of methods that assess the similarity between two WSDL (Web Service Description Language) specifications based on the structure of their data types and operations and the semantics of their natural language descriptions and identifiers. Given only a textual description of the desired service, a semantic information-retrieval method can be used to identify and order the most relevant WSDL specifications based on the similarity of the element descriptions of the available specifications with the query. If a (potentially partial) specification of the desired service behavior is also available, this set of likely candidates can be further refined by a semantic structure-matching step, assessing the structural similarity of the desired vs the retrieved services and the semantic similarity of their identifiers. In this paper, we describe and experimentally evaluate our suite of service-similarity assessment methods.
Software solutions to automate the procurement of web services are gaining importance when technology evolves, the number of providers increases and the needs of the clients become more complex. There are several proposals in this field, but they all have important drawbacks, namely: many of them are not able to check offers and demands for internal consistency; selecting the best offer usually relies on evaluating linear objective functions, which is quite a naive solution; the language to express offers is usually less expressive than the language to express demands; and, last but not least, providers cannot impose constraints on their clients. In this article, we present a solution to overcome these problems that relies on constraint programming; furthermore, we present a run-time framework, some experimental results, and a comparison with other proposals.
The use of open, Internet-based communications for business-to-business (B2B) interactions requires accountability for and acknowledgment of the actions of participants. Accountability and acknowledgment can be achieved by the systematic maintenance of an irrefutable audit trail to render the interaction non-repudiable. To safeguard the interests of each party, the mechanisms used to meet this requirement should ensure fairness. That is, misbehavior should not disadvantage well-behaved parties. Despite the fact that Web services are increasingly used to enable B2B interactions, there is currently no systematic support to deliver such guarantees. This paper introduces a flexible framework to support fair non-repudiable B2B interactions based on a trusted delivery agent. A Web services implementation is presented. The role of the delivery agent can be adapted to different end user capabilities and to meet different application requirements.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.