World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Spring Sale: Get 35% off with a min. purchase of 2 titles. Use code SPRING35. Valid till 31st Mar 2025.

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Stochastic Automata Networks and Lumpable Stochastic Bounds: Bounding Availability

    https://doi.org/10.1142/9781860948947_0010Cited by:1 (Source: Crossref)
    Abstract:

    The use of Markov chains to model complex systems is becoming increasingly common in many areas of science and engineering. The underlying transition matrix of the Markov chain can frequently be represented in an extremely compact form – a consequence of the manner in which the matrix is generated. This means that the definition and generation of large-scale Markov models is relatively easy and efficient in both time and memory requirements. The remaining difficulty is that of actually solving the Markov chain and deriving useful performance characteristics from it.

    The use of bounding procedures is one approach to alleviating this problem. Such bounds are frequently more easily obtained than the exact solution and provide sufficient information to be of value. In this chapter, we show how to bound certain dependability characteristics such as steady-state and transient availability using an algorithm based on the storage of the Markov chain as a sum of tensor products. Our bounds are based on arguments concerning the stochastic comparison of Markov matrices rather than the usual approach which involves sample-path arguments. The algorithm requires only a small number of vectors of size equal to the number of reachable states of the Markov chain.