Finite automata that traverse graphs by moving along their edges are known as graph-walking automata (GWA). This paper investigates the state complexity of Boolean operations for this model. It is proved that the union of GWA with mm and nn states, with m≤nm≤n, operating on graphs with kk labels of edge end-points, is representable by a GWA with 2km+n+12km+n+1 states, and at least 2(k−3)(m−1)+n−12(k−3)(m−1)+n−1 states are necessary in the worst case. For the intersection, the upper bound is (2k+1)m+n(2k+1)m+n and the lower bound is 2(k−3)(m−1)+n−12(k−3)(m−1)+n−1. The upper bound for the complementation is 2kn+12kn+1, and the lower bound is 2(k−3)(n−1)2(k−3)(n−1).
Based on recent results from extremal graph theory, we prove that every n-state binary deterministic finite automaton can be converted into an equivalent regular expression of size O(1.742n) using state elimination. Furthermore, we give improved upper bounds on the language operations intersection and interleaving on regular expressions.
For a regular language LL, let Var(L)Var(L) be the minimal number of nonterminals necessary to generate LL by right linear grammars. Moreover, for natural numbers k1,k2,…,knk1,k2,…,kn and an nn-ary regularity preserving operation ff, let the range gVarf(k1,k2,…,kn)gVarf(k1,k2,…,kn) be the set of all numbers kk such that there are regular languages L1,L2,…,LnL1,L2,…,Ln with Var(Li)=kiVar(Li)=ki for 1≤i≤n1≤i≤n and Var(f(L1,L2,…,Ln))=kVar(f(L1,L2,…,Ln))=k. We show that, for the circular shift operation CircCirc, gVarCirc(n)gVarCirc(n) is infinite for all nn, and we completely determine the set gVarCirc(2)gVarCirc(2). Moreover, we give a precise range for the left right quotient and a partial result for the left quotient. Furthermore, we add some values to the range for the operation intersection which improves the result of [2].
We propose Asymmetric Simple Exclusion Processes to analyze the traffic states around a T-shaped intersection. The system consists of six roadways connected by the intersection. There are nine control-parameters separated into three categories: injection αi, removal βi, and turning Pi, (where i = 1, 2, 3). As these nine parameters change, traffic states on each roadway reveal a two-phase transition: free flow (F) and jam (J). Together, there can be 64 (=26) possible combinations for the traffic phases. We observe 63 distinct phases. We analyze three major causes of congestion:
(1) increase of traffic demand simulated by injection αi;
(2) decrease of roadway capacity simulated by removal βi;
(3) redistribution of traffic pattern simulated by turning Pi.
In case (1), congestion can be confined to the roadways heading toward the intersection. In case (2), spillovers can be observed and congestion will pervade the whole system. In case (3), congestion can be triggered by both increasing Pi and decreasing Pi. The phase diagram can be a convenient tool to summarize the results of numerical simulations. We also compare the unsignalized intersection to an intersection regulated by traffic signals. We find that the operation of traffic signals is very inefficient in resolving the congestion around a T-shaped intersection.
In real traffic, the right-turn vehicles at intersections are not controlled by signal lights and their effects are neglected. In this paper, we develop a cellular automaton model to formulate the complicated turning behaviors of vehicles at intersections. Simulation results are quite in accord with the observation on the Beijing's 4th ring road. It is found that the right-turn vehicles may produce queue near the intersection, a short lane designed for right-turn has prominent effect in improving traffic flow, but, a too long lane for right-turn cannot further decrease the stop ratio as expected. These findings deepen our understanding on the effects of right-turn vehicles and may help the design and management of intersections.
Using cellular automata (CA) Nagel–Schreckenberg (NaSch) model, we numerically study the probability Pac of the occurrence of car accidents at nonsignalized intersection when drivers do not respect the priority rules. We also investigated the impact of mixture lengths and velocities of vehicles on this probability. It is found that in the first case, where vehicles distinguished only by their lengths, the car accidents start to occur above a critical density ρc. Furthermore, the increase of the fraction of long vehicles (FL) delays the occurrence of car accidents (increasing ρc) and increases the risk of collisions when ρ > ρc. In other side, the mixture of maximum velocities (with same length for all vehicles) leads to the appearance of accidents at the intersection even in the free flow regime. Moreover, the increase of the fraction of fast vehicles (Ff) reduces the accident probability (Pac). The influence of roads length is also studied. We found that the decrease of the roads length enhance the risk of collision.
Various feedback strategies are proposed to improve the traffic flow. However, most of these works did not take road safety into consideration. In this paper, we studied the impact of four feedback strategies on the probability of rear-end collisions (PacPac). We proposed a new feedback strategy named Accidents Coefficient Feedback Strategy (ACFS) in which dynamic information can be generated and displayed on a board at the entrance of two-route scenario with intersection to help drivers to choose the appropriate road. This new strategy can greatly improve road safety and make the flow smooth as possible at the same time. Moreover, the impact of the intersection and boundary rates (αα and ββ) on PacPac is also studied.
Using the cellular automata Nagel–Schreckenberg (NaSch) model, we numerically study the impact of traffic lights on the probability of car accidents (PacPac) at the intersection of two roads. It is found that, the probability PacPac is more stable with variation of the green light (TT) when the symmetric lights are adopted. Moreover, simulation results show the existence of a critical time TcTc, below which (T<TcT<Tc) PacPac increases as the injection rate (αα) increases, however, above which (T>TcT>Tc) the growing of αα has for effect the decrease of PacPac. Furthermore, the decrease of PacPac is almost always accompanied by a loss of the flux, especially with asymmetrical lights. To overcome this problem, we proposed a strategy that can greatly increase the flux and keep the probability PacPac as small as possible, especially for the low and high injection rates.
In this paper, we proposed a model based on the connected vehicles to control the traffic flow at the intersection. To evaluate this model, we studied its impact on the flux and the probability of accidents at the intersection. On the one hand, simulation results showed that the increase in the number of connected vehicles decreases the congestion coefficient and enhances the traffic situation in the system. On the other hand, connected vehicles reduce the probability of accidents at the intersection. In this way, the vehicle intersection (central controller) communication can decrease the congestion and enhance road safety, especially in the intermediate and high traffic conditions.
We propose time-optimal algorithms for a number of convex polygon problems on meshes with multiple broadcasting. Specifically, we show that on a mesh with multiple broadcasting of size n × n, the task of deciding whether an n-gon is convex, deciding whether two convex n-gons edge-intersect, deciding whether one convex n-gon lies in the interior of another, as well as variants of the tasks of computing the intersection and union of two convex n-gons can be accomplished in Θ(log n) time. We also show that detecting whether two convex n-gons are separable takes O(1) time.
This paper explored the impacts of vehicle-to-infrastructure (V2I) communication on the mixed traffic flow consisting of connected vehicles (CVs) and human-driven vehicles (HVs). We developed a cellular automaton model for mixed flow at the signalized intersection. In addition to considering the motion characteristics of CVs and the influence of HVs on the motion behavior of CVs, the model also considered the influence of signal lights. CVs determine their velocities via V2I communication in order to pass the signal light with less delay and avoid stopping. Through simulations, we found that the presence, frequency and range of V2I communication all make a difference in the mixed flow. Also, 1-Hz communication reduces the number of vehicles within 300 m before the red light from 36 to 26, and the 10-Hz communication reduces one more; 1-Hz communication increases the number of accelerations, but when the frequency increases to 10 Hz, the number of accelerations decreases to the same value as without V2I communication, but the value of number of accelerations increases monotonously with the frequency; traffic delay decreases and capacity increases as the frequency increases. However, as the communication range increases, except that the number of accelerations first decreases and then increases, other traffic characteristics remain unchanged. The number of accelerations reaches a minimum at about 500 m.
Based on the cautious driving behavior and the principle of the vehicles at left-side having priority to pass in the intersection, a two-dimensional cellular automata model for planar signalized intersection (NS-STCA) is established. The different turning vehicles are regarded as the research objects and the effect of the left-turn probability, signal cycle, vehicle flow density on traffic flow at the intersection is investigated.
Detecting the interaction between humans and objects in images is a critical problem for obtaining a deeper understanding of the visual relationship in a scene and also a critical technology in many practical applications, such as augmented reality, video surveillance and information retrieval. Be that as it may, due to the fine-grained actions and objects in the real scene and the coexistence of multiple interactions in one scene, the problem is far from being solved. This paper differs from prior approaches, which focused only on the features of instances, by proposing a method that utilizes a four-stream CNNs network for human-object interaction (HOI) detection. More detailed visual features, spatial features and pose features from human-object pairs are extracted to solve the challenging task of detection in images. Specially, the core idea is that the region where people interact with objects contains important identifying cues for specific action classes, and the detailed cues can be fused to facilitate HOI recognition. Experiments on two large-scale HOI public benchmarks, V-COCO and HICO-DET, are carried out and the results show the effectiveness of the proposed method.
Octrees offer a powerful means for representing and manipulating 3-D objects. This paper presents an implementation of octree manipulations using a new approach on a shared memory architecture. Octrees are hierarchical data structures used to model 3-D objects. The manipulation of these data structures involves performing independent computations on each node of the octree. Octrees are much easier to deal with than other forms of representations used to model 3-D objects especially where extensive manipulations are involved. When these operations are distributed among multiple processing elements (PEs) and executed simultaneously, a significant speedup may be achieved. Manipulations such as a complement, a union, an intersection and other operations such as finding the volume and centroid which this paper describes are implemented on the Sequent Balance multiprocessor. In this approach the PEs are allocated dynamically, resulting in a uniform load balancing among them. The experimental results presented illustrate the feasibility of the approach. Although this evaluation has been originally done for shared memory machines, it will provide insight for the evaluation of other architectures.
Differential equations that switch between different modes of behavior across a surface of discontinuity are used to model, for example, electronic switches, mechanical contact, predator–prey preference changes, and genetic or cellular regulation. Switching in such systems is unlikely to occur precisely at the ideal discontinuity surface, but instead can involve various spatiotemporal delays or noise. If a system switches between more than two modes, across a boundary formed by the intersection of discontinuity surfaces, then its motion along that intersection becomes highly sensitive to such nonidealities. If switching across the surfaces is affected by hysteresis, time delay, or discretization, then motion along the intersection can be affected by erratic variations that we characterize as “jitter”. Introducing noise, or smoothing out the discontinuity, instead leads to steady motion along the intersection well described by the so-called canopy extension of Filippov’s sliding concept (which applies when the discontinuity surface is a simple hypersurface). We illustrate the results with numerical experiments and an example from power electronics, providing explanations for the phenomenon as far as they are known.
A set of spanning line segments (SLS) is a subset of the edges of a finite polyhedron in E3 such that an arbitrary plane intersects the polyhedron if and only if the plane intersects at least one of the line segments of the SLS. In this paper an algorithm is presented for computing an almost minimum set of spanning line segments for an arbitrary polyhedron . When the number of extreme vertices of
is odd, the computed SLS is minimum; when the number of extreme vertices of
is even, the size of the computed SLS is at most the minimum size plus one. The algorithm has linear-time complexity for a convex polyhedron, hence is optimal in this case; its time complexity is Θ(m log m) for an arbitrary polyhedron, where m is the number of vertices of the polyhedron.
In this paper, we propose new rules of advancing edges for computing the intersection of a pair of convex polygons in the plane. These rules have no ambiguities when extended into the spherical surface, differently from those of O'Rourke et al.4 Finally, we design a linear-time algorithm for computing the intersection of a pair of spherical convex polygons, and prove its correctness.
A robust and efficient surface intersection algorithm that is implementable in floating point arithmetic, accepts surfaces algebraic or otherwise and which operates without human supervision is critical to boundary representation solid modeling. To the author's knowledge, no such algorithms has been developed. All tolerance-based subdivision algorithms will fail on surfaces with sufficiently small intersections. Algebraic techniques, while promising robustness, are presently too slow to be practical and do not accept non-algebraic surfaces. Algorithms based on loop detection hold promise. They do not require tolerances except those associated with machine associated with machine arithmetic, and can handle any surface for which there is a method to construct bounds on the surface and its Gauss map. Published loop detection algorithms are, however, still too slow and do not deal with singularities. We present a new loop detection criterion and discuss its use in a surface intersection algorithms. The algorithm, like other loop detection based intersection algorithms, subdivides the surfaces into pairs of sub-patches which do not intersect in any closed loops. This paper presents new strategies for subdividing surfaces in a way that causes the algorithms to run quickly even when the intersection curve(s) contain(s) singularities.
Evaluating the intersection of two rational parametric surfaces is a recurring operation in solid modeling. However, surface intersection is not an easy problem and continues to be an active topic of research. The main reason lies in the fact that any good surface intersection technique has to balance three conflicting goals of accuracy, robustness and efficiency. In this paper, we formulate the problems of curve and surface intersections using algebraic sets in a higher dimensional space. Using results from Elimination theory, we project the algebraic set to a lower dimensional space. The projected set can be expressed as a matrix determinant. The matrix itself, rather than its symbolic determinant, is used as the representation for the algebraic set in the lower dimensional space. This is a much more compact and efficient representation. Given such a representation, we perform matrix operations for evaluation and use results from linear algebra for geometric operations on the intersection curve. Most of the operations involve evaluating numeric determinants and computing the rank, kernel and eigenvalues of matrices. The accuracy of such operations can be improved by pivoting or other numerical techniques. We use this representation for inversion operation, computing the intersection of curves and surfaces and tracing the intersection curve of two surfaces in lower dimension.
This paper investigates computational aspects of the well-known convexity theorem due to Helly, which states that the existence of a point in the common intersection of n convex sets is guaranteed by the existence of points in the common intersection of each combination of d+1 of these sets. Given an oracle which accepts d+1 convex sets and either returns a point in their common intersection, or reports its non-existence, we give two algorithms which compute a point in the common intersection of n such gets. The first algorithm runs in O(nd+1T) time and O(nd) space, where T is the time required for a single call to the oracle. The second algorithm is a multi-stage variant of the first by which the space complexity may be reduced to O(n) at the expense of an increase in the time complexity by a factor independent of n. We also show how these algorithms may be adapted to construct linear and spherical separators of a collection of sets, and to construct a translate of a given object which either contains, is contained by, or intersects a collection of convex sets.
Please login to be able to save your searches and receive alerts for new content matching your search criteria.