Divergence (computer science)In computer science, a computation is said to diverge if it does not terminate or terminates in an exceptional state.[1]: 377 Otherwise it is said to converge. In domains where computations are expected to be infinite, such as process calculi, a computation is said to diverge if it fails to be productive (i.e. to continue producing an action within a finite amount of time). DefinitionsVarious subfields of computer science use varying, but mathematically precise, definitions of what it means for a computation to converge or diverge. RewritingIn abstract rewriting, an abstract rewriting system is called convergent if it is both confluent and terminating.[2] The notation t ↓ n means that t reduces to normal form n in zero or more reductions, t↓ means t reduces to some normal form in zero or more reductions, and t↑ means t does not reduce to a normal form; the latter is impossible in a terminating rewriting system. In the lambda calculus an expression is divergent if it has no normal form.[3] Denotational semanticsIn denotational semantics an object function f : A → B can be modelled as a mathematical function where ⊥ (bottom) indicates that the object function or its argument diverges. Concurrency theoryIn the calculus of communicating sequential processes (CSP), divergence occurs when a process performs an endless series of hidden actions.[4] For example, consider the following process, defined by CSP notation: The traces of this process are defined as: Now, consider the following process, which hides the tick event of the Clock process: As cannot do anything other than perform hidden actions forever, it is equivalent to the process that does nothing but diverge, denoted . One semantic model of CSP is the failures-divergences models, which refines the stable failures model by distinguishes processes based on the sets of traces after which they can diverge. See alsoNotes
References
|