We consider the system AotoYamada_05__014. Alphabet: 0 : [] --> b cons : [b * a] --> a double : [] --> a -> a inc : [] --> a -> a map : [b -> b] --> a -> a nil : [] --> a plus : [b] --> b -> b s : [b] --> b times : [b] --> b -> b Rules: plus(0) x => x plus(s(x)) y => s(plus(x) y) times(0) x => 0 times(s(x)) y => plus(times(x) y) y map(f) nil => nil map(f) cons(x, y) => cons(f x, map(f) y) inc => map(plus(s(0))) double => map(times(s(s(0)))) This AFS is converted to an AFSM simply by replacing all free variables by meta-variables (with arity 0). We observe that the rules contain a first-order subset: plus(0) X => X plus(s(X)) Y => s(plus(X) Y) times(0) X => 0 times(s(X)) Y => plus(times(X) Y) Y Moreover, the system is orthogonal. Thus, by [Kop12, Thm. 7.55], we may omit all first-order dependency pairs from the dependency pair problem (DP(R), R) if this first-order part is terminating when seen as a many-sorted first-order TRS. According to the external first-order termination prover, this system is indeed terminating: || proof of resources/system.trs || # AProVE Commit ID: d84c10301d352dfd14de2104819581f4682260f5 fuhs 20130616 || || || Termination w.r.t. Q of the given QTRS could be proven: || || (0) QTRS || (1) QTRSRRRProof [EQUIVALENT] || (2) QTRS || (3) RisEmptyProof [EQUIVALENT] || (4) YES || || || ---------------------------------------- || || (0) || Obligation: || Q restricted rewrite system: || The TRS R consists of the following rules: || || plus(0, %X) -> %X || plus(s(%X), %Y) -> s(plus(%X, %Y)) || times(0, %X) -> 0 || times(s(%X), %Y) -> plus(times(%X, %Y), %Y) || || Q is empty. || || ---------------------------------------- || || (1) QTRSRRRProof (EQUIVALENT) || Used ordering: || Quasi precedence: || times_2 > plus_2 > s_1 || times_2 > 0 > s_1 || || || Status: || plus_2: multiset status || 0: multiset status || s_1: multiset status || times_2: multiset status || || With this ordering the following rules can be removed by the rule removal processor [LPAR04] because they are oriented strictly: || || plus(0, %X) -> %X || plus(s(%X), %Y) -> s(plus(%X, %Y)) || times(0, %X) -> 0 || times(s(%X), %Y) -> plus(times(%X, %Y), %Y) || || || || || ---------------------------------------- || || (2) || Obligation: || Q restricted rewrite system: || R is empty. || Q is empty. || || ---------------------------------------- || || (3) RisEmptyProof (EQUIVALENT) || The TRS R is empty. Hence, termination is trivially proven. || ---------------------------------------- || || (4) || YES || We use the dependency pair framework as described in [Kop12, Ch. 6/7], with dynamic dependency pairs. After applying [Kop12, Thm. 7.22] to denote collapsing dependency pairs in an extended form, we thus obtain the following dependency pair problem (P_0, R_0, minimal, formative): Dependency Pairs P_0: 0] map(F) cons(X, Y) =#> F(X) 1] map(F) cons(X, Y) =#> map(F) Y 2] map(F) cons(X, Y) =#> map#(F) 3] inc X =#> map(plus(s(0))) X 4] inc# =#> map#(plus(s(0))) 5] inc# =#> plus#(s(0)) 6] double X =#> map(times(s(s(0)))) X 7] double# =#> map#(times(s(s(0)))) 8] double# =#> times#(s(s(0))) Rules R_0: plus(0) X => X plus(s(X)) Y => s(plus(X) Y) times(0) X => 0 times(s(X)) Y => plus(times(X) Y) Y map(F) nil => nil map(F) cons(X, Y) => cons(F X, map(F) Y) inc => map(plus(s(0))) double => map(times(s(s(0)))) Thus, the original system is terminating if (P_0, R_0, minimal, formative) is finite. We consider the dependency pair problem (P_0, R_0, minimal, formative). We place the elements of P in a dependency graph approximation G (see e.g. [Kop12, Thm. 7.27, 7.29], as follows: * 0 : 0, 1, 2, 3, 4, 5, 6, 7, 8 * 1 : 0, 1, 2 * 2 : * 3 : 0, 1, 2 * 4 : * 5 : * 6 : 0, 1, 2 * 7 : * 8 : This graph has the following strongly connected components: P_1: map(F) cons(X, Y) =#> F(X) map(F) cons(X, Y) =#> map(F) Y inc X =#> map(plus(s(0))) X double X =#> map(times(s(s(0)))) X By [Kop12, Thm. 7.31], we may replace any dependency pair problem (P_0, R_0, m, f) by (P_1, R_0, m, f). Thus, the original system is terminating if (P_1, R_0, minimal, formative) is finite. We consider the dependency pair problem (P_1, R_0, minimal, formative). The formative rules of (P_1, R_0) are R_1 ::= map(F) cons(X, Y) => cons(F X, map(F) Y) inc X => map(plus(s(0))) X double X => map(times(s(s(0)))) X By [Kop12, Thm. 7.17], we may replace the dependency pair problem (P_1, R_0, minimal, formative) by (P_1, R_1, minimal, formative). Thus, the original system is terminating if (P_1, R_1, minimal, formative) is finite. We consider the dependency pair problem (P_1, R_1, minimal, formative). We will use the reduction pair processor [Kop12, Thm. 7.16]. As the system is abstraction-simple and the formative flag is set, it suffices to find a tagged reduction pair [Kop12, Def. 6.70]. Thus, we must orient: map(F, cons(X, Y)) >? F(X) map(F, cons(X, Y)) >? map(F, Y) inc(X) >? map(plus(s(0)), X) double(X) >? map(times(s(s(0))), X) map(F, cons(X, Y)) >= cons(F X, map(F, Y)) inc(X) >= map(plus(s(0)), X) double(X) >= map(times(s(s(0))), X) We apply [Kop12, Thm. 6.75] and use the following argument functions: pi( double(X) ) = #argfun-double#(map(times(s(s(0))), X), map(times(s(s(0))), X)) pi( inc(X) ) = #argfun-inc#(map(plus(s(0)), X), map(plus(s(0)), X)) We orient these requirements with a polynomial interpretation in the natural numbers. The following interpretation satisfies the requirements: #argfun-double# = \y0y1.3 + max(y0, y1) #argfun-inc# = \y0y1.3 + max(y0, y1) 0 = 0 cons = \y0y1.2 + y0 + y1 double = \y0.0 inc = \y0.0 map = \G0y1.y1 + G0(y1) + 3y1G0(y1) + 3G0(0) plus = \y0y1.0 s = \y0.0 times = \y0y1.0 Using this interpretation, the requirements translate to: [[map(_F0, cons(_x1, _x2))]] = 2 + x1 + x2 + 3x1F0(2 + x1 + x2) + 3x2F0(2 + x1 + x2) + 3F0(0) + 7F0(2 + x1 + x2) > F0(x1) = [[_F0(_x1)]] [[map(_F0, cons(_x1, _x2))]] = 2 + x1 + x2 + 3x1F0(2 + x1 + x2) + 3x2F0(2 + x1 + x2) + 3F0(0) + 7F0(2 + x1 + x2) > x2 + F0(x2) + 3x2F0(x2) + 3F0(0) = [[map(_F0, _x2)]] [[#argfun-inc#(map(plus(s(0)), _x0), map(plus(s(0)), _x0))]] = 3 + x0 > x0 = [[map(plus(s(0)), _x0)]] [[#argfun-double#(map(times(s(s(0))), _x0), map(times(s(s(0))), _x0))]] = 3 + x0 > x0 = [[map(times(s(s(0))), _x0)]] [[map(_F0, cons(_x1, _x2))]] = 2 + x1 + x2 + 3x1F0(2 + x1 + x2) + 3x2F0(2 + x1 + x2) + 3F0(0) + 7F0(2 + x1 + x2) >= 2 + x2 + F0(x2) + 3x2F0(x2) + 3F0(0) + max(x1, F0(x1)) = [[cons(_F0 _x1, map(_F0, _x2))]] [[#argfun-inc#(map(plus(s(0)), _x0), map(plus(s(0)), _x0))]] = 3 + x0 >= x0 = [[map(plus(s(0)), _x0)]] [[#argfun-double#(map(times(s(s(0))), _x0), map(times(s(s(0))), _x0))]] = 3 + x0 >= x0 = [[map(times(s(s(0))), _x0)]] By the observations in [Kop12, Sec. 6.6], this reduction pair suffices; we may thus replace a dependency pair problem (P_1, R_1) by ({}, R_1). By the empty set processor [Kop12, Thm. 7.15] this problem may be immediately removed. As all dependency pair problems were succesfully simplified with sound (and complete) processors until nothing remained, we conclude termination. +++ Citations +++ [Kop12] C. Kop. Higher Order Termination. PhD Thesis, 2012.