We consider the system AotoYamada_05__021. Alphabet: 0 : [] --> a cons : [a * b] --> b double : [b] --> b inc : [b] --> b map : [a -> a * b] --> b nil : [] --> b plus : [a] --> a -> a s : [a] --> a times : [a] --> a -> a Rules: plus(0) x => x plus(s(x)) y => s(plus(x) y) times(0) x => 0 times(s(x)) y => plus(times(x) y) y inc(x) => map(plus(s(0)), x) double(x) => map(times(s(s(0))), x) map(f, nil) => nil map(f, cons(x, y)) => cons(f x, map(f, y)) This AFS is converted to an AFSM simply by replacing all free variables by meta-variables (with arity 0). We observe that the rules contain a first-order subset: plus(0) X => X plus(s(X)) Y => s(plus(X) Y) times(0) X => 0 times(s(X)) Y => plus(times(X) Y) Y Moreover, the system is orthogonal. Thus, by [Kop12, Thm. 7.55], we may omit all first-order dependency pairs from the dependency pair problem (DP(R), R) if this first-order part is terminating when seen as a many-sorted first-order TRS. According to the external first-order termination prover, this system is indeed terminating: || proof of resources/system.trs || # AProVE Commit ID: d84c10301d352dfd14de2104819581f4682260f5 fuhs 20130616 || || || Termination w.r.t. Q of the given QTRS could be proven: || || (0) QTRS || (1) QTRSRRRProof [EQUIVALENT] || (2) QTRS || (3) RisEmptyProof [EQUIVALENT] || (4) YES || || || ---------------------------------------- || || (0) || Obligation: || Q restricted rewrite system: || The TRS R consists of the following rules: || || plus(0, %X) -> %X || plus(s(%X), %Y) -> s(plus(%X, %Y)) || times(0, %X) -> 0 || times(s(%X), %Y) -> plus(times(%X, %Y), %Y) || || Q is empty. || || ---------------------------------------- || || (1) QTRSRRRProof (EQUIVALENT) || Used ordering: || Quasi precedence: || times_2 > plus_2 > s_1 || times_2 > 0 > s_1 || || || Status: || plus_2: multiset status || 0: multiset status || s_1: multiset status || times_2: multiset status || || With this ordering the following rules can be removed by the rule removal processor [LPAR04] because they are oriented strictly: || || plus(0, %X) -> %X || plus(s(%X), %Y) -> s(plus(%X, %Y)) || times(0, %X) -> 0 || times(s(%X), %Y) -> plus(times(%X, %Y), %Y) || || || || || ---------------------------------------- || || (2) || Obligation: || Q restricted rewrite system: || R is empty. || Q is empty. || || ---------------------------------------- || || (3) RisEmptyProof (EQUIVALENT) || The TRS R is empty. Hence, termination is trivially proven. || ---------------------------------------- || || (4) || YES || We use the dependency pair framework as described in [Kop12, Ch. 6/7], with dynamic dependency pairs. After applying [Kop12, Thm. 7.22] to denote collapsing dependency pairs in an extended form, we thus obtain the following dependency pair problem (P_0, R_0, minimal, formative): Dependency Pairs P_0: 0] inc#(X) =#> map#(plus(s(0)), X) 1] inc#(X) =#> plus#(s(0)) 2] double#(X) =#> map#(times(s(s(0))), X) 3] double#(X) =#> times#(s(s(0))) 4] map#(F, cons(X, Y)) =#> F(X) 5] map#(F, cons(X, Y)) =#> map#(F, Y) Rules R_0: plus(0) X => X plus(s(X)) Y => s(plus(X) Y) times(0) X => 0 times(s(X)) Y => plus(times(X) Y) Y inc(X) => map(plus(s(0)), X) double(X) => map(times(s(s(0))), X) map(F, nil) => nil map(F, cons(X, Y)) => cons(F X, map(F, Y)) Thus, the original system is terminating if (P_0, R_0, minimal, formative) is finite. We consider the dependency pair problem (P_0, R_0, minimal, formative). We place the elements of P in a dependency graph approximation G (see e.g. [Kop12, Thm. 7.27, 7.29], as follows: * 0 : 4, 5 * 1 : * 2 : 4, 5 * 3 : * 4 : 0, 1, 2, 3, 4, 5 * 5 : 4, 5 This graph has the following strongly connected components: P_1: inc#(X) =#> map#(plus(s(0)), X) double#(X) =#> map#(times(s(s(0))), X) map#(F, cons(X, Y)) =#> F(X) map#(F, cons(X, Y)) =#> map#(F, Y) By [Kop12, Thm. 7.31], we may replace any dependency pair problem (P_0, R_0, m, f) by (P_1, R_0, m, f). Thus, the original system is terminating if (P_1, R_0, minimal, formative) is finite. We consider the dependency pair problem (P_1, R_0, minimal, formative). The formative rules of (P_1, R_0) are R_1 ::= inc(X) => map(plus(s(0)), X) double(X) => map(times(s(s(0))), X) map(F, cons(X, Y)) => cons(F X, map(F, Y)) By [Kop12, Thm. 7.17], we may replace the dependency pair problem (P_1, R_0, minimal, formative) by (P_1, R_1, minimal, formative). Thus, the original system is terminating if (P_1, R_1, minimal, formative) is finite. We consider the dependency pair problem (P_1, R_1, minimal, formative). We will use the reduction pair processor [Kop12, Thm. 7.16]. As the system is abstraction-simple and the formative flag is set, it suffices to find a tagged reduction pair [Kop12, Def. 6.70]. Thus, we must orient: inc#(X) >? map#(plus(s(0)), X) double#(X) >? map#(times(s(s(0))), X) map#(F, cons(X, Y)) >? F(X) map#(F, cons(X, Y)) >? map#(F, Y) inc(X) >= map(plus(s(0)), X) double(X) >= map(times(s(s(0))), X) map(F, cons(X, Y)) >= cons(F X, map(F, Y)) We apply [Kop12, Thm. 6.75] and use the following argument functions: pi( double(X) ) = #argfun-double#(map(times(s(s(0))), X)) pi( double#(X) ) = #argfun-double##(map#(times(s(s(0))), X)) pi( inc(X) ) = #argfun-inc#(map(plus(s(0)), X)) pi( inc#(X) ) = #argfun-inc##(map#(plus(s(0)), X)) We orient these requirements with a polynomial interpretation in the natural numbers. The following interpretation satisfies the requirements: #argfun-double# = \y0.3 + y0 #argfun-double## = \y0.3 + y0 #argfun-inc# = \y0.3 + y0 #argfun-inc## = \y0.3 + y0 0 = 0 cons = \y0y1.3 + y0 + 2y1 double = \y0.0 double# = \y0.0 inc = \y0.0 inc# = \y0.0 map = \G0y1.y1 + 3y1G0(y1) map# = \G0y1.3 + G0(y1) + 2y1G0(y1) plus = \y0y1.0 s = \y0.0 times = \y0y1.0 Using this interpretation, the requirements translate to: [[#argfun-inc##(map#(plus(s(0)), _x0))]] = 6 > 3 = [[map#(plus(s(0)), _x0)]] [[#argfun-double##(map#(times(s(s(0))), _x0))]] = 6 > 3 = [[map#(times(s(s(0))), _x0)]] [[map#(_F0, cons(_x1, _x2))]] = 3 + 2x1F0(3 + x1 + 2x2) + 4x2F0(3 + x1 + 2x2) + 7F0(3 + x1 + 2x2) > F0(x1) = [[_F0(_x1)]] [[map#(_F0, cons(_x1, _x2))]] = 3 + 2x1F0(3 + x1 + 2x2) + 4x2F0(3 + x1 + 2x2) + 7F0(3 + x1 + 2x2) >= 3 + F0(x2) + 2x2F0(x2) = [[map#(_F0, _x2)]] [[#argfun-inc#(map(plus(s(0)), _x0))]] = 3 + x0 >= x0 = [[map(plus(s(0)), _x0)]] [[#argfun-double#(map(times(s(s(0))), _x0))]] = 3 + x0 >= x0 = [[map(times(s(s(0))), _x0)]] [[map(_F0, cons(_x1, _x2))]] = 3 + x1 + 2x2 + 3x1F0(3 + x1 + 2x2) + 6x2F0(3 + x1 + 2x2) + 9F0(3 + x1 + 2x2) >= 3 + 2x2 + 6x2F0(x2) + max(x1, F0(x1)) = [[cons(_F0 _x1, map(_F0, _x2))]] By the observations in [Kop12, Sec. 6.6], this reduction pair suffices; we may thus replace the dependency pair problem (P_1, R_1, minimal, formative) by (P_2, R_1, minimal, formative), where P_2 consists of: map#(F, cons(X, Y)) =#> map#(F, Y) Thus, the original system is terminating if (P_2, R_1, minimal, formative) is finite. We consider the dependency pair problem (P_2, R_1, minimal, formative). We apply the subterm criterion with the following projection function: nu(map#) = 2 Thus, we can orient the dependency pairs as follows: nu(map#(F, cons(X, Y))) = cons(X, Y) |> Y = nu(map#(F, Y)) By [FuhKop19, Thm. 61], we may replace a dependency pair problem (P_2, R_1, minimal, f) by ({}, R_1, minimal, f). By the empty set processor [Kop12, Thm. 7.15] this problem may be immediately removed. As all dependency pair problems were succesfully simplified with sound (and complete) processors until nothing remained, we conclude termination. +++ Citations +++ [FuhKop19] C. Fuhs, and C. Kop. A static higher-order dependency pair framework. In Proceedings of ESOP 2019, 2019. [Kop12] C. Kop. Higher Order Termination. PhD Thesis, 2012.