SlideShare uma empresa Scribd logo
1 de 109
Baixar para ler offline
Completeness of Verification System with
Separation Logic for Recursive Procedures
Mahmudul Faisal Al Ameen
Doctor of Philosophy
Department of Informatics
School of Multidisciplinary Sciences
SOKENDAI (The Graduate University for Advanced Studies)
Tokyo, Japan
September 2015
A dissertation submitted to
the Department of Informatics
School of Multidisciplinary Sciences
SOKENDAI (The Graduate University for Advanced Studies)
in partial fulfillment of the requirements for
the degree of
Doctor of Philosophy
Advisory Committee
Makoto TATSUTA National Institute of Informatics, SOKENDAI
Zhenjiang HU National Institute of Informatics, SOKENDAI
Makoto KANAZAWA National Institute of Informatics, SOKENDAI
Shin Nakajima National Institute of Informatics, SOKENDAI
Yukiyoshi Kameyama University of Tsukuba
Thesis advisor: Makoto Tatsuta Mahmudul Faisal Al Ameen
SOKENDAI (The Graduate University for Advanced Studies)
Abstract
Completeness of Verification System with Separation Logic for
Recursive Procedures
The contributions of this dissertation are two results; the first result gives a new
complete Hoare’s logic system for recursive procedures, and the second result proves
the completeness of the verification system based on Hoare’s logic and separation
logic for recursive procedures. The first result is a complete verification system for
reasoning about while programs with recursive procedures that can be extended to
separation logic. To obtain it, this work introduces two new inference rules, shows
derivability of an inference rule and removes other redundant inference rules and an
unsound axiom for showing completeness. The second result is a complete verifica-
tion system, which is an extension of Hoare’s logic and separation logic for mutual
recursive procedures. To obtain the second result, the language of while programs
with recursive procedures is extended with commands to allocate, access, mutate and
deallocate shared resources, and the logical system from the first result is extended
with the backward reasoning rules of Hoare’s logic and separation logic. Moreover,
it is shown that the assertion language of separation logic is expressive relative to the
programs. It also introduces a novel expression that is used to describe the complete
information of a given state in a precondition. In addition, this work uses the nec-
essary and sufficient precondition of a program for the abort-free execution, which
enables to utilize the strongest postconditions.
iii
I would like to dedicate this thesis to my parents for their love, sac-
rifice and support all through my life.
Acknowledgments
I wish to express my deepest gratitude to my advisor, Prof. Makoto Tatsuta, for
giving me the opportunity, pre-admission supports and recommendations to study
in SOKENDAI with NII scholarship. His guidance with extreme patience is the key
driving force for my today’s success. His support not only encouraged but also nour-
ished me this last six years.Without his sincere efforts, I could not be in today’s posi-
tion. His lessons helped me realizing the rigid nature of mathematics and logic.
I am very grateful to my co-supervisor Asst. Prof. Makoto Kanazawa for being
one of the most important supporting force for my work. His suggestions at the logic
seminars have been a very good basis for elaborating and structuring my research; he
gave me enough feedback to mature my ideas while always pointing me interesting
directions.
I would like to thank Prof. Zhenjiang Hu for providing valuable remarks, allowing
me to improve the papers drafts and clarify my arguments. He has shown me a very
important horizon of program composition that enabled me to realize programs as
mathematical objects. His care and support especially kept me optimistic in my hard
times.
I am grateful to Prof. Kazushize Terui for his suggestions and support in the early
stage of my study.
I would like to express my sincere appreciation to Dr. Daisuke Kimura for giving
me a long time support as an elder brother. I am indebted to Dr. Takayuki Koai for his
v
sinceremostefforttomakemylifeeasyinJapan. Indifferentoccasions,theiracademic
tutoring, as well as mental supports, are never to be forgotten. Their explanation of
several basic and advanced topics often made me understand difficult mathematical
topics in an easy way. Special thanks to Dr. Kazuhiro Inaba for an important criticism
to show a gap in one of my earlier solution. Dr. Kazuyuki Asada’s advice and Mr.
Toshihiko Uchida’s friendly support also helped me to reach in this position. I am
very grateful to all of them. Moreover, I thank to Mr. Koike Atsushi for sharing his
LATEXclass file, based on which the class file for this dissertation is prepared.
I thank National Institute of Informatics for providing me the necessary financial
andresourcesupport. Further, IalsothankNIIofficialsatthestudentsupportsection
for their administrative assistances.
I owe to Prof. Dr. Shorif Uddin for driving me and directly helping me to apply to
NII for the doctoral study. I also thank Dr. Zahed Hasan and Mr. Enayetur Rahman
for their contributions in my life and research career.
Finally, I thank my parents for their love, sacrifices, long patience and endless en-
couragements. I thank my wife for her invaluable and direct care and support in the
final year. I also thank my sister, relatives and friends for their great mental support
that helped me to handle the challenging research situations.
Contents
1 Introduction 1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Main Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 Background 7
3 New Complete System of Hoare’s Logic with Recursive Proce-
dures 17
3.1 Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.2 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.3 Logical System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.4 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4 Separation Logic for Recursive Procedures 25
4.1 Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.2 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.3 Logical System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5 Soundness and Completeness 57
5.1 Soundness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5.2 Expressiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5.3 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6 Conclusion 97
1
1
1Introduction
1.1 Motivation
It is widely accepted that a program is needed to be verified to ensure that it is cor-
rect. A correct program guarantees to perform the given task as expected. It is very
important to ensure the safety of the mission-critical, medical, spacecraft, nuclear re-
actor, financial, genetic-engineeringandsimulatorprograms. Moreover, everyonede-
sires bug-free programs.
Formal verification is intended to deliver programs which are completely free of
bugs or defects. It verifies the source code of a program statically. So formal verifi-
cation does not depend on the execution of a program. The time required to verify a
program depends on neither its runtime and memory complexity nor the magnitude
of its inputs. A program is required to be verified only once since it does not depend
2 Chapter 1. Introduction
on test cases. Hence, formal verification of programs is important to save both time
and expenses commercially and for its supremacy theoretically. Among formal verifi-
cation approaches, model checking and Hoare’s logic are prominent. Model checking
computeswhetheramodelsatisfiesagivenspecification, whereasHoare’slogicshows
it for all models by provability [8].
Since it was proposed by Hoare [7], numerous works on Hoare’s logic have been
done [1, 4–6, 10]. Several extensions have also been proposed [1], among which
some attempted to verify programs that access heap or shared resources. But until
the twenty-first century begins, very few of them were simple enough to use. On the
otherhand, sincethedevelopmentofprogramminglanguageslikeCandC++, theus-
age of pointers in programs (which are called pointer programs) gained much popu-
larity for their ability to use shared memory and other resources directly and for faster
execution. Yet this ability also causes crashed programs for some reasons because it is
difficult to keep track of each memory operation. It may lead unsafe heap operation.
A program crash occurs when the program tries to access a memory cell that has al-
ready been deallocated before or when a memory cell is accessed before its allocation.
So apparently it became necessary to have an extension of Hoare’s logic that can verify
such pointer programs. In 2002, Reynolds proposed separation logic [13]. It was a
breakthrough to achieve the ability to verify pointer programs. Especially it can guar-
antee safe heap operations of programs. Although recently we can find several works
on separation logic [2, 9] and its extensions and applications [3, 11, 12], there are few
works found to show their completeness [14]. Tatsuta et al. [14] shows the complete-
ness of the separation logic for pointer programs which is introduced in [13]. In this
paper, we will show the completeness of an extended logical system. Our logical sys-
tem is intended to verify pointer programs with mutual recursive procedures. Among
several versions of the same inference rule Reynolds offered in [13] for separation
logic, a concise set of backward reasoning rules has been chosen in [14]. The later
work in [14] also offers rigorous mathematical discussions. The problems regarding
thecompletenessofHoare’slogic, theconceptofrelativecompleteness, completeness
ofHoare’slogicwithrecursiveproceduresandmanyotherimportanttopicshavebeen
discussed in detail in [1]. Our work begins with [14] and [1].
In modern days, programs are written in segments with procedures, which make
1.2 Main Contribution 3
the programs shorter in size and logically structured, and increases the reusability of
code. Soit is importanttouseprocedures andheapoperations(useof sharedmutable
resources)bothinasingleprogram. Verifyingprogramswithsuchfeaturesisthemain
motivation of our work.
1.2 Main Contribution
Our goal is to give a relatively complete logical system that can be used for reason-
ing about pointer programs with mutual recursive procedures. A logical system for
software verification is called complete if every true judgment can be derived from
that system. It ensures the strength of the logical system such that no further develop-
ment is necessary for it. If all true asserted programs are provable in Hoare’s system
where all true assertions are provided, we call it a relatively complete system. We will
show the relative completeness of our system. A language is expressive if the weakest
precondition can be defined in the language. We will also show that our language of
specification is expressive for our programs. Relative completeness is discussed vastly
in [1, 5]. In this paper, relative completeness is sometimes paraphrased as complete-
ness when it is not ambiguous.
The main contributions of our paper are as follows:
(1) A new complete logical system for Hoare’s logic for recursive procedures.
(2) A new logical system for verification of pointer programs and recursive proce-
dures.
(3) Proving the soundness and the completeness theorems.
(4) Proving that our assertion language is expressive relative our programs.
We know that Hoare’s logic with recursive procedures is complete [1]. We also
know that Hoare’s logic with separation logic is complete [14]. But we do not know
if Hoare’s logic and separation logic for recursive procedures is complete.
In order to achieve our contributions, we will first construct our logical system by
4 Chapter 1. Introduction
combining the axioms and inference rules of [1] and [14]. Then we will prove the
expressiveness by coding the states in a similar way to [14]. At last we will follow a
similar strategy in [1] to prove the completeness.
Although one may feel it easy to combine these two logical systems to achieve such
a complete system, in reality it is not the case. Now we will discuss some challenges
we face to prove its relative completeness.
(1) The axiom (AXIOM 9: INVARIANCE AXIOM), which is essential in [1] to
show completeness, is not sound in separation logic.
(2) In the completeness proof of the extension of Hoare’s logic for the recursive
procedures in [1], the expression −→x = −→z (−→x are all program variables and −→z are
fresh) is used to describe the complete information of a given state in a precondition.
A state in Hoare’s logic is only a store, which is a function assigning to each variable a
value. In separation logic, a state is a pair of a store and a heap. So the same expres-
sion cannot be used for a similar purpose for a heap, because a store information may
contain variables x , . . . , xm which are assigned z , . . . , zm respectively, while a heap
information consists of the set of the physical addresses only in the heap and their
corresponding values. The vector notation cannot express the general information of
the size of the heap and its changes because of allocation and deallocation of memory
cells.
(3) Another challenge is to utilize the strongest postcondition of a precondition
and a program. In case a program aborts in a state for which the precondition is valid,
the strongest postcondition of the precondition and the program does not exist. But
utilizing the strongest postcondition is necessary for completeness proof, because the
completeness proof of [1] depends on it.
Now it is necessary to solve these obstacles for the proof of the completeness of our
system. That is why it is quite challenging to solve the completeness theorem which
is our principal goal.
The solutions to the challenges stated above are as follows:
(1) We will give an inference rule (inv-conj) as an alternative to the axiom (AX-
1.2 Main Contribution 5
IOM9: INVARIANCEAXIOM)in[1]. Itwillacceptapureassertionwhichdoesnot
have a variable common to the program. We will also give an inference rule (exists)
that is analogous to the existential introduction rule in the first-order predicate cal-
culus. We will show that the inference rule (RULE 10: SUBSTITUTION RULE I)
in [1] is derivable in our system. Since the inference rules (RULE 11: SUBSTITU-
TION RULE II) and (RULE 12: CONJUNCTION RULE) in [1] are redundant in
our system, we will remove them. It gives us the new complete system for Hoare’s
logic for mutual recursive procedures. We will extend this system with the inference
rules in [14] to give the verification system for pointer programs with mututal recur-
sive procedures. As a result, the set of our axioms and inference rules will be quite
different from the union of those of [1] and [14].
(2) We will give an appropriate assertion to describe the complete information of
a given state in a precondition. Beside the expression −→x = −→z for the store infor-
mation, we will additionally use the expression Heap(xh) for the heap information,
where xh keeps a natural number that is obtained by a coding of the current heap.
(3) For pointer programs, it is difficult to utilize the strongest postcondition be-
cause it is impossible to assert a postcondition for A and P where P may abort in a
state for which A is true. We use {A}P{True} as the abort-free condition of A and P.
For the existence of the strongest postcondition, it is necessary for {A}P{True} to be
true. We will give the necessary and sufficient precondition WP,True(−→x ) for the fact
that the program P will never abort.
Outline of This Paper
Our background will be presented in Chapter 2. A new complete Hoare’s logic for
recursive procedures will be given in Chapter 3. We will define our languages, seman-
tics and the logical system in Chapter 4. In Chapter 5, we will prove the soundness,
expressiveness and completeness. We will conclude in Chapter 6.
7
2Background
Hoare introduced an axiomatic method, Hoare’s logic, to prove the correctness of
programs in 1969 [7]. Floyd’s intermediate assertion method was behind the ap-
proach of Hoare’s logic. Besides its great influence in designing and verifying pro-
grams, it has also been used to define the semantics of programming languages.
Among many works which extended the approach of Hoare to prove a program
correct, some were not useful or difficult to use. In 1981, Apt presented a survey of
variousresultsconcerningtheapproachofHoarein[1]. Hisworkemphasizedmainly
on the soundness and completeness issues. He first presented the proof system for
while programs along with its soundness, expressiveness, and completeness. He then
presented the extension of it to recursive procedures, local variable declarations and
procedures with parameters with corresponding soundness and completeness.
Although several extensions of Hoare’s logic found in literature, until the work of
8 Chapter 2. Background
Reynolds in 2002 [13], an extension to pointer programs was missing. Reynolds pro-
vided the separation logic, which is an assertion language for shared mutable data
structures and resources. In his work, he also extended Hoare’s logic to pointer pro-
grams with several sets of logical rules and showed its soundness. Tatsuta chose only
the rules with forward reasoning and proved its completeness in [14]. The scope of
this logical system was limited only to while programs with memory access opera-
tions.
Since this work intends to extend Hoare’s logic and separation logic to mutual re-
cursive procedures, it will be based on [1] and [14].
Hoare’s Logic for Recursive Procedures
In this section, we will discuss the verification system for while programs with re-
cursive procedures given in [1]. Here we will present the language of the while pro-
grams with recursive procedures and the assertions, their semantics, a logical system
to reason about the programs and the completeness proof. We will extend this proof
to pointer programs with recursive procedures later.
Language
The language of assertion in [1] is the first order language with equality of Peano
arithmetic. Its variables are denoted by x, y, z, w, . . .. Expressions, denoted by e, are
defined by e ::= x | | | e + e | e × e. A quantifier-free formula b is defined by
b ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b.
The formula (of the assertion language), denoted by A, B, C, is defined by
A ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b | ∀xA | ∃xA.
9
RecursiveproceduresaredenotedbyR. Whileprogramsextendedtorecursivepro-
cedures, denoted by P, Q, is defined in [1] by
P, Q ::= x := e
| if (b) then (P) else (P)
| while (b) do (P)
| P; P
| skip
| R.
We assume that procedure R is declared with its body Q.
The basic formula of Hoare’s logic is composed with three elements. They are two
assertions A and B and a program P. It is expressed in the form
{A}P{B}
that is also called a correctness formula or an asserted program. Here A and B are
called the precondition and the postcondition of the program P respectively. When-
ever A is true before the execution of P and the execution terminates, B is true after
the execution.
Semantics
States,denotedbys,aredefinedin[1]asafunctionfromthesetofvariablesVtothe
set of natural numbers N. The semantics of the programming language is defined first
by P −
for the programs those do not contain procedures, which is a partial function
from States to States.
The semantics of assertions is denoted by A s that gives us the truth value of A at
the state s.
10 Chapter 2. Background
Definition 2.0.1 The definition of semantics of programs is given below.
x := e −
(s) = s[x := e s],
if (b) then (P ) else (P ) −
(s) =
{
P (s) if b s = True
P (s) otherwise,
while (b) do (P) −
=
{
s if b s = False
while (b) do (P) −
( P −
(s)) otherwise,
P ; P −
(s) = P −
( P −
(s))
skip −
(s) = s.
In order to define the semantics of programs which include recursive procedures,
Apt provided the approximation semantics of programs. He defined a procedure-less
program P(n)
by induction on n:
P( )
= Ω,
P(n+ )
= P[Q(n)
/R].
He then defined the semantics of programs by
P =
∞∪
i=
P[Q(i)
/R] −
An asserted program (or a correctness formula) {A}P{B} is defined to be true if
and only if for all states s, s′
, if A s = True and P (s) = s′
then B s′ = True.
Logical System
The logical system H given in [1] consists of the following axioms and inference
rules. Here Γ is used as a set of asserted programs. A judgment is defined as Γ ⊢
{A}P{B}. var(P) is defined as all the variables appeared in the execution of P.
11
AXIOM 1: ASSIGNMENT AXIOM
Γ ⊢ {A[x := e]}x := e{A}
(assignment)
RULE 2: COMPOSITION RULE
Γ ⊢ {A}P {C} Γ ⊢ {C}P {B}
Γ ⊢ {A}P ; P {B}
(composition)
RULE 3: if-then-else RULE
Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B}
Γ ⊢ {A}if (b) then (P ) else (P ){B}
(if-then-else)
RULE 4: while RULE
Γ ⊢ {A ∧ b}P{A}
Γ ⊢ {A}while (b) do (P){A ∧ ¬b}
(while)
RULE 5: CONSEQUENCE RULE
Γ ⊢ {A }P{B }
Γ ⊢ {A}P{B}
(consequence)
(A → A , B → B true)
RULE 8: RECURSION RULE
Γ ∪ {A}R{B} ⊢ {A}Q{B}
Γ ⊢ {A}R{B}
(recursion)
AXIOM 9: INVARIANCE AXIOM
Γ ⊢ {A}R{A}
(invariance)
(FV(A) ∩ var(R) = ∅)
12 Chapter 2. Background
RULE 10: SUBSTITUTION RULE I
Γ ⊢ {A}R{B}
Γ ⊢ {A[−→z := −→y ]}R{B[−→z := −→y ]}
(substitution I)
(−→y , −→z ̸∈ var(R))
RULE 11: SUBSTITUTION RULE II
Γ ⊢ {A}R{B}
Γ ⊢ {A[−→z := −→y ]}R{B}
(substitution II)
(−→z ̸∈ var(R) ∪ FV(B))
RULE 12: CONJUNCTION RULE
Γ ⊢ {A}R{B} Γ ⊢ {A′
}R{C}
Γ ⊢ {A ∧ A′
}R{B ∧ C}
(conjunction)
An asserted formula {A}x := e{A[x := e]} may seems fit more for the assignment
axiom. But it is not. For an example like {x = a ∧ b = a + }x := b{(x = a ∧ b =
a+ )[x := b]}, that is {x = a∧b = a+ }x := b{(b = a∧b = a+ )}, is not obvi-
ously true. Rather {(x = b)[x := b]}x := b{x = b}, that is {b = b}x := b{x = b}
istrue. Thecompositionruleissimilartothecutruleinconcept. Whentwoprograms
are executed one after another, the postcondition of the former one is the precondi-
tionofthelater. Thepreconditionoftheformeroneandthepostconditionofthelater
one are preserved for the execution of the composition of those two programs. The
if-then-else rule comes from the fact that truth value of b determines the execution of
either P or P . The rule itself is very natural. The while rule is a bit tricky. Here A
is called a loop invariant, which is an assertion that is preserved before and after the
execution of P. The truthness of b triggers execution of P and naturally the execution
terminates only when b is false.
The consequence rule is not any ordinary rule like others. Here the important fact
is that a stronger precondition and a weaker postcondition may replace respectively
the precondition and the postcondition of a valid asserted program without affecting
itsvalidity. Withanexample,itmayhelptounderstanditbetter. Theassertedprogram
13
{x = a}x := x + {x = a + } is indeed valid. But from assignment axiom we may
get only {(x = a+ )[x := x+ ]}x := x+ {x = a+ }, that is {x+ = a+ }x :=
x + {x = a + }. Since x = a → x + = a + , with the help of the consequence
rule now we finally get {x = a}x := x + {x = a + }.
Recursion rule states that if the assumption of a valid asserted program for a re-
cursive procedure gives us a valid asserted program for its body, we can say that the
asserted program for the procedure is indeed valid. The invariance axiom confirms
us that the precondition is preserved in the postcondition if none of its variables is
accessed by the execution of a recursive procedure. Substitution rule I allows vari-
able substitution in the assertions in an asserted program if the recursive procedure
does not access the substituted and substituting variables. Substitution rules II allows
the substitution of only the variables in the precondition if those neither appear in
the recursive procedure nor in the postcondition. Conjunction rule allows the pre-
conditions and the postconditions of two asserted programs for the same recursive
procedure to be conjoined.
Soundness
In [1], ⊢H {A}P{B} denotes the fact that {A}P{B} is provable in the logical sys-
tem H, which uses the assumption that all the true assertions are provided (for conse-
quence rule). In his work, the notion of the truth of an asserted program is introduced
where he chose the standard interpretation of the assertion language with the domain
of natural numbers.
Apt called an asserted program valid if it is true under all interpretations. He also
called a proof rule sound if for all interpretations it preserves the truth of asserted pro-
grams. Since it is easy to prove that the axioms are valid and the proof rules are sound,
it can be said that the logical system is proved to be sound by induction on the length
of proofs.
His soundness theorem claims that for every asserted program {A}P{B} in the
logical system H, if ⊢H {A}P{B} is provable under the presence of all true assertions
then {A}P{B} is true.
14 Chapter 2. Background
Completeness in the sense of Cook
Thestrongestpostconditionofanassertionandaprogramandtheweakestprecon-
dition of a program and an assertion have a key role in defining the completeness in
the sense of Cook of such a proof system where general completeness does not hold.
Now we will define the strongest postcondition and the weakest precondition.
Definition 2.0.2 ThestrongestpostconditionofanassertionAandaprogramPisdefined
by
SP(A, P) = { s′
| ∃s( A s ∧ P (s) = s′
) }.
The weakest precondition of a program P and an assertion A is defined by
WP(P, A) = { s | ∀s′
( P (s) = s′
→ A s′ ) }.
Definition 2.0.3 An assertion language L is said to be expressive relative to the set of
programs P if for all assertions A ∈ L and programs P ∈ P, there exists an assertion
S ∈ L which defines the strongest postcondition SP(A, P).
Definition 2.0.4 A proof system G for a set of programs P is said to be complete in the
sense of Cook if, for all L such that L is expressive relative to P and for every asserted
formula {A}P{B}, if {A}P{B} is true then ⊢G {A}P{B} is provable.
Apt presented the proof of the completeness of the system in the sense of Cook in
[1] using two central lemmas. We will present them and discuss their proof. Assume
that −→x is the sequence of all variables which occur in P and −→z is a sequence of some
new variables and both of their lengths are same. Assume that the assertion language
is expressive for the logical system. So, there exists an assertion S that defines the
strongest postcondition of −→x = −→z and R. The asserted program {−→x = −→z }R{S}
is the most general formula for R, since any other true asserted program about R can
be derived from {−→x = −→z }R{S}. This claim is the contents of the first lemma.
Lemma 2.0.5 (Apt 1981) if {A}P{B} is true then {−→x = −→z }R{S} ⊢ {A}P{B} is
provable provided that all the true assertions are given.
Proof.
15
It is proved by induction on P where the most interesting case is P = R. Other
cases are similar to that of the system H in [1].
SupposethatPisR. Assume{A}R{B}istrue. Wehave⊢ {−→x = −→z }R{S}. LetA
be A[−→z := −→u ] and B be B[−→z := −→u ] where −→u ̸∈ FV(B) ∪ var(R). By invariance
axiom, ⊢ {A [−→x := −→z ]}R{A [−→x := −→z ]} is provable. By the conjunction rule,
⊢ {−→x = −→z ∧ A [−→x := −→z ]}R{S ∧ A [−→x := −→z ]} is provable since FV(A [−→x :=
−→z ]) ∩ var(R) = ∅. We now show that S ∧ A [−→x := −→z ] → B .
Assume S ∧ A [−→x := −→z ] s = True. By definition S s = True. By the prop-
erty of the strongest postcondition, there exists a state s′
such that Ri (s′
) = s and
−→x = −→z s′ = True.
By invariance axiom, ⊢ {¬A [−→x := −→z ]}R{¬A [−→x := −→z ]} is provable. The
by conjunction rule ⊢ {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S ∧ ¬A [−→x := −→z ]}. By
soundness, {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S ∧ ¬A [−→x := −→z ]} is true. Now
suppose that ¬ A [−→x := −→z ] s′ = True. Hence −→x = −→z ∧ ¬A [−→x := −→z ] s′ =
True. Therefore, ¬ A [−→x := −→z ] s = True. But A [−→x := −→z ] s = True by the
assumption for s′
. It contradicts the assumption and hence A [−→x := −→z ] s′ = True.
Since−→x = −→z ∧A [−→x := −→z ]→A ,wehave A s′ = True. Then A s′[−→z :=
−−→
s′(u)]
=
True. Then R (s′
[−→z :=
−−→
s′
(u)]) = s[−→z :=
−→
s(u)] since −→z , −→u ̸∈ var(R). Then by
definition, B s[−→z :=−→u ]True. Then by definition, B s = True. Hence S ∧ A [−→x :=
−→z ] → B is true.
Then by the consequence rule, ⊢ {−→x = −→z ∧ A [−→x := −→z ]}R{B } is provable.
Then by the substitution rule II, ⊢ {−→x = −→x ∧ A }R{B } is provable. Then by the
consequence rule, ⊢ {A }Ri{B } is provable. By the substitution rule I, ⊢ {A [−→u :=
−→z ]}Ri{B [−→u := −→z ]}. We have A → A [−→u := −→z ] and B [−→u := −→z ] → B. Then by
the consequence rule, ⊢ {A}P{B} provable. 2
Lemma 2.0.6 (Apt 1981) The next lemma in [1] claims that ⊢ {−→x = −→z }R{S} is
provable.
Proof.
16 Chapter 2. Background
By definition of S, {−→x = −→z }R{S} is true and hence {−→x = −→z }Q{S} is true
since R = Q . By the Lemma 2.0.5, {−→x = −→z }R{S} ⊢ {−→x = −→z }Q{S} is
provable. By the recursion rule, ⊢ {−→x = −→z }R{S} is provable. 2
The completeness theorem states that if an asserted program {A}P{B} is true then
⊢ {A}P{B}isprovablewhereallthetrueassertionsaregiven. Itisthecentralconcept
of completeness in the sense of Cook.
Theorem 2.0.7 (Apt 1981) If {A}P{B} is true then ⊢ {A}P{B} is provable.
Proof.
Assume {A}P{B} is true. By Lemma 2.0.5, {−→x = −→z }R{S} ⊢ {A}P{B} is
provable. By Lemma 2.0.6, ⊢ {−→x = −→z }R{S} is provable. Then ⊢ {A}P{B} is
provable. 2
17
3
New Complete System of Hoare’s Logic
with Recursive Procedures
We introduce a complete system of Hoare’s logic with recursive procedures. Apt
gave a system for the same purpose and showed its completeness in [1]. Our system
is obtained from Apt’s system by replacing the invariance axiom, the substitution rule
I, the substitution rule II, and the conjunction rule by the rules (inv-conj) and (ex-
ists). Apt suggested without proofs that one could replace them by his substitution
rule I, (inv-conj), and (exists) to get another complete system. We prove that the sub-
stitution rule I can actually be derived in our system. We also give a detailed proof of
the completeness of our system.
18 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures
3.1 Language
Our assertion is a formula A of Peano arithmetic. We define the language L as
follows.
Definition 3.1.1 Formulas A are defined by
A ::= e = e | e < e | ¬A | A ∧ A | A ∨ A | A → A | ∀xA | ∃xA
We will sometimes call a formula an assertion.
We define FV(A) as the set of free variables in A. We define FV(e) similarly.
Our program is a while-program with parameterless recursive procedures.
Definition 3.1.2 Programs, denoted by P,Q, are defined by
P, Q ::= x := e
| if (b) then (P) else (P)
| while (b) do (P)
| P; P
| skip
| Ri.
b is a formula without the quantifiers. Ri is a parameter-less procedure name having
Qi as its definition body. We define the language L−
as L excluding the construct R.
An asserted program is defined by {A}P{B}, which means the partial correctness.
3.2 Semantics
Definition 3.2.1 We define the semantics of our programming language. For a program
P, its meaning P is defined as a partial function from States to States. We will define
3.3 Logical System 19
P (r ) as the resulting state after termination of the execution of P with the initial state r .
If the execution of P with the initial state r does not terminate, we leave P (r ) undefined.
In order to define P , we would like to define P −
for all P in the language L−
. We define
P −
by induction on P in L−
as follows:
x := e −
(s) = s[x := e s],
if (b) then (P ) else (P ) −
(s) =
{
P (s) if b s = True
P (s) otherwise,
while (b) do (P) −
=
{
s if b s = False
while (b) do (P) −
( P −
(s)) otherwise,
P ; P −
(s) = P −
( P −
(s))
skip −
(s) = s.
Definition 3.2.2 For an asserted program {A}P{B}, the meaning of {A}P{B} is de-
fined as True or False. {A}P{B} is defined to be True if the following holds.
For all s and s′
, if A s = True and P (s) = s′
, then B s′ = True.
Definition 3.2.3 The semantics of P in L is defined by
P (s) =
{
s′
if { P(i) −
(s)|i ≥ } = {s′
}
undefined if { P(i) −
(s)|i ≥ } = ∅
3.3 Logical System
This section defines the logical system.
We will write A[x := e] for the formula obtained from A by replacing x by e.
Definition 3.3.1 Our logical system consists of the following inference rules. As men-
tionedinprevioussection, wewilluseΓforasetofassertedprograms. Ajudgmentisdefined
as Γ ⊢ {A}P{B}.
20 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures
Γ ⊢ {A}skip{A}
(skip)
Γ, {A}P{B} ⊢ {A}P{B}
(axiom)
Γ ⊢ {A[x := e]}x := e{A}
(assignment)
Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B}
Γ ⊢ {A}if (b) then (P ) else (P ){B}
(if )
Γ ⊢ {A ∧ b}P{A}
Γ ⊢ {A}while (b) do (P){A ∧ ¬b}
(while )
Γ ⊢ {A}P {C} Γ ⊢ {C}P {B}
Γ ⊢ {A}P ; P {B}
(comp)
Γ ⊢ {A }P{B }
Γ ⊢ {A}P{B}
(conseq)
(A → A , B → B)
Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {A }Q {B }
...
Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {Anproc }Qnproc {Bnproc }
Γ ⊢ {Ai}Ri{Bi}
(recursion)
Γ ⊢ {A}P{C}
Γ ⊢ {A ∧ B}P{C ∧ B}
(inv-conj)
(FV(B) ∩ Mod(P) = ∅)
3.4 Completeness 21
Γ ⊢ {A}P{B}
Γ ⊢ {∃x.A}P{B}
(exists)
(x /∈ FV(B) ∪ EFV(P))
We say {A}P{B} is provable and we write ⊢ {A}P{B}, when ⊢ {A}P{B} can be
derived by these inference rules.
The rule (exists) is analogous to the rule existential introduction of propositional
calculus.
3.4 Completeness
Lemma 3.4.1 If {A}P {B} is true and P = P then {A}P {B} is true.
Proof.
By definition.
Definition 3.4.2 X is called the strongest postcondition of P and A if and only if the fol-
lowing holds.
(1) For all s, s′
, if A s = True and P (s) = s′
then s′
∈ X.
(2) For all Y, if ∀s, s′
( A s = True ∧ P (s) = s′
→ s′
∈ Y) then X ⊆ Y.
Definition 3.4.3 SA,P(−→x ) is defined as the strongest postcondition for A and P.
SA,P(−→x ) gives the strongest assertion S such that {A}P{S} is true.
Lemma 3.4.4 If⊢ {A}P{B}then⊢ {A[−→x := −→z ]}P{B[−→x := −→z ]}where−→z , −→x ̸∈
EFV(P).
Proof. Assume ⊢ {A}P{B} and −→z ̸∈ EFV(P). Then by (inv-conj), ⊢ {A ∧ −→x =
−→z }P{B ∧ −→x = −→z }. We have B ∧ −→x = −→z → B[−→x := −→z ]. Then by (con-
seq), ⊢ {A ∧ −→x = −→z }P{B[−→x := −→z ]}. Then by (exists), ⊢ {∃−→z (A ∧ −→x =
22 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures
−→z )}P{B[−→x := −→z ]}. We have A[−→x := −→z ] → ∃−→z (A ∧ −→x = −→z ). Then by
(conseq), ⊢ {A[−→x := −→z ]}P{B[−→x := −→z ]}. 2
Lemma 3.4.5 If {A}P{B} is true and −→z ̸∈ EFV(P) then Γ ⊢ {A}P{B} where
Γ = {{−→x = −→z }Ri{S−→x =−→z ,Ri
(−→x )}|i = , . . . , n}, −→x = x , . . . , xm and {xj|j =
, . . . , m} = EFV(P).
Proof. We will prove it by induction on P. We will consider the cases of P.
If P is other than Ri, the proof of these cases are similar to those of completeness
proof of H given in [1].
Case P is Ri.
Assume {A}Ri{B} is true and ⃗z ̸∈ EFV(Ri). We have Γ ⊢ {−→x =
−→z }Ri{S−→x =−→z ,Ri
(−→x )}. Let A be A[−→z := −→u ] and B be B[−→z := −→u ] where
−→u ̸∈ FV(B) ∪ EFV(Ri). By the rule (inv-conj), Γ ⊢ {−→x = −→z ∧ A [−→x :=
−→z ]}Ri{S−→x =−→z ,Ri
(−→x ) ∧ A [−→x := −→z ]} since FV(A [−→x := −→z ]) ∩ Mod(Ri) = ∅.
We now show that S−→x =−→z ,Ri
(−→x ) ∧ A [−→x := −→z ] → B .
Assume S−→x =−→z ,R(−→x ) ∧ A [−→x := −→z ] s = True. By definition
S−→x =−→z ,Ri
(−→x ) s = True. By the property of the strongest postcondition, there
exists a state s′
such that Ri (s′
) = s and −→x = −→z s′ = True.
Now suppose that ¬ A [−→x := −→z ] s′ = True. By (inv-conj), Γ ⊢ {−→x = −→z ∧
¬A [−→x := −→z ]}Ri{S−→x =−→z ,Ri
(−→x ) ∧ ¬A [−→x := −→z ]}. Since Γ is true by definition,
by soundness, {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S−→x =−→z ,Ri
(−→x ) ∧ ¬A [−→x := −→z ]} is
true. Then by definition ¬ A [−→x := −→z ] s = True. But A [−→x := −→z ] s = True. It
contradicts the assumption and hence A [−→x := −→z ] s′ = True.
Since−→x = −→z ∧A [−→x := −→z ]→A ,wehave A s′ = True. Then A s′[−→z :=
−−→
s′(u)]
=
True. Then Ri (s′
[−→z :=
−−→
s′
(u)]) = s[−→z :=
−→
s(u)] since −→z , −→u ̸∈ EFV(Ri). Then by
definition, B s[−→z :=−→u ]True. Then by definition, B s = True. Hence S−→x =−→z ,R(−→x )∧
A [−→x := −→z ] → B is true. Then by the rule (conseq), Γ ⊢ {−→x = −→z ∧ A [−→x :=
−→z ]}Ri{B }. Then by the rule (exists), Γ ⊢ {∃−→z (−→x = −→z ∧A [−→x := −→z ])}Ri{B }.
We have A → ∃−→z (−→x = −→z ∧ A [−→x := −→z ]). Then by the rule (conseq), Γ ⊢
3.4 Completeness 23
{A }Ri{B }. By Lemma 3.4.4, Γ ⊢ {A [−→u := −→z ]}Ri{B [−→u := −→z ]}. We have
A → A [−→u := −→z ] and B [−→u := −→z ] → B. Then by (conseq), Γ ⊢ {A}P{B}. 2
Next lemma shows that the hypothesis {−→x = −→z (−→x )}R{S−→x =−→z ,R(−→x )} used in
lemma 5.3.7 is provable in the our system.
Lemma 3.4.6 ⊢ {−→x = −→z }Rj{S−→x =−→z ,Rj
(−→x )} for j = , . . . , n where −→x =
x , . . . , xm, {xj|j = , . . . , m} =
∪n
i= EFV(Ri) and −→z ̸∈
∪n
i= EFV(Ri).
Proof.
Assume −→z ̸∈
∪n
i= EFV(Ri) and −→x = x , . . . , xm where {xj|j = , . . . , m} =
∪n
i= EFV(Ri).
Fix j. Assume −→x = −→z s = True. Assume Qj (s) = r where Qj is the body
of Rj. Then by Lemma 3.4.1, Rj (s) = r. By definition, S−→x =−→z ,Rj
(−→x ) r = True.
Then by definition, {−→x = −→z }Qj{S−→x =−→z ,Rj
(−→x )} is true. By Lemma 3.4.5, {{−→x =
−→z }Ri{S−→x =−→z ,Ri
(−→x )}|i = , . . . , n} ⊢ {−→x = −→z }Qj{S−→x =−→z ,Rj
(−→x )}. Hence,
{{−→x = −→z }Ri{S−→x =−→z ,Ri
(−→x )}|i = , . . . , n} ⊢ {−→x = −→z }Qj{S−→x =−→z ,Rj
(−→x )}
for all j = , . . . , n. Then by the rule (recursion), ⊢ {−→x = −→z }Rj{S−→x =−→z ,Rj
(−→x )}.
2
The following theorem is the key theorem of this paper. It says that our system is
complete.
Theorem 3.4.7 If {A}P{B} is true then {A}P{B} is provable.
Proof.
Assume {A}P{B} is true. Let −→z be such that −→z ̸∈
∪n
i= EFV(Ri) ∪ EFV(P)
and −→x = x , . . . , xm where {xj|j = , . . . , m} =
∪n
i= EFV(Ri) ∪ EFV(P). Then by
Lemma3.4.5,{{−→x = −→z }Ri{S−→x =−→z ,Ri
(−→x )}|i = , . . . , n} ⊢ {A}P{B}. ByLemma
3.4.6, ⊢ {−→x = −→z }Ri{S−→x =−→z ,Ri
(−→x )} for i = , . . . , n. Then we have ⊢ {A}P{B}.
2
Apt’s system cannot be extended to separation logic, because his invariance axiom
isinconsistentwithseparationlogic. Ontheotherhand,wecanextendoursystemtoa
verificationsystemwithseparationlogicandrecursiveproceduresinastraightforward
way.
25
4Separation Logic for Recursive Procedures
26 Chapter 4. Separation Logic for Recursive Procedures
4.1 Language
This section defines our programming language and our assertion language. Our
programming language inherits from the pointer programs in Reynolds’ paper [13].
Ourassertionlanguageisalsothesameasin[13],whichisbasedonPeanoarithmetic.
Base Language
We first define our base language, which will be used later for both a part of our
programming language and a part of our assertion language. It is essentially a first-
order language for Peano arithmetic. We call its formula a pure formula. We will use
i, j, k, l, m, n for natural numbers. Our base language is defined as follows. We have
variables x, y, z, w, . . . and constants , , null, denoted by c. The symbol null means
thenullpointer. Wehavefunctionsymbols+and×. Ourpredicatesymbols,denoted
by p, are = and <. Terms, denoted by t, are defined as x | c | f(t, . . . , t). Our pure
formulas, denoted by A, are defined by
A ::= p(t, t) | ¬A | A ∧ A | A ∨ A | A → A | ∀xA | ∃xA
where p(t, t) is t = t or t < t. Here p(t , t ) means that the predicate p holds for
terms t , t . The other formula constructions mean usual logical connectives. We will
sometimes write the number n to denote the term + ( + ( + . . . ( + ))) (n
times of +).
Programming Language
Next we define our programming language, which is an extension of while pro-
grams to pointers and procedures. Its expressions are terms of the base language. That
is, our expressions, denoted by e, are defined by e ::= x | | | null | e + e | e × e.
Expressions mean natural numbers or pointers.
Itsbooleanexpressions, denotedbyb, arequantifier-freepureformulasanddefined
4.1 Language 27
by b ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b. Boolean expressions are used as
conditions in a program.
We assume procedure names R , . . . , Rnproc for some nproc. We will write R for these
procedure names.
Definition 4.1.1 Programs, denoted by P,Q, are defined by
P ::= x := e (assignment)
| if (b) then (P) else (P) (conditional)
| while (b) do (P) (iteration)
| P; P (composition)
| skip (no operation)
| x := cons(e, e) (allocation)
| x := [e] (lookup)
| [e] := e (mutation)
| dispose(e) (deallocation)
| R (mutual recursive procedure name)
R means a procedure name without parameters.
We write L for the set of programs. We write L−
for the set of programs that do
not contain procedure names.
The statement x := cons(e , e ) allocates two new consecutive memory cells, puts
the values of e and e in the respective cells, and assigns the first address to x. The
statementx := [e]looksupthecontentofthememorycellattheaddresseandassigns
it to x. The statement [e ] := e changes the content of the memory cell at the address
e by e . The statement dispose(e) deallocates the memory cell at the address e.
The programs x := e, skip, x := cons(e , e ), x := [e], [e ] := e and dispose(e)
are called atomic programs. A program P is called pure if it doesn’t contain any of
x := cons(e , e ), x := [e], [e ] := e or dispose(e). It means that a pure program
never access memory cells directly.
We call Procedure R(Q) a procedure declaration where R is a procedure name and
28 Chapter 4. Separation Logic for Recursive Procedures
Q is a program. The program Q is said to be the body of R. This means that we define
the procedure name R with its procedure body Q.
We assume the procedure declarations
{Procedure R (Q ), . . . , Procedure Rnproc (Qnproc )}
that gives procedure definitions to all procedure names in the rest of the paper. We
allow mutual recursive procedure calls.
Assertion Language and Asserted Programs
Our assertion language is a first-order language extended by the separating con-
junction ∗ and the separating implication −∗ as well as emp and →. Its variables,
constants, function symbols, and terms are the same as those of the base language.
We have predicate symbols =, < and → and a predicate constant emp. Our assertion
language is defined as follows.
Definition 4.1.2 Formulas A are defined by
A ::= emp | e = e | e < e | e → e | ¬A | A ∧ A | A ∨ A | A → A | ∀xA |
∃xA | A ∗ A | A −∗ A
We will sometimes call a formula an assertion.
We define FV(A) as the set of free variables in A. We define FV(e) similarly.
The symbol emp means the current heap is empty. The formula e → e means
the current heap has only one cell at the address e and its content is e . The formula
A ∗ B means the current heap can be split into some two disjoint heaps such that the
formula A holds at one heap and the formula B holds at the other heap. The formula
A −∗ B means that for any heap disjoint from the current heap such that the formula
A holds at the heap, the formula B holds at the new heap obtained from the current
heap and the heap by combining them.
4.1 Language 29
We use vector notation to denote a sequence. For example, −→e denotes the se-
quence e , . . . , en of expressions.
Definition 4.1.3 The expression {A}P{B} is called an asserted program, where A, B are
formulas and P is a program.
This means the program P with its precondition A and its postcondition B.
Unfolding of Procedures
We define the set of procedure names which are visible in a program. It will be
necessary later in defining dependency relation between two procedures.
Definition 4.1.4 The set PN(P) of procedure names in P is defined as follows.
PN(P) = ∅ if P is atomic,
PN(if (b) then (P ) else (P )) = PN(P ) ∪ PN(P ),
PN(P ; P ) = PN(P ) ∪ PN(P ),
PN(while (b) do (P)) = PN(P),
PN(Ri) = {Ri}.
We define the dependency relation between two procedures. When a procedure
name appears in the body of another procedure, we say the latter procedure depends
on the former procedure at level 1. When one procedure depends on another and the
latter one again depends on the third one, we say the first one also depends on the
third one. In this case, the level of the third dependency is determined by summing
up the levels of first and second dependencies mentioned above.
Definition 4.1.5 We define the relation Ri
k
; Rj as follows:
Ri ; Ri,
Ri ; Rj if PN(Qi) ∋ Rj,
Ri
k
; Rj if Ri = R′
; R′
; . . . ; R′
k = Rj for some R′
, . . . , R′
k.
30 Chapter 4. Separation Logic for Recursive Procedures
Procedures dependency PD(Ri, k) of a procedure name Ri up to level k is defined by
PD(Ri, k) = { Rj | Ri
l
; Rj, l ≤ k }.
This relation will be used to define EFV(P) and Mod(P) as well as the semantics of P.
Note that (1) PD(Ri, k) ⊆ PD(Ri, k + ) for all k and (2) PD(Ri, k) ⊆ { Ri | i =
, . . . , nproc } where nproc is the number of procedures in the declaration.
The following first lemma will show that once procedures dependencies of a pro-
cedure up to two consecutive levels are the same, it is the same up to any higher level
too. The second claim states that nproc − is sufficient for the largest level.
Lemma 4.1.6 (1) If PD(Ri, k) = PD(Ri, k + ) then PD(Ri, k) = PD(Ri, k + l) for
all k, l ∈ N.
(2) PD(Ri, k) ⊆ PD(Ri, nproc − ) for all k.
Proof.
(1) It is proved by induction on l.
Case 1. l be .
Its proof is immediate.
Case 2. l be l′
+ .
Assume PD(Ri, k) = PD(Ri, k + ). We can show that if R′
i ∈ PD(Ri, k) then
R′
i ∈ PD(Ri, k + l′
+ ). Now we will show that if R′
i ∈ PD(Ri, k + l′
+ ) then
R′
i ∈ PD(Ri, k). Assume R′
i ∈ PD(Ri, k + l′
+ ). Then we have Rj such that Rj ∈
PD(Ri, k + l′
) and Rj ; R′
i. By induction hypothesis, PD(Ri, k) = PD(Ri, k + l′
).
Then Rj ∈ PD(Ri, k). Then by definition, R′
i ∈ PD(Ri, k + ). Then R′
i ∈ PD(Ri, k)
by the assumption. Therefore, PD(Ri, k) = PD(Ri, k + l′
+ ).
(2) We will show that PD(Ri, m) = PD(Ri, m + ) for some m < nproc by
contradiction. Assume for all m < nproc, PD(Ri, m) ̸= PD(Ri, m + ). Then
PD(Ri, m) ⫋ PD(Ri, m + ). Then we have | PD(Ri, m) | ≥ m + and hence
| PD(Ri, nproc) | ≥ nproc + . But PD(Ri, nproc) ⊆ { Ri | i = , . . . , nproc } and then
| PD(Ri, nproc) | ≤ nproc. Itisacontradiction. Therefore, PD(Ri, m) = PD(Ri, m+ )
4.1 Language 31
for some m < nproc. By (1), we have now PD(Ri, nproc − ) = PD(Ri, nproc − + l)
for all l.
Therefore, PD(Ri, k) ⊆ PD(Ri, nproc − ) for all k. 2
We need to define some closed program that never terminates in order to define
unfolding of a program for a specific number of times. First we will define Ω.
Definition 4.1.7 We define Ω as
while ( = ) do (skip)
Substitution of a program for a procedure name is defined below.
Definition 4.1.8 Let
−→
P ′
= P′
, . . . , P′
nproc
where P′
i is a program. P[
−→
P ′
] is defined by
induction on P as follows:
P[
−→
P ′
] = P if P is atomic,
(if (b) then (P ) else (P ))[
−→
P ′
] = (if (b) then (P [
−→
P ′
]) else (P [
−→
P ′
])),
(while (b) do (P))[
−→
P ′
] = (while (b) do (P[
−→
P ′
])),
(P ; P )[
−→
P ′
] = (P [
−→
P ′
]; P [
−→
P ′
]),
(Ri)[
−→
P ′
] = P′
i.
P[
−→
P ′
]isaprogramobtainedfromPbyreplacingtheprocedurenamesR , . . . , Rnproc
by P′
, . . . , P′
nproc
respectively.
UnfoldingtransformsaprograminlanguageLintoaprograminlanguageL−
. Dis-
cussions on the programs in language L can be reduced to those in L−
, which are ei-
ther easy or already shown elsewhere. P(k)
denotes P where each procedure name is
unfolded only k times. P( )
just replaces a procedure name by Ω, since a procedure
name is not unfolded any more, which means the procedure name is supposed to be
not executed. Here we present the unfolding of a program.
Definition 4.1.9 Let Ωi = Ω for ≤ i ≤ nproc. We define P(k)
for k ≥ as follows:
32 Chapter 4. Separation Logic for Recursive Procedures
P( )
= P[Ω , . . . , Ωnproc ],
P(k+ )
= P[Q(k)
, . . . , Q(k)
nproc
].
Sometimes we will call P(k)
as the k-times unfolding of the program P.
We present some basic properties of unfolded programs.
Proposition 4.1.10 (1) P(k)
= P[
−→
R(k)
].
(2) R
( )
i = Ω.
(3) R
(k+ )
i = Qi[
−→
R(k)
].
Proof.
(1) By case analysis of k.
Case 1. k = .
P( )
= P[
−→
Ω] by definition. By (2), P[
−→
Ω] = P[
−→
R( )
]. Therefore, P( )
= P[
−→
R( )
].
Case 2. k = k′
+ .
P(k′+ )
= P[
−−→
Q(k′)
]. Since R
(k′+ )
i = Ri[
−−→
Q(k′)
] = Q
(k′)
i , P[
−−→
Q(k′)
] = P[
−−−→
R(k′+ )
]. There-
fore, P(k′+ )
= P[
−−−→
R(k′+ )
].
(2) By definition we have R
( )
i = Ri[
−→
Ω] = Ω.
(3) By definition we have R
(k+ )
i = Ri[
−−→
Q(k)
] = Q
(k)
i . By (1), Q
(k)
i = Qi[
−→
R(k)
].
Therefore, R
(k+ )
i = Qi[
−→
R(k)
]. 2
The nexttwodefinitionswilldefinethesetofthefreevariables(FV(P))andtheset
of the variables that can be modified (Mod (P)). Generally speaking, the left variable
of the symbol := in a program is a modifiable variable. First we will define above
mentioned two sets for a program in its syntactic structure. Next, it will be used to
definethesetoffreevariables(extendedfreevariables, EFV)andthesetofmodifiable
variables (Mod(P)) that may appear in the execution of the program. Since ‘a free
variable’ has an ordinary meaning without procedure calls, we will use ‘an extended
free variable’ for that with procedure calls.
4.1 Language 33
Definition 4.1.11 We define FV(P) for P in L−
and EFV(P) for P in L as follows:
FV(x := e) = {x} ∪ FV(e),
FV(if (b) then (P ) else (P )) = FV(b) ∪ FV(P ) ∪ FV(P ),
FV(while (b) do (P)) = FV(b) ∪ FV(P),
FV(P ; P ) = FV(P ) ∪ FV(P ),
FV(skip) = ∅,
FV(x := cons(e , e )) = {x} ∪ FV(e ) ∪ FV(e ),
FV(x := [e]) = {x} ∪ FV(e),
FV([e ] := e ) = FV(e ) ∪ FV(e ),
FV(dispose(e)) = FV(e),
EFV(P) = FV(P(nproc)
).
The expression FV(P) is the set of variables that occur in P. EFV(P) is the set of vari-
ables that may be used in the execution of P.
The expression FV(O , . . . , Om) is defined as FV(O ) ∪ . . . ∪ FV(Om) when Oi is
a formula, an expression, or a program.
Definition 4.1.12 We define Mod (P) for P in L−
and Mod(P) for P in L as follows:
Mod (x := e) = {x},
Mod (if (b) then (P ) else (P )) = Mod (P ) ∪ Mod (P ),
Mod (while (b) do (P)) = Mod (P),
Mod (P ; P ) = Mod (P ) ∪ Mod (P ),
Mod (skip) = ∅,
Mod (x := cons(e , e )) = {x},
Mod (x := [e]) = {x},
Mod ([e ] := e ) = ∅,
Mod (dispose(e)) = ∅,
Mod(P) = Mod (P(nproc)
).
The expression Mod(P) is the set of variables that may be modified by P.
34 Chapter 4. Separation Logic for Recursive Procedures
Lemma 4.1.13 (1) FV(R
(k+ )
i ) =
∪
Rj∈PD(Ri,k) FV(Qj).
(2) Mod (R
(k+ )
i ) =
∪
Rj∈PD(Ri,k) Mod (Qj).
Proof.
(1) It is proved by induction on k.
Case 1. k = .
By definition, FV(R
( )
i ) = FV(Ri[
−−→
Q( )
]) = FV(Q
( )
i ) = FV(Qi[
−→
Ω]) = FV(Qi).
Since PD(Ri, ) = {Ri}, we have FV(Qi) =
∪
Rj∈PD(Ri, ) FV(Qj).
Case 2. k to be k′
+ .
By Proposition 4.1.10 (3), FV(R
(k′+ )
i ) = FV(Qi[
−−−→
R(k′+ )
]). Then we have
FV(R
(k′+ )
i ) = FV(Qi) ∪
∪
Ri;Rj
FV(R
(k′+ )
j ). By induction hypothesis, we
have FV(R
(k′+ )
i ) = FV(Qi) ∪
∪
Ri;Rj
∪
Rm∈PD(Rj,k′) FV(Qm). Then we have
FV(R
(k′+ )
i ) = FV(Qi) ∪
∪
Rm∈PD(Ri,k′+ ) FV(Qm). Therefore, FV(R
(k′+ )
i ) =
∪
Rj∈PD(Ri,k′+ )(FV(Qj)).
(2) Its proof is similar to (1). 2
Proposition 4.1.14 (1) FV(P(k)
) ⊆ EFV(P) for all k.
(2) Mod (P(k)
) ⊆ Mod(P) for all k.
Proof.
(1) Fix k. If k = , the claim trivially holds. Assume k > . By Lemma 4.1.6
(2), we have PD(Ri, k − ) ⊆ PD(Ri, nproc − ). Then
∪
Rj∈PD(Ri,k− ) FV(Qj) ⊆
∪
Rj∈PD(Ri,nproc− ) FV(Qj). Then by Proposition 4.1.13 (1), FV(R
(k)
i ) ⊆ FV(R
(nproc)
i ).
Then we have FV(P) ∪
∪
Ri∈PN(P) FV(R
(k)
i ) ⊆ FV(P) ∪
∪
Ri∈PN(P) FV(R
(nproc)
i ). Then
byProposition4.1.10(1)wehaveFV(P(k)
) ⊆ FV(P(nproc)
). Bydefinition,FV(P(k)
) ⊆
EFV(P).
(2) Its proof is similar to (1). 2
4.2 Semantics 35
4.2 Semantics
The semantics of our programming language and our assertion language is defined
in this section. Our semantics is based on the same structure as that in Reynolds’
paper [13] except the following simplification: (1) values are natural numbers, (2)
addresses are non-zero natural numbers, and (3) null is .
The set N is defined as the set of natural numbers. The set Vars is defined as the set
of variables in the base language. The set Locs is defined as the set {n ∈ N|n > }.
For sets S , S , f : S → S means that f is a function from S to S . f : S →fin S
means that f is a finite function from S to S , that is, there is a finite subset S′
of S
and f : S′
→ S . Dom(f) denotes the domain of the function f. The expression p(S)
denotes the power set of the set S. For a function f : A → B and a subset C ⊆ A, the
function f|C : C → B is defined by f|C(x) = f(x) for x ∈ C.
AstoreisdefinedasafunctionfromVars → N, anddenotedbys. Aheapisdefined
as a finite function from Locs →fin N, and denoted by h. A value is a natural number.
An address is a positive natural number. The null pointer is . A store assigns a value
to each variable. A heap assigns a value to an address in its finite domain.
The store s[x := n , . . . , xk := nk] is defined by s′
such that s′
(xi) = ni and
s′
(y) = s(y) for y ̸∈ {x , . . . , xk}. The heap h[m := n , . . . , mk := nk] is defined
by h′
such that h′
(mi) = ni and h′
(y) = h(y) for y ∈ Dom(h) − {m , . . . , mk}.
The store s[x := n , . . . , xk := nk] is the same as s except values for the variables
x , . . . , xk. The heap h[m := n , . . . , mk := nk] is the same as h except the contents
of the memory cells at the addresses m , . . . , mk.
We will write h = h + h when Dom(h) = Dom(h ) ∪ Dom(h ), Dom(h ) ∩
Dom(h ) = ∅, h(x) = h (x) for x ∈ Dom(h ), and h(x) = h (x) for x ∈ Dom(h ).
The heap h is divided into the two disjoint heaps h and h when h = h + h .
A state is defined as (s, h). The set States is defined as the set of states. The state
for a pointer program is specified by the store and the heap, since pointer programs
manipulate memory heaps as well as variable assignments.
36 Chapter 4. Separation Logic for Recursive Procedures
Definition 4.2.1 We define the semantics of our base language by the standard model of
natural numbers and null = . That is, we suppose = , = , + = +,
× = ×, = = (=), and < = (<). For a store s, an expression e, and a pure
formula A, according to the interpretation of a first-order language, the meaning e s is
defined as a natural number and the meaning A s is defined as True or False.
The expression e s and A s are the value of e under the store s, and the truth value of
A under the store s, respectively.
Semantics of Programs
The relation ⊆ over the functions of type States∪{abort} → p(States∪{abort})
is necessary to define the semantics for while (b) do (P).
Definition 4.2.2 We define ⊆ for functions F, G: States ∪ {abort} → p(States ∪
{abort}). F ⊆ G is defined to hold if ∀r ∈ States (F(r) ⊆ G(r)).
Definition 4.2.3 We define the semantics of our programming language. For a program
P, its meaning P is defined as a function from States ∪ {abort} to p(States ∪ {abort}).
We will define P (r ) as the set of all the possible resulting states after the execution of P
terminates with the initial state r . In particular, if the execution of P with the initial state r
does not terminate, we will define P (r ) as the empty set ∅. The set P ({r , . . . , rm}) is
defined as
∪m
i= P (ri). In order to define P we would like to define P −
for all P in the
4.2 Semantics 37
language L−
. We define P −
by induction on P in L−
as follows:
P −
(abort) = {abort},
x := e −
((s, h)) = {(s[x := e s], h)},
if (b) then (P ) else (P ) −
((s, h)) =
{
P ((s, h)) if b s = True,
P ((s, h)) otherwise,
while (b) do (P) −
is the least function satisfying
while (b) do (P) −
(abort) = {abort},
while (b) do (P) −
((s, h)) = {(s, h)} if b s = False,
while (b) do (P) −
((s, h)) =
∪
{ while (b) do (P) −
(r) | r ∈ P −
((s, h)) } otherwise,
P ; P −
((s, h)) =
∪
{ P −
(r) | r ∈ P −
((s, h)) },
skip −
((s, h)) = {(s, h)},
x := cons(e , e ) −
((s, h)) =
{(s[x := n], h[n := e s, n + := e s]) | n > , n, n + ̸∈ Dom(h)},
x := [e] −
((s, h)) =
{
{(s[x := h( e s)], h)}
{abort}
if e s ∈ Dom(h),
otherwise,
[e ] := e −
((s, h)) =
{
{(s, h[ e s := e s])}
{abort}
if e s ∈ Dom(h),
otherwise,
dispose(e) −
((s, h)) =
{
{(s, h|Dom(h)−{ e s})}
{abort}
if e s ∈ Dom(h),
otherwise.
We present two propositions which together state that the semantics of a
while (b) do (P) can be constructed by the semantics of P.
Proposition 4.2.4 r ∈ while (b) do (P) −
((s, h)) if and only if there exist m ≥ ,
r , . . . , rm such that r = (s, h), rm = r, b ri = True and ri+ ∈ P −
(ri) for ≤ i <
m, and one of the following holds:
(1) r ̸= abort and b r = False,
(2) r = abort,
38 Chapter 4. Separation Logic for Recursive Procedures
where we write b (s′,h′) for b s′ .
Proof. First we will show the only-if-part. Let F((s, h)) = { r | m ≥
, r = (s, h), rm = r, ri = (si, hi), b si = True, ri+ ∈ P −
(ri) ( ≤ ∀i <
m), (r = (sm, hm) ∧ b sm = False) ∨ r = abort } and F(abort) = {abort}.
We will show that F satisfies the inequations obtained from the equations for
while (b) do (P) −
by replacing = by ⊇. That is, we will show
F(abort) ⊇ {abort},
F((s, h)) ⊇ {(s, h)} if b s = False,
F((s, h)) ⊇
∪
{ F(r) | r ∈ P −
((s, h)) } if b s = True.
By the well-known least fixed point theorem [17], these inequations imply our
only-if-part.
The first inequation immediately holds by the definition of F. For the second in-
equation, since b s = False, by taking m to be , we have (s, h) ∈ F((s, h)). We will
show the third inequation. Assume b s = True and r is in the right-hand side. We
will show r ∈ F((s, h)).
We have q ∈ P −
((s, h)) and r ∈ F(q).
Case 1. q = (s′′
, h′′
).
By the definition of r ∈ F(q), we have m ≥ , r = (s′′
, h′′
), rm = r, ri =
(si, hi), b si = True, ri+ ∈ P −
(ri) ( ≤ ∀i < m), and one of the following
holds: either r = (sm, hm) and b sm = False, or r = abort.
Let m′
= m + , r′
= (s, h), and r′
i = ri− for < i ≤ m′
. By taking m and ri to be
m′
and r′
i respectively in the definition of F, we have r ∈ F((s, h)).
Case 2. q = abort.
By the definition of F we have r = abort. By taking m = , we have abort ∈
F((s, h)).
4.2 Semantics 39
Next we will show the if-part by induction on m. We assume the right-hand side.
We will show r ∈ while (b) do (P) −
((s, h)).
If m = then we have b s = False and r = (s, h). Hence r ∈
while (b) do (P) −
((s, h)).
Suppose m > . We have m and r , . . . , rm satisfying the conditions (1) or
(2). If r = abort, we have m = and abort ∈ P −
((s, h)). Hence r ∈
while (b) do (P) −
((s, h)) in this case. Suppose r ̸= abort. By induction hy-
pothesis for m − , we have r ∈ while (b) do (P) −
(r ). Since b s = True and
r ∈ P −
((s, h)), by the definition we have r ∈ while (b) do (P) −
((s, h)). 2
The following proposition characterizes the two properties of the semantics of Ω.
Proposition 4.2.5 (1) Ω −
(abort) = {abort}.
(2) Ω −
((s, h)) = ∅.
Proof.
(1) By definition, Ω −
(abort) = {abort}.
(2) By definition, Ω −
((s, h)) = while ( = ) do (skip) −
((s, h)). Since
skip −
((s′
, h′
)) ̸∋ abort for all s′
, h′
, by Proposition 4.2.4, abort ̸∈
while ( = ) do (skip) −
((s, h)). Since = s′ = True for all s′
, by Proposition
4.2.4 we have (s′
, h′
) ̸∈ while ( = ) do (skip) −
((s, h)) for all s′
, h′
. Therefore,
while ( = ) do (skip) −
((s, h)) = ∅. 2
Lemma 4.2.6 If P′
i, P′′
i ∈ L−
( ≤ i ≤ nproc) and P′
i
−
⊆ P′′
i
−
for all i then
P[
−→
P′
] −
⊆ P[
−→
P′′
] −
where P ∈ L and P[
−→
P′
] ∈ L−
.
Proof.
(1) By induction on P. Assume P′
i
−
⊆ P′′
i
−
for all i.
Case 1. P is atomic.
Since P[
−→
P′
]=P=P[
−→
P′′
], the claim holds.
Case 2. P is if (b) then (P ) else (P ).
40 Chapter 4. Separation Logic for Recursive Procedures
We will show that P[
−→
P′
] −
(r) ⊆ P[
−→
P′′
] −
(r).
Case 2.1. r is abort.
By definition, P[
−→
P′
] −
(abort) = {abort} = P[
−→
P′′
] −
(abort). Hence
P[
−→
P′
] −
⊆ P[
−→
P′′
] −
.
Case 2.2. r is (s, h).
Let r to be (s, h). Suppose b s = True. Then by definition, P[
−→
P′
] −
(r) =
P [
−→
P′
] −
(r) and P[
−→
P′′
] −
(r) = P [
−→
P′′
] −
(r). By induction hypothesis,
P [
−→
P′
] −
⊆ P [
−→
P′′
] −
. Therefore, P[
−→
P′
] −
⊆ P[
−→
P′′
] −
. In the case b s = False,
it can be proved similarly.
Case 3. P is while (b) do (P ).
We will show that P[
−→
P′
] −
(r) ⊆ P[
−→
P′′
] −
(r).
Case 3.1. r is abort.
The case is similar to 2.1.
Case 3.2. r to be (s, h).
Case b s = False. By definition, while (b) do (P [
−→
P′
]) −
((s, h)) = {(s, h)} =
while (b) do (P [
−→
P′′
]) −
((s, h)). Hence P[
−→
P′
] −
⊆ P[
−→
P′′
] −
.
Case b s = True. Assume while (b) do (P [
−→
P′
]) −
((s, h)) ∋ r′
. By Proposi-
tion 4.2.4, we have m ≥ , r , . . . , rm such that r = (s, h), ri+ ∈ P [
−→
P′
] −
(ri)
and b ri = True for ≤ i < m such that either r′
̸= abort and
b r′ = False, or r′
= abort. By induction hypothesis, ri+ ∈ P [
−→
P′′
] −
(ri). Then
while (b) do (P [
−→
P′′
]) −
(r) ∋ r′
.
Then while (b) do (P [
−→
P′
]) −
(r) ⊆ while (b) do (P [
−→
P′′
]) −
(r). Therefore,
P[
−→
P′
] −
⊆ P[
−→
P′′
] −
.
Case 4. P is P ; P .
By induction hypothesis, P [
−→
P′
] −
⊆ P [
−→
P′′
] −
and P [
−→
P′
] −
⊆ P [
−→
P′′
] −
.
Therefore, P[
−→
P′
] −
⊆ P[
−→
P′′
] −
.
4.2 Semantics 41
Case 5. Ri.
We have Ri[
−→
P′
] = P′
i and Ri[
−→
P′′
] = P′′
i . Hence Ri[
−→
P′
] −
⊆ Ri[
−→
P′′
] −
.
2
We have already defined the semantics of P in the language L−
. We define the
semantics of P in L. The semantics we will define is usually called an approximating
semantics.
Definition 4.2.7 The semantics of P in L is defined by P (r) =
∪∞
i= ( P(i) −
(r)).
Note that Q
(k)
i is a program of the language L−
and doesn’t contain Ri.
Remark that for P in L−
, we have P = P −
since P(k)
= P.
Lemma 4.2.8 For all k, P(k) −
⊆ P(k+ ) −
.
Proof. By induction on k.
Case 1. k = .
We have Ω −
(r) = ∅ for all r by Proposition 4.2.5 (2). Then for all i, we have
Ω −
⊆ Q
( )
i
−
. Then by Lemma 4.2.6, P[
−→
Ω] ⊆ P[
−−→
Q( )
] . Then P( ) −
⊆
P( ) −
.
Case 2. k > .
Let k be k′
+ . We have Q(k)
= Q(k′+ )
= Q[
−−→
Q(k′)
] and Q(k+ )
= Q(k′+ )
=
Q[
−−−→
Q(k′+ )
]. By induction hypothesis, Q(k′) −
⊆ Q(k′+ ) −
. By Lemma 4.2.6, we
now have P[
−−→
Q(k′)
] −
⊆ P[
−−−→
Q(k′+ )
] −
. Then P(k) −
⊆ P(k+ ) −
. 2
Semantics of Assertions
Definition 4.2.9 We define the semantics of the assertion language. For an assertion A
and a state (s, h), the meaning A (s,h) is defined as True or False. A (s,h) is the truth value
42 Chapter 4. Separation Logic for Recursive Procedures
of A at the state (s, h). A (s,h) is defined by induction on A as follows:
emp (s,h) = True if Dom(h) = ∅,
e = e (s,h) = ( e s = e s),
e < e (s,h) = ( e s < e s),
e → e (s,h) = True if Dom(h) = { e s} and h( e s) = e s,
¬A (s,h) = (not A (s,h)),
A ∧ B (s,h) = ( A (s,h) and B (s,h)),
A ∨ B (s,h) = ( A (s,h) or B (s,h)),
A → B (s,h) = ( A (s,h) implies B (s,h)),
∀xA (s,h) = True if A (s[x:=m],h) = True for all m ∈ N,
∃xA (s,h) = True if A (s[x:=m],h) = True for some m ∈ N,
A ∗ B (s,h) = True if h = h + h and
A (s,h ) = B (s,h ) = True for some h , h ,
A −∗ B (s,h) = True if h = h + h and
A (s,h ) = True implies B (s,h ) = True for all h , h .
We say A is true when A (s,h) = True for all (s, h).
Semantics of Asserted Program
Finally we need to define the semantics of the asserted programs. It is basically the
same as in [13].
Definition 4.2.10 For an asserted program {A}P{B}, the meaning of {A}P{B} is de-
fined as True or False. {A}P{B} is defined to be True if both of the following hold.
(1) for all (s, h), if A (s,h) = True, then P ((s, h)) ̸∋ abort.
(2) for all (s, h) and (s′
, h′
), if A (s,h) = True and P ((s, h)) ∋ (s′
, h′
), then
B (s′,h′) = True.
Remark that semantics of an asserted program with a non-terminating program is
always True. Because, according to the definition of semantics, a resulting abort state
of a program implies termination of its execution.
4.2 Semantics 43
In our system, judgments are in the form Γ ⊢ {A}P{B} where Γ is a set of asserted
programs. Here we will define semantics of a judgment.
We say Γ is true if all {A}P{B} in Γ are true.
Definition 4.2.11 We say Γ ⊢ {A}P{B} is true when the following holds: {A}P{B} is
true if {Ai}Pi{Bi} is true for all {Ai}Pi{Bi} ∈ Γ.
The asserted program {A}P{B} means abort-free partial correctness and also im-
plies partial correctness in the standard sense. Namely, it means that the execution of
the program P with all the initial states which satisfy A never aborts, that is, P does not
access any of the unallocated addresses during the execution. It is one of the strongest
feature of the system.
The next lemma shows that the semantics of a procedure call (or procedure name)
is the same as that of its body.
Lemma 4.2.12 Ri = Qi .
Proof. By definition we have Qi (r) =
∪∞
k= ( Q
(k)
i
−
(r)). Since Q
(k)
i =
Ri[
−−→
Q(k)
] = R
(k+ )
i , we have
∪∞
k= ( Q
(k)
i
−
(r)) =
∪∞
k= ( R
(k+ )
i
−
(r)). Then by
Proposition 4.2.5 (2), Qi (r) =
∪∞
k= ( R
(k+ )
i
−
(r)) ∪ Ω −
(r). Then Qi (r) =
∪∞
k= ( R
(k+ )
i
−
(r)) ∪ R
( )
i
−
(r) by Proposition 4.1.10(2). Then Qi (r) =
∪∞
k= ( R
(k)
i
−
(r)). Therefore, Qi = Ri . 2
We define the equality of two stores over a set of variables. Suppose some stores
are equal over the set of free variables of a program. If we execute the program with
those stores in the states, then the stores in the resulting states are still equal over the
same set of free variables.
Definition 4.2.13 For a set V of variables, we define =V as follows: s =V s′
if and only if
s(x) = s′
(x) for all x ∈ V.
Definition 4.2.14 We first define [s , s , V] to denote the store s such that s =V s and
s =Vc s for stores s , s and a set of variables V.
Lemma 4.2.15 Suppose P ∈ L−
.
44 Chapter 4. Separation Logic for Recursive Procedures
(1) If P −
((s, h)) ∋ (s , h ) then s =Mod (P)c s.
(2) If s =FV(P) s′
, P −
((s, h)) ∋ (s , h ) and s′
= [s , s′
, FV(P)] then
P −
((s′
, h)) ∋ (s′
, h ).
(3) If s =FV(P) s′
and P −
((s, h)) ∋ abort, then P −
((s′
, h)) ∋ abort.
Proof.
(1) We will show the claim by induction on P. We consider cases according to P.
Case 1. x := e.
Assume x := e −
((s, h)) ∋ (s , h ). By definition, s = s[x := e s] and h = h .
Then for all y ̸= x, s(y) = s (y). By definition, Mod (x := e) = {x}. Therefore,
s =Mod (P)c s.
Case 2. P is if (b) then (P ) else (P ).
Assume P −
((s, h)) ∋ (s , h ).
Case b s = True. By definition, we have P −
((s, h)) ∋ (s , h ). By induction
hypothesis, s =Mod (P )c s . WehaveMod (P)c
⊆ Mod (P )c
. Therefore, s =Mod (P)c s .
Case b s = False can be shown as above.
Case 3. P is while (b) do (P ).
Assume P −
((s, h)) ∋ (s , h ).
Case b s = True. By Proposition 4.2.4, we have s , . . . , sm, h , . . . , hm such that
(s, h) = (s , h ), (s , h ) = (sm, hm), for all i = , . . . , m − , P −
((si, hi)) ∋
(si+ , hi+ ), b si = True and b sm = False. By induction hypothesis, for all i =
, . . . , m− , si =Mod (P )c si+ . Since Mod (P)c
⊆ Mod (P )c
, for all i = , . . . , m− ,
si =Mod (P)c si+ . Therefore, s =Mod (P)c s .
Case b s = False. Then by definition, s = s . Then s =Mod (P)c s .
Case 4. P ; P .
4.2 Semantics 45
Assume P ; P −
((s, h)) ∋ (s , h ). By definition, we have s , h such that
P −
((s, h)) ∋ (s , h ) and P −
((s , h )) ∋ (s , h ).
By induction hypothesis, s =Mod (P )c s and s =Mod (P )c s . Since Mod (P)c
⊆
Mod (P )c
and Mod (P)c
⊆ Mod (P )c
, we have s =Mod (P)c s and s =Mod (P)c s .
Therefore, s =Mod (P)c s.
Case 5. skip.
Its proof is immediate.
Case 6. x := cons(e , e ).
Assume x := cons(e , e ) −
((s, h)) ∋ (s , h ). By definition, (s , h ) is in
{ (s[x := m], h[m := e s, m + := e s]) | m > , m, m + ̸∈ Dom(h) }. So,
for all y ̸= x, s(y) = s (y). By definition, Mod (x := cons(e , e )) = {x}. Therefore,
s =Mod (x:=cons(e ,e ))c s .
Case 7. x := [e].
Assume x := [e] −
((s, h)) ∋ (s , h ). By definition, {(s[x := h( e s)], h)}. So,
for all y ̸= x, s(y) = s (y). By definition, Mod (x := [e]) = {x}. Therefore,
s =Mod (x:=[e])c s .
Case 8. [e ] := e .
Assume [e ] := e −
((s, h)) ∋ (s , h ). By definition, s = s . Therefore,
s =Mod ([e ]:=e )c s .
Case 9. dispose(e).
Assume dispose(e) −
((s, h)) ∋ (s , h ). By definition, s = s . Therefore,
s =Mod (dispose(e))c s .
(2) We will show the claim by induction on P. We consider cases according to P.
Case 1. P is x := e.
Assume s =FV(x:=e) s′
, x := e −
((s, h)) ∋ (s , h ) and s′
= [s , s′
, FV(x := e)].
Then by definition, s = s[x := e s] and s′
=FV(x:=e) s . Then s′
(x) = s (x) = e s =
46 Chapter 4. Separation Logic for Recursive Procedures
e s′ . We have s′
=FV(x:=e)c s′
and for all y ∈ FV(e) and y ̸= x, s′
(y) = s (y) = s(y) =
s′
(y). Then we have s′
= s′
[x := e s′ ]. Therefore, x := e −
((s′
, h)) ∋ (s′
, h ).
Case 2. P is if (b) then (P ) else (P ).
Assume s =FV(if (b) then (P ) else (P )) s′
, if (b) then (P ) else (P ) −
((s, h)) ∋ (s , h )
and s′
= [s , s′
, FV(if (b) then (P ) else (P ))].
Let b s = True. Then by definition P −
((s, h)) ∋ (s , h ). By (1), s =Mod (P )c s
and since FV(P )c
⊆ Mod (P )c
, we have s =FV(P )c s . We also have s =FV(P) s′
by
assumption and s′
=FV(P) s by definition and hence s′
=FV(P ) s and s′
=FV(P ) s
because FV(P ) ⊆ FV(P). Then for all y ∈ FV(P) − FV(P ), s′
(y) = s(y) = s (y) =
s′
(y). Since s′
=FV(P)−FV(P ) s′
and s′
=FV(P)c s′
are true, we have that s′
=FV(P )c s′
holds. Then s′
= [s , s′
, FV(P )]. By induction hypothesis, P −
((s′
, h)) ∋ (s′
, h ).
Then by definition P −
((s′
, h)) ∋ (s′
, h ).
Again let b s = False. In the same way as above we can prove that P −
((s′
, h)) ∋
(s′
, h ).
Case 3. P is while (b) do (P ).
Assume s =FV(while (b) do (P )) s′
, while (b) do (P ) −
((s, h)) ∋ (s , h ) and s′
=
[s , s′
, FV(while (b) do (P ))]. We have (s, h) = (q , h′
), …, (qm, h′
m) = (s , h ) such
that P −
((qi, h′
i)) ∋ (qi+ , h′
i+ ) and b qi = True for all ≤ i < m and b qm =
False by Proposition 4.2.4. Let q′
i = [qi, s′
, FV(P )] for all ≤ i ≤ m.
We will show that s′
= q′
and s′
= q′
m. We have q′
= [s, s′
, FV(P )]. Then
q′
=FV(P )c s′
. Since s =FV(P ) s′
and q′
=FV(P ) s, we have q′
=FV(P ) s′
. Therefore,
s′
= q′
. We have q′
m = [s , s′
, FV(P )]. We also have s′
= [s , s′
, FV(P)]. Then
q′
m =FV(P ) s′
. By (1), s =Mod (P)c s . Then s =FV(P )c s since Mod (P)c
= Mod (P )c
and Mod (P )c
⊇ FV(P )c
. We have q′
m =FV(P)c s′
since FV(P)c
⊆ FV(P )c
and
q′
m =FV(P )c s′
. Then q′
m =FV(P)c s′
. Since q′
m =FV(P )c s′
, s′
=FV(P) s and s =FV(P) s′
,
for all y ∈ FV(P) − FV(P ), q′
m(y) = s′
(y) = s(y) = s (y) = s′
(y). Therefore,
s′
= q′
m. Hence s′
= q′
, s′
= q′
m.
Since q′
i =FV(P )c s′
by definition, q′
i+ = [qi+ , q′
i, FV(P )] for ≤ i < m.
By induction hypothesis, P −
((q′
i, hi)) ∋ (q′
i+ , hi+ ). We also have b q′
i
=
4.2 Semantics 47
True for ≤ i < m and b q′
m
= False. Hence by Proposition 4.2.4,
while (b) do (P ) −
((s′
, h)) ∋ (s′
, h ).
Case 4. P is P ; P .
Assume s =FV(P ;P ) s′
, P ; P −
((s, h)) ∋ (s , h ) and s′
= [s , s′
, FV(P ; P )].
By definition, we know that P −
((s, h)) ∋ (s , h ) and P −
((s , h )) ∋ (s , h )
for some s , h . By (1), s =Mod (P )c s and s =Mod (P )c s and then s =FV(P )c s
and s =FV(P )c s . Let take s′
such that s′
= [s , s′
, FV(P )]. Then s′
=FV(P ) s and
s′
=FV(P )c s′
. Then for all y ∈ FV(P ) ∩ FV(P ), s′
(y) = s (y). For all y ∈ FV(P ) −
FV(P ), s′
(y) = s′
(y) = s(y) = s (y). So, s′
=FV(P ) s . Since s′
=FV(P ;P ) s we have
s′
=FV(P ) s . For all y ∈ FV(P ) − FV(P ), s′
(y) = s (y) by s′
= [s , s′
, FV(P ; P )],
s (y) = s (y) by s =FV(P )c s , s (y) = s′
(y) by s′
=FV(P ) s and hence s′
(y) = s′
(y).
For all y ̸∈ FV(P ; P ), s′
(y) = s′
(y) by s′
= [s , s′
, FV(P ; P )] and s′
(y) = s′
(y)
by s′
= [s , s′
, FV(P )] and hence s′
(y) = s′
(y). Then we have s′
=FV(P )c s′
. Then
s′
= [s , s′
, FV(P )].
Hence, by induction hypothesis P −
((s′
, h)) ∋ (s′
, h ) and P −
((s′
, h )) ∋
(s′
, h ). Therefore, by definition P ; P −
((s′
, h)) ∋ (s′
, h ).
Case 5. P is skip.
Its proof is immediate.
Case 6. P is x := cons(e , e ).
Assume s =FV(x:=cons(e ,e )) s′
, x := cons(e , e ) −
((s, h)) ∋ (s , h ) and s′
=
[s , s′
, FV(x := cons(e , e ))]. Then by definition, s = s[x := m] for some m >
where m, m + ̸∈ Dom(h) and s′
=FV(x:=cons(e ,e )) s . Then s′
(x) = s (x) = m.
We have s′
=FV(x:=cons(e ,e ))c s′
. For all y ∈ FV(x := cons(e , e )) − {x},
s′
(y) = s (y) = s(y) = s′
(y). Then we have s′
= s′
[x := m]. Therefore, by defi-
nition x := cons(e , e ) −
((s′
, h)) ∋ (s′
, h ).
Case 7. P is x := [e].
Assume s =FV(x:=[e]) s′
, x := [e] −
((s, h)) ∋ (s , h ) and s′
= [s , s′
, FV(x :=
[e])]. Then by definition, s = s[x := h( e s)] and s′
=FV(x:=[e]) s . Then s′
(x) =
48 Chapter 4. Separation Logic for Recursive Procedures
s (x) = h( e s) = h( e s′ ). We have s′
=FV(x:=[e])c s′
and hence for all y ̸= x and
y ∈ FV(x := [e]), s′
(y) = s (y) = s(y) = s′
(y). Then we have s′
= s′
[x := h( e s′ )].
Therefore, x := [e] −
((s′
, h)) ∋ (s′
, h ).
Case 8. P is [e ] := e .
Assume s =FV([e ]:=e ) s′
, [e ] := e −
((s, h)) ∋ (s , h ) and s′
= [s , s′
, FV([e ] :=
e )]. Then by definition, s = s and s′
=FV([e ]:=e ) s . For all y ∈ FV([e ] := e ), we
have s′
(y) = s (y) = s(y) = s′
(y). Then we have s′
= s′
. Since h = h[ e s := e s],
we have h = h[ e s′ := e s′ ]. Therefore, [e ] := e −
((s′
, h)) ∋ (s′
, h ).
Case 9. P is dispose(e).
Assume s =FV(dispose(e)) s′
, dispose(e) −
((s, h)) ∋ (s , h ) and s′
=
[s , s′
, FV(dispose(e))]. Then by definition, s = s and s′
=FV(dispose(e)) s . For
all y ∈ FV(dispose(e)), we have s′
(y) = s (y) = s(y) = s′
(y). Then we
have s′
= s′
. Since h = h|Dom(h)− e s , we have h = h|Dom(h)− e s′ . Therefore,
dispose(e) −
((s′
, h)) ∋ (s′
, h ).
(3) We will show the claim by induction on P. We consider cases according to P.
Case 1. x := e.
Assume s =FV(x:=e) s′
and x := e −
((s, h)) ∋ abort. By definition,
x := e −
((s, h)) = {(s[x := e s], h)}. Therefore, x := e −
((s, h)) ∋ abort is
False. Then x := e −
((s, h)) ∋ abort implies x := e −
((s′
, h)) ∋ abort.
Case 2. P is if (b) then (P ) else (P ).
Assume s =FV(P) s′
.
Case 2.1. b s = True. Then b s′ = True. Assume P −
((s, h)) ∋ abort.
By definition, P ((s, h)) ∋ abort. By definition FV(P ) ⊆ FV(P) and then
s =FV(P ) s′
. By induction hypothesis, P −
((s′
, h)) ∋ abort. Therefore, by defi-
nition P −
((s′
, h)) ∋ abort.
Case 2.2. b s = False can be proved similarly.
Case 3. P is while (b) do (P ).
4.2 Semantics 49
Assume s =FV(while (b) do (P )) s′
.
Case 3.1. b s = True. Then b s′ = True. Assume while (b) do (P ) ((s, h)) ∋
abort. By Proposition 4.2.4, we have s , . . . , sm, h , . . . , hm such that (s, h) =
(s , h ), P −
((si, hi)) ∋ (si+ , hi+ ) and b si = True for all i = , . . . , m −
, P −
((sm, hm)) ∋ abort and b sm = True. By definition FV(P ) ⊆
FV(while (b) do (P )) and then s =FV(P ) s′
. Let s′
i = [si, s′
, FV(P)] for all i =
, . . . , m. Then s′
=FV(P)c s′
and s′
=FV(P) s = s =FV(P) s′
and hence s′
= s′
.
We will show that s′
i+ = [si+ , s′
i, FV(P )] for i = , . . . , m − .
For that we will show that s′
i+ =FV(P )c s′
i. By (1), si+ =Mod (P )c si and hence
si+ =FV(P )c si. For all y ∈ FV(P) − FV(P ), s′
i+ (y) = si+ (y) = si(y) = s′
i(y). For
all y ̸∈ FV(P), s′
i+ (y) = s′
(y) = s′
i(y). Hence we have s′
i+ =FV(P )c s′
i.
Since s′
i =FV(P) si, we have s′
i+ =FV(P ) si+ and then s′
i+ = [si+ , s′
i, FV(P )].
Then by (2), we have P −
((s′
i, hi)) ∋ (s′
i+ , hi+ ) for all i = , . . . , m − . Since
s′
m =FV(P) sm, we also have b s′
m
= True, s′
m =FV(P ) sm and then by induction hy-
pothesis P −
((s′
m, hm)) ∋ abort. Now we have (s′
, h) = (s′
, h ), P −
((s′
i, hi)) ∋
(s′
i+ , hi+ ) and b s′
i
= True for all i = , . . . , m − , P −
((s′
m, hm)) ∋ abort and
b s′
m
= True. Therefore, by Proposition 4.2.4, while (b) do (P) −
((s′
, h)) ∋ abort.
Case 3.2. b s = False. Then b s′ = False. Assume
while (b) do (P) −
((s, h)) ∋ abort. By definition it is false.
Case 4. P ; P .
Assume s =FV(P ;P ) s′
and P ; P ((s, h)) ∋ abort. By definition,
∪
{ P −
(r) | r ∈ P −
((s, h)) } ∋ abort. Then either P −
((s, h)) ∋ abort
or P −
((s, h)) ∋ (s , h ) and P −
((s , h )) ∋ abort for some s , h . Assume
P −
((s, h)) ∋ abort. Since FV(P ) ⊆ FV(P ; P ), we have s =FV(P ) s′
. By in-
duction hypothesis, P −
((s′
, h)) ∋ abort. Since P −
(abort) ∋ abort, we have
P ; P −
((s′
, h)) ∋ abort.
Now assume, P −
((s, h)) ∋ (s , h ) and P −
((s , h )) ∋ abort. We have
s =FV(P ) s′
. Let s′
= [s , s′
, FV(P )]. Then by induction hypothesis (2),
P −
((s′
, h )) ∋ (s′
, h ).
50 Chapter 4. Separation Logic for Recursive Procedures
By (1), s =FV(P )c s and s′
=FV(P )c s′
. Then for all y ∈ FV(P ) ∩ FV(P ),
s (y) = s′
(y). Since FV(P ) − FV(P ) ⊆ Mod (P )c
, for all y ∈ FV(P ) − FV(P ),
we have s (y) = s(y) = s′
(y) = s′
(y). Then s =FV(P ) s′
. By induction hypothesis,
P −
((s′
, h )) ∋ abort. Then, P ; P −
((s′
, h )) ∋ abort.
Case 5. skip.
Its proof is immediate.
Case 6. x := cons(e , e ).
Assumes =FV(x:=cons(e ,e )) s′
and x := cons(e , e ) −
((s, h)) ∋ abort. Butbydef-
inition, x := cons(e , e ) −
((s, h)) ̸∋ abort. Then x := cons(e , e ) −
((s, h)) ∋
abort implies x := cons(e , e ) −
((s′
, h)) ∋ abort.
Case 7. x := [e].
Assume s =FV(x:=[e]) s′
and x := [e] −
((s, h)) ∋ abort. Then by definition,
e s /∈ Dom(h). Since s =FV(x:=[e]) s′
, we have e s = e s′ . Then e s′ /∈ Dom(h).
Therefore, x := [e] −
((s′
, h)) ∋ abort.
Case 8. [e ] := e .
Assume s =FV([e ]:=e ) s′
and [e ] := e −
((s, h)) ∋ abort. Then by definition,
e s /∈ Dom(h). Since s =FV([e ]:=e ) s′
, we have e s = e s′ . Then e s′ /∈
Dom(h). Therefore, [e ] := e −
((s′
, h)) ∋ abort.
Case 9. dispose(e).
Assume s =FV(dispose(e)) s′
and dispose(e) −
((s, h)) ∋ abort. Then by definition,
e s /∈ Dom(h). Since s =FV(dispose(e)) s′
, we have e s = e s′ . Then e s′ /∈ Dom(h).
Therefore, dispose(e) −
((s′
, h)) ∋ abort. 2
Lemma 4.2.16 Suppose P ∈ L.
(1) If s =EFV(P) s′
and P ((s, h)) ∋ abort, then P ((s′
, h)) ∋ abort.
(2) If s =EFV(P) s′
and P ((s, h)) ∋ (s , h ) and s′
= [s , s′
, EFV(P)] then
P ((s′
, h)) ∋ (s′
, h ).
4.2 Semantics 51
(3) If P ((s, h)) ∋ (s , h ) then s =Mod(P)c s.
Proof.
(1) Assume s =EFV(P) s′
and P ((s, h)) ∋ abort. Then by definition,
P(k) −
((s, h)) ∋ abort for some k. Since we have FV(P(k)
) ⊆ EFV(P) by
Proposition 4.1.14 (1), we have s =FV(P(k)) s′
. By Lemma 4.2.15 (3), we have
P(k) −
((s′
, h)) ∋ abort. Then by definition, P ((s′
, h)) ∋ abort.
(2) Assume s =EFV(P) s′
, P ((s, h)) ∋ (s , h ) and s′
= [s , s′
, EFV(P)]. Then by
definition, P(k )
((s, h)) ∋ (s , h ) for some k . By definition, EFV(P) = FV(P(nproc)
)
where nproc is the number of our procedure names. Let k = max(k , nproc). Then by
Lemma 4.2.8, P(k)
((s, h)) ∋ (s , h ). Since FV(P(nproc)
) = FV(P(k)
) by Proposi-
tion 4.1.14 (1), we have FV(P(k)
) = EFV(P). Then s′
= [s , s′
, FV(P(k)
)]. Then by
Lemma 4.2.15 (2), P(k) −
((s′
, h)) ∋ (s′
, h ). Therefore, P ((s′
, h)) ∋ (s′
, h ).
(3) We can show the claim in a similar way to (1). 2
52 Chapter 4. Separation Logic for Recursive Procedures
4.3 Logical System
This section defines our logical system. It is the extension of Reynolds’ system pre-
sented in [13] for mutual recursive procedure call.
We will write A[x := e] for the formula obtained from A by replacing x by e. We
will write the formula e → e , e to denote (e → e ) ∗ (e + → e ).
Definition 4.3.1 Our logical system consists of the following inference rules. As men-
tioned in the previous section, we will use Γ for a set of asserted programs. A judgment
is defined as Γ ⊢ {A}P{B}.
Γ ⊢ {A}skip{A}
(skip)
Γ, {A}P{B} ⊢ {A}P{B}
(axiom)
Γ ⊢ {A[x := e]}x := e{A}
(assignment)
Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B}
Γ ⊢ {A}if (b) then (P ) else (P ){B}
(if )
Γ ⊢ {A ∧ b}P{A}
Γ ⊢ {A}while (b) do (P){A ∧ ¬b}
(while )
Γ ⊢ {A}P {C} Γ ⊢ {C}P {B}
Γ ⊢ {A}P ; P {B}
(comp)
Γ ⊢ {A }P{B }
Γ ⊢ {A}P{B}
(conseq)
(A → A , B → B)
4.3 Logical System 53
Γ ⊢ {∀x′
((x′
→ e , e ) −∗ A[x := x′
])}
x := cons(e , e ){A}
(cons)
(x′
̸∈ FV(e , e , A))
Γ ⊢ {∃x′
(e → x′
∗ (e → x′
−∗ A[x := x′
]))}
x := [e]{A}
(lookup)
(x′
̸∈ FV(e, A))
Γ ⊢ {(∃x(e → x)) ∗ (e → e −∗ A)}
[e ] := e {A}
(mutation)
(x ̸∈ FV(e ))
Γ ⊢ {(∃x(e → x)) ∗ A}dispose(e){A}
(dispose)
(x ̸∈ FV(e))
Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {A }Q {B }
...
Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {Anproc }Qnproc {Bnproc }
Γ ⊢ {Ai}Ri{Bi}
(recursion)
Γ ⊢ {A}P{C}
Γ ⊢ {A ∧ B}P{C ∧ B}
(inv-conj)
(FV(B) ∩ Mod(P) = ∅ and B is pure)
Γ ⊢ {A}P{B}
Γ ⊢ {∃x.A}P{B}
(exists)
(x /∈ FV(B) ∪ EFV(P))
We say {A}P{B} is provable and we write ⊢ {A}P{B}, when ⊢ {A}P{B} can be
derived by these inference rules.
We explain inference rules by the following simple example.
54 Chapter 4. Separation Logic for Recursive Procedures
Example
Let A be the abbreviation of ∀y(y ≥ x ∧ y < z ↔ ∃w(y → w) ∗ True). Suppose
we have the procedure declaration
Procedure R (if (x < z) then (dispose(x); x := x + ; R ) else (skip))
We will show that ⊢ {A}R {emp} is provable in our system. This means we have a
heap with some consecutive allocation of cells and the program R deallocates them
all. Let Γ be {{A}R {emp}}. The axiom (assignment) gives Γ ⊢ {A[x := x +
]}x := x+ {A}. The axiom (axiom) gives Γ ⊢ {A}R {emp}. By the inference rule
(comp), we have Γ ⊢ {A[x := x + ]}x := x + ; R {emp}. The axiom (dispose)
gives Γ ⊢ {(∃y(x → y)) ∗ A[x := x + ]}dispose(x){A[x := x + ]}. Then by the
inference rule (comp), we have Γ ⊢ {(∃y(x → y))∗A[x := x+ ]}dispose(x); x :=
x+ ; R {emp}. Theaxiom(skip)givesΓ ⊢ {A∧¬(x < z)}skip{A∧¬(x < z)}. We
haveA∧x < z→(∃y(x → y))∗A[x := x+ ]andA∧¬(x < z)→emp. Thenbythe
inference rule (conseq), we have Γ ⊢ {A ∧ x < z}dispose(x); x := x + ; R {emp}
and Γ ⊢ {A ∧ ¬(x < z)}skip{emp}. By applying the inference rule (if ), we get
Γ ⊢ {A}if (x < z) then (dispose(x); x := x + ; R ) else (skip){emp}. By the
inference rule (recursion), we have ⊢ {A}R {emp} is provable.
We have two new rules those appears neither in [1] nor in [13]. Those are the rules
(inv-conj) and (exists). We have found that the axiom (AXIOM 9: INVARIANCE
AXIOM) in [1] is not sound for our system. The definition of the axiom will be given
in the next chapter. The asserted program {x = }[y] := {x = } is a counter
example since it is not true although it is provable by the axiom. It causes abort for
the state (s, h) = (s′
[x := , y := ], ∅) though x = (s,h) is true. Another counter
example is {emp}x := cons( , ){emp}. Although it does not abort, it is not true.
So, we have introduced the rule (inv-conj). It is a variant of the combination of the
axiom (AXIOM 9: INVARIANCE AXIOM) and the rule (RULE 12: CONJUNC-
TION RULE) of [1]. That is, the truthness of a pure assertion for the post-execution
statedoesnotchangedfromthatofthepre-executionstateifitssetoffreevariablesare
mutually exclusive to the set of modifiable variables of the program. It is a special case
4.3 Logical System 55
offramerulein[13]. Therule(exists)isanalogoustotheruleexistentialintroduction
of propositional calculus.
57
5Soundness and Completeness
58 Chapter 5. Soundness and Completeness
5.1 Soundness
In this section we will prove soundness of our system. That means we will show
that if a judgment Γ ⊢ {A}P{B} is derivable in our system then Γ ⊢ {A}P{B} is
also true.
We say an unfolded program is correct when all level of the unfolding of the pro-
gram is correct. For a given specification, correctness of a program is the same as that
of its unfolded transformation.
Itisstraightforwardtoconstructalogicalsystembycollectingaxiomsandinference
rules from both Hoare’s logic for recursive procedure and separation logic to verify
pointer programs with recursive procedures. But this system is unsound. It is because
the invariance axiom is not sound in separation logic. At the end of this section, we
will give an unsound example.
Proposition 5.1.1 {A}P{B} is true if and only if for all k, {A}P(k)
{B} is true.
Proof. First we will show from the left-hand side to the right-hand side. Assume
{A}P{B} is true. Assume A r = True. Then by definition, P (r) ̸∋ abort. Then
by definition,
∪∞
k= ( P(k) −
(r)) ̸∋ abort. Then for all k, P(k) −
(r) ̸∋ abort. Fix
k. Assume P(k) −
(r) ∋ r′
. Then we have
∪∞
k= ( P(k) −
(r)) ∋ r′
. Then again by
definition, P (r) ∋ r′
. Then B r′ = True. Then {A}P(k)
{B} is true for all k.
Next we will show the right-hand side to the left-hand side. Assume {A}P(k)
{B}
is true for all k. Assume A r = True. Fix k. Then by definition, P(k)
(r) ̸∋ abort.
Then P(k) −
(r) ̸∋ abort. Then
∪∞
k= ( P(k) −
(r)) ̸∋ abort. Then by definition,
P (r) ̸∋ abort. Assume P (r) ∋ r′
. Then by definition,
∪∞
k= ( P(k) −
(r)) ∋ r′
.
Then we have P(k) −
(r) ∋ r′
for some k. Then by definition, P(k)
(r) ∋ r′
. Then
B r′ = True. Then {A}P{B} is true. 2
Definition 5.1.2 For a set Γ of asserted programs, Γ(k)
is defined by { {Ai}Pi{Bi} |
≤ i ≤ n }(k)
= { {Ai}P
(k)
i {Bi} | ≤ i ≤ n }.
If the truth of an assertion depends only on a store, it is not altered in the presence
of any heap. This easy lemma will be necessary for some formal discussions later.
5.1 Soundness 59
Lemma 5.1.3 A s = A (s,h) forapureformulaAwheretheleft-handsideistheseman-
tics for the base language and the right-hand side is the semantics for the assertion language.
Proof. This is proved by induction on A. If A is B ∧ C, we have A s = ( B s and
C s) and A (s,h) = ( B (s,h) and C (s,h)), and both sides are the same, since by the
induction hypothesis we have B s = B (s,h) and C s = C (s,h). Other cases are
similarly proved. 2
The following lemma is important for the soundness of the inference rule
(recursion).
Lemma 5.1.4 If { {Ai}R
(k)
i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q
(k)
i {Bi} is true for all i and
k then ⊢ {Ai}R
(k)
i {Bi} true for all i.
Proof. Assume { {Ai}R
(k)
i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q
(k)
i {Bi} is trueforall iand
k. We will show {Ai}R
(k)
i {Bi} true for all i by induction on k.
Case 1. k = .
By Proposition 4.1.10 (2) R
( )
i = Ω. Then {Ai}Ω{Bi} is true.
Case 2. k = k′
+ .
We assume that { {Ai}R
(k)
i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q
(k)
i {Bi} is true for all k.
By induction hypothesis, {Ai}R
(k′)
i {Bi} is true for all i. By the assumption for k′
, we
have { {Ai}Rk′
i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Qk′
i {Bi} true. Hence {Ai}Q
(k′)
i {Bi} is
true. By Proposition 4.1.10 (3), R
(k′+ )
i = Q
(k′)
i . Hence {Ai}R
(k)
i {Bi} is true for all i.
2
In the following lemma we consider an assertion and a program such that the as-
sertion has no free variable that can be modified by the program. In such a case, the
truth of the assertion is preserved even after the execution of the program.
Lemma 5.1.5 If A is pure, P doesn’t contain any procedure names, A (s,h) = True,
FV(A) ∩ Mod (P) = ∅ and P −
((s, h)) ∋ (s′
, h′
) then A (s′,h′) = True.
Proof. Assume A is pure, P doesn’t contain any procedure names, A (s,h) = True,
FV(A) ∩ Mod (P) = ∅ and P −
((s, h)) ∋ (s′
, h′
). By Lemma 4.2.15 (1), s =Mod(P)c
60 Chapter 5. Soundness and Completeness
s′
. Since FV(A) ⊆ Mod (P)c
, we have s =FV(A) s′
. By Lemma 5.1.3, A s=True.
Hence A s′ =True. By Lemma 5.1.3, A (s′,h′) = True. 2
The following lemma will be necessary to show the soundness of the inference rule
(inv-conj).
Lemma 5.1.6 IfPdoesn’tcontainanyprocedurenames,Bispure,Mod (P)∩FV(B) = ∅
and {A}P{C} is true, then {A ∧ B}P{C ∧ B} is true.
Proof. Assume B is pure, Mod (P) ∩ FV(B) = ∅ and {A}P{C} is true. We have to
prove {A ∧ B}P{C ∧ B} is true.
Assume A ∧ B (s,h) = True. Then A (s,h) = True and B (s,h) = True. Then
P ((s, h)) ̸∋ abort. Assume P ((s, h)) ∋ (s′
, h′
). Then C (s′,h′) = True. By
Lemma5.1.5, B (s′,h′) = True. Then B ∧ C (s′,h′) = True. Hence,{A∧B}P{C∧B}
is true. 2
Lemma 5.1.7 If Γ ⊢ {A}P{B} is provable then Γ(k)
⊢ {A}P(k)
{B} is true for all k.
Proof. It is proved by induction on the proof. We consider cases according to the
last rule.
Case 1. (skip).
Its proof is immediate.
Case 2. (axiom).
Assume Γ, {A}P{B} ⊢ {A}P{B} is provable. Then Γ(k)
, {A}P(k)
{B} ⊢
{A}P(k)
{B} is true for all k.
Case 3. (assignment).
Assume Γ ⊢ {A[x := e]}x := e{A} is provable. Assume A[x := e] (s,h) =
True. Let n be e s. By the definition we have x := e ((s, h)) = {(s , h)} where
s = s[x := n]. Since A[x := e] (s,h) = A (s ,h), we have A (s ,h) = True. Hence
{A[x := e]}x := e{A}istrue. Thenbydefinition,Γ(k)
⊢ {A[x := e]}(x := e)(k)
{A}
is true.
5.1 Soundness 61
Case 4. (if).
We have the asserted program Γ ⊢ {A ∧ b}P {B} and Γ ⊢ {A ∧ ¬b}P {B}
which are provable. By induction hypothesis, Γ(k)
⊢ {A ∧ b}P(k)
{B} and
Γ(k)
⊢ {A ∧ ¬b}P(k)
{B} are true.
Now we will show that Γ(k)
⊢ {A}(if (b) then (P ) else (P ))(k)
{B} is true for all
k. Assume Γ(k)
is true. Then {A ∧ b}P(k)
{B} and {A ∧ ¬b}P(k)
{B} are true.
Assume A (s,h) is true.
Case 4.1. b s is true. By Lemma 5.1.3, we have b (s,h) is true. By def-
inition, A ∧ b (s,h) is true. Assume (if (b) then (P ) else (P ))(k)
((s, h)) =
P(k)
((s, h)) ∋ r. Then r ̸= abort and B r is true.
Case 4.2. b s is false. Then the fact that r ̸= abort and B r is true is similarly
proved to Case 1.
Then {A}(if (b) then (P ) else (P ))(k)
{B} is true. Then by definition Γ(k)
⊢
{A}(if b then P else P )(k)
{B} is true for all k.
Case 5. (while).
B is in the form A ∧ ¬b and by (while) rule we have the asserted program Γ ⊢
{A ∧ b}P{A} which is provable. By induction hypothesis, Γ(k)
⊢ {A ∧ b}P(k)
{A} is
true for all k.
Now we will show that Γ(k)
⊢ {A}(while (b) do (P))(k)
{¬b ∧ A} is true. Assume
Γ(k)
is true. Then {A ∧ b}P(k)
{A} is true.
Let define the function F : States ∪ {abort} → p(States ∪ {abort}) by
F(abort) = {abort},
F((s, h)) = {(s , h ) | A ∧ ¬b (s ,h ) = True} ∪ {r ∈ States | A (s,h) = False}.
We will show that F satisfies the inequations obtained from the equations for
62 Chapter 5. Soundness and Completeness
(while (b) do (P))(k)
by replacing = by ⊇. That is, we will show
F(abort) ⊇ {abort},
F((s, h)) ⊇ {(s, h)} if b s = False,
F((s, h)) ⊇
∪
{ F(r) | r ∈ P(k)
((s, h)) } if b s = True.
ThefirstinequationimmediatelyholdsbythedefinitionofF. Wewillshowthesecond
inequation. Assume b s = False. By Lemma 5.1.3, we have b (s,h) = False. If
A (s,h) = False, then the inequation holds by the definition of F. If A (s,h) = True,
then A ∧ ¬b (s,h) = True, and the inequation holds by the definition of F.
We will show the last inequation. If A (s,h) = False, then it holds by the definition
of F. Assume A (s,h) = True, b s = True, and r′
is in the right-hand side. We
will show that r′
∈ F((s, h)). We have some r such that P(k)
((s, h)) ∋ r and
F(r ) ∋ r′
. By Lemma 5.1.3, we have b (s,h) = True. Hence A ∧ b (s,h) = True.
Since {A ∧ b}P(k)
{A} is true we have r ̸= abort and A r = True. Then we have
r′
̸= abort and A ∧ ¬b r′ = True. Hence r′
∈ F((s, h)).
We have shown that F satisfies the inequations. By the least fixed point theo-
rem [17], we have F ⊇ (while (b) do (P))(k)
. If A (s,h) = True and r′
∈
(while (b) do (P))(k)
((s, h)), then r′
∈ F((s, h)), and we have r′
̸= abort and
A ∧ ¬b r′ = True. Therefore Γ(k)
⊢ {A}(while (b) do (P))(k)
{A ∧ ¬b} is true.
Case 6. (comp).
We have the asserted program Γ ⊢ {A}P {C} and Γ ⊢ {C}P {B} which are prov-
able for some C. By induction hypothesis, Γ(k)
⊢ {A}P(k)
{C} and Γ ⊢ {C}P(k)
{B}
are true for all k.
Now we will show that Γ(k)
⊢ {A}(P ; P )(k)
{B} is true for all k. Assume Γ(k)
is
true. Then {A}P(k)
{C} and {C}P(k)
{B} are true.
Assume A (s,h) = True and (P ; P )(k)
((s, h)) ∋ r. We will show r ̸= abort
and B r = True. We have r such that P(k)
((s, h)) ∋ r and P(k)
(r ) ∋ r. Hence
we have r ̸= abort and C r = True. Since P(k)
(r ) ∋ r, we have r ̸= abort and
B r = True. Hence {A}(P ; P )(k)
{B} is true. Then Γ(k)
⊢ {A}(P ; P )(k)
{B} is
5.1 Soundness 63
true for all k.
Case 7. (conseq).
We have the asserted program Γ ⊢ {A′
}P{B′
} which is provable where A → A′
and B′
→B. By induction hypothesis for the assumption, Γ(k)
⊢ {A′
}P(k)
{B′
} is true
for all k.
Now we will show that Γ(k)
⊢ {A}P(k)
{B} is true for all k. Assume Γ(k)
is true.
Then {A′
}P(k)
{B′
} is true.
Assume A (s,h) = True and P(k)
((s, h)) ∋ r. We will show r ̸= abort and
B r = True. By the side condition A → A (s,h) = True, we have A (s,h) = True.
Hence r ̸= abort and B r = True. By the side condition B → B r = True, we
have B r = True. Hence {A}P(k)
{B} is true. Then Γ(k)
⊢ {A}P(k)
{B} is true for all
k.
Case 8. (cons).
Assume Γ ⊢ {∀x′
((x′
→ e , e ) −∗ A[x := x′
])}x :=
cons(e , e ){A} is provable. Assume ∀x′
(x′
→ e , e −∗ A[x := x′
]) (s,h) = True
and x := cons(e , e ) ((s, h)) ∋ r. We will show r ̸= abort and A r = True. Let
P be x := cons(e , e ), n be e s, and n be e s. By the definition of P , r ̸= abort
andr = (s , h )wheres = s[x := n]andh = h[n := n , n+ := n ]forsomensuch
that n > and n, n + ̸∈ Dom(h). By ∀x′
(x′
→ e , e −∗ A[x := x′
]) (s,h) = True,
we have x′
→ e , e −∗ A[x := x′
] (s′,h) = True where s′
= s[x′
:= n]. Let
h = ∅[n := n , n + := n ]. We have h = h + h . Then x′
→ e , e (s′,h ) = True
since x′
̸∈ FV(e , e ). Since x′
→ e , e −∗ A[x := x′
] (s′,h) = True, we have
A[x := x′
] (s′,h ) = True. Hence A (s ,h ) = True, since x′
̸∈ FV(A). Now
we have {∀x′
((x′
→ e , e ) −∗ A[x := x′
])}x := cons(e , e ){A} is true. Then
Γ(k)
⊢ {∀x′
((x′
→ e , e ) −∗ A[x := x′
])}(x := cons(e , e ))(k)
{A} is true for all k.
Case 9. (lookup).
Assume Γ ⊢ {∃x′
(e → x′
∗ (e → x′
−∗ A[x := x′
]))}x := [e]{A}
is provable. Assume ∃x′
((e → x′
) ∗ ((e → x′
) −∗ A[x := x′
])) (s,h) = True and
x := [e] ((s, h)) ∋ r. We will show r ̸= abort and A r = True. Let n be
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis
thesis

Mais conteúdo relacionado

Semelhante a thesis

Interdisciplinarity report draft v0 8 21th apr 2010
Interdisciplinarity report draft v0 8 21th apr 2010Interdisciplinarity report draft v0 8 21th apr 2010
Interdisciplinarity report draft v0 8 21th apr 2010grainne
 
E.Leute: Learning the impact of Learning Analytics with an authentic dataset
E.Leute: Learning the impact of Learning Analytics with an authentic datasetE.Leute: Learning the impact of Learning Analytics with an authentic dataset
E.Leute: Learning the impact of Learning Analytics with an authentic datasetHendrik Drachsler
 
Analog_and_digital_signals_and_systems.pdf
Analog_and_digital_signals_and_systems.pdfAnalog_and_digital_signals_and_systems.pdf
Analog_and_digital_signals_and_systems.pdfKrishanVyas1
 
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...Mohamed Marei
 
Design and Development of a Knowledge Community System
Design and Development of a Knowledge Community SystemDesign and Development of a Knowledge Community System
Design and Development of a Knowledge Community SystemHuu Bang Le Phan
 
Computing Science Dissertation
Computing Science DissertationComputing Science Dissertation
Computing Science Dissertationrmc1987
 
Essentials of Robust Control
Essentials of Robust ControlEssentials of Robust Control
Essentials of Robust ControlCHIH-PEI WEN
 
System_Reliability-Based_Design_and_Multiresolutio.pdf
System_Reliability-Based_Design_and_Multiresolutio.pdfSystem_Reliability-Based_Design_and_Multiresolutio.pdf
System_Reliability-Based_Design_and_Multiresolutio.pdfLeonardoNahle1
 
project sentiment analysis
project sentiment analysisproject sentiment analysis
project sentiment analysissneha penmetsa
 
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...Jim Jimenez
 
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...Darian Pruitt
 
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docx
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docxCLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docx
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docxHussainiHamza1
 
Master Thesis - Erik van Doesburg - revised
Master Thesis - Erik van Doesburg - revisedMaster Thesis - Erik van Doesburg - revised
Master Thesis - Erik van Doesburg - revisedErik van Doesburg
 

Semelhante a thesis (20)

dmo-phd-thesis
dmo-phd-thesisdmo-phd-thesis
dmo-phd-thesis
 
Kumar_Akshat
Kumar_AkshatKumar_Akshat
Kumar_Akshat
 
Interdisciplinarity report draft v0 8 21th apr 2010
Interdisciplinarity report draft v0 8 21th apr 2010Interdisciplinarity report draft v0 8 21th apr 2010
Interdisciplinarity report draft v0 8 21th apr 2010
 
E.Leute: Learning the impact of Learning Analytics with an authentic dataset
E.Leute: Learning the impact of Learning Analytics with an authentic datasetE.Leute: Learning the impact of Learning Analytics with an authentic dataset
E.Leute: Learning the impact of Learning Analytics with an authentic dataset
 
Tesi
TesiTesi
Tesi
 
rosario_phd_thesis
rosario_phd_thesisrosario_phd_thesis
rosario_phd_thesis
 
Analog_and_digital_signals_and_systems.pdf
Analog_and_digital_signals_and_systems.pdfAnalog_and_digital_signals_and_systems.pdf
Analog_and_digital_signals_and_systems.pdf
 
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...
Mohamed Marei 120126864 Dissertation Design and Construction of Hardware Add-...
 
Design and Development of a Knowledge Community System
Design and Development of a Knowledge Community SystemDesign and Development of a Knowledge Community System
Design and Development of a Knowledge Community System
 
Computing Science Dissertation
Computing Science DissertationComputing Science Dissertation
Computing Science Dissertation
 
hdr-cucugrosjean
hdr-cucugrosjeanhdr-cucugrosjean
hdr-cucugrosjean
 
Dissertation
DissertationDissertation
Dissertation
 
Essentials of Robust Control
Essentials of Robust ControlEssentials of Robust Control
Essentials of Robust Control
 
System_Reliability-Based_Design_and_Multiresolutio.pdf
System_Reliability-Based_Design_and_Multiresolutio.pdfSystem_Reliability-Based_Design_and_Multiresolutio.pdf
System_Reliability-Based_Design_and_Multiresolutio.pdf
 
project sentiment analysis
project sentiment analysisproject sentiment analysis
project sentiment analysis
 
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...
A DISSERTATION SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE D...
 
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...
A Comparison Of The Rule And Case-Based Reasoning Approaches For The Automati...
 
gusdazjo_thesis
gusdazjo_thesisgusdazjo_thesis
gusdazjo_thesis
 
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docx
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docxCLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docx
CLINICAL_MANAGEMENT_SYSTEM_PROJECT_DOCUM.docx
 
Master Thesis - Erik van Doesburg - revised
Master Thesis - Erik van Doesburg - revisedMaster Thesis - Erik van Doesburg - revised
Master Thesis - Erik van Doesburg - revised
 

Mais de Mahmudul Faisal (7)

F~CompSynth
F~CompSynthF~CompSynth
F~CompSynth
 
MS Thesis of Al Ameen 1.5 2010
MS Thesis of Al Ameen 1.5 2010MS Thesis of Al Ameen 1.5 2010
MS Thesis of Al Ameen 1.5 2010
 
thesis
thesisthesis
thesis
 
predefenseslide
predefenseslidepredefenseslide
predefenseslide
 
predefenseslide
predefenseslidepredefenseslide
predefenseslide
 
main
mainmain
main
 
faisal_summery
faisal_summeryfaisal_summery
faisal_summery
 

thesis

  • 1. Completeness of Verification System with Separation Logic for Recursive Procedures Mahmudul Faisal Al Ameen Doctor of Philosophy Department of Informatics School of Multidisciplinary Sciences SOKENDAI (The Graduate University for Advanced Studies) Tokyo, Japan September 2015
  • 2. A dissertation submitted to the Department of Informatics School of Multidisciplinary Sciences SOKENDAI (The Graduate University for Advanced Studies) in partial fulfillment of the requirements for the degree of Doctor of Philosophy Advisory Committee Makoto TATSUTA National Institute of Informatics, SOKENDAI Zhenjiang HU National Institute of Informatics, SOKENDAI Makoto KANAZAWA National Institute of Informatics, SOKENDAI Shin Nakajima National Institute of Informatics, SOKENDAI Yukiyoshi Kameyama University of Tsukuba
  • 3. Thesis advisor: Makoto Tatsuta Mahmudul Faisal Al Ameen SOKENDAI (The Graduate University for Advanced Studies) Abstract Completeness of Verification System with Separation Logic for Recursive Procedures The contributions of this dissertation are two results; the first result gives a new complete Hoare’s logic system for recursive procedures, and the second result proves the completeness of the verification system based on Hoare’s logic and separation logic for recursive procedures. The first result is a complete verification system for reasoning about while programs with recursive procedures that can be extended to separation logic. To obtain it, this work introduces two new inference rules, shows derivability of an inference rule and removes other redundant inference rules and an unsound axiom for showing completeness. The second result is a complete verifica- tion system, which is an extension of Hoare’s logic and separation logic for mutual recursive procedures. To obtain the second result, the language of while programs with recursive procedures is extended with commands to allocate, access, mutate and deallocate shared resources, and the logical system from the first result is extended with the backward reasoning rules of Hoare’s logic and separation logic. Moreover, it is shown that the assertion language of separation logic is expressive relative to the programs. It also introduces a novel expression that is used to describe the complete information of a given state in a precondition. In addition, this work uses the nec- essary and sufficient precondition of a program for the abort-free execution, which enables to utilize the strongest postconditions. iii
  • 4. I would like to dedicate this thesis to my parents for their love, sac- rifice and support all through my life.
  • 5. Acknowledgments I wish to express my deepest gratitude to my advisor, Prof. Makoto Tatsuta, for giving me the opportunity, pre-admission supports and recommendations to study in SOKENDAI with NII scholarship. His guidance with extreme patience is the key driving force for my today’s success. His support not only encouraged but also nour- ished me this last six years.Without his sincere efforts, I could not be in today’s posi- tion. His lessons helped me realizing the rigid nature of mathematics and logic. I am very grateful to my co-supervisor Asst. Prof. Makoto Kanazawa for being one of the most important supporting force for my work. His suggestions at the logic seminars have been a very good basis for elaborating and structuring my research; he gave me enough feedback to mature my ideas while always pointing me interesting directions. I would like to thank Prof. Zhenjiang Hu for providing valuable remarks, allowing me to improve the papers drafts and clarify my arguments. He has shown me a very important horizon of program composition that enabled me to realize programs as mathematical objects. His care and support especially kept me optimistic in my hard times. I am grateful to Prof. Kazushize Terui for his suggestions and support in the early stage of my study. I would like to express my sincere appreciation to Dr. Daisuke Kimura for giving me a long time support as an elder brother. I am indebted to Dr. Takayuki Koai for his v
  • 6. sinceremostefforttomakemylifeeasyinJapan. Indifferentoccasions,theiracademic tutoring, as well as mental supports, are never to be forgotten. Their explanation of several basic and advanced topics often made me understand difficult mathematical topics in an easy way. Special thanks to Dr. Kazuhiro Inaba for an important criticism to show a gap in one of my earlier solution. Dr. Kazuyuki Asada’s advice and Mr. Toshihiko Uchida’s friendly support also helped me to reach in this position. I am very grateful to all of them. Moreover, I thank to Mr. Koike Atsushi for sharing his LATEXclass file, based on which the class file for this dissertation is prepared. I thank National Institute of Informatics for providing me the necessary financial andresourcesupport. Further, IalsothankNIIofficialsatthestudentsupportsection for their administrative assistances. I owe to Prof. Dr. Shorif Uddin for driving me and directly helping me to apply to NII for the doctoral study. I also thank Dr. Zahed Hasan and Mr. Enayetur Rahman for their contributions in my life and research career. Finally, I thank my parents for their love, sacrifices, long patience and endless en- couragements. I thank my wife for her invaluable and direct care and support in the final year. I also thank my sister, relatives and friends for their great mental support that helped me to handle the challenging research situations.
  • 7. Contents 1 Introduction 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Main Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 Background 7 3 New Complete System of Hoare’s Logic with Recursive Proce- dures 17 3.1 Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.2 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.3 Logical System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.4 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4 Separation Logic for Recursive Procedures 25 4.1 Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.2 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 4.3 Logical System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 5 Soundness and Completeness 57 5.1 Soundness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 5.2 Expressiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5.3 Completeness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 6 Conclusion 97 1
  • 8.
  • 9. 1 1Introduction 1.1 Motivation It is widely accepted that a program is needed to be verified to ensure that it is cor- rect. A correct program guarantees to perform the given task as expected. It is very important to ensure the safety of the mission-critical, medical, spacecraft, nuclear re- actor, financial, genetic-engineeringandsimulatorprograms. Moreover, everyonede- sires bug-free programs. Formal verification is intended to deliver programs which are completely free of bugs or defects. It verifies the source code of a program statically. So formal verifi- cation does not depend on the execution of a program. The time required to verify a program depends on neither its runtime and memory complexity nor the magnitude of its inputs. A program is required to be verified only once since it does not depend
  • 10. 2 Chapter 1. Introduction on test cases. Hence, formal verification of programs is important to save both time and expenses commercially and for its supremacy theoretically. Among formal verifi- cation approaches, model checking and Hoare’s logic are prominent. Model checking computeswhetheramodelsatisfiesagivenspecification, whereasHoare’slogicshows it for all models by provability [8]. Since it was proposed by Hoare [7], numerous works on Hoare’s logic have been done [1, 4–6, 10]. Several extensions have also been proposed [1], among which some attempted to verify programs that access heap or shared resources. But until the twenty-first century begins, very few of them were simple enough to use. On the otherhand, sincethedevelopmentofprogramminglanguageslikeCandC++, theus- age of pointers in programs (which are called pointer programs) gained much popu- larity for their ability to use shared memory and other resources directly and for faster execution. Yet this ability also causes crashed programs for some reasons because it is difficult to keep track of each memory operation. It may lead unsafe heap operation. A program crash occurs when the program tries to access a memory cell that has al- ready been deallocated before or when a memory cell is accessed before its allocation. So apparently it became necessary to have an extension of Hoare’s logic that can verify such pointer programs. In 2002, Reynolds proposed separation logic [13]. It was a breakthrough to achieve the ability to verify pointer programs. Especially it can guar- antee safe heap operations of programs. Although recently we can find several works on separation logic [2, 9] and its extensions and applications [3, 11, 12], there are few works found to show their completeness [14]. Tatsuta et al. [14] shows the complete- ness of the separation logic for pointer programs which is introduced in [13]. In this paper, we will show the completeness of an extended logical system. Our logical sys- tem is intended to verify pointer programs with mutual recursive procedures. Among several versions of the same inference rule Reynolds offered in [13] for separation logic, a concise set of backward reasoning rules has been chosen in [14]. The later work in [14] also offers rigorous mathematical discussions. The problems regarding thecompletenessofHoare’slogic, theconceptofrelativecompleteness, completeness ofHoare’slogicwithrecursiveproceduresandmanyotherimportanttopicshavebeen discussed in detail in [1]. Our work begins with [14] and [1]. In modern days, programs are written in segments with procedures, which make
  • 11. 1.2 Main Contribution 3 the programs shorter in size and logically structured, and increases the reusability of code. Soit is importanttouseprocedures andheapoperations(useof sharedmutable resources)bothinasingleprogram. Verifyingprogramswithsuchfeaturesisthemain motivation of our work. 1.2 Main Contribution Our goal is to give a relatively complete logical system that can be used for reason- ing about pointer programs with mutual recursive procedures. A logical system for software verification is called complete if every true judgment can be derived from that system. It ensures the strength of the logical system such that no further develop- ment is necessary for it. If all true asserted programs are provable in Hoare’s system where all true assertions are provided, we call it a relatively complete system. We will show the relative completeness of our system. A language is expressive if the weakest precondition can be defined in the language. We will also show that our language of specification is expressive for our programs. Relative completeness is discussed vastly in [1, 5]. In this paper, relative completeness is sometimes paraphrased as complete- ness when it is not ambiguous. The main contributions of our paper are as follows: (1) A new complete logical system for Hoare’s logic for recursive procedures. (2) A new logical system for verification of pointer programs and recursive proce- dures. (3) Proving the soundness and the completeness theorems. (4) Proving that our assertion language is expressive relative our programs. We know that Hoare’s logic with recursive procedures is complete [1]. We also know that Hoare’s logic with separation logic is complete [14]. But we do not know if Hoare’s logic and separation logic for recursive procedures is complete. In order to achieve our contributions, we will first construct our logical system by
  • 12. 4 Chapter 1. Introduction combining the axioms and inference rules of [1] and [14]. Then we will prove the expressiveness by coding the states in a similar way to [14]. At last we will follow a similar strategy in [1] to prove the completeness. Although one may feel it easy to combine these two logical systems to achieve such a complete system, in reality it is not the case. Now we will discuss some challenges we face to prove its relative completeness. (1) The axiom (AXIOM 9: INVARIANCE AXIOM), which is essential in [1] to show completeness, is not sound in separation logic. (2) In the completeness proof of the extension of Hoare’s logic for the recursive procedures in [1], the expression −→x = −→z (−→x are all program variables and −→z are fresh) is used to describe the complete information of a given state in a precondition. A state in Hoare’s logic is only a store, which is a function assigning to each variable a value. In separation logic, a state is a pair of a store and a heap. So the same expres- sion cannot be used for a similar purpose for a heap, because a store information may contain variables x , . . . , xm which are assigned z , . . . , zm respectively, while a heap information consists of the set of the physical addresses only in the heap and their corresponding values. The vector notation cannot express the general information of the size of the heap and its changes because of allocation and deallocation of memory cells. (3) Another challenge is to utilize the strongest postcondition of a precondition and a program. In case a program aborts in a state for which the precondition is valid, the strongest postcondition of the precondition and the program does not exist. But utilizing the strongest postcondition is necessary for completeness proof, because the completeness proof of [1] depends on it. Now it is necessary to solve these obstacles for the proof of the completeness of our system. That is why it is quite challenging to solve the completeness theorem which is our principal goal. The solutions to the challenges stated above are as follows: (1) We will give an inference rule (inv-conj) as an alternative to the axiom (AX-
  • 13. 1.2 Main Contribution 5 IOM9: INVARIANCEAXIOM)in[1]. Itwillacceptapureassertionwhichdoesnot have a variable common to the program. We will also give an inference rule (exists) that is analogous to the existential introduction rule in the first-order predicate cal- culus. We will show that the inference rule (RULE 10: SUBSTITUTION RULE I) in [1] is derivable in our system. Since the inference rules (RULE 11: SUBSTITU- TION RULE II) and (RULE 12: CONJUNCTION RULE) in [1] are redundant in our system, we will remove them. It gives us the new complete system for Hoare’s logic for mutual recursive procedures. We will extend this system with the inference rules in [14] to give the verification system for pointer programs with mututal recur- sive procedures. As a result, the set of our axioms and inference rules will be quite different from the union of those of [1] and [14]. (2) We will give an appropriate assertion to describe the complete information of a given state in a precondition. Beside the expression −→x = −→z for the store infor- mation, we will additionally use the expression Heap(xh) for the heap information, where xh keeps a natural number that is obtained by a coding of the current heap. (3) For pointer programs, it is difficult to utilize the strongest postcondition be- cause it is impossible to assert a postcondition for A and P where P may abort in a state for which A is true. We use {A}P{True} as the abort-free condition of A and P. For the existence of the strongest postcondition, it is necessary for {A}P{True} to be true. We will give the necessary and sufficient precondition WP,True(−→x ) for the fact that the program P will never abort. Outline of This Paper Our background will be presented in Chapter 2. A new complete Hoare’s logic for recursive procedures will be given in Chapter 3. We will define our languages, seman- tics and the logical system in Chapter 4. In Chapter 5, we will prove the soundness, expressiveness and completeness. We will conclude in Chapter 6.
  • 14.
  • 15. 7 2Background Hoare introduced an axiomatic method, Hoare’s logic, to prove the correctness of programs in 1969 [7]. Floyd’s intermediate assertion method was behind the ap- proach of Hoare’s logic. Besides its great influence in designing and verifying pro- grams, it has also been used to define the semantics of programming languages. Among many works which extended the approach of Hoare to prove a program correct, some were not useful or difficult to use. In 1981, Apt presented a survey of variousresultsconcerningtheapproachofHoarein[1]. Hisworkemphasizedmainly on the soundness and completeness issues. He first presented the proof system for while programs along with its soundness, expressiveness, and completeness. He then presented the extension of it to recursive procedures, local variable declarations and procedures with parameters with corresponding soundness and completeness. Although several extensions of Hoare’s logic found in literature, until the work of
  • 16. 8 Chapter 2. Background Reynolds in 2002 [13], an extension to pointer programs was missing. Reynolds pro- vided the separation logic, which is an assertion language for shared mutable data structures and resources. In his work, he also extended Hoare’s logic to pointer pro- grams with several sets of logical rules and showed its soundness. Tatsuta chose only the rules with forward reasoning and proved its completeness in [14]. The scope of this logical system was limited only to while programs with memory access opera- tions. Since this work intends to extend Hoare’s logic and separation logic to mutual re- cursive procedures, it will be based on [1] and [14]. Hoare’s Logic for Recursive Procedures In this section, we will discuss the verification system for while programs with re- cursive procedures given in [1]. Here we will present the language of the while pro- grams with recursive procedures and the assertions, their semantics, a logical system to reason about the programs and the completeness proof. We will extend this proof to pointer programs with recursive procedures later. Language The language of assertion in [1] is the first order language with equality of Peano arithmetic. Its variables are denoted by x, y, z, w, . . .. Expressions, denoted by e, are defined by e ::= x | | | e + e | e × e. A quantifier-free formula b is defined by b ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b. The formula (of the assertion language), denoted by A, B, C, is defined by A ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b | ∀xA | ∃xA.
  • 17. 9 RecursiveproceduresaredenotedbyR. Whileprogramsextendedtorecursivepro- cedures, denoted by P, Q, is defined in [1] by P, Q ::= x := e | if (b) then (P) else (P) | while (b) do (P) | P; P | skip | R. We assume that procedure R is declared with its body Q. The basic formula of Hoare’s logic is composed with three elements. They are two assertions A and B and a program P. It is expressed in the form {A}P{B} that is also called a correctness formula or an asserted program. Here A and B are called the precondition and the postcondition of the program P respectively. When- ever A is true before the execution of P and the execution terminates, B is true after the execution. Semantics States,denotedbys,aredefinedin[1]asafunctionfromthesetofvariablesVtothe set of natural numbers N. The semantics of the programming language is defined first by P − for the programs those do not contain procedures, which is a partial function from States to States. The semantics of assertions is denoted by A s that gives us the truth value of A at the state s.
  • 18. 10 Chapter 2. Background Definition 2.0.1 The definition of semantics of programs is given below. x := e − (s) = s[x := e s], if (b) then (P ) else (P ) − (s) = { P (s) if b s = True P (s) otherwise, while (b) do (P) − = { s if b s = False while (b) do (P) − ( P − (s)) otherwise, P ; P − (s) = P − ( P − (s)) skip − (s) = s. In order to define the semantics of programs which include recursive procedures, Apt provided the approximation semantics of programs. He defined a procedure-less program P(n) by induction on n: P( ) = Ω, P(n+ ) = P[Q(n) /R]. He then defined the semantics of programs by P = ∞∪ i= P[Q(i) /R] − An asserted program (or a correctness formula) {A}P{B} is defined to be true if and only if for all states s, s′ , if A s = True and P (s) = s′ then B s′ = True. Logical System The logical system H given in [1] consists of the following axioms and inference rules. Here Γ is used as a set of asserted programs. A judgment is defined as Γ ⊢ {A}P{B}. var(P) is defined as all the variables appeared in the execution of P.
  • 19. 11 AXIOM 1: ASSIGNMENT AXIOM Γ ⊢ {A[x := e]}x := e{A} (assignment) RULE 2: COMPOSITION RULE Γ ⊢ {A}P {C} Γ ⊢ {C}P {B} Γ ⊢ {A}P ; P {B} (composition) RULE 3: if-then-else RULE Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B} Γ ⊢ {A}if (b) then (P ) else (P ){B} (if-then-else) RULE 4: while RULE Γ ⊢ {A ∧ b}P{A} Γ ⊢ {A}while (b) do (P){A ∧ ¬b} (while) RULE 5: CONSEQUENCE RULE Γ ⊢ {A }P{B } Γ ⊢ {A}P{B} (consequence) (A → A , B → B true) RULE 8: RECURSION RULE Γ ∪ {A}R{B} ⊢ {A}Q{B} Γ ⊢ {A}R{B} (recursion) AXIOM 9: INVARIANCE AXIOM Γ ⊢ {A}R{A} (invariance) (FV(A) ∩ var(R) = ∅)
  • 20. 12 Chapter 2. Background RULE 10: SUBSTITUTION RULE I Γ ⊢ {A}R{B} Γ ⊢ {A[−→z := −→y ]}R{B[−→z := −→y ]} (substitution I) (−→y , −→z ̸∈ var(R)) RULE 11: SUBSTITUTION RULE II Γ ⊢ {A}R{B} Γ ⊢ {A[−→z := −→y ]}R{B} (substitution II) (−→z ̸∈ var(R) ∪ FV(B)) RULE 12: CONJUNCTION RULE Γ ⊢ {A}R{B} Γ ⊢ {A′ }R{C} Γ ⊢ {A ∧ A′ }R{B ∧ C} (conjunction) An asserted formula {A}x := e{A[x := e]} may seems fit more for the assignment axiom. But it is not. For an example like {x = a ∧ b = a + }x := b{(x = a ∧ b = a+ )[x := b]}, that is {x = a∧b = a+ }x := b{(b = a∧b = a+ )}, is not obvi- ously true. Rather {(x = b)[x := b]}x := b{x = b}, that is {b = b}x := b{x = b} istrue. Thecompositionruleissimilartothecutruleinconcept. Whentwoprograms are executed one after another, the postcondition of the former one is the precondi- tionofthelater. Thepreconditionoftheformeroneandthepostconditionofthelater one are preserved for the execution of the composition of those two programs. The if-then-else rule comes from the fact that truth value of b determines the execution of either P or P . The rule itself is very natural. The while rule is a bit tricky. Here A is called a loop invariant, which is an assertion that is preserved before and after the execution of P. The truthness of b triggers execution of P and naturally the execution terminates only when b is false. The consequence rule is not any ordinary rule like others. Here the important fact is that a stronger precondition and a weaker postcondition may replace respectively the precondition and the postcondition of a valid asserted program without affecting itsvalidity. Withanexample,itmayhelptounderstanditbetter. Theassertedprogram
  • 21. 13 {x = a}x := x + {x = a + } is indeed valid. But from assignment axiom we may get only {(x = a+ )[x := x+ ]}x := x+ {x = a+ }, that is {x+ = a+ }x := x + {x = a + }. Since x = a → x + = a + , with the help of the consequence rule now we finally get {x = a}x := x + {x = a + }. Recursion rule states that if the assumption of a valid asserted program for a re- cursive procedure gives us a valid asserted program for its body, we can say that the asserted program for the procedure is indeed valid. The invariance axiom confirms us that the precondition is preserved in the postcondition if none of its variables is accessed by the execution of a recursive procedure. Substitution rule I allows vari- able substitution in the assertions in an asserted program if the recursive procedure does not access the substituted and substituting variables. Substitution rules II allows the substitution of only the variables in the precondition if those neither appear in the recursive procedure nor in the postcondition. Conjunction rule allows the pre- conditions and the postconditions of two asserted programs for the same recursive procedure to be conjoined. Soundness In [1], ⊢H {A}P{B} denotes the fact that {A}P{B} is provable in the logical sys- tem H, which uses the assumption that all the true assertions are provided (for conse- quence rule). In his work, the notion of the truth of an asserted program is introduced where he chose the standard interpretation of the assertion language with the domain of natural numbers. Apt called an asserted program valid if it is true under all interpretations. He also called a proof rule sound if for all interpretations it preserves the truth of asserted pro- grams. Since it is easy to prove that the axioms are valid and the proof rules are sound, it can be said that the logical system is proved to be sound by induction on the length of proofs. His soundness theorem claims that for every asserted program {A}P{B} in the logical system H, if ⊢H {A}P{B} is provable under the presence of all true assertions then {A}P{B} is true.
  • 22. 14 Chapter 2. Background Completeness in the sense of Cook Thestrongestpostconditionofanassertionandaprogramandtheweakestprecon- dition of a program and an assertion have a key role in defining the completeness in the sense of Cook of such a proof system where general completeness does not hold. Now we will define the strongest postcondition and the weakest precondition. Definition 2.0.2 ThestrongestpostconditionofanassertionAandaprogramPisdefined by SP(A, P) = { s′ | ∃s( A s ∧ P (s) = s′ ) }. The weakest precondition of a program P and an assertion A is defined by WP(P, A) = { s | ∀s′ ( P (s) = s′ → A s′ ) }. Definition 2.0.3 An assertion language L is said to be expressive relative to the set of programs P if for all assertions A ∈ L and programs P ∈ P, there exists an assertion S ∈ L which defines the strongest postcondition SP(A, P). Definition 2.0.4 A proof system G for a set of programs P is said to be complete in the sense of Cook if, for all L such that L is expressive relative to P and for every asserted formula {A}P{B}, if {A}P{B} is true then ⊢G {A}P{B} is provable. Apt presented the proof of the completeness of the system in the sense of Cook in [1] using two central lemmas. We will present them and discuss their proof. Assume that −→x is the sequence of all variables which occur in P and −→z is a sequence of some new variables and both of their lengths are same. Assume that the assertion language is expressive for the logical system. So, there exists an assertion S that defines the strongest postcondition of −→x = −→z and R. The asserted program {−→x = −→z }R{S} is the most general formula for R, since any other true asserted program about R can be derived from {−→x = −→z }R{S}. This claim is the contents of the first lemma. Lemma 2.0.5 (Apt 1981) if {A}P{B} is true then {−→x = −→z }R{S} ⊢ {A}P{B} is provable provided that all the true assertions are given. Proof.
  • 23. 15 It is proved by induction on P where the most interesting case is P = R. Other cases are similar to that of the system H in [1]. SupposethatPisR. Assume{A}R{B}istrue. Wehave⊢ {−→x = −→z }R{S}. LetA be A[−→z := −→u ] and B be B[−→z := −→u ] where −→u ̸∈ FV(B) ∪ var(R). By invariance axiom, ⊢ {A [−→x := −→z ]}R{A [−→x := −→z ]} is provable. By the conjunction rule, ⊢ {−→x = −→z ∧ A [−→x := −→z ]}R{S ∧ A [−→x := −→z ]} is provable since FV(A [−→x := −→z ]) ∩ var(R) = ∅. We now show that S ∧ A [−→x := −→z ] → B . Assume S ∧ A [−→x := −→z ] s = True. By definition S s = True. By the prop- erty of the strongest postcondition, there exists a state s′ such that Ri (s′ ) = s and −→x = −→z s′ = True. By invariance axiom, ⊢ {¬A [−→x := −→z ]}R{¬A [−→x := −→z ]} is provable. The by conjunction rule ⊢ {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S ∧ ¬A [−→x := −→z ]}. By soundness, {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S ∧ ¬A [−→x := −→z ]} is true. Now suppose that ¬ A [−→x := −→z ] s′ = True. Hence −→x = −→z ∧ ¬A [−→x := −→z ] s′ = True. Therefore, ¬ A [−→x := −→z ] s = True. But A [−→x := −→z ] s = True by the assumption for s′ . It contradicts the assumption and hence A [−→x := −→z ] s′ = True. Since−→x = −→z ∧A [−→x := −→z ]→A ,wehave A s′ = True. Then A s′[−→z := −−→ s′(u)] = True. Then R (s′ [−→z := −−→ s′ (u)]) = s[−→z := −→ s(u)] since −→z , −→u ̸∈ var(R). Then by definition, B s[−→z :=−→u ]True. Then by definition, B s = True. Hence S ∧ A [−→x := −→z ] → B is true. Then by the consequence rule, ⊢ {−→x = −→z ∧ A [−→x := −→z ]}R{B } is provable. Then by the substitution rule II, ⊢ {−→x = −→x ∧ A }R{B } is provable. Then by the consequence rule, ⊢ {A }Ri{B } is provable. By the substitution rule I, ⊢ {A [−→u := −→z ]}Ri{B [−→u := −→z ]}. We have A → A [−→u := −→z ] and B [−→u := −→z ] → B. Then by the consequence rule, ⊢ {A}P{B} provable. 2 Lemma 2.0.6 (Apt 1981) The next lemma in [1] claims that ⊢ {−→x = −→z }R{S} is provable. Proof.
  • 24. 16 Chapter 2. Background By definition of S, {−→x = −→z }R{S} is true and hence {−→x = −→z }Q{S} is true since R = Q . By the Lemma 2.0.5, {−→x = −→z }R{S} ⊢ {−→x = −→z }Q{S} is provable. By the recursion rule, ⊢ {−→x = −→z }R{S} is provable. 2 The completeness theorem states that if an asserted program {A}P{B} is true then ⊢ {A}P{B}isprovablewhereallthetrueassertionsaregiven. Itisthecentralconcept of completeness in the sense of Cook. Theorem 2.0.7 (Apt 1981) If {A}P{B} is true then ⊢ {A}P{B} is provable. Proof. Assume {A}P{B} is true. By Lemma 2.0.5, {−→x = −→z }R{S} ⊢ {A}P{B} is provable. By Lemma 2.0.6, ⊢ {−→x = −→z }R{S} is provable. Then ⊢ {A}P{B} is provable. 2
  • 25. 17 3 New Complete System of Hoare’s Logic with Recursive Procedures We introduce a complete system of Hoare’s logic with recursive procedures. Apt gave a system for the same purpose and showed its completeness in [1]. Our system is obtained from Apt’s system by replacing the invariance axiom, the substitution rule I, the substitution rule II, and the conjunction rule by the rules (inv-conj) and (ex- ists). Apt suggested without proofs that one could replace them by his substitution rule I, (inv-conj), and (exists) to get another complete system. We prove that the sub- stitution rule I can actually be derived in our system. We also give a detailed proof of the completeness of our system.
  • 26. 18 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures 3.1 Language Our assertion is a formula A of Peano arithmetic. We define the language L as follows. Definition 3.1.1 Formulas A are defined by A ::= e = e | e < e | ¬A | A ∧ A | A ∨ A | A → A | ∀xA | ∃xA We will sometimes call a formula an assertion. We define FV(A) as the set of free variables in A. We define FV(e) similarly. Our program is a while-program with parameterless recursive procedures. Definition 3.1.2 Programs, denoted by P,Q, are defined by P, Q ::= x := e | if (b) then (P) else (P) | while (b) do (P) | P; P | skip | Ri. b is a formula without the quantifiers. Ri is a parameter-less procedure name having Qi as its definition body. We define the language L− as L excluding the construct R. An asserted program is defined by {A}P{B}, which means the partial correctness. 3.2 Semantics Definition 3.2.1 We define the semantics of our programming language. For a program P, its meaning P is defined as a partial function from States to States. We will define
  • 27. 3.3 Logical System 19 P (r ) as the resulting state after termination of the execution of P with the initial state r . If the execution of P with the initial state r does not terminate, we leave P (r ) undefined. In order to define P , we would like to define P − for all P in the language L− . We define P − by induction on P in L− as follows: x := e − (s) = s[x := e s], if (b) then (P ) else (P ) − (s) = { P (s) if b s = True P (s) otherwise, while (b) do (P) − = { s if b s = False while (b) do (P) − ( P − (s)) otherwise, P ; P − (s) = P − ( P − (s)) skip − (s) = s. Definition 3.2.2 For an asserted program {A}P{B}, the meaning of {A}P{B} is de- fined as True or False. {A}P{B} is defined to be True if the following holds. For all s and s′ , if A s = True and P (s) = s′ , then B s′ = True. Definition 3.2.3 The semantics of P in L is defined by P (s) = { s′ if { P(i) − (s)|i ≥ } = {s′ } undefined if { P(i) − (s)|i ≥ } = ∅ 3.3 Logical System This section defines the logical system. We will write A[x := e] for the formula obtained from A by replacing x by e. Definition 3.3.1 Our logical system consists of the following inference rules. As men- tionedinprevioussection, wewilluseΓforasetofassertedprograms. Ajudgmentisdefined as Γ ⊢ {A}P{B}.
  • 28. 20 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures Γ ⊢ {A}skip{A} (skip) Γ, {A}P{B} ⊢ {A}P{B} (axiom) Γ ⊢ {A[x := e]}x := e{A} (assignment) Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B} Γ ⊢ {A}if (b) then (P ) else (P ){B} (if ) Γ ⊢ {A ∧ b}P{A} Γ ⊢ {A}while (b) do (P){A ∧ ¬b} (while ) Γ ⊢ {A}P {C} Γ ⊢ {C}P {B} Γ ⊢ {A}P ; P {B} (comp) Γ ⊢ {A }P{B } Γ ⊢ {A}P{B} (conseq) (A → A , B → B) Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {A }Q {B } ... Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {Anproc }Qnproc {Bnproc } Γ ⊢ {Ai}Ri{Bi} (recursion) Γ ⊢ {A}P{C} Γ ⊢ {A ∧ B}P{C ∧ B} (inv-conj) (FV(B) ∩ Mod(P) = ∅)
  • 29. 3.4 Completeness 21 Γ ⊢ {A}P{B} Γ ⊢ {∃x.A}P{B} (exists) (x /∈ FV(B) ∪ EFV(P)) We say {A}P{B} is provable and we write ⊢ {A}P{B}, when ⊢ {A}P{B} can be derived by these inference rules. The rule (exists) is analogous to the rule existential introduction of propositional calculus. 3.4 Completeness Lemma 3.4.1 If {A}P {B} is true and P = P then {A}P {B} is true. Proof. By definition. Definition 3.4.2 X is called the strongest postcondition of P and A if and only if the fol- lowing holds. (1) For all s, s′ , if A s = True and P (s) = s′ then s′ ∈ X. (2) For all Y, if ∀s, s′ ( A s = True ∧ P (s) = s′ → s′ ∈ Y) then X ⊆ Y. Definition 3.4.3 SA,P(−→x ) is defined as the strongest postcondition for A and P. SA,P(−→x ) gives the strongest assertion S such that {A}P{S} is true. Lemma 3.4.4 If⊢ {A}P{B}then⊢ {A[−→x := −→z ]}P{B[−→x := −→z ]}where−→z , −→x ̸∈ EFV(P). Proof. Assume ⊢ {A}P{B} and −→z ̸∈ EFV(P). Then by (inv-conj), ⊢ {A ∧ −→x = −→z }P{B ∧ −→x = −→z }. We have B ∧ −→x = −→z → B[−→x := −→z ]. Then by (con- seq), ⊢ {A ∧ −→x = −→z }P{B[−→x := −→z ]}. Then by (exists), ⊢ {∃−→z (A ∧ −→x =
  • 30. 22 Chapter 3. New Complete System of Hoare’s Logic with Recursive Procedures −→z )}P{B[−→x := −→z ]}. We have A[−→x := −→z ] → ∃−→z (A ∧ −→x = −→z ). Then by (conseq), ⊢ {A[−→x := −→z ]}P{B[−→x := −→z ]}. 2 Lemma 3.4.5 If {A}P{B} is true and −→z ̸∈ EFV(P) then Γ ⊢ {A}P{B} where Γ = {{−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )}|i = , . . . , n}, −→x = x , . . . , xm and {xj|j = , . . . , m} = EFV(P). Proof. We will prove it by induction on P. We will consider the cases of P. If P is other than Ri, the proof of these cases are similar to those of completeness proof of H given in [1]. Case P is Ri. Assume {A}Ri{B} is true and ⃗z ̸∈ EFV(Ri). We have Γ ⊢ {−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )}. Let A be A[−→z := −→u ] and B be B[−→z := −→u ] where −→u ̸∈ FV(B) ∪ EFV(Ri). By the rule (inv-conj), Γ ⊢ {−→x = −→z ∧ A [−→x := −→z ]}Ri{S−→x =−→z ,Ri (−→x ) ∧ A [−→x := −→z ]} since FV(A [−→x := −→z ]) ∩ Mod(Ri) = ∅. We now show that S−→x =−→z ,Ri (−→x ) ∧ A [−→x := −→z ] → B . Assume S−→x =−→z ,R(−→x ) ∧ A [−→x := −→z ] s = True. By definition S−→x =−→z ,Ri (−→x ) s = True. By the property of the strongest postcondition, there exists a state s′ such that Ri (s′ ) = s and −→x = −→z s′ = True. Now suppose that ¬ A [−→x := −→z ] s′ = True. By (inv-conj), Γ ⊢ {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S−→x =−→z ,Ri (−→x ) ∧ ¬A [−→x := −→z ]}. Since Γ is true by definition, by soundness, {−→x = −→z ∧ ¬A [−→x := −→z ]}Ri{S−→x =−→z ,Ri (−→x ) ∧ ¬A [−→x := −→z ]} is true. Then by definition ¬ A [−→x := −→z ] s = True. But A [−→x := −→z ] s = True. It contradicts the assumption and hence A [−→x := −→z ] s′ = True. Since−→x = −→z ∧A [−→x := −→z ]→A ,wehave A s′ = True. Then A s′[−→z := −−→ s′(u)] = True. Then Ri (s′ [−→z := −−→ s′ (u)]) = s[−→z := −→ s(u)] since −→z , −→u ̸∈ EFV(Ri). Then by definition, B s[−→z :=−→u ]True. Then by definition, B s = True. Hence S−→x =−→z ,R(−→x )∧ A [−→x := −→z ] → B is true. Then by the rule (conseq), Γ ⊢ {−→x = −→z ∧ A [−→x := −→z ]}Ri{B }. Then by the rule (exists), Γ ⊢ {∃−→z (−→x = −→z ∧A [−→x := −→z ])}Ri{B }. We have A → ∃−→z (−→x = −→z ∧ A [−→x := −→z ]). Then by the rule (conseq), Γ ⊢
  • 31. 3.4 Completeness 23 {A }Ri{B }. By Lemma 3.4.4, Γ ⊢ {A [−→u := −→z ]}Ri{B [−→u := −→z ]}. We have A → A [−→u := −→z ] and B [−→u := −→z ] → B. Then by (conseq), Γ ⊢ {A}P{B}. 2 Next lemma shows that the hypothesis {−→x = −→z (−→x )}R{S−→x =−→z ,R(−→x )} used in lemma 5.3.7 is provable in the our system. Lemma 3.4.6 ⊢ {−→x = −→z }Rj{S−→x =−→z ,Rj (−→x )} for j = , . . . , n where −→x = x , . . . , xm, {xj|j = , . . . , m} = ∪n i= EFV(Ri) and −→z ̸∈ ∪n i= EFV(Ri). Proof. Assume −→z ̸∈ ∪n i= EFV(Ri) and −→x = x , . . . , xm where {xj|j = , . . . , m} = ∪n i= EFV(Ri). Fix j. Assume −→x = −→z s = True. Assume Qj (s) = r where Qj is the body of Rj. Then by Lemma 3.4.1, Rj (s) = r. By definition, S−→x =−→z ,Rj (−→x ) r = True. Then by definition, {−→x = −→z }Qj{S−→x =−→z ,Rj (−→x )} is true. By Lemma 3.4.5, {{−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )}|i = , . . . , n} ⊢ {−→x = −→z }Qj{S−→x =−→z ,Rj (−→x )}. Hence, {{−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )}|i = , . . . , n} ⊢ {−→x = −→z }Qj{S−→x =−→z ,Rj (−→x )} for all j = , . . . , n. Then by the rule (recursion), ⊢ {−→x = −→z }Rj{S−→x =−→z ,Rj (−→x )}. 2 The following theorem is the key theorem of this paper. It says that our system is complete. Theorem 3.4.7 If {A}P{B} is true then {A}P{B} is provable. Proof. Assume {A}P{B} is true. Let −→z be such that −→z ̸∈ ∪n i= EFV(Ri) ∪ EFV(P) and −→x = x , . . . , xm where {xj|j = , . . . , m} = ∪n i= EFV(Ri) ∪ EFV(P). Then by Lemma3.4.5,{{−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )}|i = , . . . , n} ⊢ {A}P{B}. ByLemma 3.4.6, ⊢ {−→x = −→z }Ri{S−→x =−→z ,Ri (−→x )} for i = , . . . , n. Then we have ⊢ {A}P{B}. 2 Apt’s system cannot be extended to separation logic, because his invariance axiom isinconsistentwithseparationlogic. Ontheotherhand,wecanextendoursystemtoa verificationsystemwithseparationlogicandrecursiveproceduresinastraightforward way.
  • 32.
  • 33. 25 4Separation Logic for Recursive Procedures
  • 34. 26 Chapter 4. Separation Logic for Recursive Procedures 4.1 Language This section defines our programming language and our assertion language. Our programming language inherits from the pointer programs in Reynolds’ paper [13]. Ourassertionlanguageisalsothesameasin[13],whichisbasedonPeanoarithmetic. Base Language We first define our base language, which will be used later for both a part of our programming language and a part of our assertion language. It is essentially a first- order language for Peano arithmetic. We call its formula a pure formula. We will use i, j, k, l, m, n for natural numbers. Our base language is defined as follows. We have variables x, y, z, w, . . . and constants , , null, denoted by c. The symbol null means thenullpointer. Wehavefunctionsymbols+and×. Ourpredicatesymbols,denoted by p, are = and <. Terms, denoted by t, are defined as x | c | f(t, . . . , t). Our pure formulas, denoted by A, are defined by A ::= p(t, t) | ¬A | A ∧ A | A ∨ A | A → A | ∀xA | ∃xA where p(t, t) is t = t or t < t. Here p(t , t ) means that the predicate p holds for terms t , t . The other formula constructions mean usual logical connectives. We will sometimes write the number n to denote the term + ( + ( + . . . ( + ))) (n times of +). Programming Language Next we define our programming language, which is an extension of while pro- grams to pointers and procedures. Its expressions are terms of the base language. That is, our expressions, denoted by e, are defined by e ::= x | | | null | e + e | e × e. Expressions mean natural numbers or pointers. Itsbooleanexpressions, denotedbyb, arequantifier-freepureformulasanddefined
  • 35. 4.1 Language 27 by b ::= e = e | e < e | ¬b | b ∧ b | b ∨ b | b → b. Boolean expressions are used as conditions in a program. We assume procedure names R , . . . , Rnproc for some nproc. We will write R for these procedure names. Definition 4.1.1 Programs, denoted by P,Q, are defined by P ::= x := e (assignment) | if (b) then (P) else (P) (conditional) | while (b) do (P) (iteration) | P; P (composition) | skip (no operation) | x := cons(e, e) (allocation) | x := [e] (lookup) | [e] := e (mutation) | dispose(e) (deallocation) | R (mutual recursive procedure name) R means a procedure name without parameters. We write L for the set of programs. We write L− for the set of programs that do not contain procedure names. The statement x := cons(e , e ) allocates two new consecutive memory cells, puts the values of e and e in the respective cells, and assigns the first address to x. The statementx := [e]looksupthecontentofthememorycellattheaddresseandassigns it to x. The statement [e ] := e changes the content of the memory cell at the address e by e . The statement dispose(e) deallocates the memory cell at the address e. The programs x := e, skip, x := cons(e , e ), x := [e], [e ] := e and dispose(e) are called atomic programs. A program P is called pure if it doesn’t contain any of x := cons(e , e ), x := [e], [e ] := e or dispose(e). It means that a pure program never access memory cells directly. We call Procedure R(Q) a procedure declaration where R is a procedure name and
  • 36. 28 Chapter 4. Separation Logic for Recursive Procedures Q is a program. The program Q is said to be the body of R. This means that we define the procedure name R with its procedure body Q. We assume the procedure declarations {Procedure R (Q ), . . . , Procedure Rnproc (Qnproc )} that gives procedure definitions to all procedure names in the rest of the paper. We allow mutual recursive procedure calls. Assertion Language and Asserted Programs Our assertion language is a first-order language extended by the separating con- junction ∗ and the separating implication −∗ as well as emp and →. Its variables, constants, function symbols, and terms are the same as those of the base language. We have predicate symbols =, < and → and a predicate constant emp. Our assertion language is defined as follows. Definition 4.1.2 Formulas A are defined by A ::= emp | e = e | e < e | e → e | ¬A | A ∧ A | A ∨ A | A → A | ∀xA | ∃xA | A ∗ A | A −∗ A We will sometimes call a formula an assertion. We define FV(A) as the set of free variables in A. We define FV(e) similarly. The symbol emp means the current heap is empty. The formula e → e means the current heap has only one cell at the address e and its content is e . The formula A ∗ B means the current heap can be split into some two disjoint heaps such that the formula A holds at one heap and the formula B holds at the other heap. The formula A −∗ B means that for any heap disjoint from the current heap such that the formula A holds at the heap, the formula B holds at the new heap obtained from the current heap and the heap by combining them.
  • 37. 4.1 Language 29 We use vector notation to denote a sequence. For example, −→e denotes the se- quence e , . . . , en of expressions. Definition 4.1.3 The expression {A}P{B} is called an asserted program, where A, B are formulas and P is a program. This means the program P with its precondition A and its postcondition B. Unfolding of Procedures We define the set of procedure names which are visible in a program. It will be necessary later in defining dependency relation between two procedures. Definition 4.1.4 The set PN(P) of procedure names in P is defined as follows. PN(P) = ∅ if P is atomic, PN(if (b) then (P ) else (P )) = PN(P ) ∪ PN(P ), PN(P ; P ) = PN(P ) ∪ PN(P ), PN(while (b) do (P)) = PN(P), PN(Ri) = {Ri}. We define the dependency relation between two procedures. When a procedure name appears in the body of another procedure, we say the latter procedure depends on the former procedure at level 1. When one procedure depends on another and the latter one again depends on the third one, we say the first one also depends on the third one. In this case, the level of the third dependency is determined by summing up the levels of first and second dependencies mentioned above. Definition 4.1.5 We define the relation Ri k ; Rj as follows: Ri ; Ri, Ri ; Rj if PN(Qi) ∋ Rj, Ri k ; Rj if Ri = R′ ; R′ ; . . . ; R′ k = Rj for some R′ , . . . , R′ k.
  • 38. 30 Chapter 4. Separation Logic for Recursive Procedures Procedures dependency PD(Ri, k) of a procedure name Ri up to level k is defined by PD(Ri, k) = { Rj | Ri l ; Rj, l ≤ k }. This relation will be used to define EFV(P) and Mod(P) as well as the semantics of P. Note that (1) PD(Ri, k) ⊆ PD(Ri, k + ) for all k and (2) PD(Ri, k) ⊆ { Ri | i = , . . . , nproc } where nproc is the number of procedures in the declaration. The following first lemma will show that once procedures dependencies of a pro- cedure up to two consecutive levels are the same, it is the same up to any higher level too. The second claim states that nproc − is sufficient for the largest level. Lemma 4.1.6 (1) If PD(Ri, k) = PD(Ri, k + ) then PD(Ri, k) = PD(Ri, k + l) for all k, l ∈ N. (2) PD(Ri, k) ⊆ PD(Ri, nproc − ) for all k. Proof. (1) It is proved by induction on l. Case 1. l be . Its proof is immediate. Case 2. l be l′ + . Assume PD(Ri, k) = PD(Ri, k + ). We can show that if R′ i ∈ PD(Ri, k) then R′ i ∈ PD(Ri, k + l′ + ). Now we will show that if R′ i ∈ PD(Ri, k + l′ + ) then R′ i ∈ PD(Ri, k). Assume R′ i ∈ PD(Ri, k + l′ + ). Then we have Rj such that Rj ∈ PD(Ri, k + l′ ) and Rj ; R′ i. By induction hypothesis, PD(Ri, k) = PD(Ri, k + l′ ). Then Rj ∈ PD(Ri, k). Then by definition, R′ i ∈ PD(Ri, k + ). Then R′ i ∈ PD(Ri, k) by the assumption. Therefore, PD(Ri, k) = PD(Ri, k + l′ + ). (2) We will show that PD(Ri, m) = PD(Ri, m + ) for some m < nproc by contradiction. Assume for all m < nproc, PD(Ri, m) ̸= PD(Ri, m + ). Then PD(Ri, m) ⫋ PD(Ri, m + ). Then we have | PD(Ri, m) | ≥ m + and hence | PD(Ri, nproc) | ≥ nproc + . But PD(Ri, nproc) ⊆ { Ri | i = , . . . , nproc } and then | PD(Ri, nproc) | ≤ nproc. Itisacontradiction. Therefore, PD(Ri, m) = PD(Ri, m+ )
  • 39. 4.1 Language 31 for some m < nproc. By (1), we have now PD(Ri, nproc − ) = PD(Ri, nproc − + l) for all l. Therefore, PD(Ri, k) ⊆ PD(Ri, nproc − ) for all k. 2 We need to define some closed program that never terminates in order to define unfolding of a program for a specific number of times. First we will define Ω. Definition 4.1.7 We define Ω as while ( = ) do (skip) Substitution of a program for a procedure name is defined below. Definition 4.1.8 Let −→ P ′ = P′ , . . . , P′ nproc where P′ i is a program. P[ −→ P ′ ] is defined by induction on P as follows: P[ −→ P ′ ] = P if P is atomic, (if (b) then (P ) else (P ))[ −→ P ′ ] = (if (b) then (P [ −→ P ′ ]) else (P [ −→ P ′ ])), (while (b) do (P))[ −→ P ′ ] = (while (b) do (P[ −→ P ′ ])), (P ; P )[ −→ P ′ ] = (P [ −→ P ′ ]; P [ −→ P ′ ]), (Ri)[ −→ P ′ ] = P′ i. P[ −→ P ′ ]isaprogramobtainedfromPbyreplacingtheprocedurenamesR , . . . , Rnproc by P′ , . . . , P′ nproc respectively. UnfoldingtransformsaprograminlanguageLintoaprograminlanguageL− . Dis- cussions on the programs in language L can be reduced to those in L− , which are ei- ther easy or already shown elsewhere. P(k) denotes P where each procedure name is unfolded only k times. P( ) just replaces a procedure name by Ω, since a procedure name is not unfolded any more, which means the procedure name is supposed to be not executed. Here we present the unfolding of a program. Definition 4.1.9 Let Ωi = Ω for ≤ i ≤ nproc. We define P(k) for k ≥ as follows:
  • 40. 32 Chapter 4. Separation Logic for Recursive Procedures P( ) = P[Ω , . . . , Ωnproc ], P(k+ ) = P[Q(k) , . . . , Q(k) nproc ]. Sometimes we will call P(k) as the k-times unfolding of the program P. We present some basic properties of unfolded programs. Proposition 4.1.10 (1) P(k) = P[ −→ R(k) ]. (2) R ( ) i = Ω. (3) R (k+ ) i = Qi[ −→ R(k) ]. Proof. (1) By case analysis of k. Case 1. k = . P( ) = P[ −→ Ω] by definition. By (2), P[ −→ Ω] = P[ −→ R( ) ]. Therefore, P( ) = P[ −→ R( ) ]. Case 2. k = k′ + . P(k′+ ) = P[ −−→ Q(k′) ]. Since R (k′+ ) i = Ri[ −−→ Q(k′) ] = Q (k′) i , P[ −−→ Q(k′) ] = P[ −−−→ R(k′+ ) ]. There- fore, P(k′+ ) = P[ −−−→ R(k′+ ) ]. (2) By definition we have R ( ) i = Ri[ −→ Ω] = Ω. (3) By definition we have R (k+ ) i = Ri[ −−→ Q(k) ] = Q (k) i . By (1), Q (k) i = Qi[ −→ R(k) ]. Therefore, R (k+ ) i = Qi[ −→ R(k) ]. 2 The nexttwodefinitionswilldefinethesetofthefreevariables(FV(P))andtheset of the variables that can be modified (Mod (P)). Generally speaking, the left variable of the symbol := in a program is a modifiable variable. First we will define above mentioned two sets for a program in its syntactic structure. Next, it will be used to definethesetoffreevariables(extendedfreevariables, EFV)andthesetofmodifiable variables (Mod(P)) that may appear in the execution of the program. Since ‘a free variable’ has an ordinary meaning without procedure calls, we will use ‘an extended free variable’ for that with procedure calls.
  • 41. 4.1 Language 33 Definition 4.1.11 We define FV(P) for P in L− and EFV(P) for P in L as follows: FV(x := e) = {x} ∪ FV(e), FV(if (b) then (P ) else (P )) = FV(b) ∪ FV(P ) ∪ FV(P ), FV(while (b) do (P)) = FV(b) ∪ FV(P), FV(P ; P ) = FV(P ) ∪ FV(P ), FV(skip) = ∅, FV(x := cons(e , e )) = {x} ∪ FV(e ) ∪ FV(e ), FV(x := [e]) = {x} ∪ FV(e), FV([e ] := e ) = FV(e ) ∪ FV(e ), FV(dispose(e)) = FV(e), EFV(P) = FV(P(nproc) ). The expression FV(P) is the set of variables that occur in P. EFV(P) is the set of vari- ables that may be used in the execution of P. The expression FV(O , . . . , Om) is defined as FV(O ) ∪ . . . ∪ FV(Om) when Oi is a formula, an expression, or a program. Definition 4.1.12 We define Mod (P) for P in L− and Mod(P) for P in L as follows: Mod (x := e) = {x}, Mod (if (b) then (P ) else (P )) = Mod (P ) ∪ Mod (P ), Mod (while (b) do (P)) = Mod (P), Mod (P ; P ) = Mod (P ) ∪ Mod (P ), Mod (skip) = ∅, Mod (x := cons(e , e )) = {x}, Mod (x := [e]) = {x}, Mod ([e ] := e ) = ∅, Mod (dispose(e)) = ∅, Mod(P) = Mod (P(nproc) ). The expression Mod(P) is the set of variables that may be modified by P.
  • 42. 34 Chapter 4. Separation Logic for Recursive Procedures Lemma 4.1.13 (1) FV(R (k+ ) i ) = ∪ Rj∈PD(Ri,k) FV(Qj). (2) Mod (R (k+ ) i ) = ∪ Rj∈PD(Ri,k) Mod (Qj). Proof. (1) It is proved by induction on k. Case 1. k = . By definition, FV(R ( ) i ) = FV(Ri[ −−→ Q( ) ]) = FV(Q ( ) i ) = FV(Qi[ −→ Ω]) = FV(Qi). Since PD(Ri, ) = {Ri}, we have FV(Qi) = ∪ Rj∈PD(Ri, ) FV(Qj). Case 2. k to be k′ + . By Proposition 4.1.10 (3), FV(R (k′+ ) i ) = FV(Qi[ −−−→ R(k′+ ) ]). Then we have FV(R (k′+ ) i ) = FV(Qi) ∪ ∪ Ri;Rj FV(R (k′+ ) j ). By induction hypothesis, we have FV(R (k′+ ) i ) = FV(Qi) ∪ ∪ Ri;Rj ∪ Rm∈PD(Rj,k′) FV(Qm). Then we have FV(R (k′+ ) i ) = FV(Qi) ∪ ∪ Rm∈PD(Ri,k′+ ) FV(Qm). Therefore, FV(R (k′+ ) i ) = ∪ Rj∈PD(Ri,k′+ )(FV(Qj)). (2) Its proof is similar to (1). 2 Proposition 4.1.14 (1) FV(P(k) ) ⊆ EFV(P) for all k. (2) Mod (P(k) ) ⊆ Mod(P) for all k. Proof. (1) Fix k. If k = , the claim trivially holds. Assume k > . By Lemma 4.1.6 (2), we have PD(Ri, k − ) ⊆ PD(Ri, nproc − ). Then ∪ Rj∈PD(Ri,k− ) FV(Qj) ⊆ ∪ Rj∈PD(Ri,nproc− ) FV(Qj). Then by Proposition 4.1.13 (1), FV(R (k) i ) ⊆ FV(R (nproc) i ). Then we have FV(P) ∪ ∪ Ri∈PN(P) FV(R (k) i ) ⊆ FV(P) ∪ ∪ Ri∈PN(P) FV(R (nproc) i ). Then byProposition4.1.10(1)wehaveFV(P(k) ) ⊆ FV(P(nproc) ). Bydefinition,FV(P(k) ) ⊆ EFV(P). (2) Its proof is similar to (1). 2
  • 43. 4.2 Semantics 35 4.2 Semantics The semantics of our programming language and our assertion language is defined in this section. Our semantics is based on the same structure as that in Reynolds’ paper [13] except the following simplification: (1) values are natural numbers, (2) addresses are non-zero natural numbers, and (3) null is . The set N is defined as the set of natural numbers. The set Vars is defined as the set of variables in the base language. The set Locs is defined as the set {n ∈ N|n > }. For sets S , S , f : S → S means that f is a function from S to S . f : S →fin S means that f is a finite function from S to S , that is, there is a finite subset S′ of S and f : S′ → S . Dom(f) denotes the domain of the function f. The expression p(S) denotes the power set of the set S. For a function f : A → B and a subset C ⊆ A, the function f|C : C → B is defined by f|C(x) = f(x) for x ∈ C. AstoreisdefinedasafunctionfromVars → N, anddenotedbys. Aheapisdefined as a finite function from Locs →fin N, and denoted by h. A value is a natural number. An address is a positive natural number. The null pointer is . A store assigns a value to each variable. A heap assigns a value to an address in its finite domain. The store s[x := n , . . . , xk := nk] is defined by s′ such that s′ (xi) = ni and s′ (y) = s(y) for y ̸∈ {x , . . . , xk}. The heap h[m := n , . . . , mk := nk] is defined by h′ such that h′ (mi) = ni and h′ (y) = h(y) for y ∈ Dom(h) − {m , . . . , mk}. The store s[x := n , . . . , xk := nk] is the same as s except values for the variables x , . . . , xk. The heap h[m := n , . . . , mk := nk] is the same as h except the contents of the memory cells at the addresses m , . . . , mk. We will write h = h + h when Dom(h) = Dom(h ) ∪ Dom(h ), Dom(h ) ∩ Dom(h ) = ∅, h(x) = h (x) for x ∈ Dom(h ), and h(x) = h (x) for x ∈ Dom(h ). The heap h is divided into the two disjoint heaps h and h when h = h + h . A state is defined as (s, h). The set States is defined as the set of states. The state for a pointer program is specified by the store and the heap, since pointer programs manipulate memory heaps as well as variable assignments.
  • 44. 36 Chapter 4. Separation Logic for Recursive Procedures Definition 4.2.1 We define the semantics of our base language by the standard model of natural numbers and null = . That is, we suppose = , = , + = +, × = ×, = = (=), and < = (<). For a store s, an expression e, and a pure formula A, according to the interpretation of a first-order language, the meaning e s is defined as a natural number and the meaning A s is defined as True or False. The expression e s and A s are the value of e under the store s, and the truth value of A under the store s, respectively. Semantics of Programs The relation ⊆ over the functions of type States∪{abort} → p(States∪{abort}) is necessary to define the semantics for while (b) do (P). Definition 4.2.2 We define ⊆ for functions F, G: States ∪ {abort} → p(States ∪ {abort}). F ⊆ G is defined to hold if ∀r ∈ States (F(r) ⊆ G(r)). Definition 4.2.3 We define the semantics of our programming language. For a program P, its meaning P is defined as a function from States ∪ {abort} to p(States ∪ {abort}). We will define P (r ) as the set of all the possible resulting states after the execution of P terminates with the initial state r . In particular, if the execution of P with the initial state r does not terminate, we will define P (r ) as the empty set ∅. The set P ({r , . . . , rm}) is defined as ∪m i= P (ri). In order to define P we would like to define P − for all P in the
  • 45. 4.2 Semantics 37 language L− . We define P − by induction on P in L− as follows: P − (abort) = {abort}, x := e − ((s, h)) = {(s[x := e s], h)}, if (b) then (P ) else (P ) − ((s, h)) = { P ((s, h)) if b s = True, P ((s, h)) otherwise, while (b) do (P) − is the least function satisfying while (b) do (P) − (abort) = {abort}, while (b) do (P) − ((s, h)) = {(s, h)} if b s = False, while (b) do (P) − ((s, h)) = ∪ { while (b) do (P) − (r) | r ∈ P − ((s, h)) } otherwise, P ; P − ((s, h)) = ∪ { P − (r) | r ∈ P − ((s, h)) }, skip − ((s, h)) = {(s, h)}, x := cons(e , e ) − ((s, h)) = {(s[x := n], h[n := e s, n + := e s]) | n > , n, n + ̸∈ Dom(h)}, x := [e] − ((s, h)) = { {(s[x := h( e s)], h)} {abort} if e s ∈ Dom(h), otherwise, [e ] := e − ((s, h)) = { {(s, h[ e s := e s])} {abort} if e s ∈ Dom(h), otherwise, dispose(e) − ((s, h)) = { {(s, h|Dom(h)−{ e s})} {abort} if e s ∈ Dom(h), otherwise. We present two propositions which together state that the semantics of a while (b) do (P) can be constructed by the semantics of P. Proposition 4.2.4 r ∈ while (b) do (P) − ((s, h)) if and only if there exist m ≥ , r , . . . , rm such that r = (s, h), rm = r, b ri = True and ri+ ∈ P − (ri) for ≤ i < m, and one of the following holds: (1) r ̸= abort and b r = False, (2) r = abort,
  • 46. 38 Chapter 4. Separation Logic for Recursive Procedures where we write b (s′,h′) for b s′ . Proof. First we will show the only-if-part. Let F((s, h)) = { r | m ≥ , r = (s, h), rm = r, ri = (si, hi), b si = True, ri+ ∈ P − (ri) ( ≤ ∀i < m), (r = (sm, hm) ∧ b sm = False) ∨ r = abort } and F(abort) = {abort}. We will show that F satisfies the inequations obtained from the equations for while (b) do (P) − by replacing = by ⊇. That is, we will show F(abort) ⊇ {abort}, F((s, h)) ⊇ {(s, h)} if b s = False, F((s, h)) ⊇ ∪ { F(r) | r ∈ P − ((s, h)) } if b s = True. By the well-known least fixed point theorem [17], these inequations imply our only-if-part. The first inequation immediately holds by the definition of F. For the second in- equation, since b s = False, by taking m to be , we have (s, h) ∈ F((s, h)). We will show the third inequation. Assume b s = True and r is in the right-hand side. We will show r ∈ F((s, h)). We have q ∈ P − ((s, h)) and r ∈ F(q). Case 1. q = (s′′ , h′′ ). By the definition of r ∈ F(q), we have m ≥ , r = (s′′ , h′′ ), rm = r, ri = (si, hi), b si = True, ri+ ∈ P − (ri) ( ≤ ∀i < m), and one of the following holds: either r = (sm, hm) and b sm = False, or r = abort. Let m′ = m + , r′ = (s, h), and r′ i = ri− for < i ≤ m′ . By taking m and ri to be m′ and r′ i respectively in the definition of F, we have r ∈ F((s, h)). Case 2. q = abort. By the definition of F we have r = abort. By taking m = , we have abort ∈ F((s, h)).
  • 47. 4.2 Semantics 39 Next we will show the if-part by induction on m. We assume the right-hand side. We will show r ∈ while (b) do (P) − ((s, h)). If m = then we have b s = False and r = (s, h). Hence r ∈ while (b) do (P) − ((s, h)). Suppose m > . We have m and r , . . . , rm satisfying the conditions (1) or (2). If r = abort, we have m = and abort ∈ P − ((s, h)). Hence r ∈ while (b) do (P) − ((s, h)) in this case. Suppose r ̸= abort. By induction hy- pothesis for m − , we have r ∈ while (b) do (P) − (r ). Since b s = True and r ∈ P − ((s, h)), by the definition we have r ∈ while (b) do (P) − ((s, h)). 2 The following proposition characterizes the two properties of the semantics of Ω. Proposition 4.2.5 (1) Ω − (abort) = {abort}. (2) Ω − ((s, h)) = ∅. Proof. (1) By definition, Ω − (abort) = {abort}. (2) By definition, Ω − ((s, h)) = while ( = ) do (skip) − ((s, h)). Since skip − ((s′ , h′ )) ̸∋ abort for all s′ , h′ , by Proposition 4.2.4, abort ̸∈ while ( = ) do (skip) − ((s, h)). Since = s′ = True for all s′ , by Proposition 4.2.4 we have (s′ , h′ ) ̸∈ while ( = ) do (skip) − ((s, h)) for all s′ , h′ . Therefore, while ( = ) do (skip) − ((s, h)) = ∅. 2 Lemma 4.2.6 If P′ i, P′′ i ∈ L− ( ≤ i ≤ nproc) and P′ i − ⊆ P′′ i − for all i then P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − where P ∈ L and P[ −→ P′ ] ∈ L− . Proof. (1) By induction on P. Assume P′ i − ⊆ P′′ i − for all i. Case 1. P is atomic. Since P[ −→ P′ ]=P=P[ −→ P′′ ], the claim holds. Case 2. P is if (b) then (P ) else (P ).
  • 48. 40 Chapter 4. Separation Logic for Recursive Procedures We will show that P[ −→ P′ ] − (r) ⊆ P[ −→ P′′ ] − (r). Case 2.1. r is abort. By definition, P[ −→ P′ ] − (abort) = {abort} = P[ −→ P′′ ] − (abort). Hence P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − . Case 2.2. r is (s, h). Let r to be (s, h). Suppose b s = True. Then by definition, P[ −→ P′ ] − (r) = P [ −→ P′ ] − (r) and P[ −→ P′′ ] − (r) = P [ −→ P′′ ] − (r). By induction hypothesis, P [ −→ P′ ] − ⊆ P [ −→ P′′ ] − . Therefore, P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − . In the case b s = False, it can be proved similarly. Case 3. P is while (b) do (P ). We will show that P[ −→ P′ ] − (r) ⊆ P[ −→ P′′ ] − (r). Case 3.1. r is abort. The case is similar to 2.1. Case 3.2. r to be (s, h). Case b s = False. By definition, while (b) do (P [ −→ P′ ]) − ((s, h)) = {(s, h)} = while (b) do (P [ −→ P′′ ]) − ((s, h)). Hence P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − . Case b s = True. Assume while (b) do (P [ −→ P′ ]) − ((s, h)) ∋ r′ . By Proposi- tion 4.2.4, we have m ≥ , r , . . . , rm such that r = (s, h), ri+ ∈ P [ −→ P′ ] − (ri) and b ri = True for ≤ i < m such that either r′ ̸= abort and b r′ = False, or r′ = abort. By induction hypothesis, ri+ ∈ P [ −→ P′′ ] − (ri). Then while (b) do (P [ −→ P′′ ]) − (r) ∋ r′ . Then while (b) do (P [ −→ P′ ]) − (r) ⊆ while (b) do (P [ −→ P′′ ]) − (r). Therefore, P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − . Case 4. P is P ; P . By induction hypothesis, P [ −→ P′ ] − ⊆ P [ −→ P′′ ] − and P [ −→ P′ ] − ⊆ P [ −→ P′′ ] − . Therefore, P[ −→ P′ ] − ⊆ P[ −→ P′′ ] − .
  • 49. 4.2 Semantics 41 Case 5. Ri. We have Ri[ −→ P′ ] = P′ i and Ri[ −→ P′′ ] = P′′ i . Hence Ri[ −→ P′ ] − ⊆ Ri[ −→ P′′ ] − . 2 We have already defined the semantics of P in the language L− . We define the semantics of P in L. The semantics we will define is usually called an approximating semantics. Definition 4.2.7 The semantics of P in L is defined by P (r) = ∪∞ i= ( P(i) − (r)). Note that Q (k) i is a program of the language L− and doesn’t contain Ri. Remark that for P in L− , we have P = P − since P(k) = P. Lemma 4.2.8 For all k, P(k) − ⊆ P(k+ ) − . Proof. By induction on k. Case 1. k = . We have Ω − (r) = ∅ for all r by Proposition 4.2.5 (2). Then for all i, we have Ω − ⊆ Q ( ) i − . Then by Lemma 4.2.6, P[ −→ Ω] ⊆ P[ −−→ Q( ) ] . Then P( ) − ⊆ P( ) − . Case 2. k > . Let k be k′ + . We have Q(k) = Q(k′+ ) = Q[ −−→ Q(k′) ] and Q(k+ ) = Q(k′+ ) = Q[ −−−→ Q(k′+ ) ]. By induction hypothesis, Q(k′) − ⊆ Q(k′+ ) − . By Lemma 4.2.6, we now have P[ −−→ Q(k′) ] − ⊆ P[ −−−→ Q(k′+ ) ] − . Then P(k) − ⊆ P(k+ ) − . 2 Semantics of Assertions Definition 4.2.9 We define the semantics of the assertion language. For an assertion A and a state (s, h), the meaning A (s,h) is defined as True or False. A (s,h) is the truth value
  • 50. 42 Chapter 4. Separation Logic for Recursive Procedures of A at the state (s, h). A (s,h) is defined by induction on A as follows: emp (s,h) = True if Dom(h) = ∅, e = e (s,h) = ( e s = e s), e < e (s,h) = ( e s < e s), e → e (s,h) = True if Dom(h) = { e s} and h( e s) = e s, ¬A (s,h) = (not A (s,h)), A ∧ B (s,h) = ( A (s,h) and B (s,h)), A ∨ B (s,h) = ( A (s,h) or B (s,h)), A → B (s,h) = ( A (s,h) implies B (s,h)), ∀xA (s,h) = True if A (s[x:=m],h) = True for all m ∈ N, ∃xA (s,h) = True if A (s[x:=m],h) = True for some m ∈ N, A ∗ B (s,h) = True if h = h + h and A (s,h ) = B (s,h ) = True for some h , h , A −∗ B (s,h) = True if h = h + h and A (s,h ) = True implies B (s,h ) = True for all h , h . We say A is true when A (s,h) = True for all (s, h). Semantics of Asserted Program Finally we need to define the semantics of the asserted programs. It is basically the same as in [13]. Definition 4.2.10 For an asserted program {A}P{B}, the meaning of {A}P{B} is de- fined as True or False. {A}P{B} is defined to be True if both of the following hold. (1) for all (s, h), if A (s,h) = True, then P ((s, h)) ̸∋ abort. (2) for all (s, h) and (s′ , h′ ), if A (s,h) = True and P ((s, h)) ∋ (s′ , h′ ), then B (s′,h′) = True. Remark that semantics of an asserted program with a non-terminating program is always True. Because, according to the definition of semantics, a resulting abort state of a program implies termination of its execution.
  • 51. 4.2 Semantics 43 In our system, judgments are in the form Γ ⊢ {A}P{B} where Γ is a set of asserted programs. Here we will define semantics of a judgment. We say Γ is true if all {A}P{B} in Γ are true. Definition 4.2.11 We say Γ ⊢ {A}P{B} is true when the following holds: {A}P{B} is true if {Ai}Pi{Bi} is true for all {Ai}Pi{Bi} ∈ Γ. The asserted program {A}P{B} means abort-free partial correctness and also im- plies partial correctness in the standard sense. Namely, it means that the execution of the program P with all the initial states which satisfy A never aborts, that is, P does not access any of the unallocated addresses during the execution. It is one of the strongest feature of the system. The next lemma shows that the semantics of a procedure call (or procedure name) is the same as that of its body. Lemma 4.2.12 Ri = Qi . Proof. By definition we have Qi (r) = ∪∞ k= ( Q (k) i − (r)). Since Q (k) i = Ri[ −−→ Q(k) ] = R (k+ ) i , we have ∪∞ k= ( Q (k) i − (r)) = ∪∞ k= ( R (k+ ) i − (r)). Then by Proposition 4.2.5 (2), Qi (r) = ∪∞ k= ( R (k+ ) i − (r)) ∪ Ω − (r). Then Qi (r) = ∪∞ k= ( R (k+ ) i − (r)) ∪ R ( ) i − (r) by Proposition 4.1.10(2). Then Qi (r) = ∪∞ k= ( R (k) i − (r)). Therefore, Qi = Ri . 2 We define the equality of two stores over a set of variables. Suppose some stores are equal over the set of free variables of a program. If we execute the program with those stores in the states, then the stores in the resulting states are still equal over the same set of free variables. Definition 4.2.13 For a set V of variables, we define =V as follows: s =V s′ if and only if s(x) = s′ (x) for all x ∈ V. Definition 4.2.14 We first define [s , s , V] to denote the store s such that s =V s and s =Vc s for stores s , s and a set of variables V. Lemma 4.2.15 Suppose P ∈ L− .
  • 52. 44 Chapter 4. Separation Logic for Recursive Procedures (1) If P − ((s, h)) ∋ (s , h ) then s =Mod (P)c s. (2) If s =FV(P) s′ , P − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(P)] then P − ((s′ , h)) ∋ (s′ , h ). (3) If s =FV(P) s′ and P − ((s, h)) ∋ abort, then P − ((s′ , h)) ∋ abort. Proof. (1) We will show the claim by induction on P. We consider cases according to P. Case 1. x := e. Assume x := e − ((s, h)) ∋ (s , h ). By definition, s = s[x := e s] and h = h . Then for all y ̸= x, s(y) = s (y). By definition, Mod (x := e) = {x}. Therefore, s =Mod (P)c s. Case 2. P is if (b) then (P ) else (P ). Assume P − ((s, h)) ∋ (s , h ). Case b s = True. By definition, we have P − ((s, h)) ∋ (s , h ). By induction hypothesis, s =Mod (P )c s . WehaveMod (P)c ⊆ Mod (P )c . Therefore, s =Mod (P)c s . Case b s = False can be shown as above. Case 3. P is while (b) do (P ). Assume P − ((s, h)) ∋ (s , h ). Case b s = True. By Proposition 4.2.4, we have s , . . . , sm, h , . . . , hm such that (s, h) = (s , h ), (s , h ) = (sm, hm), for all i = , . . . , m − , P − ((si, hi)) ∋ (si+ , hi+ ), b si = True and b sm = False. By induction hypothesis, for all i = , . . . , m− , si =Mod (P )c si+ . Since Mod (P)c ⊆ Mod (P )c , for all i = , . . . , m− , si =Mod (P)c si+ . Therefore, s =Mod (P)c s . Case b s = False. Then by definition, s = s . Then s =Mod (P)c s . Case 4. P ; P .
  • 53. 4.2 Semantics 45 Assume P ; P − ((s, h)) ∋ (s , h ). By definition, we have s , h such that P − ((s, h)) ∋ (s , h ) and P − ((s , h )) ∋ (s , h ). By induction hypothesis, s =Mod (P )c s and s =Mod (P )c s . Since Mod (P)c ⊆ Mod (P )c and Mod (P)c ⊆ Mod (P )c , we have s =Mod (P)c s and s =Mod (P)c s . Therefore, s =Mod (P)c s. Case 5. skip. Its proof is immediate. Case 6. x := cons(e , e ). Assume x := cons(e , e ) − ((s, h)) ∋ (s , h ). By definition, (s , h ) is in { (s[x := m], h[m := e s, m + := e s]) | m > , m, m + ̸∈ Dom(h) }. So, for all y ̸= x, s(y) = s (y). By definition, Mod (x := cons(e , e )) = {x}. Therefore, s =Mod (x:=cons(e ,e ))c s . Case 7. x := [e]. Assume x := [e] − ((s, h)) ∋ (s , h ). By definition, {(s[x := h( e s)], h)}. So, for all y ̸= x, s(y) = s (y). By definition, Mod (x := [e]) = {x}. Therefore, s =Mod (x:=[e])c s . Case 8. [e ] := e . Assume [e ] := e − ((s, h)) ∋ (s , h ). By definition, s = s . Therefore, s =Mod ([e ]:=e )c s . Case 9. dispose(e). Assume dispose(e) − ((s, h)) ∋ (s , h ). By definition, s = s . Therefore, s =Mod (dispose(e))c s . (2) We will show the claim by induction on P. We consider cases according to P. Case 1. P is x := e. Assume s =FV(x:=e) s′ , x := e − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(x := e)]. Then by definition, s = s[x := e s] and s′ =FV(x:=e) s . Then s′ (x) = s (x) = e s =
  • 54. 46 Chapter 4. Separation Logic for Recursive Procedures e s′ . We have s′ =FV(x:=e)c s′ and for all y ∈ FV(e) and y ̸= x, s′ (y) = s (y) = s(y) = s′ (y). Then we have s′ = s′ [x := e s′ ]. Therefore, x := e − ((s′ , h)) ∋ (s′ , h ). Case 2. P is if (b) then (P ) else (P ). Assume s =FV(if (b) then (P ) else (P )) s′ , if (b) then (P ) else (P ) − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(if (b) then (P ) else (P ))]. Let b s = True. Then by definition P − ((s, h)) ∋ (s , h ). By (1), s =Mod (P )c s and since FV(P )c ⊆ Mod (P )c , we have s =FV(P )c s . We also have s =FV(P) s′ by assumption and s′ =FV(P) s by definition and hence s′ =FV(P ) s and s′ =FV(P ) s because FV(P ) ⊆ FV(P). Then for all y ∈ FV(P) − FV(P ), s′ (y) = s(y) = s (y) = s′ (y). Since s′ =FV(P)−FV(P ) s′ and s′ =FV(P)c s′ are true, we have that s′ =FV(P )c s′ holds. Then s′ = [s , s′ , FV(P )]. By induction hypothesis, P − ((s′ , h)) ∋ (s′ , h ). Then by definition P − ((s′ , h)) ∋ (s′ , h ). Again let b s = False. In the same way as above we can prove that P − ((s′ , h)) ∋ (s′ , h ). Case 3. P is while (b) do (P ). Assume s =FV(while (b) do (P )) s′ , while (b) do (P ) − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(while (b) do (P ))]. We have (s, h) = (q , h′ ), …, (qm, h′ m) = (s , h ) such that P − ((qi, h′ i)) ∋ (qi+ , h′ i+ ) and b qi = True for all ≤ i < m and b qm = False by Proposition 4.2.4. Let q′ i = [qi, s′ , FV(P )] for all ≤ i ≤ m. We will show that s′ = q′ and s′ = q′ m. We have q′ = [s, s′ , FV(P )]. Then q′ =FV(P )c s′ . Since s =FV(P ) s′ and q′ =FV(P ) s, we have q′ =FV(P ) s′ . Therefore, s′ = q′ . We have q′ m = [s , s′ , FV(P )]. We also have s′ = [s , s′ , FV(P)]. Then q′ m =FV(P ) s′ . By (1), s =Mod (P)c s . Then s =FV(P )c s since Mod (P)c = Mod (P )c and Mod (P )c ⊇ FV(P )c . We have q′ m =FV(P)c s′ since FV(P)c ⊆ FV(P )c and q′ m =FV(P )c s′ . Then q′ m =FV(P)c s′ . Since q′ m =FV(P )c s′ , s′ =FV(P) s and s =FV(P) s′ , for all y ∈ FV(P) − FV(P ), q′ m(y) = s′ (y) = s(y) = s (y) = s′ (y). Therefore, s′ = q′ m. Hence s′ = q′ , s′ = q′ m. Since q′ i =FV(P )c s′ by definition, q′ i+ = [qi+ , q′ i, FV(P )] for ≤ i < m. By induction hypothesis, P − ((q′ i, hi)) ∋ (q′ i+ , hi+ ). We also have b q′ i =
  • 55. 4.2 Semantics 47 True for ≤ i < m and b q′ m = False. Hence by Proposition 4.2.4, while (b) do (P ) − ((s′ , h)) ∋ (s′ , h ). Case 4. P is P ; P . Assume s =FV(P ;P ) s′ , P ; P − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(P ; P )]. By definition, we know that P − ((s, h)) ∋ (s , h ) and P − ((s , h )) ∋ (s , h ) for some s , h . By (1), s =Mod (P )c s and s =Mod (P )c s and then s =FV(P )c s and s =FV(P )c s . Let take s′ such that s′ = [s , s′ , FV(P )]. Then s′ =FV(P ) s and s′ =FV(P )c s′ . Then for all y ∈ FV(P ) ∩ FV(P ), s′ (y) = s (y). For all y ∈ FV(P ) − FV(P ), s′ (y) = s′ (y) = s(y) = s (y). So, s′ =FV(P ) s . Since s′ =FV(P ;P ) s we have s′ =FV(P ) s . For all y ∈ FV(P ) − FV(P ), s′ (y) = s (y) by s′ = [s , s′ , FV(P ; P )], s (y) = s (y) by s =FV(P )c s , s (y) = s′ (y) by s′ =FV(P ) s and hence s′ (y) = s′ (y). For all y ̸∈ FV(P ; P ), s′ (y) = s′ (y) by s′ = [s , s′ , FV(P ; P )] and s′ (y) = s′ (y) by s′ = [s , s′ , FV(P )] and hence s′ (y) = s′ (y). Then we have s′ =FV(P )c s′ . Then s′ = [s , s′ , FV(P )]. Hence, by induction hypothesis P − ((s′ , h)) ∋ (s′ , h ) and P − ((s′ , h )) ∋ (s′ , h ). Therefore, by definition P ; P − ((s′ , h)) ∋ (s′ , h ). Case 5. P is skip. Its proof is immediate. Case 6. P is x := cons(e , e ). Assume s =FV(x:=cons(e ,e )) s′ , x := cons(e , e ) − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(x := cons(e , e ))]. Then by definition, s = s[x := m] for some m > where m, m + ̸∈ Dom(h) and s′ =FV(x:=cons(e ,e )) s . Then s′ (x) = s (x) = m. We have s′ =FV(x:=cons(e ,e ))c s′ . For all y ∈ FV(x := cons(e , e )) − {x}, s′ (y) = s (y) = s(y) = s′ (y). Then we have s′ = s′ [x := m]. Therefore, by defi- nition x := cons(e , e ) − ((s′ , h)) ∋ (s′ , h ). Case 7. P is x := [e]. Assume s =FV(x:=[e]) s′ , x := [e] − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(x := [e])]. Then by definition, s = s[x := h( e s)] and s′ =FV(x:=[e]) s . Then s′ (x) =
  • 56. 48 Chapter 4. Separation Logic for Recursive Procedures s (x) = h( e s) = h( e s′ ). We have s′ =FV(x:=[e])c s′ and hence for all y ̸= x and y ∈ FV(x := [e]), s′ (y) = s (y) = s(y) = s′ (y). Then we have s′ = s′ [x := h( e s′ )]. Therefore, x := [e] − ((s′ , h)) ∋ (s′ , h ). Case 8. P is [e ] := e . Assume s =FV([e ]:=e ) s′ , [e ] := e − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV([e ] := e )]. Then by definition, s = s and s′ =FV([e ]:=e ) s . For all y ∈ FV([e ] := e ), we have s′ (y) = s (y) = s(y) = s′ (y). Then we have s′ = s′ . Since h = h[ e s := e s], we have h = h[ e s′ := e s′ ]. Therefore, [e ] := e − ((s′ , h)) ∋ (s′ , h ). Case 9. P is dispose(e). Assume s =FV(dispose(e)) s′ , dispose(e) − ((s, h)) ∋ (s , h ) and s′ = [s , s′ , FV(dispose(e))]. Then by definition, s = s and s′ =FV(dispose(e)) s . For all y ∈ FV(dispose(e)), we have s′ (y) = s (y) = s(y) = s′ (y). Then we have s′ = s′ . Since h = h|Dom(h)− e s , we have h = h|Dom(h)− e s′ . Therefore, dispose(e) − ((s′ , h)) ∋ (s′ , h ). (3) We will show the claim by induction on P. We consider cases according to P. Case 1. x := e. Assume s =FV(x:=e) s′ and x := e − ((s, h)) ∋ abort. By definition, x := e − ((s, h)) = {(s[x := e s], h)}. Therefore, x := e − ((s, h)) ∋ abort is False. Then x := e − ((s, h)) ∋ abort implies x := e − ((s′ , h)) ∋ abort. Case 2. P is if (b) then (P ) else (P ). Assume s =FV(P) s′ . Case 2.1. b s = True. Then b s′ = True. Assume P − ((s, h)) ∋ abort. By definition, P ((s, h)) ∋ abort. By definition FV(P ) ⊆ FV(P) and then s =FV(P ) s′ . By induction hypothesis, P − ((s′ , h)) ∋ abort. Therefore, by defi- nition P − ((s′ , h)) ∋ abort. Case 2.2. b s = False can be proved similarly. Case 3. P is while (b) do (P ).
  • 57. 4.2 Semantics 49 Assume s =FV(while (b) do (P )) s′ . Case 3.1. b s = True. Then b s′ = True. Assume while (b) do (P ) ((s, h)) ∋ abort. By Proposition 4.2.4, we have s , . . . , sm, h , . . . , hm such that (s, h) = (s , h ), P − ((si, hi)) ∋ (si+ , hi+ ) and b si = True for all i = , . . . , m − , P − ((sm, hm)) ∋ abort and b sm = True. By definition FV(P ) ⊆ FV(while (b) do (P )) and then s =FV(P ) s′ . Let s′ i = [si, s′ , FV(P)] for all i = , . . . , m. Then s′ =FV(P)c s′ and s′ =FV(P) s = s =FV(P) s′ and hence s′ = s′ . We will show that s′ i+ = [si+ , s′ i, FV(P )] for i = , . . . , m − . For that we will show that s′ i+ =FV(P )c s′ i. By (1), si+ =Mod (P )c si and hence si+ =FV(P )c si. For all y ∈ FV(P) − FV(P ), s′ i+ (y) = si+ (y) = si(y) = s′ i(y). For all y ̸∈ FV(P), s′ i+ (y) = s′ (y) = s′ i(y). Hence we have s′ i+ =FV(P )c s′ i. Since s′ i =FV(P) si, we have s′ i+ =FV(P ) si+ and then s′ i+ = [si+ , s′ i, FV(P )]. Then by (2), we have P − ((s′ i, hi)) ∋ (s′ i+ , hi+ ) for all i = , . . . , m − . Since s′ m =FV(P) sm, we also have b s′ m = True, s′ m =FV(P ) sm and then by induction hy- pothesis P − ((s′ m, hm)) ∋ abort. Now we have (s′ , h) = (s′ , h ), P − ((s′ i, hi)) ∋ (s′ i+ , hi+ ) and b s′ i = True for all i = , . . . , m − , P − ((s′ m, hm)) ∋ abort and b s′ m = True. Therefore, by Proposition 4.2.4, while (b) do (P) − ((s′ , h)) ∋ abort. Case 3.2. b s = False. Then b s′ = False. Assume while (b) do (P) − ((s, h)) ∋ abort. By definition it is false. Case 4. P ; P . Assume s =FV(P ;P ) s′ and P ; P ((s, h)) ∋ abort. By definition, ∪ { P − (r) | r ∈ P − ((s, h)) } ∋ abort. Then either P − ((s, h)) ∋ abort or P − ((s, h)) ∋ (s , h ) and P − ((s , h )) ∋ abort for some s , h . Assume P − ((s, h)) ∋ abort. Since FV(P ) ⊆ FV(P ; P ), we have s =FV(P ) s′ . By in- duction hypothesis, P − ((s′ , h)) ∋ abort. Since P − (abort) ∋ abort, we have P ; P − ((s′ , h)) ∋ abort. Now assume, P − ((s, h)) ∋ (s , h ) and P − ((s , h )) ∋ abort. We have s =FV(P ) s′ . Let s′ = [s , s′ , FV(P )]. Then by induction hypothesis (2), P − ((s′ , h )) ∋ (s′ , h ).
  • 58. 50 Chapter 4. Separation Logic for Recursive Procedures By (1), s =FV(P )c s and s′ =FV(P )c s′ . Then for all y ∈ FV(P ) ∩ FV(P ), s (y) = s′ (y). Since FV(P ) − FV(P ) ⊆ Mod (P )c , for all y ∈ FV(P ) − FV(P ), we have s (y) = s(y) = s′ (y) = s′ (y). Then s =FV(P ) s′ . By induction hypothesis, P − ((s′ , h )) ∋ abort. Then, P ; P − ((s′ , h )) ∋ abort. Case 5. skip. Its proof is immediate. Case 6. x := cons(e , e ). Assumes =FV(x:=cons(e ,e )) s′ and x := cons(e , e ) − ((s, h)) ∋ abort. Butbydef- inition, x := cons(e , e ) − ((s, h)) ̸∋ abort. Then x := cons(e , e ) − ((s, h)) ∋ abort implies x := cons(e , e ) − ((s′ , h)) ∋ abort. Case 7. x := [e]. Assume s =FV(x:=[e]) s′ and x := [e] − ((s, h)) ∋ abort. Then by definition, e s /∈ Dom(h). Since s =FV(x:=[e]) s′ , we have e s = e s′ . Then e s′ /∈ Dom(h). Therefore, x := [e] − ((s′ , h)) ∋ abort. Case 8. [e ] := e . Assume s =FV([e ]:=e ) s′ and [e ] := e − ((s, h)) ∋ abort. Then by definition, e s /∈ Dom(h). Since s =FV([e ]:=e ) s′ , we have e s = e s′ . Then e s′ /∈ Dom(h). Therefore, [e ] := e − ((s′ , h)) ∋ abort. Case 9. dispose(e). Assume s =FV(dispose(e)) s′ and dispose(e) − ((s, h)) ∋ abort. Then by definition, e s /∈ Dom(h). Since s =FV(dispose(e)) s′ , we have e s = e s′ . Then e s′ /∈ Dom(h). Therefore, dispose(e) − ((s′ , h)) ∋ abort. 2 Lemma 4.2.16 Suppose P ∈ L. (1) If s =EFV(P) s′ and P ((s, h)) ∋ abort, then P ((s′ , h)) ∋ abort. (2) If s =EFV(P) s′ and P ((s, h)) ∋ (s , h ) and s′ = [s , s′ , EFV(P)] then P ((s′ , h)) ∋ (s′ , h ).
  • 59. 4.2 Semantics 51 (3) If P ((s, h)) ∋ (s , h ) then s =Mod(P)c s. Proof. (1) Assume s =EFV(P) s′ and P ((s, h)) ∋ abort. Then by definition, P(k) − ((s, h)) ∋ abort for some k. Since we have FV(P(k) ) ⊆ EFV(P) by Proposition 4.1.14 (1), we have s =FV(P(k)) s′ . By Lemma 4.2.15 (3), we have P(k) − ((s′ , h)) ∋ abort. Then by definition, P ((s′ , h)) ∋ abort. (2) Assume s =EFV(P) s′ , P ((s, h)) ∋ (s , h ) and s′ = [s , s′ , EFV(P)]. Then by definition, P(k ) ((s, h)) ∋ (s , h ) for some k . By definition, EFV(P) = FV(P(nproc) ) where nproc is the number of our procedure names. Let k = max(k , nproc). Then by Lemma 4.2.8, P(k) ((s, h)) ∋ (s , h ). Since FV(P(nproc) ) = FV(P(k) ) by Proposi- tion 4.1.14 (1), we have FV(P(k) ) = EFV(P). Then s′ = [s , s′ , FV(P(k) )]. Then by Lemma 4.2.15 (2), P(k) − ((s′ , h)) ∋ (s′ , h ). Therefore, P ((s′ , h)) ∋ (s′ , h ). (3) We can show the claim in a similar way to (1). 2
  • 60. 52 Chapter 4. Separation Logic for Recursive Procedures 4.3 Logical System This section defines our logical system. It is the extension of Reynolds’ system pre- sented in [13] for mutual recursive procedure call. We will write A[x := e] for the formula obtained from A by replacing x by e. We will write the formula e → e , e to denote (e → e ) ∗ (e + → e ). Definition 4.3.1 Our logical system consists of the following inference rules. As men- tioned in the previous section, we will use Γ for a set of asserted programs. A judgment is defined as Γ ⊢ {A}P{B}. Γ ⊢ {A}skip{A} (skip) Γ, {A}P{B} ⊢ {A}P{B} (axiom) Γ ⊢ {A[x := e]}x := e{A} (assignment) Γ ⊢ {A ∧ b}P {B} Γ ⊢ {A ∧ ¬b}P {B} Γ ⊢ {A}if (b) then (P ) else (P ){B} (if ) Γ ⊢ {A ∧ b}P{A} Γ ⊢ {A}while (b) do (P){A ∧ ¬b} (while ) Γ ⊢ {A}P {C} Γ ⊢ {C}P {B} Γ ⊢ {A}P ; P {B} (comp) Γ ⊢ {A }P{B } Γ ⊢ {A}P{B} (conseq) (A → A , B → B)
  • 61. 4.3 Logical System 53 Γ ⊢ {∀x′ ((x′ → e , e ) −∗ A[x := x′ ])} x := cons(e , e ){A} (cons) (x′ ̸∈ FV(e , e , A)) Γ ⊢ {∃x′ (e → x′ ∗ (e → x′ −∗ A[x := x′ ]))} x := [e]{A} (lookup) (x′ ̸∈ FV(e, A)) Γ ⊢ {(∃x(e → x)) ∗ (e → e −∗ A)} [e ] := e {A} (mutation) (x ̸∈ FV(e )) Γ ⊢ {(∃x(e → x)) ∗ A}dispose(e){A} (dispose) (x ̸∈ FV(e)) Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {A }Q {B } ... Γ ∪ {{Ai}Ri{Bi}|i = , . . . , nproc} ⊢ {Anproc }Qnproc {Bnproc } Γ ⊢ {Ai}Ri{Bi} (recursion) Γ ⊢ {A}P{C} Γ ⊢ {A ∧ B}P{C ∧ B} (inv-conj) (FV(B) ∩ Mod(P) = ∅ and B is pure) Γ ⊢ {A}P{B} Γ ⊢ {∃x.A}P{B} (exists) (x /∈ FV(B) ∪ EFV(P)) We say {A}P{B} is provable and we write ⊢ {A}P{B}, when ⊢ {A}P{B} can be derived by these inference rules. We explain inference rules by the following simple example.
  • 62. 54 Chapter 4. Separation Logic for Recursive Procedures Example Let A be the abbreviation of ∀y(y ≥ x ∧ y < z ↔ ∃w(y → w) ∗ True). Suppose we have the procedure declaration Procedure R (if (x < z) then (dispose(x); x := x + ; R ) else (skip)) We will show that ⊢ {A}R {emp} is provable in our system. This means we have a heap with some consecutive allocation of cells and the program R deallocates them all. Let Γ be {{A}R {emp}}. The axiom (assignment) gives Γ ⊢ {A[x := x + ]}x := x+ {A}. The axiom (axiom) gives Γ ⊢ {A}R {emp}. By the inference rule (comp), we have Γ ⊢ {A[x := x + ]}x := x + ; R {emp}. The axiom (dispose) gives Γ ⊢ {(∃y(x → y)) ∗ A[x := x + ]}dispose(x){A[x := x + ]}. Then by the inference rule (comp), we have Γ ⊢ {(∃y(x → y))∗A[x := x+ ]}dispose(x); x := x+ ; R {emp}. Theaxiom(skip)givesΓ ⊢ {A∧¬(x < z)}skip{A∧¬(x < z)}. We haveA∧x < z→(∃y(x → y))∗A[x := x+ ]andA∧¬(x < z)→emp. Thenbythe inference rule (conseq), we have Γ ⊢ {A ∧ x < z}dispose(x); x := x + ; R {emp} and Γ ⊢ {A ∧ ¬(x < z)}skip{emp}. By applying the inference rule (if ), we get Γ ⊢ {A}if (x < z) then (dispose(x); x := x + ; R ) else (skip){emp}. By the inference rule (recursion), we have ⊢ {A}R {emp} is provable. We have two new rules those appears neither in [1] nor in [13]. Those are the rules (inv-conj) and (exists). We have found that the axiom (AXIOM 9: INVARIANCE AXIOM) in [1] is not sound for our system. The definition of the axiom will be given in the next chapter. The asserted program {x = }[y] := {x = } is a counter example since it is not true although it is provable by the axiom. It causes abort for the state (s, h) = (s′ [x := , y := ], ∅) though x = (s,h) is true. Another counter example is {emp}x := cons( , ){emp}. Although it does not abort, it is not true. So, we have introduced the rule (inv-conj). It is a variant of the combination of the axiom (AXIOM 9: INVARIANCE AXIOM) and the rule (RULE 12: CONJUNC- TION RULE) of [1]. That is, the truthness of a pure assertion for the post-execution statedoesnotchangedfromthatofthepre-executionstateifitssetoffreevariablesare mutually exclusive to the set of modifiable variables of the program. It is a special case
  • 63. 4.3 Logical System 55 offramerulein[13]. Therule(exists)isanalogoustotheruleexistentialintroduction of propositional calculus.
  • 64.
  • 66. 58 Chapter 5. Soundness and Completeness 5.1 Soundness In this section we will prove soundness of our system. That means we will show that if a judgment Γ ⊢ {A}P{B} is derivable in our system then Γ ⊢ {A}P{B} is also true. We say an unfolded program is correct when all level of the unfolding of the pro- gram is correct. For a given specification, correctness of a program is the same as that of its unfolded transformation. Itisstraightforwardtoconstructalogicalsystembycollectingaxiomsandinference rules from both Hoare’s logic for recursive procedure and separation logic to verify pointer programs with recursive procedures. But this system is unsound. It is because the invariance axiom is not sound in separation logic. At the end of this section, we will give an unsound example. Proposition 5.1.1 {A}P{B} is true if and only if for all k, {A}P(k) {B} is true. Proof. First we will show from the left-hand side to the right-hand side. Assume {A}P{B} is true. Assume A r = True. Then by definition, P (r) ̸∋ abort. Then by definition, ∪∞ k= ( P(k) − (r)) ̸∋ abort. Then for all k, P(k) − (r) ̸∋ abort. Fix k. Assume P(k) − (r) ∋ r′ . Then we have ∪∞ k= ( P(k) − (r)) ∋ r′ . Then again by definition, P (r) ∋ r′ . Then B r′ = True. Then {A}P(k) {B} is true for all k. Next we will show the right-hand side to the left-hand side. Assume {A}P(k) {B} is true for all k. Assume A r = True. Fix k. Then by definition, P(k) (r) ̸∋ abort. Then P(k) − (r) ̸∋ abort. Then ∪∞ k= ( P(k) − (r)) ̸∋ abort. Then by definition, P (r) ̸∋ abort. Assume P (r) ∋ r′ . Then by definition, ∪∞ k= ( P(k) − (r)) ∋ r′ . Then we have P(k) − (r) ∋ r′ for some k. Then by definition, P(k) (r) ∋ r′ . Then B r′ = True. Then {A}P{B} is true. 2 Definition 5.1.2 For a set Γ of asserted programs, Γ(k) is defined by { {Ai}Pi{Bi} | ≤ i ≤ n }(k) = { {Ai}P (k) i {Bi} | ≤ i ≤ n }. If the truth of an assertion depends only on a store, it is not altered in the presence of any heap. This easy lemma will be necessary for some formal discussions later.
  • 67. 5.1 Soundness 59 Lemma 5.1.3 A s = A (s,h) forapureformulaAwheretheleft-handsideistheseman- tics for the base language and the right-hand side is the semantics for the assertion language. Proof. This is proved by induction on A. If A is B ∧ C, we have A s = ( B s and C s) and A (s,h) = ( B (s,h) and C (s,h)), and both sides are the same, since by the induction hypothesis we have B s = B (s,h) and C s = C (s,h). Other cases are similarly proved. 2 The following lemma is important for the soundness of the inference rule (recursion). Lemma 5.1.4 If { {Ai}R (k) i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q (k) i {Bi} is true for all i and k then ⊢ {Ai}R (k) i {Bi} true for all i. Proof. Assume { {Ai}R (k) i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q (k) i {Bi} is trueforall iand k. We will show {Ai}R (k) i {Bi} true for all i by induction on k. Case 1. k = . By Proposition 4.1.10 (2) R ( ) i = Ω. Then {Ai}Ω{Bi} is true. Case 2. k = k′ + . We assume that { {Ai}R (k) i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Q (k) i {Bi} is true for all k. By induction hypothesis, {Ai}R (k′) i {Bi} is true for all i. By the assumption for k′ , we have { {Ai}Rk′ i {Bi} | ≤ i ≤ nproc } ⊢ {Ai}Qk′ i {Bi} true. Hence {Ai}Q (k′) i {Bi} is true. By Proposition 4.1.10 (3), R (k′+ ) i = Q (k′) i . Hence {Ai}R (k) i {Bi} is true for all i. 2 In the following lemma we consider an assertion and a program such that the as- sertion has no free variable that can be modified by the program. In such a case, the truth of the assertion is preserved even after the execution of the program. Lemma 5.1.5 If A is pure, P doesn’t contain any procedure names, A (s,h) = True, FV(A) ∩ Mod (P) = ∅ and P − ((s, h)) ∋ (s′ , h′ ) then A (s′,h′) = True. Proof. Assume A is pure, P doesn’t contain any procedure names, A (s,h) = True, FV(A) ∩ Mod (P) = ∅ and P − ((s, h)) ∋ (s′ , h′ ). By Lemma 4.2.15 (1), s =Mod(P)c
  • 68. 60 Chapter 5. Soundness and Completeness s′ . Since FV(A) ⊆ Mod (P)c , we have s =FV(A) s′ . By Lemma 5.1.3, A s=True. Hence A s′ =True. By Lemma 5.1.3, A (s′,h′) = True. 2 The following lemma will be necessary to show the soundness of the inference rule (inv-conj). Lemma 5.1.6 IfPdoesn’tcontainanyprocedurenames,Bispure,Mod (P)∩FV(B) = ∅ and {A}P{C} is true, then {A ∧ B}P{C ∧ B} is true. Proof. Assume B is pure, Mod (P) ∩ FV(B) = ∅ and {A}P{C} is true. We have to prove {A ∧ B}P{C ∧ B} is true. Assume A ∧ B (s,h) = True. Then A (s,h) = True and B (s,h) = True. Then P ((s, h)) ̸∋ abort. Assume P ((s, h)) ∋ (s′ , h′ ). Then C (s′,h′) = True. By Lemma5.1.5, B (s′,h′) = True. Then B ∧ C (s′,h′) = True. Hence,{A∧B}P{C∧B} is true. 2 Lemma 5.1.7 If Γ ⊢ {A}P{B} is provable then Γ(k) ⊢ {A}P(k) {B} is true for all k. Proof. It is proved by induction on the proof. We consider cases according to the last rule. Case 1. (skip). Its proof is immediate. Case 2. (axiom). Assume Γ, {A}P{B} ⊢ {A}P{B} is provable. Then Γ(k) , {A}P(k) {B} ⊢ {A}P(k) {B} is true for all k. Case 3. (assignment). Assume Γ ⊢ {A[x := e]}x := e{A} is provable. Assume A[x := e] (s,h) = True. Let n be e s. By the definition we have x := e ((s, h)) = {(s , h)} where s = s[x := n]. Since A[x := e] (s,h) = A (s ,h), we have A (s ,h) = True. Hence {A[x := e]}x := e{A}istrue. Thenbydefinition,Γ(k) ⊢ {A[x := e]}(x := e)(k) {A} is true.
  • 69. 5.1 Soundness 61 Case 4. (if). We have the asserted program Γ ⊢ {A ∧ b}P {B} and Γ ⊢ {A ∧ ¬b}P {B} which are provable. By induction hypothesis, Γ(k) ⊢ {A ∧ b}P(k) {B} and Γ(k) ⊢ {A ∧ ¬b}P(k) {B} are true. Now we will show that Γ(k) ⊢ {A}(if (b) then (P ) else (P ))(k) {B} is true for all k. Assume Γ(k) is true. Then {A ∧ b}P(k) {B} and {A ∧ ¬b}P(k) {B} are true. Assume A (s,h) is true. Case 4.1. b s is true. By Lemma 5.1.3, we have b (s,h) is true. By def- inition, A ∧ b (s,h) is true. Assume (if (b) then (P ) else (P ))(k) ((s, h)) = P(k) ((s, h)) ∋ r. Then r ̸= abort and B r is true. Case 4.2. b s is false. Then the fact that r ̸= abort and B r is true is similarly proved to Case 1. Then {A}(if (b) then (P ) else (P ))(k) {B} is true. Then by definition Γ(k) ⊢ {A}(if b then P else P )(k) {B} is true for all k. Case 5. (while). B is in the form A ∧ ¬b and by (while) rule we have the asserted program Γ ⊢ {A ∧ b}P{A} which is provable. By induction hypothesis, Γ(k) ⊢ {A ∧ b}P(k) {A} is true for all k. Now we will show that Γ(k) ⊢ {A}(while (b) do (P))(k) {¬b ∧ A} is true. Assume Γ(k) is true. Then {A ∧ b}P(k) {A} is true. Let define the function F : States ∪ {abort} → p(States ∪ {abort}) by F(abort) = {abort}, F((s, h)) = {(s , h ) | A ∧ ¬b (s ,h ) = True} ∪ {r ∈ States | A (s,h) = False}. We will show that F satisfies the inequations obtained from the equations for
  • 70. 62 Chapter 5. Soundness and Completeness (while (b) do (P))(k) by replacing = by ⊇. That is, we will show F(abort) ⊇ {abort}, F((s, h)) ⊇ {(s, h)} if b s = False, F((s, h)) ⊇ ∪ { F(r) | r ∈ P(k) ((s, h)) } if b s = True. ThefirstinequationimmediatelyholdsbythedefinitionofF. Wewillshowthesecond inequation. Assume b s = False. By Lemma 5.1.3, we have b (s,h) = False. If A (s,h) = False, then the inequation holds by the definition of F. If A (s,h) = True, then A ∧ ¬b (s,h) = True, and the inequation holds by the definition of F. We will show the last inequation. If A (s,h) = False, then it holds by the definition of F. Assume A (s,h) = True, b s = True, and r′ is in the right-hand side. We will show that r′ ∈ F((s, h)). We have some r such that P(k) ((s, h)) ∋ r and F(r ) ∋ r′ . By Lemma 5.1.3, we have b (s,h) = True. Hence A ∧ b (s,h) = True. Since {A ∧ b}P(k) {A} is true we have r ̸= abort and A r = True. Then we have r′ ̸= abort and A ∧ ¬b r′ = True. Hence r′ ∈ F((s, h)). We have shown that F satisfies the inequations. By the least fixed point theo- rem [17], we have F ⊇ (while (b) do (P))(k) . If A (s,h) = True and r′ ∈ (while (b) do (P))(k) ((s, h)), then r′ ∈ F((s, h)), and we have r′ ̸= abort and A ∧ ¬b r′ = True. Therefore Γ(k) ⊢ {A}(while (b) do (P))(k) {A ∧ ¬b} is true. Case 6. (comp). We have the asserted program Γ ⊢ {A}P {C} and Γ ⊢ {C}P {B} which are prov- able for some C. By induction hypothesis, Γ(k) ⊢ {A}P(k) {C} and Γ ⊢ {C}P(k) {B} are true for all k. Now we will show that Γ(k) ⊢ {A}(P ; P )(k) {B} is true for all k. Assume Γ(k) is true. Then {A}P(k) {C} and {C}P(k) {B} are true. Assume A (s,h) = True and (P ; P )(k) ((s, h)) ∋ r. We will show r ̸= abort and B r = True. We have r such that P(k) ((s, h)) ∋ r and P(k) (r ) ∋ r. Hence we have r ̸= abort and C r = True. Since P(k) (r ) ∋ r, we have r ̸= abort and B r = True. Hence {A}(P ; P )(k) {B} is true. Then Γ(k) ⊢ {A}(P ; P )(k) {B} is
  • 71. 5.1 Soundness 63 true for all k. Case 7. (conseq). We have the asserted program Γ ⊢ {A′ }P{B′ } which is provable where A → A′ and B′ →B. By induction hypothesis for the assumption, Γ(k) ⊢ {A′ }P(k) {B′ } is true for all k. Now we will show that Γ(k) ⊢ {A}P(k) {B} is true for all k. Assume Γ(k) is true. Then {A′ }P(k) {B′ } is true. Assume A (s,h) = True and P(k) ((s, h)) ∋ r. We will show r ̸= abort and B r = True. By the side condition A → A (s,h) = True, we have A (s,h) = True. Hence r ̸= abort and B r = True. By the side condition B → B r = True, we have B r = True. Hence {A}P(k) {B} is true. Then Γ(k) ⊢ {A}P(k) {B} is true for all k. Case 8. (cons). Assume Γ ⊢ {∀x′ ((x′ → e , e ) −∗ A[x := x′ ])}x := cons(e , e ){A} is provable. Assume ∀x′ (x′ → e , e −∗ A[x := x′ ]) (s,h) = True and x := cons(e , e ) ((s, h)) ∋ r. We will show r ̸= abort and A r = True. Let P be x := cons(e , e ), n be e s, and n be e s. By the definition of P , r ̸= abort andr = (s , h )wheres = s[x := n]andh = h[n := n , n+ := n ]forsomensuch that n > and n, n + ̸∈ Dom(h). By ∀x′ (x′ → e , e −∗ A[x := x′ ]) (s,h) = True, we have x′ → e , e −∗ A[x := x′ ] (s′,h) = True where s′ = s[x′ := n]. Let h = ∅[n := n , n + := n ]. We have h = h + h . Then x′ → e , e (s′,h ) = True since x′ ̸∈ FV(e , e ). Since x′ → e , e −∗ A[x := x′ ] (s′,h) = True, we have A[x := x′ ] (s′,h ) = True. Hence A (s ,h ) = True, since x′ ̸∈ FV(A). Now we have {∀x′ ((x′ → e , e ) −∗ A[x := x′ ])}x := cons(e , e ){A} is true. Then Γ(k) ⊢ {∀x′ ((x′ → e , e ) −∗ A[x := x′ ])}(x := cons(e , e ))(k) {A} is true for all k. Case 9. (lookup). Assume Γ ⊢ {∃x′ (e → x′ ∗ (e → x′ −∗ A[x := x′ ]))}x := [e]{A} is provable. Assume ∃x′ ((e → x′ ) ∗ ((e → x′ ) −∗ A[x := x′ ])) (s,h) = True and x := [e] ((s, h)) ∋ r. We will show r ̸= abort and A r = True. Let n be