% LaTeX 2e
% Introduction to mathematical logic - A problem solving course
% Arnold W. Miller miller@math.wisc.edu
\documentclass[12pt]{article}
\usepackage{amssymb,latexsym}
\def\Bbb#1{{\mathbb #1}}
\def\goth#1{{\mathfrak #1}}
\def\frak#1{{\mathfrak #1}}
%%%%%%% counters %%%%%%%%%%%%%%%%%%%%%%%%%%
\newcount\probno
\newcount\sectorno
\sectorno=0
\def\sector#1{\probno=1\advance\sectorno by 1
\newpage\begin{center}{\bf #1}\end{center}}
\def\prob{\par\medskip\noindent
{\kern -.5in \parbox{.45in}{\the\sectorno .\the\probno}}\advance\probno by 1}
\newtheorem{lemma}{Lemma}
%%%%%%%%% indexing %%%%%%%%%%%%%%%%%%%%%%%%
\newwrite\dexno
\immediate\openout\dexno=dexdef
\def\dex#1{{\it #1}\immediate\write\dexno{?par?noindent #1 ?dotfill
\the\sectorno.\the\probno -p.\thepage}}
\def\sdex#1{\immediate\write\dexno{?par?noindent #1 ?dotfill
\the\sectorno.\the\probno -p.\thepage}}
%%% note symdex only writes to dexdef.tex, so use "$foo$ \sdex{$foo$}"
%%% always use $ inside sdex, ie. \sdex{$\omega$}, inside sdex change \ to ?
%%% use dexdef.tex to make defines.tex - sort and replace ? by backslash
%%% dummies:
%\def\dex#1{{\it #1}}
%\def\sdex#1{}
%%%%%%%%%%%% symbols %%%%%%%%%%%%%%
\def\ax #1.{\par\medskip\noindent{\bf #1:}}
\def\blank{\;\tilde{ }\;} \def\cc{{\frak c}} \def\conj{\mathop{\wedge}}
\def\disj{\mathop{\vee}} \def\dom{{\rm dom}} \def\dotminus{\dot{-}}
\def\elemsub{\preceq} \def\elemsup{\succeq} \def\eq{\approx}
\def\fama{{\cal A}} \def\famc{{\cal C}} \def\famf{{\cal F}}
\def\HFin{(HF,\in)} \def\implies{\rightarrow} \def\isom{\simeq}
\def\moda{{\goth A}} \def\modb{{\goth B}} \def\modc{{\goth C}}
\def\modn{{\goth N}} \def\modq{{\goth Q}} \def\modr{{\goth R}}
\def\nn{{\Bbb N}} \def\nor{\oplus} \def\ord{{\bf ORD }}
\def\partial{{\rm partial}{\;\;}}
\def\pp{{\cal P}} \def\prf{{\rm Prf}}
\def\proof{\par\noindent proof:\par} \def\proves{\vdash}
\def\qed{\nopagebreak\par\noindent\nopagebreak$\Box$\par}
\def\qq{{\Bbb Q}} \def\res{\upharpoonright}
\def\rmand{{\mbox{ and }}} \def\rmiff{{\mbox{ iff }}}
\def\rmor{{\mbox{ or }}} \def\rr{{\Bbb R}} \def\ss{{\cal S}}
\def\substruc{\subseteq} \def\ul{\ulcorner} \def\ur{\urcorner}
\def\zfc{{\rm ZFC}} \def\zz{{\Bbb Z}}
%\def\com#1{\par {\it {\bf Comment:} #1}}
\def\com#1{}
\begin{document}
\begin{flushright}
Arnold W. Miller\\
Department of Mathematics\\
University of Wisconsin\\
480 Lincoln Drive\\
Madison, WI 53706\\
miller@math.wisc.edu\\
Fall 95\\
\end{flushright}
\bigskip
\begin{center}
Introduction to Mathematical Logic
\end{center}
I have used these questions or some variations four times to
teach a beginning graduate course in Mathematical Logic. I
want to thank the many students who hopefully
had some fun doing them, especially, Michael Benedikt,
Tom Linton, Hans Mathew, Karl Peters, Mark Uscilka, Joan Hart,
Stephen Mellendorf, Ganesan Ramalingam, Steven Schwalm, Garth
Dickie, Garry Schumacher, Krung Sinapiromsaran, Stephen Young,
Brent Hetherwick, Maciej Smuga-Otto, and Stephen Tanner.
\begin{center}
Instructions
\end{center}
Do not read logic books during this semester, it is self-defeating.
You will learn proofs you have figured out yourself and the more
you have to discover yourself the better you will learn them.
You will probably not learn much from your fellow student's presentations
(although the one doing the presenting does). And you shouldn't!
Those that have solved the problem should be sure that the presented
solution is correct. If it doesn't look right it probably isn't.
Don't leave this up to me, if I am the only one who objects I will
stop doing it. For those that haven't solved the problem, you
should regard the presented solution as a hint and go and write
up for yourself a complete and correct solution. Also you might want
to present it to one of your fellow students outside the classroom, if
you can get one to listen to you.
\newpage
\begin{center}
The Moore Method
\end{center}
From P.R. Halmos\footnote{The teaching of problem solving,
Amer. Math. Monthly, (1975)82, 466-470.}:
``What then is the secret--what is the best way to learn to solve
problems? The answer is implied by the sentence I started with: solve
problems. The method I advocate is sometimes known as the `Moore
method', because R.L. Moore developed and used it at the University
of Texas. It is a method of teaching, a method
of creating the problem-solving attitude in a student, that
is a mixture of what Socrates taught
us and the fiercely competitive spirit of the Olympic games.''
\bigskip
From F.Burton Jones\footnote{The Moore method,
Amer. Math. Monthly, (1977)84, 273-278.}:
``What Moore did: $\ldots$
After stating the axioms and giving motivating examples to illustrate
their meaning he would then state definitions and theorems. He simply
read them from his book as the students copied them down.
He would then instruct the class to find proofs of their own and
also to construct examples to show that the hypotheses of the theorems
could not be weakened, omitted, or partially omitted.
$\ldots$
``When a student stated that he could prove Theorem x, he was asked to
go to the blackboard and present the proof. Then the other students,
especially those who hadn't been able to discover a proof, would
make sure that the proof presented was correct and convincing.
Moore sternly prevented heckling. This was seldom necessary because
the whole atmosphere was one of a serious community effort to understand
the argument.''
\bigskip
From D.Taylor\footnote{Creative teaching: heritage of R.L.Moore, University of
Houston, 1972, QA 29 M6 T7, p149.}:
``Criteria which characterize the Moore method of teaching include:
\noindent (1) The fundamental purpose: that of causing a student to
develop his power at rational thought.
\noindent (2) Collecting the students in classes with common mathematical
knowledge,
striking from membership of a class any student whose knowledge is too
advanced over others in the class.
\noindent (3) Causing students to perform research at their level by
confronting
the class with impartially posed questions and conjectures which are at
the limits of their capability.
\noindent (4) Allowing no collective effort on the part of the students
inside
or outside of class, and allowing the use of no source material.
\noindent (5) Calling on students for presentation of their efforts at
settling
questions raised, allowing a feeling of ``ownership'' of a theorem
to develop.
\noindent (6) Fostering competition between students over the settling
of questions raised.
\noindent (7) Developing skills of critical analysis among the class by
burdening
students therein with the assignment of ``refereeing'' an argument
presented.
\noindent (8) Pacing the class to best develop the talent among its
membership.
\noindent (9) Burdening the instructor with the obligation to not assist, yet
respond to incorrect statements, or discussions arising from incorrect
statements, with immediate examples or logically sound propositions to
make clear the objection or understanding.''
\bigskip
Taylor's (2) and (4) are a little too extreme for me.
It is OK to collaborate
with your fellow students as long as you give them credit. In fact, it
is a good idea to try out your argument first by presenting it to
fellow student. Avoid reading logic if you can, at least this semester,
but if you do give a reference.
For more readings on the Moore method see:
Paul R. Halmos, What is Teaching?,
Amer. Math. Monthly, 101 (1994), 848-854.
Donald R. Chalice, How to teach a class by the modified Moore method,
Amer. Math. Monthly, 102 (1995), 317-321.
\bigskip
\begin{center}
Quote From P.R. Halmos:
\end{center}
``A famous dictum of P\'{o}lya's about problem solving is that
if you can't solve a problem, then there is an easier problem that
you can't solve--find it!''
\sector{Propositional Logic and the Compactness Theorem}
The \dex{syntax} (grammar) of propositional logic is the following.
The \dex{logical symbols} are $ \conj ,\disj , \neg , \implies ,$ and $ \iff$.
The \dex{nonlogical symbols} consist of an arbitrary nonempty set
$\pp$ that we assume
is disjoint from the set of logical symbols to avoid confusion. The
set ${\pp}$ is referred to as the set of \dex{atomic sentences} or
as the set of \dex{propositional letters}.
For example, $\{P,Q,R\}$, $\{P_0,P_1,P_2,\ldots\}$, or $\{S_r:r\in\rr\}$.
The set of
\dex{propositional sentences} ${\ss}$ is the smallest set of finite strings
of symbols such that
$\pp\subseteq \ss$, and if $\theta \in \ss$ and $\psi \in \ss$, then
$ \neg \theta \in \ss, (\theta \conj \psi) \in \ss,(\theta \disj \psi) \in \ss,
(\theta \implies \psi) \in \ss,$ and $(\theta \iff \psi) \in \ss$.
The \dex{semantics} (meaning) of propositional logic consists of truth
evaluations.
A \dex{truth evaluation} is a function $e:{\ss}\to\{ T,F \}$,
that is consistent
with the following truth tables:
$$
\begin{array}{ccccccc}
\theta & \psi &\neg\theta &(\theta\conj\psi) &(\theta\disj\psi)
&(\theta\implies\psi) &(\theta\iff\psi) \\
T&T&F&T&T&T&T\\
T&F&F&F&T&F&F\\
F&T&T&F&T&T&F\\
F&F&T&F&F&T&T
\end{array}
$$
For example if $e(\theta)=T$ and $e(\psi)=F$, then
$e(\theta \implies \psi)=F$. Also $e(\neg\theta)=T$ iff $e(\theta)=F$.
For example, if $\pp=\{P_x:x\in\rr\}$ and we define
$e(P_x)=T$ if $x$ is a rational and $e(P_x)=T$ if $x$ is a irrational,
then $e((P_2\conj \neg P_{\sqrt{2}}))=T$. However if we
define $e^\prime(P_x)=T$ iff $x$ is an algebraic number, then
$e^{\prime}((P_2\conj \neg P_{\sqrt{2}}))=F$.
A sentence
$\theta$ is called a \dex{validity} iff for every truth evaluation $e$,
$e(\theta)=T$. A sentence
$\theta$ is called a \dex{contradiction} iff for every truth evaluation $e$,
$e(\theta)=F$.
We say that two sentences $\theta$ and $\psi$ are \dex{logically equivalent}
iff for every truth evaluation $e$, $e(\theta)=e(\psi)$.
A set of logical symbols is \dex{adequate} for propositional logic iff
every propositional sentence is logically equivalent to one whose only logical
symbols are from the given set.
\bigskip
\prob Define ${\ss}_0={\pp}$ the atomic sentences and define
$${\ss}_{n+1}={\ss}_n\cup\{\neg\theta:\theta\in {\ss}_n\}\cup
\{(\theta\#\psi):\theta,\psi\in {\ss}_n,\#\in
\{\conj,\disj,\implies,\iff\}\}$$
Prove that ${\ss}={\ss}_0\cup {\ss}_1\cup {\ss}_2\cup\cdots$.
\prob Prove that for any function $f:{\pp}\to \{T,F\}$ there exists a
unique truth evaluation $e:\ss\to \{T,F\}$ such that $f=e\res {\pp}$.
The symbol $e\res {\pp}$
stands for the restriction of the function $e$ to $\pp$.
\sdex{$e?res {?pp}$}
\prob Let $\theta$ and $\psi$ be two propositional sentences. Show
that $\theta$ and $\psi$ are logically equivalent iff $(\theta\iff\psi)$
is a validity.
\prob Suppose $\theta$ is a propositional validity, $P$ and $Q$ are
two of the propositional letters occurring in $\theta$, and $\psi$
is the sentence obtained by replacing each occurrence of $P$ in
$\theta$ by $Q$. Prove that $\psi$ is a validity.
\prob Can you define $ \disj$ using only $\implies$? Can you define $\conj$
using only $\implies$?
\com{ $(p\implies q)\implies p$ is equivalent to $p\disj q$. $conj$ cannot
be defined (if there are at least two atomic sentences- it is definable if
there is only one), one proof is that every sentence using only implies is
made true by at least 50\% of the truth evaluations.}
\prob Show that $\{ \disj, \neg \}$ is an adequate set for propositional
logic.
\prob The definition of the logical connective \dex{nor} ( $\nor$ )
is given by
the following truth table:
$$
\begin{array}{ccc}
\theta & \psi & (\theta \nor \psi) \\
T&T&F \\
T&F&F \\
F&T&F \\
F&F&T
\end{array}
$$
Show that $\{ \nor \}$ is an adequate set for propositional logic.
\prob (Sheffer) Find another binary connective that is adequate all
by itself.
\prob Show that $\{\neg\}$ is not adequate.
\prob Show that $\{ \disj \}$ is not adequate.
\prob How many binary logical connectives are there? We assume two
connectives are the same if they have the same truth table.
\prob Show that there are exactly two binary logical connectives that are
adequate
all by themselves. Two logical connectives are the same iff they have the same
truth table.
\prob Suppose $\pp=\{P_1,P_2,\ldots,P_n\}$. How many propositional
sentences (up to logical equivalence) are there in this language?
\prob Show that every propositional sentence is equivalent to a sentence in
\dex{disjunctive normal form}, i.e. a disjunction of conjunctions
of atomic or the negation of atomic sentences:
$$\disj_{i=1}^m \bigl(\conj_{j=1}^{k_i}\theta_{ij}\bigr)$$
where each $\theta{ij}$ is atomic or $\neg$atomic.
The expression $\disj_{i=1}^n \psi_i$ abbreviates
$(\psi_1\disj(\psi_2\disj(\cdots\disj (\psi_{n-1}\disj\psi_{n})))\cdots)$.
\bigskip
In the following definitions and problems $\Sigma$ is a set of propositional
sentences in some fixed language and all sentences are assumed to be in this
same fixed language.
$\Sigma$ is \dex{realizable} iff there exists a truth evaluation $e$ such that
for all $\theta \in \Sigma$, $e(\theta)=T.$
$\Sigma$ is \dex{finitely realizable}
iff every finite subset of $\Sigma$ is realizable.
$\Sigma$ is \dex{complete} iff for every sentence $\theta$ in the language of
$\Sigma$ either $\theta$ is in $\Sigma$ or $\neg\theta$ is in $\Sigma$.
\medskip
\prob Show that if $\Sigma$ is finitely realizable and $\theta$ is
any sentence
then either
$\Sigma \cup \{ \theta \}$ is finitely
realizable or $\Sigma \cup \{ \neg\theta \}$
is finitely realizable.
\prob Show that if $\Sigma$ is finitely realizable and $(\theta \disj \psi)$
is in $\Sigma$, then either
$\Sigma \cup \{ \theta\}$ is finitely realizable or
$\Sigma \cup \{ \psi \}$ is finitely realizable.
\prob Show that if $\Sigma$ is finitely realizable
and complete and if $\theta$ and
$(\theta \implies \psi)$
are both in $\Sigma$, then $\psi$ is in $\Sigma$.
\prob Show that if $\Sigma$ is finitely realizable
and complete, then $\Sigma$ is
realizable.
\prob Suppose that the set of all sentences in our language is countable,
e.g.,
$S =\{ \theta_n : n = 0,1,2,\ldots \}$. Show that if $\Sigma$ is finitely
realizable, then there
exists a complete finitely
realizable $\Sigma^\prime $ with $\Sigma\subseteq \Sigma^\prime$.
\prob ({\bf Compactness theorem for propositional logic})
Show that every finitely realizable $\Sigma$ is realizable. You may assume
there are only countably many sentences in the language.
\bigskip
A family of sets $\famc$ is a \dex{chain} iff for any $X,Y$ in $\famc$
either
$X \subseteq Y$ or $Y \subseteq X$.
The union of the family $\fama$ is
$$\bigcup \fama = \{ b : \exists c \in \fama ,b \in c \}.$$
$M$ is a \dex{maximal} member of a family $\fama$ iff $M \in \fama$ and
for every $B$ if
$B \in \fama$ and $M \subseteq B$, then $M=B$.
A family of sets $\fama$ is closed under the unions of chains iff
for every subfamily, $\famc$,
of $\fama$ which is a chain the union of the chain, $\bigcup\famc$,
is also a member of $\fama$.
\par\medskip
{\bf Maximality Principle:} Every family of sets closed
under the unions of
chains has a maximal member.
\par\medskip
\prob Show that the family of finitely realizable $\Sigma$ is closed under
unions of chains.
\prob Show how to prove the compactness theorem without the assumption
that there are only countably many sentences. (You may use the Maximality
Principle.)
\prob Suppose $\Sigma$ is a set of sentences and $\theta$ is some sentence
such that for every
truth evaluation $e$ if $e$ makes all sentences in $\Sigma$ true, then
$e$ makes
$\theta$ true.
Show that for some
finite $\{ \psi_1, \psi_2, \psi_3,\ldots \psi_n \}\subseteq \Sigma$
the sentence
$$ ( \psi_1 \conj \psi_2 \conj \psi_3 \conj\cdots \conj \psi_n )
\implies \theta$$
is a validity.
\par\bigskip
\noindent A \dex{binary relation} $R$ on a set $A$ is a subset of $A \times A$.
Often
we write $xRy$ instead of $\langle x,y\rangle\in R$.
A binary relation $\leq$ on a set $A$ is a \dex{partial order} iff
\par a. (reflexive) $\forall a \in A \; a\leq a$;
\par b. (transitive) $ \forall a,b,c \in A \;[ ( a \leq b \conj b \leq c)
\implies a \leq c]$; and
\par c. (antisymmetric) $\forall a,b\in A\;[ (a \leq b \conj b \leq a)
\implies a=b]$.
\noindent Given a partial order $\leq$ we define the \dex{strict order}
$<$ by
$$x < y \iff (x \leq y \conj x \not= y)$$
\noindent A binary relation $\leq$ on a set A is a \dex{linear order}
iff $\leq$
is a partial order and
\par d. (total) $\forall a,b \in A (a \leq b \disj b \leq a)$.
A binary relation $R$ on a set $A$ extends a binary relation
$S$ on $A$ iff $S \subseteq R$.
\medskip
\prob Show that for every finite set $A$ and partial order $\leq$ on $A$
there exists a
linear order $\leq ^*$ on $A$ extending $\leq$.
\prob Let $A$ be any set and let our set of atomic sentences $\pp$ be:
$$\pp= \{ P_{ab} : a,b \in A \}$$
For any truth evaluation $e$ define $\leq_e$ to be the binary relation
on $A$ defined by
$$ a \leq_e b \mbox{ iff } e(P_{ab})=T. $$
Construct a set of sentences $\Sigma$ such that for every truth
evaluation $e$,
\centerline{$e$ makes $\Sigma$ true iff $\leq_e$ is a linear order on $A$.}
\prob Without assuming the set $A$ is finite prove
for every partial order $\leq$ on $A$
there exists a linear order $\leq ^*$ on $A$ extending $\leq$.
\com{This one can be proved directly from the maximality principal, the
others I don't know about.}
\medskip
In the next problems $n$ is an arbitrary positive integer.
\par\medskip
\prob If $X \subseteq A$ and $R$ is a binary relation on $A$ then the
restriction of $R$ to $X$ is the binary relation
$S= R\cap (X \times X)$.
For a partial order $\leq$ on $A$, a set $B \subseteq A$ is an $\leq$-chain
iff the
restriction of $\leq$ to $B$ is a linear order.
Show that given a partial order $\leq$ on $A$:
the set $A$ is the union of less than $n$ $\leq$-chains
iff
every finite subset of $A$ is the union of less than $n$ $\leq$-chains.
\prob A partial order $\leq$ on a set $A$ has \dex{dimension}
less than $n+1$ iff
there exists $n$ linear
orders $\{ \leq_1, \leq_2, \leq_3,\ldots,\leq_{n} \}$ on $A$ (not
necessarily distinct) such that:
$$ \forall x,y \in A\; [ x\leq y \mbox{ iff }
( x\leq_i y \mbox{ for } i=1,2,\ldots,n) ].$$
Show that a partial order $\leq$ on a set $A$ has dimension less than $n+1$
iff for every
finite $X$ included in $A$ the restriction of $\leq$ to
$X$ has dimension less than $n+1$.
\prob
A binary relation $E$ (called the edges) on a set $V$ (called the vertices)
is a \dex{graph} iff
\par a. (irreflexive) $\forall x \in V \neg xEx $; and
\par b. (symmetric) $\forall x,y \in V \, (xEy \implies yEx)$.
\par\noindent We say $x$ and $y$ are adjacent iff $xEy$.
$(V^\prime ,E^\prime )$ is a subgraph of $(V,E)$ iff
$V^\prime\subseteq V$ and $E^\prime$ is the restriction of $E$ to $V^\prime$.
For a graph $(V,E)$ an $n$ coloring is a map
$c : V \to \{ 1,2,\ldots,n \}$ satisfying
$\forall x,y \in V (xEy \implies c(x) \not= c(y))$,
i.e. adjacent vertices have different colors.
A graph $(V,E)$ has \dex{chromatic number} $\leq n$ iff there is a $n$
coloring on its vertices.
Show that a graph has chromatic number $\leq n$ iff
every finite subgraph of it has
chromatic number $\leq n$.
\prob A triangle in a graph $(V,E)$ is a set $\Delta=\{a,b,c\} \subseteq V$
such that $aEb$, $bEc$, and $cEa$. Suppose that every finite subset of
$V$ can be partitioned into $n$ or fewer sets none of which contain a
triangle.
Show that $V$ is the union of $n$ sets none of
which contain a triangle.
\prob (Henkin) A \dex{transversal} for a family of sets $\famf$ is a
one-to-one choice function.
That is a one-to-one function $f$ with domain $\famf$ and for every
$x\in \famf \,\, f(x)\in x.$ Suppose that $\famf$ is a family of
finite sets such that for
every finite $\famf^\prime \subseteq \famf, \famf^\prime $ has a
transversal.
Show that $\famf$ has a
transversal. Is this result true if $\famf$ contains infinite sets?
\prob Let $\famf$ be a family of subsets of a set $X$. We say
that $\famc\subseteq \famf$
is an \dex{exact cover}
of $Y \subseteq X$ iff every element of $Y$ is in a unique
element of $\famc$. Suppose that every element of $X$ is in at most
finitely many
elements of $\famf$. Show that there exists an exact cover
$\famc\subseteq \famf$ of $X$
iff for every finite $Y\subseteq X$ there exists
$\famc\subseteq \famf$ an exact
cover
of $Y$. Is it necessary that every element of $X$ is in at most finitely many
elements of $\famf$?
\com{Use $P_a$ to say ``$a\in \famc$'', so $\Sigma$ contains
$\disj_{x\in a} P_a$ for each $x\in X$ and $\neg(P_a\conj P_b)$ if
$a$ and $b$ are not disjoint. To verify finite sat- given Y finite get
$Z\supseteq Y$ finite which can witness the nondisjointedness of the
elements of $\famf$ which hit $Y$, then for exact cover $Z$ the
elements which
hit $Y$ are a disjoint exact cover of $Y$.}
\prob If $\famf$ is a family of subsets of $X$ and $Y\subseteq X$
then we say $Y$
\dex{splits} $\famf$ iff for any $Z\in \famf,$
$ Z \cap Y $ and $Z \setminus Y$
are both nonempty. Prove that if $\famf$ is a family of finite
subsets of $X$
then
$\famf$ is split by some $Y\subseteq X$ iff every finite
$\famf^\prime \subseteq \famf$
is split by some $Y\subseteq X$. What if $\famf$ is allowed to
have infinite sets
in it?
\prob Given a set of students and set of classes, suppose each student
wants one of a finite set of classes, and each class has a finite
enrollment limit. Show that if each finite set of students can
be accommodated, they all can be accommodated.
\prob Show that the compactness theorem of propositional logic is
equivalent to the statement that for any set $I$, the space $2^I$, with
the usual Tychonov product topology is compact, where $2=\{0,1\}$ has
the discrete topology. (You should skip this problem if you do not know what
a topology is.)
\sector{The Axioms of Set Theory}
Here are some. The whole system is known as \dex{ZF} for Zermelo-Fraenkel
set theory. When the axiom of choice is included it is denoted
\dex{ZFC}. It was originally developed by Zermelo to
make precise what he meant when he said that the
well-ordering principal follows from the
axiom of choice. Latter Fraenkel added the axiom of replacement.
Another interesting system is GBN which is G\"{o}del-Bernays-von Neumann
set theory.
\medskip
\ax Empty Set. $$ \exists x\, \forall y ( y \notin x )$$
The empty set is usually written $\emptyset$. \sdex{$?emptyset$}
\ax Extensionality.
$$ \forall x \forall y ( x = y \iff \forall z ( z \in x \iff z\in y)) $$
Hence there is only one empty set.
\ax Pairing. $$ \forall x \forall y \exists z \forall u
( u \in z \iff u = x \disj u = y ) $$
We usually write $z = \{ x,y \}$. \sdex{$z = ?{ x,y ?}$}
\ax Union. $$ \forall x\, \exists y\, ( \forall z ( z \in y \iff
(\exists u\, u \in x \conj z \in u ))$$
We usually write $y = \cup x$. \sdex{$y = \cup x $} $A \cup B$
abbreviates $ \cup \{ A,B \}$. \sdex{$A \cup B$}
$z \subseteq x $ is an abbreviation
for $ \forall u \, (u\in z \implies u\in x)$. \sdex{$z ?subseteq x $}
\ax Power Set. $$ \forall x \, \exists y \, \forall z
( z \in y \iff z \subseteq x ) $$
We usually write $y = P(x)$. \sdex{$y = P(x)$}
For any set x , $x + 1 = x \cup \{x\}$. \sdex{$x + 1 = x ?cup ?{x?}$}
\ax Infinity. $$\exists y \, (\emptyset\in y \conj \forall x
( x \in y \implies x + 1 \in y )) $$
The smallest such y is denoted $\omega$, \sdex{$?omega$} so
$\omega=\{0,1,2,\ldots\}$.
\ax Comprehension Scheme.
$$ \forall z \exists y \forall x [ x \in y \iff
( x \in z \conj \theta(x) ) ]$$
The comprehension axiom is being invoked when we say given $z$ let
$$y = \{ x \in z : \theta(x) \}.$$ \sdex{$y = ?{ x ?in z : ?theta(x) ?}$}
The formula $\theta$ may refer to $z$ and
to other sets, but not to $y$. In general given a formula $\theta(x)$
the family $\{x:\theta(x)\}$ is referred to as a \dex{class}, it may not
be a set. For example, the class of all sets is
$${\bf V}=\{x:x=x\}.$$ \sdex{$V$}
Classes that are not sets are referred to as
\dex{proper classes}.
Every set $a$ is a class, since the formula
``$x\in a$'' defines it.
The comprehension axioms say that the intersection
of a class and a set is a set. We use {\bf boldface} characters to
refer to classes.
\bigskip
\prob Define $X\cap Y$,
$X\setminus Y$, and
$\bigcap X$ and show they exist. \sdex{$X?cap Y$}
\sdex{$X?setminus Y$} \sdex{$?bigcap X$}
\prob The \dex{ordered pair} is defined by
\sdex{$?langle x,y?rangle$}
$${\langle x,y\rangle =\{\{x\},\{x,y\}\}}.$$
Show it exists.
Show the $\langle x,y\rangle = \langle u,v\rangle$ iff $x = u$ and $y = v$.
\prob The \dex{cartesian product} is defined by
\sdex{$X ?times Y$}
$$ X \times Y =\{ \langle x,y\rangle : x \in X \mbox{ and } y \in Y\}.$$
Show it exists.
\prob A function is identified with its graph. For any sets $X$ and $Y$ we let
$Y^X$ be the set of all functions with domain $X$ and
range $Y$. \sdex{$Y^X$} Show this
set exists.
\prob Given a function $f:A\mapsto B$ and set $C\subseteq A$ the
\dex{restriction} of $f$ to $C$, written $f\res C$
is the function
with domain $C$
and equal to $f$ everywhere in $C$. \sdex{$f?res C$} Show that it exists.
$f^{\prime\prime}C$ is the set
of all elements of $B$ that are in the image of $C$.
\sdex{$f^{\prime\prime}C$} Show that it exists.
\prob Prove $\omega$ exists (i.e.
that there does exist a smallest such $y$).
Prove for any formula $\theta(x)$ if $\theta(0)$ and
$\forall x\in\omega (\theta(x)\implies \theta(x+1))$, then
$\forall x\in\omega\;\theta(x)$.
\prob Suppose $G:Z\to Z$.
Show that for any $x\in Z$ there exists a unique $f:\omega\to Z$
such that $f(0)=x$ and for all $n\in\omega\;\; f(n+1)=G(f(n))$.
\prob Let $(V,E)$ be a graph. Informally, two vertices in any graph
are \dex{connected} iff (either they are the same or) there is a
finite path using the edges of the graph connecting one to the other.
Use the preceding problem to formally define and prove that
the relation $x$ is connected to $y$, written $x\sim y$,
exists and is an equivalence relation on $V$.
Equivalence classes of $\sim$ are called the
\dex{components} of the graph.
\prob Let $A$ and $B$ be disjoint sets and $f:A\to B$ and $g:B\to A$
be one-to-one functions. Consider the \dex{bipartite graph} $V$ which
has vertices $A\cup B$ and edges given by the union of the graphs of
$f$ and $g$, i.e., there is an edge between $a\in A$ and $b\in B$ iff
either $f(a)=b$ or $g(b)=a$.
Describe what the finite components of $V$ must look like as a
subgraph of $V$.
Describe the infinite components of $V$.
\prob Define $| X | = | Y |$ iff
there is a one-to-one onto map from $X$ to $Y$. \sdex{$| X | = | Y |$}
We say $X$ and $Y$ have the same cardinality.
Define ${| X | \leq | Y |}$ iff
there is a one-to-one map
from $X$ to $Y$. \sdex{$| X | ?leq | Y |$}
Define ${| X | < |Y |}$ iff $ | X | \leq | Y |$ and $ | X | \not= | Y |$.
\sdex{$| X | < |Y |$}
(Cantor-Shr\"oder-Bernstein)
Show that if $| A | \leq | B |$ and $| B | \leq | A |$ then
$| A | = | B |$.
\bigskip
\prob Show that $|A|\leq |A|$.
Show that if $| A | \leq | B |$ and $| B | \leq | C |$ then
$| A | \leq | C |$.
\prob Show that
$|P(X)| = |\{ 0,1\}^X|$.
\prob (Cantor) Show that $| X | < | P(X) |$.
\prob Show that the class of sets, ${\bf V}$, is not a set.
\prob Show that $|A \times ( B \times C )| = | ( A \times B ) \times C |$.
\prob Show that $| A^{ B \times C}| = | {( A^B)}^C |$.
\prob Show that if there is a function $f:A\mapsto B$ that is onto,
then $|B|\leq |A|$.
\footnote{This requires the axiom of choice. It is open if it is equivalent
to AC.}
\bigskip
A set is finite iff it can be put into one-to-one correspondence with
an element of $\omega$
A set is \dex{countable} iff it is either finite or of the same cardinality
as $\omega$. A set is \dex{uncountable} iff it is not countable.
${{\rr}}$ is the set of real numbers and we use
$\cc = |\rr|$
to denote its cardinality which is also called
the cardinality of the continuum. \sdex{${?rr}$} \sdex{$?cc$}
Below, you may use whatever set theoretic
definitions of the integers, rationals and real numbers that you know.
For example, you may regard the reals as either Dedekind cuts in
the rationals, equivalences classes of Cauchy sequences of rationals,
infinitely long decimals, or ?points on a line.
\bigskip
\prob Show that the set of integers $\zz$
is countable.\sdex{${?zz}$}
\prob Show that the set of odd positive integers is countable.
\prob Show that the set of points in the plane with integer coordinates is
countable.
\prob Show that the countable union of countable sets is countable.
\footnote{Do you think you needed to use the Axiom of Choice?}
\prob
For any set $X$ let $[X]^{<\omega}$
be the finite subsets of $X$. \sdex{$[X]^{$ to refer to the
converse order, i.e. $x > y$ iff $y < x$.
\bigskip
\prob Let $( L, \leq )$ be a well ordering. Let $(L\times L,\leq^\prime)$
be defined in one of the following ways:
\par a. $(x,y) \leq^\prime (u,v)$ iff $x < u$ or ($x = u$ and $y \leq v$)
\par b. $(x,y) \leq^\prime (u,v)$ iff $x \leq u$ and $y \leq v$
\par c. $(x,y) \leq^\prime (u,v)$ iff $max\{x,y\} < max\{u,v\} $ or
$[max\{x,y\} = max\{u,v\}$ and ($x < u$ or ($x = u$ and $y \leq v)]$.
\par\noindent Which are well-orderings?
\prob Prove:
Let $( L, \leq )$ be any well-ordering and $f: L \to L$ an
increasing function
($\forall x,y \in L\;( x < y \implies f(x) < f(y))$). Then
for every $x$ in $L$ $x \leq f(x)$.
\prob For two binary relations $R$ on $A$ and $S$ on $B$ we write
$(A,R)\isom (B,S)$
iff there exists a one-to-one onto map
$f:A\to B$
such that
\centerline{for every $x,y$ in $A$ ($xRy$ iff $f(x)Sf(y)$).}
\noindent Such a map is called an
\dex{isomorphism}. \sdex{$(A,R)?isom (B,S)$}
If $(L_1,\leq_1 )$ and $(L_2,\leq_2)$ are well-orders and
$(L_1,\leq_1)\isom (L_2,\leq_2)$ then
show the isomorphism is unique. Is this true for all linear orderings?
\prob Let $(L,\leq)$ be a wellorder and for any $a \in L$ let
$L_a = \{ c \in L : c < a \}$. Show that
$( L, \leq )$ is not isomorphic to $( L_a, \leq )$ for any $a \in L$.
\prob (Cantor) Show that for any two wellorders exactly one
of the following occurs:
they are isomorphic, the first is isomorphic to an initial segment
of the second, or the second is isomorphic to an initial segment
of the first.
\com{ Let $(L,\leq)$ and $(K,\leq^\prime)$ be two wellorders.
Let $$G=\{\langle a,b\rangle : ( L_a,\leq ) \isom ( K_b,\leq ^\prime )\}$$
Show that $G$ is the graph of an order preserving bijection whose domain
is all of $L$ or whose range is all of $K$.}
\prob Let $(A,\leq)$ be a linear order such that
$$\forall X\subseteq A (X\isom A \rmor (A\setminus X)\isom A ).$$
Show that $A$ is a well order or an inverse well order.
\prob Show that a linear order $(L,\leq)$ is a wellorder
iff there does not exist an infinite sequence $x_n$ for
$n= 0,1,2,\ldots$ with
$x_{n+1} < x_{n}$ for every $n$. Does this use AC?
\sector{Axiom of Choice}
\bigskip
\noindent (AC) \dex{Axiom of Choice}: For every family $F$ of nonempty sets
there exists a choice function, i.e. a function $f$
with domain F such that for every $x$ in $F$, $f(x) \in x$.
\bigskip
\noindent (WO) \dex{Well-ordering Principle} : Every nonempty set
can be well ordered.
\bigskip
\noindent (TL) \dex{Tuckey's Lemma}: Every family of sets
with finite character has a maximal
element. A family of sets F has finite character iff for every set X,
$X\in F$ iff for every finite $Y\subseteq X$, $Y\in F$.
\bigskip
\noindent (MP) \dex{Maximality Principle}: Every family of sets closed
under the unions of chains has a maximal member.
\bigskip
\noindent (ZL) \dex{Zorn's Lemma}: Every family of sets contains a
maximal chain.
\par\bigskip
\prob Show that ZL implies MP.
\prob Show that MP implies TL.
\prob Show that TL implies AC.
\prob (Zermelo) Show that AC implies WO.
\com{ Given $X$ use AC to get a function
$f:P(X)\mapsto X$ such that $f(A) \in X\setminus A$
for all $A$ a proper subset of $X$. Call a well ordering
$(L,\leq)$ respectful iff $L\subseteq X$ and for every
$a\in L\; f(L_a)=a$.
Show that the union of all respectful well orderings is a
well ordering of $X$. }
\prob Given a nonempty family $\famf$ let $<$ be a
strict well-ordering of $\famf$. Say that a chain $\famc\subseteq\famf$
is greedy
iff for every $a\in \famf$ if
$$\{ b\in \famc : b < a \} \cup\{a\}$$
is a chain, then either
$a\in \famc$ or $b0 \;\conj\; \alpha \beta = \alpha \gamma)
& \implies & \beta =\gamma \\
(\alpha>1\;\conj\; {\beta <\gamma})
& \implies & \alpha^{\beta} < \alpha^{\gamma} \\
\end{eqnarray*}
\prob For any ordinals $\alpha$ and $\beta>0$ show there exists
unique ordinals
$\gamma$ and $\delta$ such that
$\alpha = \beta \gamma + \delta$ and $\delta<\beta$.
\prob Is the previous problem true for $\alpha=\gamma\beta + \delta$?
\prob Show for any $\beta > 0$ there exists unique ordinals
$\gamma_1,\ldots,\gamma_n, d_1,\ldots,d_n$ such that
$\gamma_1 >\gamma_2 > \cdots >\gamma_n$ ;
$ 0 < d_1,d_2,\ldots,d_n < \omega$ and
$$ \beta = \omega^{\gamma_1}d_1 + \omega^{ \gamma_2}d_2 + \ldots +
\omega^{ \gamma_n}d_n $$
This is called \dex{Cantor normal form}.
\prob Sort the following set of five ordinals:
$$\begin{array}{ccc} (\omega^{\omega})(\omega+\omega) &
(\omega+\omega) ( \omega^{\omega} ) &
\omega^{\omega} \omega + \omega^{\omega} \omega\\
\omega \omega^{\omega} +\omega \omega^{\omega} &
\omega^{\omega} \omega+\omega \omega^{\omega}
\end{array} $$
\prob An ordinal $\alpha$ is \dex{indecomposable} iff it satisfies any
of the following.
a) $\exists \beta \,\,\alpha = \omega^{\beta} $
b) $\forall \beta \forall \gamma$ if $\alpha= \beta +\gamma$ then
$\alpha=\beta$ or $\alpha =\gamma$
c) $\forall X \subseteq \alpha \,\,\,[( X,<) \isom (\alpha,<)$ or
$(\alpha\setminus X,<) \isom (\alpha,<)]$
d) $\forall \beta<\alpha\;\;\beta+\alpha=\alpha$
Show they are all equivalent.
\prob (Goodstein) The complete expansion of a positive integer in a base
is gotten by writing
everything possible including exponents and exponents of exponents etc. as
powers of the base. For example the number 36 written in complete base
two is: $$ 2^{( {2^2} + 1)}+2^2$$
The same number in complete base 3 is:
$$ 3^3 + 3^2$$
Let $a_n$ be a sequence described as follows. Given $a_k$ calculate
$a_{k+1}$ by
writing $a_k$ in base k then substitute k+1 for every k, then subtract one.
For example:
$$
a_2 =36=2^{( {2^2} + 1)}+2^2 \implies a_3 =
3^{\Bigl( {3^3} + 1 \Bigr)}+3^3 - 1
= 2.2876\ldots \times 10^{13}
$$
or for example if $a_6 = 6^4 + 2 \cdot 6^3 = 1728$, then
$$
a_7 = ( 7^4 + 2 \cdot 7^3) -1
= 7^4 +7^3 +6\cdot 7^2 + 6\cdot 7^1 + 6 = 3086
$$
Show that given any pair of positive integers n and m, if we let
$a_n = m$ then for some $k > n$ we get $a_k = 0$.
\prob Let $S$ be a countable set of ordinals. Show that
$$\{\beta:\exists \langle \alpha_n:n\in\omega\rangle\in S^{\omega}
\;\;\beta=\Sigma_{n<\omega}\alpha_n \}$$
is countable.
($\Sigma_{n<\omega}\alpha_n=sup\{\Sigma_{n\kappa$.
\prob Show $(\forall n \in \omega \,\, 2^{\aleph_n}=
\aleph_{n+1}) \implies 2^{\aleph_{\omega}}=
(\aleph_{\omega})^{\omega}$.
\prob Show that $({2^{<\kappa}})^{cf(\kappa)}=2^{\kappa}$.
\prob Show $(\forall n \in \omega \,\, 2^{\aleph_n}=
\aleph_{\omega+17})\implies 2^{\aleph_{\omega}}=
\aleph_{\omega+17}$.
\prob Let $\kappa$ be the least cardinal such that
$2^{\kappa}> 2^{\omega}$. Show that $\kappa$ is regular.
\prob Prove that for every infinite regular cardinal $\kappa$, there is
a cardinal $\lambda$ such that $\aleph_{\lambda}=\lambda$ and
$\lambda$ has cofinality $\kappa$.
\prob Show that $cf(2^{<\kappa}) = cf(\kappa)$ or
$cf(2^{<\kappa}) >\kappa$.
\prob Show that if $\omega \leq \lambda \leq \kappa$ then
$(\kappa^{+})^{\lambda}= max\{\kappa^{\lambda},\kappa^{+}\}$.
\prob For any set $X$ \dex{Hartog's ordinal} $h(X)$ is defined by:
$$ h(X) = sup\{ \alpha \in ORD: \exists {\rm \,onto\,} f: X \to \alpha\}$$
Show without AC that $h$ is well defined and with AC $h(X)=|X|^+$.
Show that AC is equivalent to the statement that for every two sets $X$ and
$Y$ either $|X|\leq|Y|$ or $|Y|\leq|X|$.
\prob Given sets $A_{\alpha} \subseteq \kappa$ for each $\alpha <\kappa$ each
of cardinality $\kappa$, show there exists $X\subseteq\kappa$ such that
$$\forall\alpha <\kappa\,\,\,\,
|A_{\alpha}\cap X|=|A_{\alpha}\setminus X|=\kappa$$
\prob Show there exists $X\subseteq \Bbb R$ which contains the rationals
and the only
automorphism of $(X,\leq)$ is the identity, i.e., any order preserving
bijection from $X$ to $X$ is the identity.
\prob Show there exists $X\subseteq \Bbb R^2$ such that for every $x\in X$
and positive $ r\in \Bbb R$ there is a unique $y\in X$ with
$d(x,y)=r$ where $d$ is Euclidean distance.
\prob (Sierpi\'{n}ski) Show that there exists
$X\subseteq \Bbb R^2$ such that for every line
$L$ in the plane, $|L\cap X|=2$.
\prob (Jech) Without using AC show that $\omega_2$ is not the countable
union of countable sets.
\sector{First Order Logic and the Compactness Theorem}
\begin{center}
Syntax
\end{center}
We begin with the syntax of first order logic. The \dex{logical symbols}
are $\disj,\neg,\exists,=$ and for each $n\in\omega$ a variable symbol
$x_n$. There are also grammatical symbols such as parentheses and commas
that we use to parse things correctly but have no meaning.
For clarity we usually use $x,y,z,u,v,$ etc. to refer to
arbitrary variables.
The \dex{nonlogical symbols} consist of a given set $L$ that
may include operation
symbols, predicate symbols, and constant symbols. The case where $L$ is
empty is referred to as the \dex{language of pure equality}.
Each symbol $s\in L$ has a nonnegative integer $\#(s)$ called its
\dex{arity}
assigned to it.
If $\#(s)=0$, then $s$ is a constant symbol. If $f$ is an operation symbol
and $\#(f)=n$ then $f$ is an $n$-ary operation symbol. Similarly if $R$ is
a predicate symbol and $\#(R)=n$ then $R$ is an $n$-ary predicate symbol.
In addition we always have that ``$=$'' is a logical binary predicate symbol.
For the \dex{theory of groups} the appropriate
language is $L=\{e,\cdot,^{-1}\}$ where ``$e$'' is a constant symbol,
so $\#(e)=0$,
``$\cdot$'' is a binary operation symbol, so $\#(\cdot)=2$, and
``$^{-1}$''
is a unary operation symbol, so $\#( ^{-1})=1$.
For the \dex{theory of partially ordered sets} we have that
$L=\{\leq\}$ where $\leq$ is a binary relation symbol, so $\#(\leq)=2$.
Our next goal is to define what it means to be a formula
of first order logic.
Let $L$ be a fixed language. An expression is a finite string of
symbols that are either logical symbols or symbols from $L$.
The set of \dex{terms} of $L$
is the smallest set of expressions that contain the variables and
constant symbols of $L$ (if any), and is closed under the formation rule:
if $t_1,t_2,\ldots,t_n$ are terms of $L$ and $f$ is an $n$-ary
operation symbol of $L$, then $t=f(t_1,t_2,\ldots,t_n)$ is a term of $L$.
If $L$ has no function symbols then the only terms of $L$ are the
variables and constant symbols. So for example if $c$ is a constant symbol,
$f$ is a 3-ary operation symbol, $g$ is a binary operation symbol, and
$h$ is a unary operation symbol, then
$$h(f(g(x,h(y)),y,h(c)))$$
is a term.
The set of \dex{atomic formulas} of $L$ is the set of all expressions of the
form $R( t_1,t_2,\ldots,t_n)$, where $t_1,t_2,\ldots,t_n$ are terms
of $L$ and $R$ is a $n$-ary predicate symbol of $L$. Since we
always have equality as
a binary relation we always have atomic formulas of the form
$t_1=t_2$.
The set of \dex{formulas} of $L$ is
the smallest set of expressions that includes
the atomic formulas and is closed under the formation rule:
if $\theta$ and $\psi$ are $L$ formulas and $x$ is any variable, then
\begin{itemize}
\item $(\theta \disj \psi)$,
\item $\neg\theta$, and
\item$\exists x \,\theta$
\end{itemize}
are $L$
formulas. We think of other logical connectives as being abbreviations,
e.g.,
\begin{itemize}
\item $(\theta \conj\psi)$ abbreviates $\neg(\neg\theta \disj \neg\psi)$,
\item $(\theta \implies\psi)$ abbreviates $(\neg\theta \disj \psi)$,
\item $\forall x \,\theta $ abbreviates $\neg\exists x \,\neg\theta$,
and so forth.
\end{itemize}
We often add and sometimes drop parentheses to improve readability.
Also we write $x\not=y$ for the formally correct but harder to
read $\neg x=y$.
It is common practice to write symbols not only in \dex{prefix} form as above
but also in \dex{postfix} and \dex{infix} forms.
For example in our example of
group theory instead of writing the term $\cdot(x,y)$ we usually write it
in infix form $x \cdot y$, and $^{-1}(x)$ is usually written
in postfix form $x^{-1}$. Similarly in the language of partially ordered sets
we usually write
$x\leq y$ instead of the prefix form $\leq(x,y)$. Binary relations
such as partial
orders and equivalence relations are most often written in infix form.
We regard the more natural forms we write as abbreviations of the more
formally correct prefix notation.
Next we want to describe the syntactical concept of \dex{substitution}.
To do so we must first describe what it means for an occurrence of a
variable $x$ in a formula $\theta$ to be \dex{free}. If an occurrence of a
variable $x$ in a formula $\theta$ is not free it is said to be \dex{bound}.
Example:
$$ (\exists x\,\, x = y \disj x = f(y) )$$
Both occurrences of $y$ are free, the first two occurrences of $x$ are bound,
and the last occurrence of $x$ is free. In the formula:
$$ \exists x\,\, (x = y \disj x = f(y) )$$
all three occurrences of $x$ are bound.
Formally we proceed as follows. All occurrences of variables
in an atomic formula are free. The free occurrences of $x$ in $\neg\theta$
are exactly the free occurrences of $x$ in $\theta$.
The free occurrences
of $x$ in $(\theta\disj\psi)$ are exactly the free occurrences of $x$ in
$\theta$ and in $\psi$. If $x$ and $y$ are distinct variables, then the free
occurrences of $x$ in $\exists y\,\,\theta$ are exactly the free occurrences
of $x$ in $\theta$. And finally no occurrence of $x$ in $\exists x\,\,\theta$
is free. This gives the inductive definition of free and bound variables.
We show that $x$ might occur freely in $\theta$ by
writing $\theta(x)$. If $c$ is a constant symbol the formula $\theta(c)$
is gotten by substituting $c$ for all free occurrences (if any) of
$x$ in $\theta$. For example: if $\theta(x)$ is
$$\exists y\; (y=x\conj\forall x\; x=y),$$
then $\theta(c)$ is
$$\exists y\; (y=c\conj\forall x\;x=y).$$
We usually write $\theta( x_1,x_2,\ldots,x_n)$ to indicate that the free
variables of $\theta$ are amongst the $x_1,x_2,\ldots,x_n$.
A formula is called a sentence if no variable occurs freely in it.
\begin{center}
Semantics
\end{center}
Our next goal is to describe the semantics of first order logic.
A \dex{structure} $\moda$ for the language $L$ is a pair consisting
of a set $A$
called the universe of $\moda$ and an assignment or interpretation
function from the nonlogical symbols
of $L$ to individuals, relations, and functions on $A$. Thus
\begin{itemize}
\item for each
constant symbol $c$ in $L$ we have an assignment $c^{\moda}\in A$,
\item for each $n$-ary operation symbol $f$ in $L$ we have a function
$f^{\moda}: A^n\to A$, and
\item for each $n$-ary predicate symbol $R$ we
have a relation $R^{\moda}\subseteq A^n$.
\end{itemize}
The symbol $=$ is always
interpreted
as the binary relation of equality, which is why we consider it a
logical symbol, i.e., for any structure $\moda$ we have
$=^{\moda}$ is $\{(x,x):x\in A\}$. We use the word structure and
\dex{model} interchangeably.
For example, suppose $L$ is the language of group theory.
One structure for
this theory is
$$\modq=({\Bbb Q},+,-x,0)$$
where
\begin{itemize}
\item the universe is the rationals,
\item $\cdot^{\modq}$ is ordinary addition of rationals,
\item $^{-1^{\modq}}$ is the function which takes each rational
$r$ to $-r$, and
\item $e^{\modq}=0$.
\end{itemize}
Another structure in this language is
$$\modr=({\Bbb R}^+,\times,{1 \over x},1)$$
where
\begin{itemize}
\item the universe is the set of positive real numbers,
\item $\cdot^{\modr}$ is multiplication $\times$,
\item $^{-1^{\modr}}$ is the function which takes $x$ to $1 \over x$, and
\item $e^{\modr}=1$.
\end{itemize}
Another example is the group $S_n$ of permutations.
Here $\cdot^{S_n}$ is composition of functions,
$^{-1^{S_n}}$ is the functional
which takes each permutation to its inverse, and $e^{S_n}$ is the identity
permutation. Of course there are many examples of structures in this
language which are not groups.
For another example, the language of partially ordered sets is
$L=\{\leq\}$ where $\leq$ is a binary relation symbol. The
following are all $L$-structures which happen to be partial orders:
\begin{itemize}
\item $({\Bbb R},\{(x,y)\in {\Bbb R}^2: x\leq y\})$,
\item $({\Bbb Q},\{(x,y)\in {\Bbb Q}^2: x\geq y\})$, and
\item $({\Bbb N},\{(x,y)\in {\Bbb N}^2: x\mbox{ divides }y\})$.
\end{itemize}
For any nonempty set
$A$ and $R\subseteq A^2$, $(A,R)$ is an $L$-structure. If in addition
the relation $R$ is transitive, reflexive, and antisymmetric, then
$(A,R)$ is a partial order.
\begin{center}
${\moda\models\theta}$
\end{center}
Next we define what it means for an $L$ structure $\moda$ to model or
satisfy
an $L$ sentence $\theta$ (written ${\moda\models\theta}$).
\sdex{$?moda?models?theta$}
For example,
$$({\Bbb Q},+,0)\models \forall x \exists y \; x\cdot y=e,$$
because for all $p\in {\Bbb Q}$ there exists $q\in {\Bbb Q}$ such that
$p+q=0$.
Usually it is not the case that every element of a model has a
constant symbol which names it. But suppose this just happened to
be the case. Let's suppose that for ever $a\in A$ there is a
constant symbol $c_a$ in the language $L$ so that
$c_a^\moda=c$.
The interpretation function can be extended to the variable free terms
of $L$ by the rule:
$$ (f( t_1,t_2,\ldots,t_n))^{\moda} =
f^{\moda}( t_1^{\moda},t_2^{\moda},\ldots,t_n^{\moda})$$
Hence for each variable free term $t$ we get an interpretation
$t^{\moda}\in A$. For example, if $L=\{S,c\}$ where $S$ is
a unary operation symbol and $c$ is a constant symbol, and
${\goth Z}$ is the $L$-structure with universe ${\Bbb Z}$ and
where $S^{\goth Z}(x)=x+1$ and $c^{\goth Z}=0$, then
$S(S(S(S(c))))^{\goth Z}=4$.
Our definition of $\models$ is by induction on the \dex{logical complexity}
of the sentence $\theta$, i.e. the number of logical symbols in $\theta$.
\begin{enumerate}
\item $\moda\models R(t_1,\ldots,t_n)$ iff
$( t_1^{\moda},\ldots,t_n^{\moda})\in R^{\moda}$.
\item $\moda\models\neg\theta$ iff not $\moda\models\theta$.
\item $\moda\models(\theta\disj\psi)$ iff $\moda\models\theta$
or $\moda\models\psi$.
\item $\moda\models\exists x\,\theta(x)$ iff there exists a $b$ in
the universe $A$ such that $\moda\models\theta(c_b)$.
\end{enumerate}
Now we would like to define $\moda\models\theta$ for arbitrary
languages $L$ and $L$-structures $\moda$.
Let ${L_{\moda}=L\cup\{c_a: a\in A\}}$
\sdex{$L_{?moda}=L?cup?{c_a: a?in A?}$}
where each ${c_a}$ is a new constant symbol.
Let ${(\moda,a)_{a\in A}}$ be the $L_{\moda}$
\sdex{$(?moda,a)_{a?in A}$}
structure gotten by augmenting the structure $\moda$ by interpreting
each symbol $c_a$ as the element $a$.
If $\theta$ is an $L$-sentence and
$\moda$ is and $L$-structure, then we define {$\moda\models\theta$}
iff $\theta$ is true in the augmented structure, i.e., ${(\moda,a)_{a\in A}}$.
If $L_1 \subseteq L_2$ and $\moda$ is a $L_2$ structure, then the reduct of
$\moda$
to $L_1$, written $\moda\res{L_1}$, is the $L_1$ structure with the
same universe \sdex{$?moda?res{L_1}$}
as $\moda$ and same relations, operations, and constants as $\moda$
for the symbols of $L_1$.
\begin{lemma}
Let $L_1\subseteq L_2$ and $\moda$ be an $L_2$ structure. Then
for any $\theta$ an $L_1$ sentence,
$$\moda\models\theta \mbox{ iff }
\moda\res{L_1}\models\theta$$
\end{lemma}
\proof
We prove by induction on the number of logical symbols in the sentence
that for any $L_{1\moda}$ sentence $\theta$:
$$(\moda,a)_{a\in A}\models\theta \mbox{ iff }
(\moda,a)_{a\in A}\res{L_{1\moda}} \models\theta$$
Let $\moda_2=(\moda,a)_{a\in A}$ and
$\moda_1=(\moda,a)_{a\in A}\res{L_{1\moda}}$.
\medskip
Atomic sentences: By induction on the
size of the term, for any $L_{1\moda}$ variable free term $t$
we have that $t^{\moda_1}=t^{\moda_2}$.
For any $n$-ary relation
symbol $R$ in $L_1$ we have $R^{\moda_1}=R^{\moda_2}$ (since
$\moda_1$ is a reduct of $\moda_2$). Hence for any atomic
$L_{1\moda}$-sentence $R(t_1,\ldots,t_n)$,
$$\moda_1\models R(t_1,\ldots,t_n)$$
\centerline{iff}
$$\langle t_1^{\moda_1},\ldots t_n^{\moda_1}\rangle \in R^{\moda_1}$$
\centerline{iff}
$$\langle t_1^{\moda_2},\ldots t_n^{\moda_2}\rangle \in R^{\moda_2}$$
\centerline{iff}
$$\moda_2\models R(t_1,\ldots,t_n).$$
\medskip
Negation:
$$\moda_1\models \neg\theta$$
\centerline{iff}
$$\mbox{ not } \moda_1\models \theta$$
\centerline{iff (by induction)}
$$\mbox{ not }\moda_2\models \theta$$
\centerline{iff}
$$\moda_2\models \neg\theta.$$
\medskip
Disjunction:
$$\moda_1\models (\theta\disj\rho)$$
\centerline{iff}
$$\moda_1\models \theta \rmor \moda_1\models\rho$$
\centerline{iff (by induction)}
$$\moda_2\models \theta \rmor \moda_2\models\rho$$
\centerline{iff}
$$\moda_2\models (\theta\disj\rho).$$
\medskip
Existential quantifier:
$$\moda_1\models \exists x\theta(x)$$
\centerline{ iff}
$$\mbox{ there exists $a\in A$ such that }\moda_1\models \theta(c_a)$$
\centerline{ iff}
$$\mbox{ there exists $a\in A$ such that }\moda_2\models \theta(c_a)$$
\centerline{iff}
$$\moda_2\models \exists x\theta(x).$$
\qed
\begin{center}
Compactness Theorem
\end{center}
The compactness theorem (for countable languages)
was proved by Kurt G\"{o}del in 1930. Malcev extended it to uncountable
languages in 1936. The proof we give here was found by Henkin in 1949.
We say that a set of $L$ sentences $\Sigma$ is \dex{finitely satisfiable}
iff every finite subset of $\Sigma$ has a model. $\Sigma$ is
\dex{complete} iff for every $L$ sentence $\theta$ either
$\theta$ is in $\Sigma$ or $\neg\theta$ is in $\Sigma$.
\begin{lemma}
For every finitely satisfiable set of $L$ sentences $\Sigma$
there is a complete finitely satisfiable set of $L$ sentences
$\Sigma^\prime\supseteq \Sigma$.
\end{lemma}
\proof
Let $B=\{Q:Q\supseteq \Sigma$ is finitely satisfiable$\}$.
$B$ is closed under unions of chains, because if $C\subseteq B$ is
a chain, and $F\subseteq \cup C$ is finite then there exists
$Q\in C$ with $F\subseteq Q$, hence $F$ has a model. By the
maximality principal, there exists $\Sigma^\prime\in B$ maximal.
But for every $L$ sentence $\theta$ either $\Sigma^\prime\cup\{\theta\}$
is finitely satisfiable or $\Sigma^\prime\cup\{\neg\theta\}$ is
finitely satisfiable. Otherwise there exists finite
$F_0,F_1\subseteq\Sigma^\prime$ such that
$F_0\cup\{\theta\}$ has no model and $F_1\cup\{\neg\theta\}$
has no model. But $F_0\cup F_1$ has a model $\moda$ since
$\Sigma$ is finitely satisfiable.
Either $\moda\models\theta$ or $\moda\models\neg\theta$.
This is a contradiction.
\qed
\begin{lemma} If $\Sigma$ is a finitely satisfiable set of $L$ sentences,
and $\theta(x)$ is an $L$ formula with
one free variable $x$,
and $c$ a new constant symbol (not in $L$), then
$\Sigma \cup \{(\exists x \,\theta (x)) \implies \theta(c)\}$
is finitely satisfiable in the language $L\cup\{c\}$.
\end{lemma}
\proof
This new sentence is called a \dex{Henkin sentence} and $c$ is called the
\dex{Henkin constant}. Suppose it is not finitely satisfiable. Then
there exists $F\subseteq\Sigma$ finite such that
$F\cup\{(\exists x \,\theta (x)) \implies \theta(c)\}$ has no
model. Let $\moda$ be an $L$-structure modeling $F$.
Since the constant $c$ is not in the language $L$ we are free to
interpret it any way we like. If $\moda\models\exists x \,\theta (x)$
choose $c\in A$ so that $(\moda,c)\models\theta(c)$, otherwise
choose $c\in A$ arbitrarily. In either case
$(\moda,c)$ models $F$ and the Henkin sentence.
\qed
\par\medskip
We say that a set of $L$ sentences $\Sigma$ is \dex{Henkin}
iff for every $L$ formula $\theta(x)$ with one free variable $x$,
there is a constant symbol $c$ in $L$ such
that $(\exists x \,\theta (x)) \implies \theta(c) \in \Sigma$.
\begin{lemma} If $\Sigma$ is a finitely satisfiable set of $L$ sentences,
then there exists
$\Sigma^\prime\supseteq \Sigma$
with $L^\prime\supseteq L$ and $\Sigma^\prime$ a finitely satisfiable
Henkin set of $L^\prime$ sentences.
\end{lemma}
\proof
For any set of $\Sigma$ of $L$ sentences, let
$$\Sigma^*=\Sigma\cup\{
(\exists x \,\theta (x)) \implies \theta(c_{\theta}):
\theta(x)\mbox{ an }L\mbox{ formula with one free variable}\}$$
The language of $\Sigma^*$ contains a new constant symbol $c_{\theta}$
for each $L$ formula $\theta(x)$. $\Sigma^*$ is finitely satisfiable,
since any finite subset of it is contained
in a set of the form
$$F\cup\{(\exists x \,\theta_1 (x)) \implies \theta(c_{\theta_1}),\ldots,
(\exists x \,\theta_n (x)) \implies \theta(c_{\theta_n})\}$$
where $F\subseteq\Sigma$ is finite. To prove this set has a model
use induction on $n$ and note that
from the point of view of
$$\Sigma\cup\{(\exists x \,\theta_1 (x))
\implies \theta(c_{\theta_1}),\ldots,
(\exists x \,\theta_{n-1} (x)) \implies \theta(c_{\theta_{n-1}})\}$$
$c_{\theta_n}$ is a new constant symbol, so we can apply the last lemma.
Now let $\Sigma_0=\Sigma$ and let
$\Sigma_{m+1}=\Sigma_m^*$. Then
$$\Sigma^\prime=\cup_{m<\omega}\Sigma_m$$
is Henkin. It is also finitely satisfiable, since it is the union
of a chain of finitely satisfiable sets.
\qed
\par\medskip
If $\Sigma$ is a set of $L$ sentences, then the \dex{canonical structure}
$\moda$ built
from $\Sigma$ is the following.
\begin{itemize}
\item Let $X$ be the set of all variable free terms of $L$.
\item For $t_1,t_2\in X$ define $t_1\sim t_2$ iff $(t_1 = t_2) \in \Sigma$.
\item Assuming that $\sim$ is an equivalence relation let
$[t]$ be the equivalence
class of $t\in X$.
\item The universe of the canonical model $\moda$ is the set of
equivalence classes of $\sim$.
\item For any constant symbol $c$ we define
$$c^{\moda}=[c].$$
\item For any $n$-ary operation symbol $f$ we define
$$f^{\moda}([t_1],[t_2],\ldots,[t_n])=[f(t_1,t_2,\ldots,t_n)].$$
\item For any $n$-ary relation symbol $R$ we define
$$([t_1],[t_2],\ldots,[t_n])\in R^{\moda} \rmiff
R(t_1,t_2,\ldots,t_n)\in \Sigma.$$
\end{itemize}
\begin{lemma} If $\Sigma$ is a finitely satisfiable complete Henkin set of
$L$ sentences, then the canonical
model $\moda$ built from $\Sigma$ is well defined and for every $L$
sentence $\theta$,
$$\moda\models \theta \rmiff \theta \in \Sigma.$$
\end{lemma}
\proof
First we show that $\sim$ is an equivalence relation.
Suppose $t,t_1,t_2,t_3$ are variable free terms.
\smallskip
$t\sim t$:
If $t=t\notin \Sigma$ then,
since $\Sigma$ is complete we have that $\neg t=t\in \Sigma$.
But clearly $\neg t=t$ has no models and so $\Sigma$ is not finitely
satisfiable.
\smallskip
$t_1\sim t_2$ implies $t_2\sim t_1$:
If not, by completeness of $\Sigma$ we must have
that $t_1=t_2$ and $\neg t_2=t_1$ are both in $\Sigma$. But then
$\Sigma$ is not finitely satisfiable.
\smallskip
($t_1\sim t_2$ and $t_2\sim t_3$) implies $t_1\sim t_3$:
If not, by completeness of $\Sigma$ we must have
that $t_1=t_2$, $t_2=t_3$, and $\neg t_1=t_3$ are all in $\Sigma$. But then
$\Sigma$ is not finitely satisfiable.
\smallskip
So $\sim$ is an equivalence relation. Next we show that it is a congruence
relation.
Suppose $t_1,\ldots,t_n,t_1^\prime,\ldots,t_n^\prime$ are variable free
terms and $f$ is an n-ary operation symbol.
If $t_1\sim t_1^\prime,\ldots,t_n\sim t_n^\prime$ then
$f(t_1,\ldots,t_n)\sim f(t_1^\prime,\ldots,t_n^\prime)$.
This amounts to saying if
$$\{t_1=t_1^{\prime},\ldots,t_n=t_n^{\prime}\}\subseteq\Sigma,$$
then
$$f(t_1,\ldots,t_n)= f(t_1^\prime,\ldots,t_n^\prime)\in\Sigma.$$
But again since $\Sigma$ is complete we would have
$$f(t_1,\ldots,t_n)\not= f(t_1^\prime,\ldots,t_n^\prime)\in\Sigma$$
but
$$\{t_1=t_1^{\prime},\ldots,t_n=t_n^{\prime},
f(t_1,\ldots,t_n)\not= f(t_1^\prime,\ldots,t_n^\prime)\}$$
has no models and
so $\Sigma$ wouldn't be finitely satisfiable.
By a similar argument: Suppose
$t_1,\ldots,t_n,t_1^\prime,\ldots,t_n^\prime$ are variable free
terms and $R$ is an n-ary operation symbol.
If $t_1\sim t_1^\prime,\ldots,t_n\sim t_n^\prime$ then
$R(t_1,\ldots,t_n)\in\Sigma$ iff
$R(t_1^\prime,\ldots,t_n^\prime)\in\Sigma$.
This shows the canonical model is well defined.
\medskip
Now we prove
by induction on the number of logical symbols that for any $L$ sentence
$\theta$,
$$\moda\models \theta \iff \theta \in \Sigma.$$
The atomic formula case is by definition.
\smallskip
$\neg$:
$$\moda\models\neg\theta$$
\centerline{iff}
$$\mbox{ not }\moda\models\theta$$
\centerline{iff(by induction)}
$$\mbox{ not }\theta\in\Sigma$$
\centerline{iff(by completeness)}
$$\neg\theta\in\Sigma.$$
\smallskip
$\disj$:
$$\moda\models(\theta\disj\rho)$$
\centerline{iff}
$$\moda\models\theta \rmor \moda\models\rho$$
\centerline{iff(by induction)}
$$\theta\in\Sigma \rmor \rho\in\Sigma$$
\centerline{iff}
$$(\theta\disj\rho)\in\Sigma.$$
This last ``iff'' uses completeness and finite satisfiability of $\Sigma$.
If $(\theta\disj\rho)\notin\Sigma$ then by
completeness $\neg(\theta\disj\rho)\in\Sigma$ but
$\{\theta,\rho,\neg(\theta\disj\rho)\}$ has no model.
Conversely if $\theta\notin\Sigma$ and $\rho\notin\Sigma$,
then by completeness $\neg\theta\in\Sigma$ and $\neg\rho\in\Sigma$, but
$\{(\theta\disj\rho),\neg\theta,\neg\rho\}$ has no model.
\smallskip
$\exists$:
$\moda\models\exists x\theta(x)$
implies
there exists $a\in A$ such that $\moda\models\theta(a)$.
This implies (by induction) $\theta(a)\in\Sigma$.
Which in turn implies $\exists x\theta(x)\in\Sigma$,
since otherwise $\neg\exists x\theta(x)\in\Sigma$ but
$\{\neg\exists x\theta(x),\theta(a)\}$ has no model.
Hence $\moda\models\exists x\theta(x)$ implies $\exists x\theta(x)\in\Sigma$.
For the other direction suppose $\exists x\theta(x)\in\Sigma$. Then
since $\Sigma$ is Henkin for some constant symbol $c$
we have $(\exists x\theta(x))\implies\theta(c)\in\Sigma$. Using completeness
and finite satisfiability we must have $\theta(c)\in\Sigma$.
By induction $\moda\models\theta(c)$ hence $\moda\models\exists x\theta(x)$.
Hence $\exists x\theta(x)\in\Sigma$ implies
$\moda\models\exists x\theta(x)$.
\qed
\par\bigskip
\noindent {\bf Compactness Theorem.} For any language $L$ and
set of $L$ sentences $\Sigma$, if every finite subset of
$\Sigma$ has a model, then $\Sigma$ has a model.
\bigskip
\proof
First Henkinize $\Sigma$, then complete it. Take its canonical model.
Then reduct it back to an $L$-structure.
\qed
\bigskip
\begin{center}
Problems
\end{center}
\prob We say that $T$ a set of $L$ sentences is an \dex{$L$ theory}
iff there exists a set $\Sigma$ of $L$ sentences such
that $T$ is the set of all $L$ sentences true in every model
of $\Sigma$. In this case we say that $\Sigma$ is an
\dex{axiomatization} of $T$ or that
$\Sigma$ \dex{axiomatizes} the theory $T$.
Prove that any theory axiomatizes itself.
Give an axiomatization of the theory of partially ordered sets.
The theory of groups is just the set of all sentences of
group theory which are true in every group. Give an axiomatization of
the theory of abelian groups.
\prob Suppose that $T$ is a theory with a model.
Show that $T$ is complete iff
for every sentence $\theta$
in the language of $T$ either every model of $T$ is a model
of $\theta$ or every model of $T$ is a model
of $\neg\theta$.
\prob A theory $T$ is \dex{finitely axiomatizable} iff
there is a finite $\Sigma$
which axiomatizes $T$. Let $T$ be the set of sentences in the language of
pure equality which are true in every infinite structure. Prove that
$T$ is a theory by finding axioms for it. Show that no finite set
of these axioms axiomatizes $T$.
\prob
The theory of $\moda$, ${Th(\moda)}$, is defined as follows:
\sdex{$Th(?moda)$}
$$Th(\moda)=\{\theta:\theta \mbox{ is an } L \mbox{ sentence and }
\moda\models\theta\}$$
Prove that $Th(\moda)$ is a complete theory.
\prob $\moda$ and $\modb$ are \dex{elementarily equivalent} (written
${\moda\equiv\modb}$) iff $Th(\moda)=Th(\modb)$.
\sdex{$?moda?equiv?modb$}
Show that $\moda\equiv\modb$ iff for every sentence
$\theta$ if $\moda\models\theta$, then $\modb\models\theta$.
\prob
Suppose $T$ is a theory with a model.
Show the following are equivalent:
\begin{itemize}
\item $T$ is complete
\item any two models of $T$ are elementarily
equivalent
\item $T=Th(\moda)$ for some model of $T$
\item $T=Th(\moda)$ for all models of $T$
\end{itemize}
\prob Show that for any set of sentences $\Sigma$ and sentence $\theta$,
every model of $\Sigma$ is a model of $\theta$ iff there exists a finite
$\Sigma^\prime\subseteq\Sigma$
such that every model of $\Sigma^{\prime}$ is a model of $\theta$.
\prob Suppose that $T$ is a finitely axiomatizable theory and
$\Sigma$ is any axiomatization of $T$.
Show that some finite $\Sigma^\prime \subseteq \Sigma$
axiomatizes $T$.
\prob (Separation) Let ${{\cal M}(T)}$
be the \sdex{${?cal M}(T)$}
class of all models of $T$.
Suppose $T$ and $T^\prime$ are theories in a language $L$ and
${\cal M}(T) \cap {\cal M}(T^\prime ) = \emptyset $. Show that
for some $L$ sentence
$\theta$,
$${\cal M}(T) \subseteq {\cal M}(\theta) \rmand
{\cal M}(\theta) \cap {\cal M}(T^\prime ) = \emptyset.$$
\prob Let $L$ be a first order language and suppose $T_i, i\in I$
are theories in $L$ such that every $L$ structure is a model of exactly
one of the $T_i$'s. If $I$ is finite does it
then follow
that each of the $T_i$'s is finitely axiomatizable? What about infinite
$I$?
\prob Show that any theory with arbitrarily large finite models
must have an infinite model.
\prob Suppose that $T$ is an $L$-theory with an infinite model.
Let $X$ be any nonempty set and $\{c_x:x\in X\}$ be new constant
symbols not appearing in $L$. Prove that the set of
sentences $T\cup\{c_x\not=c_y:x,y\in X,x\not=y\}$ has a model.
Prove that any theory with an infinite model has an uncountable
model.
\prob
Let the
language of fields be $\{ +, \cdot, 0 , 1 \}$.
Give an axiomatization for fields of characteristic $0$.
Show that this theory is not
finitely axiomatizable. Show that for any sentence $\theta$
true in all fields of
characteristic $0$ there is a $k$ such that $\theta$ is true in all
fields of characteristic $ > k$.
\prob The cardinality of a model is the number of elements in its
universe.
Give an example of a theory $T$ such that
for all $n$,
\par\centerline{$T$ has a model of cardinality $n$ iff $n$ is even.}
\par\noindent Can you find a finitely axiomatizable $T$?
\prob Is there a finitely axiomatizable theory with only infinite models?
What if the language consists of a single unary operation symbol? What
about the languages which contain only unary relation symbols?
\prob Suppose that $T$ is any theory in a language which includes a binary
relation symbol $\leq$ such that for every model $\moda$ of $T$
$\leq^{\moda}$ is a
linear order. Show that if $T$ has an infinite model then $T$ has a model
$\modb$
such that there is an order embedding of the rationals into
$\leq^{\modb}$, i.e., there is a function $f:{\Bbb Q}\to B$ such
that $p\leq q$ iff $f(p)\leq^{\modb} f(q)$ for every $p,q\in {\Bbb Q}$.
\prob Let $\moda =( A,\eq )$ be the equivalence relation with
exactly one equivalence class of cardinality $n$ for each $n=1,2,\ldots$
and no infinite equivalence classes.
Show that there exists $\modb \equiv \moda$ which has infinitely
many infinite equivalence classes.
\prob A theory is \dex{consistent} iff it has a model, i.e., it is realizable.
Let $T$ be a finitely axiomatizable theory with only a countable
number of complete consistent extensions (ie $T^\prime\supseteq T$)
in the language
of $T$. Prove that one of these complete
consistent extensions is finitely axiomatizable.
\prob For each of the following prove or give a counterexample:
\begin{enumerate}
\item Let $T_n$ for $n\in\omega$ be finite sets of sentences and $S$ be
a finite set of sentences.
Assume for all $n\in\omega$ that
$ T_n\subseteq T_{n+1}$, and there is a model of $T_n$ which is not a model
of $S$. Then there is a model of $\cup_{n\in\omega}T_n $ which
is not a model of $S$.
\item Let $S_n$ and $T_n$ for $n\in\omega$
be finite sets of sentences. Assume that for all
$n\in\omega,S_n \subseteq S_{n+1},
T_n\subseteq T_{n+1}$, and there is a model of $T_n$ which is not a model
of $S_n$. Then there is a model of $\cup_{n\in\omega}T_n $ which
is not a model of $\cup_{n\in\omega}S_n $.
\end{enumerate}
\prob Let $ T_0 \subseteq T_1 \subseteq T_2 \subseteq \ldots $
be a sequence of
$L$ theories such that for each $ n \in \omega $ there exists a model of
$T_{n}$ which is not a model of $T_{n+1}$.
Prove that $\cup_{n\in\omega}T_n$
is not finitely axiomatizable. If $L$ is finite, prove that
$\cup_{n\in\omega}T_n$ has an infinite model.
\sector{Lowenheim-Skolem Theorems}
The first version of the Lowenheim-Skolem Theorem was proved in 1915.
The final version that is presented here was developed by Tarski
in the 1950's.
\begin{lemma} The number of $L$ formulas is $|L|+\aleph_0$. \label{low1}
\end{lemma}
\proof
There are only countably many logical symbols. Hence if
$\kappa=|L|+\aleph_0$ then every formula may be regarded as
an element of $\kappa^{<\omega}$ and we know
$|\kappa^{<\omega}|=\kappa$.
\qed
\bigskip
\par\noindent
{\bf Theorem}
{\em
For any theory $T$ in a language $L$ if $T$ has
a model, then it has one of cardinality less than or
equal to $|L|+\aleph_0$.
}
\proof
The canonical model of the completion of
Henkinization of $T$ has cardinality $\leq|L|+\aleph_0$.
It is enough to see that the language of the Henkinization
of $T$ has cardinality $\leq|L|+\aleph_0$, since the
canonical models universe is the set of equivalence classes
of variable free terms.
The Henkinization language is $\cup_{n<\omega}L_n$ where
$L_0=L$ and $L_{n+1}$ is $L_n$ plus one more constant symbol
for each formula of $L_n$. But by Lemma \ref{low1}
there are $|L_n|+\aleph_0$ formulas of $L_n$. It follows
that if $\kappa=|L|+\aleph_0$, then $|L_n|=\kappa$ for
all $n$ and so $|\cup_{n<\omega}L_n|=\kappa$.
\qed
${{\moda} \substruc {\modb}}$
\sdex{${?moda} ?substruc {?modb}$}
means that ${\moda}$ is a
\dex{substructure}
of ${\modb}$; equivalently
${\modb}$ is an \dex{extension} or \dex{superstructure}
of ${\moda}$. This means that
both structures are in the same language $L$, $A \subseteq B$, for
every $n$-ary
relation symbol $R$ of $L$, $ R^{\moda} = R^{\modb} \cap A^n$,
for every $n$-ary function symbol $f$ of $L$,
$f^{\moda} = f^{\modb} \res A^n $,
and for every constant symbol $c$ of $L$,
$ c^{\moda} = c^{\modb} $.
${{\moda} \elemsub {\modb}}$
\sdex{${?moda} ?elemsub {?modb}$}
means that ${\moda}$ is an
\dex{elementary substructure}
of ${\modb}$; equivalently ${\modb}$ is an \dex{elementary extension}
of ${\moda}$. This means that ${\moda} \subseteq {\modb}$
and for every formula $\theta(x_1,x_2,\ldots,x_n)$
of the language $L$ and for every $a_1,a_2,\ldots,a_n \in A$
we have
$$({\moda },a)_{a\in A} \models \theta (c_{a_1},c_{a_2},\ldots,c_{a_n})
\mbox{ iff }
({\modb },a)_{a\in A} \models \theta (c_{a_1},c_{a_2},\ldots,c_{a_n})$$
To ease the notational complexity we will write
${\moda\models\theta(a_1,\ldots,a_n)}$
\sdex{$?moda?models?theta(a_1,?ldots,a_n)$}
instead of
$({\moda },a)_{a\in A} \models \theta (c_{a_1},c_{a_2},\ldots,c_{a_n})$.
It should be kept in mind that the language $L$ may have no constant
symbols in it.
Example. Let $\moda=(\omega,<)$ and let $\modb=(Evens,<)$. Then
$\modb\subseteq\moda$ but $\modb$ is not an elementary substructure
of $\moda$. This is because $\moda\models\exists x\; 0g(b)>g(g(b))>g(g(g(b)))>\cdots$$
\prob Find a set of sentences $T$ in an uncountable language such that
$T$ has arbitrarily large finite models but no countable infinite model.
\prob Find a theory $T$ in an uncountable language that has no
finite models and has exactly one countable model up to isomorphism, but is
not complete.
\com{let T have $\exists^{\geq n}$ and $c_\alpha=c_\beta \iff c_\delta=
c_\gamma$ for $\alpha\not=\beta$ and $\delta\not=\gamma$.}
\sector{Turing Machines and Recursive Functions}
A \dex{Turing machine} is a partial function $m$ such that for some
finite sets $A$ and
$S$ the domain of $m$ is a subset of $S \times A$ and range of $m$
is a subset of $S \times A \times \{l,r\}$.
$$\partial m: S \times A \to S \times A \times \{l,r\}$$
We call $A$ the alphabet and $S$ the states.
For example, suppose $S$ is the set of letters $\{a,b,c,\ldots,z\}$
and $A$ is the set of all integers less than seventeen, then
$$m(a,4)=(b,6,l)$$
would mean that when the machine $m$ is in state $a$
reading the
symbol $4$ it will go into state $b$, erase the symbol $4$ and
write the symbol $6$ on the tape square
where $4$ was, and then move left one square.
\bigskip
\unitlength=1.00mm
\special{em:linewidth 0.4pt}
\linethickness{0.4pt}
\begin{picture}(69.00,30.00)(-10,0)
%\put(0,0){\framebox(69,30)[cc]{}}
\put(10.00,20.00){\framebox(10.00,10.00)[cc]{$\blank$}}
\put(20.00,20.00){\framebox(10.00,10.00)[cc]{$0$}}
\put(30.00,20.00){\framebox(10.00,10.00)[cc]{$3$}}
\put(40.00,20.00){\framebox(10.00,10.00)[cc]{$4$}}
\put(50.00,20.00){\framebox(10.00,10.00)[cc]{$\blank$}}
\put(40.00,0.00){\framebox(10.00,10.00)[cc]{}}
\put(45.00,2.00){\makebox(0,0)[cc]{head}}
\put(45.00,7.00){\makebox(0,0)[cc]{read}}
\put(45.00,10.00){\vector(0,1){10.00}}
\put(24.00,11.00){\makebox(0,0)[cc]{machine $m$}}
\put(24.00,4.00){\makebox(0,0)[cc]{in state $a$}}
\put(5.00,20.00){\line(1,0){5.00}}
\put(5.00,30.00){\line(1,0){5.00}}
\put(60.00,30.00){\line(1,0){5.00}}
\put(60.00,20.00){\line(1,0){5.00}}
\end{picture}
\bigskip
\unitlength=1.00mm
\special{em:linewidth 0.4pt}
\linethickness{0.4pt}
\begin{picture}(69.00,30.00)
%\put(0,0){\framebox(69,30)[cc]{}}
\put(10.00,20.00){\framebox(10.00,10.00)[cc]{$\blank$}}
\put(20.00,20.00){\framebox(10.00,10.00)[cc]{$\blank$}}
\put(30.00,20.00){\framebox(10.00,10.00)[cc]{$0$}}
\put(40.00,20.00){\framebox(10.00,10.00)[cc]{$3$}}
\put(50.00,20.00){\framebox(10.00,10.00)[cc]{$6$}}
\put(60.00,20.00){\framebox(10.00,10.00)[cc]{$\blank$}}
\put(40.00,0.00){\framebox(10.00,10.00)[cc]{}}
\put(45.00,2.00){\makebox(0,0)[cc]{head}}
\put(45.00,7.00){\makebox(0,0)[cc]{read}}
\put(45.00,10.00){\vector(0,1){10.00}}
\put(24.00,11.00){\makebox(0,0)[cc]{machine $m$}}
\put(24.00,4.00){\makebox(0,0)[cc]{in state $b$}}
\put(5.00,20.00){\line(1,0){5.00}}
\put(5.00,30.00){\line(1,0){5.00}}
\end{picture}
\bigskip
If $(a,4)$ is not in the domain of $m$, then the machine halts.
This is the only
way of stopping a calculation.
Let $A^{<\omega}$ be the set of all finite strings from the alphabet A.
The Turing machine $m$ gives rise to a partial function
$M$ from $A^{<\omega}$ to $A^{<\omega}$ as follows.
$$\partial M:A^{<\omega}\to A^{<\omega}$$
We suppose that $A$ always contains
the blank
space symbol:
$$\blank$$
and $S$ contains the starting state $a$.
Given any word $w$
from $A^{<\omega}$ we
imagine a tape with $w$ written on it and blank symbols everywhere else. We
start the machine in state $a$ and reading the leftmost symbol of $w$.
A configuration consists of what is written on the tape, which square of
tape is being read, and the state the machine is in.
Successive
configurations are obtained according to rules determined by $m$, namely
if the machine is in state $q$ reading symbol $s$ and
$m(q,s)=(q^\prime,s^\prime,d)$ then
the next configuration has the same tape except the square we were reading
now has the symbol $s^\prime$ on it, the new state is $q^\prime$,
and the square being
read is one to the left if $d=l$ and one to the right if $d=r$.
If $(q,s)$ is not in the domain of $m$, then the computation halts and
$M(w)=v$ where $v$ is what is written on the tape when the machine halts.
Suppose $B$ is a finite alphabet that does not contain the blank
space symbol ($\blank$). Then
a partial function
$$\partial f:B^{<\omega}\to B^{<\omega}$$
is a \dex{partial recursive function}
iff there
is a Turing machine $m$ with an alphabet $A\supseteq B$
such that $f = M\res B^{<\omega}$. A partial recursive function
is \dex{recursive} iff it is total. A function $f:\omega \to \omega$
is recursive iff it is recursive when considered as a map
from $B^{<\omega}$ to $B^{<\omega}$
where
$B=\{1\}$. Words in $B$ can be regarded as numbers written
in base one, hence we identify the number $x$ with $x$ ones written
on the tape.
For example, the identity function is recursive, since it is computed by the
empty machine. The successor function is recursive since it is computed
by the machine:
\unitlength=1.00mm
\special{em:linewidth 0.4pt}
\linethickness{0.4pt}
\begin{picture}(128.00,24.00)
%\put(0.00,0.00){\framebox(64.00,24.00)[cc]{}}
\put(21.00,7.00){\vector(1,0){27.00}}
\put(15.00,17.00){\line(0,-1){4.00}}
\put(4.00,7.00){\vector(1,0){5.00}}
\put(4.00,7.00){\line(0,1){10.00}}
\put(4.00,17.00){\line(1,0){11.00}}
\put(18.00,19.00){\makebox(0,0)[cc]{1}}
\put(3.00,3.00){\makebox(0,0)[cc]{(1,r)}}
\put(24.00,10.00){\makebox(0,0)[cc]{$\;\tilde{}\;$}}
\put(44.00,10.00){\makebox(0,0)[cc]{(1,r)}}
\put(15.00,7.00){\circle{12.00}}
\put(55.00,7.00){\circle{12.00}}
\put(15.00,7.00){\makebox(0,0)[cc]{$a$}}
\put(55.00,7.00){\makebox(0,0)[cc]{$b$}}
\put(70,10){\shortstack[l]{ $m(a,1)=(a,1,r)$\\ $m(a,\blank)=(b,1,r)$}}
\end{picture}
In the diagram on the left, states are represented by circles.
The arrows represent the \dex{state transition function} $m$.
For example, the horizontal arrow represents the fact that when
$m$ is in state $a$ and reads ($\blank$), it
writes $1$, moves right, and goes into state $b$.
The set of strings of zeros and ones with
an even number of ones is recursive.
Its characteristic function (parity checker) can be computed
by the following machine:
\unitlength=.90mm
\special{em:linewidth 0.4pt}
\linethickness{0.4pt}
\begin{picture}(140.00,40.00)
%\put(00.00,000.00){\framebox(69.00,40.00)[cc]{}}
\put(15.00,025.00){\circle{10.00}}
\put(55.00,025.00){\circle{10.00}}
\put(35.00,009.00){\circle{10.00}}
\put(15.00,025.00){\makebox(0,0)[cc]{$a$}}
\put(55.00,025.00){\makebox(0,0)[cc]{$b$}}
\put(35.00,009.00){\makebox(0,0)[cc]{$c$}}
\put(20.00,028.00){\vector(1,0){30.00}}
\put(50.00,023.00){\vector(-1,0){30.00}}
\put(15.00,020.00){\vector(4,-3){15.00}}
\put(55.00,020.00){\vector(-4,-3){14.00}}
\put(15.00,030.00){\line(0,1){4.00}}
\put(15.00,034.00){\line(-1,0){10.00}}
\put(05.00,034.00){\line(0,-1){9.00}}
\put(05.00,025.00){\vector(1,0){5.00}}
\put(60.00,025.00){\line(1,0){6.00}}
\put(66.00,025.00){\line(0,1){9.00}}
\put(66.00,034.00){\line(-1,0){11.00}}
\put(55.00,034.00){\vector(0,-1){4.00}}
\put(18.00,035.00){\makebox(0,0)[cc]{$0$}}
\put(05.00,022.00){\makebox(0,0)[cc]{$(\blank,r)$}}
\put(24.00,031.00){\makebox(0,0)[cc]{$1$}}
\put(45.00,031.00){\makebox(0,0)[cc]{$(\blank,r)$}}
\put(45.00,021.00){\makebox(0,0)[cc]{$1$}}
\put(25.00,021.00){\makebox(0,0)[cc]{$(\blank,r)$}}
\put(13.00,015.00){\makebox(0,0)[cc]{$\blank$}}
\put(22.00,009.00){\makebox(0,0)[cc]{$(1,r)$}}
\put(47.00,009.00){\makebox(0,0)[cc]{$(\blank,r)$}}
\put(56.00,017.00){\makebox(0,0)[cc]{$\blank$}}
\put(63.00,022.00){\makebox(0,0)[cc]{$0$}}
\put(57.00,037.00){\makebox(0,0)[cc]{$(\blank,r)$}}
\put(80,10){\shortstack[l]{
$ m(a,0)=(a,\blank ,r)$ \\
$ m(a,1)=(b,\blank ,r)$ \\
$ m(b,0)=(b,\blank ,r)$ \\
$ m(b,1)=(a,\blank ,r)$ \\
$ m(a,\blank)=(c,1,r)$ \\
$ m(b,\blank)=(c,\blank,r)$ }}
\end{picture}
The following problems are concerned with recursive functions and
predicates on $\omega$.
\bigskip
\prob Show that any constant function is recursive.
\prob A binary function $f:\omega\times\omega\to\omega$ is recursive
iff there is a machine such that for any $x,y\in\omega$ if we input
$x$ ones and $y$ ones separated by ``,'', then the machine
eventually halts with $f(x,y)$ ones on the tape.
Show that $f(x,y) = x+y$ is recursive.
\prob Show that $g(x,y) = x y$ is recursive.
\prob Let $ x \dotminus y = max\{ 0,x-y \}.$ Show that $p(x)= x \dotminus 1$
is recursive. Show that $ q(x,y)= x\dotminus y $ is recursive.
\prob Suppose $f(x)$ and $g(x)$ are recursive. Show that $f(g(x))$
is recursive.
\prob Formalize a notion of multitape Turing machine. Show that we get
the same set of recursive functions.
\prob Show that we get the same set of recursive functions even if we
restrict our notion of computation to allow only tapes that are infinite
in one direction.
\prob Show that the family of recursive functions is closed under arbitrary
compositions, for example $f(g(x,y),h(x,z),z)$. More generally,
if $f(y_1,\ldots,y_m)$, $g_1(x_1,\ldots,x_n),\ldots$, and $g_m(x_1,\ldots,x_n)$
are all recursive, then so is
$$f(g_1(x_1,\ldots,x_n),\ldots,g_m(x_1,\ldots,x_n)).$$
\prob Define
$$ sgn(n) = \left\{
\begin{array}{ll}
0 & \mbox{ if } n=0 \\
1 & \mbox{ otherwise}
\end{array}
\right.$$
Show it is recursive.
\prob A set is recursive iff its characteristic function is.
Show that the binary relation $x=y$ is recursive. Show that the
binary relation $ x\leq y $ is recursive. \sdex{recursive set}
\prob Show that if $A \subseteq \omega$ is recursive then so is
$\omega\setminus A.$ Show that if $A$ and $B$ are recursive, then so are
$A\cap B$ and $A\cup B$.
\prob Suppose g(x) and h(x) are recursive and A is a recursive set. Show
that f is recursive where:
$$ f(x) =
\left\{
\begin{array}{ll}
g(x) & \mbox{ if } x \in A \\
h(x) & \mbox{ if } x \notin A
\end{array}
\right.$$
\prob Show that the set of even numbers is recursive. Show that the
set of primes is recursive.
\prob Show that $e(x,y)=x^y$ is recursive. Show that $f(x)=x!$ is recursive.
\prob Suppose that $h(z)$ and $g(x,y,z)$ are recursive. Define $f$
by recursion,
\begin{itemize}
\item $f(0,z)=h(z)$
\item $f(n+1,z) = g( n,z, f(n,z) )$.
\end{itemize}
Show that $f$ is recursive.
\prob We say that a set $ A \subseteq \omega$ is
\dex{recursively enumerable}
(re) iff
it is the range of a total recursive function or the empty set.
Show that a set is re
iff it is the domain of a partial recursive function.
\prob Show that every recursive set is re. Show that a set is recursive
iff it and its complement are re.
\prob Show that if $f$ is an increasing total recursive function then the
range of $f$ is recursive.
\prob Suppose that $f:\omega\to\omega$ and $g:\omega\to\omega$ are
recursive functions such that $f(m)i,j$, then we can apply Modus Ponens to get $\theta_k=\rho$.
\bigskip
{\bf proof of completeness:}
The easy direction of the completeness theorem is often called
the \dex{soundness theorem}: if $\Sigma\proves\theta$ then
every model of $\Sigma$ is a model of $\theta$. To check that MM
is sound is an easy induction on the length of the proof:
If $\theta_1,\ldots,\theta_n$ is a sequence of sentences
such that each $\theta_i$ is either an element of $\Sigma$, a logical
validity, or can be obtained by Modus Ponens from
the earlier $\theta_j$ for $j