Notes - Logic and Proof MT24, Rado graph


The theory of the random graph is a theory axiomatised by the sentences

\[\begin{aligned} F_1& : \exists x \exists y \lnot(x=y) \\\\ F_2& : \forall x \lnot R(x, x) \\\\ F_3& : \forall x \forall y(R(x, y) \to R(y, x)) \\\\ \\\\ H_{m,n}&: \forall x_1 \cdots \forall x_m \forall y_1 \cdots \forall y_n \left( \left(\bigwedge^m_{i=1} \bigwedge^n_{j=1} \lnot(x_i = y_j)\right) \to \exists z \bigwedge^m_{i = 1}R(x_i, z) \land \bigwedge^n_{j = 1}\lnot R(y_j, z) \right) \end{aligned}\]

It’s interesting for several reasons:

Flashcards

Suppose $\sigma$ is a signature with a single binary relation $R$. Can you define the “theory of the random graph” $\pmb T _ {RG}$ in this context, and give intutitive definitions of each axiom?


The $\sigma$-theory axiomatised by the sentences

\[\begin{aligned} F_1& : \exists x \exists y \lnot(x=y) \\\\ F_2& : \forall x \lnot R(x, x) \\\\ F_3& : \forall x \forall y(R(x, y) \to R(y, x)) \\\\ \\\\ H_{m,n}&: \forall x_1 \cdots \forall x_m \forall y_1 \cdots \forall y_n \left( \left(\bigwedge^m_{i=1} \bigwedge^n_{j=1} \lnot(x_i = y_j)\right) \to \exists z \bigwedge^m_{i = 1}R(x_i, z) \land \bigwedge^n_{j = 1}\lnot R(y_j, z) \right) \end{aligned}\]
  • $F _ 1$ says that the graph is non-empty
  • $F _ 2$ says that there are no self-loops
  • $F _ 3$ says that the graph is undirected
  • $H _ {m,n}$ says that for every two disjoint collections of vertices $\{x _ 1, \ldots, x _ m\}$ and $\{y _ 1, \ldots, y _ n\}$, there exists some vertex that’s connected to every vertex in the first set, but not to any vertex in the second.

Suppose $\sigma$ is a signature with a single binary relation $R$ and $\pmb T _ {RG}$ is the theory of the random graph, described by the axioms

\[\begin{aligned} F_1& : \exists x \exists y \lnot(x=y) \\\\ F_2& : \forall x \lnot R(x, x) \\\\ F_3& : \forall x \forall y(R(x, y) \to R(y, x)) \\\\ \\\\ H_{m,n}&: \forall x_1 \cdots \forall x_m \forall y_1 \cdots \forall y_n \left( \left(\bigwedge^m_{i=1} \bigwedge^n_{j=1} \lnot(x_i = y_j)\right) \to \exists z \bigwedge^m_{i = 1}R(x_i, z) \land \bigwedge^n_{j = 1}\lnot R(y_j, z) \right) \end{aligned}\]

@Prove that for all $m, n \in \mathbb N$, we have

\[\lim_{N \to \infty} \mathbb P_N(H_{m,n}) = 1\]

where $\mathbb P _ N(\varphi)$ is the probability that a random graph with $N$ nodes satisfies $\varphi$.


Let $N > n+m$ and pick some tuples $\pmb a = (a _ 1, \ldots, a _ m)$, $\pmb b = (b _ 1, \ldots, b _ n)$ drawn from the set $\{1, \ldots, N\}$.

Claim: For a graph $G$ drawn uniformly at random from $\pmb G _ N$,

\[\mathbb P\left( G \not\models \exists z \left( \bigwedge^m_{i = 1}E(a_i, z) \land \bigwedge^n_{j = 1} E(b_j, z)\right) \right) \le q^{N - m - n}\]

where

\[q := 1 - 2^{-n-m} < 1\]

Proof: For each possible choice $c$ from

\[\\{1, \ldots, N\\} \setminus \\{a_1, \ldots, a_m, b_1, \ldots, b_m\\}\]

the probability

\[G \not\models \bigwedge^m_{i = 1} E(a_i, c) \land \bigwedge^n_{j = 1} E(b_j, c)\]

is at most $q$. Since these are independent events for the $N - m - n$ choices of $c$, the claim follows.

Proof of original statement: Take the union bound ($\mathbb P(\bigcup A _ n) \le \sum \mathbb P(A _ n)$) over the $N^{n+m}$ possible choices of $a _ 1, \ldots, a _ m, b _ 1, \ldots, b _ n \in \{1, \ldots, N\}$. We have that

\[\mathbb P_N(\lnot H_{m,n}) \le N^{n+m} q^{N-n-m}\]

Since $q < 1$, $\lim _ {N \to \infty} \mathbb P _ N(H _ {m,n}) = 1$.

Suppose $\sigma$ is a signature with a single binary relation $R$ and $\pmb T _ {RG}$ is the theory of the random graph, described by the axioms

\[\begin{aligned} F_1& : \exists x \exists y \lnot(x=y) \\\\ F_2& : \forall x \lnot R(x, x) \\\\ F_3& : \forall x \forall y(R(x, y) \to R(y, x)) \\\\ \\\\ H_{m,n}&: \forall x_1 \cdots \forall x_m \forall y_1 \cdots \forall y_n \left( \left(\bigwedge^m_{i=1} \bigwedge^n_{j=1} \lnot(x_i = y_j)\right) \to \exists z \bigwedge^m_{i = 1}R(x_i, z) \land \bigwedge^n_{j = 1}\lnot R(y_j, z) \right) \end{aligned}\]

@State a result about the probability that a formula is satisfied in any structure satisfying these axioms.


For every $\sigma$-formula $\varphi$ the limit $\lim _ {N \to \infty} \mathbb P _ N(\varphi)$ exists and is either zero or one, and $\pmb T _ {RG} = \{\varphi : \lim _ {N \to \infty} \mathbb P _ N(\varphi) = 1\}$.

Suppose $\sigma$ is a signature with a single binary relation $R$ and $\pmb T _ {RG}$ is the theory of the random graph, described by the axioms

\[\begin{aligned} F_1& : \exists x \exists y \lnot(x=y) \\\\ F_2& : \forall x \lnot R(x, x) \\\\ F_3& : \forall x \forall y(R(x, y) \to R(y, x)) \\\\ \\\\ H_{m,n}&: \forall x_1 \cdots \forall x_m \forall y_1 \cdots \forall y_n \left( \left(\bigwedge^m_{i=1} \bigwedge^n_{j=1} \lnot(x_i = y_j)\right) \to \exists z \bigwedge^m_{i = 1}R(x_i, z) \land \bigwedge^n_{j = 1}\lnot R(y_j, z) \right) \end{aligned}\]

and let

\[\mathbb P_N(\varphi) = \frac{|\\{G \in \pmb G_N : G \models \varphi\\}|}{|\pmb G_N|}\]

(i.e. the probability that a random graph with $N$ vertices is a model for $\varphi$).

@Prove that for every $\sigma$-formula $\varphi$ the limit $\lim _ {N \to \infty} \mathbb P _ N(\varphi)$ exists and is either zero or one, and $\pmb T _ {RG} = \{\varphi : \lim _ {N \to \infty} \mathbb P _ N(\varphi) = 1\}$.

(you can assume $\pmb T _ {RG}$ is complete and other results relating to the random graph)


Since $\pmb T _ {RG}$ is complete, it suffices to show that for every $\varphi$ in $\pmb T _ {RG}$, we have

\[\lim_{N \to \infty} \mathbb P_N(\varphi) = 1\]

(this establishes $\pmb T _ {RG} \subseteq \{\varphi : \lim _ {N \to \infty} \mathbb P _ N(\varphi) = 1\}$, but then completeness guarantees the contrapositive. If $\pmb T _ {RG} \not\models \varphi$, then $\pmb T _ {RG} \models \lnot \varphi$, and we can apply the same argument).

By the compactness theorem for first-order logic, there exist $m, n \in \mathbb N$ such that $\{F _ 1, F _ 2, F _ 3, H _ {m,n}\}$ entails $\varphi$ (there might be lots of extension axioms, but since $H _ {m, n} \models H _ {m’, n’}$ for $m’, n’ \le m, n$, it suffices to consider just one). Hence $\mathbb P _ N(\varphi) \ge \mathbb P _ N(H _ {m,n})$ .

But then by the result that says for all $m, n \in \mathbb N$, $\lim _ {N \to \infty} \mathbb P _ N(H _ {m,n}) = 1$, it follows that $\lim _ {N \to \infty} \mathbb P _ N(\varphi) = 1$.

@Prove that any two countable models

\[\mathcal G = (V, E) \text{ and } \mathcal G' = (V', E')\]

of the theory $\pmb T _ \text{RG}$ of the theory of the random graph are isomorphic as graphs, i.e. there exists a bijection

\[f : V \to V'\]

such that

\[(u, v) \in E \iff (f(u), f(v)) \in E'\]

for all $u, v \in E$.


This follows from showing that Duplicator wins the $k$-round Ehrenfeucht-Fraïssé game for any $k$.

Duplicator wins by maintaining the following invariant: If $\pmb u = (u _ 1, \ldots, u _ n)$ and $\pmb v = (v _ 1, \ldots, v _ n)$ are the tuples of values picked for $\mathcal G$ and $\mathcal G’$ in the first $n$ rounds of the game, then:

  • $u _ i = u _ j$ iff $v _ i = v _ j$
  • $(u _ i, u _ j) \in E _ {\mathcal G}$ iff $(v _ i, v _ j) \in E _ {\mathcal G’}$

To maintain the invariant, suppose that Spoiler picks $u _ {n+1}$ after the first $n$ rounds. If $u _ {n+1} = u _ i$ for any $i \le n$, then Duplicator picks $v _ {n+1} = v _ i$.

Otherwise, by the extension axioms, there exists $v _ {n+1}$ such that $(u _ {n+1}, u _ i) \in E _ {\mathcal G}$ iff $(v _ {n+1}, v _ i) \in E _ {\mathcal G’}$.

Since the invariant implies that after $k$ rounds the two tuples are indistinguishable, it follows that Duplicator wins.




Related posts