$$=\lim_{i\to\infty} \mathbb{P}[{\color{green}\sigma_i}({\color{darkorange}H}) \text{ is a subgraph of }G_i ],$$ where ${\color{green}\sigma_i}\colon V({\color{darkorange}H})\to V(G_i)$ is a random injective mapping.
Maximize the edge density in a triangle free sequence $\mathcal{G}$ of graphs of increasing size:
We saw that
but how can we prove an upper bound?
To multiply two subgraph densities, we glue together the graphs:
These relationships are independent of $\mathcal{G}$, motivating the definition
where a graph $H$ now represents the function $$\small H(\mathcal{G}) = \phi_{\mathcal{G}}(H)= \lim_{i\to\infty} \mathbb{P}[\sigma_i(H) \text{ is a subgraph of }G_i].$$
We can fix entries of the $\sigma_i$ to fix (flag) some vertices, and extend the gluing operation to partially labeled graphs (flags):
We can average flags over all choices of labels, unlabeling them:
We can now find an upper bound for the edge density in triangle free graphs:
As with polynomial optimization, we can model Flag-SOS using semidefinite programming.
We saw that triangle free graphs have at most edge density $\frac{1}{2}$.
What happens if we allow some triangles?
Let $H$ be a bipartite graph, and $\small\rho\in [0,1]$.
Of all the graphs with edge density $\small \rho$, the graph(-limit) obtained by picking edges uniformly at random minimizes the homomorphism density of $H$.
For every bipartite graph $H$,
[Razborov, 2007] introduces two derivatives:
Lovász translates the definition to graphons.
Diao, Guillot, Khare, Bajaratnam consider Gateaux derivatives in the space of graphons.
Let ${\color{darkorange}f}$ be a linear combination of subgraph densities, with minimizer $\mathcal{G}$ .
We can add additional constraints to optimization problems!
We do not need to assume that we are in the interior of the profile.
We can find better SOS certificates in the quadratic module generated by these constraints.
We show that the proofs generalize to more complicated actions on three or more vertices.
The resulting functions are derivations of the flag algebra (i.e. Leibniz's rule holds).
Main observation: We only ever need to apply the operation $\Theta(n^2)$ times! We can construct still "independent" sets probabilistically.
And many more complicated ones! For example, we can blow up vertices, invert directed edges, connect vertices to all others...
We also found numeric certificates for the two next bigger cases with no "pure" flag-SOS based proof.
pkg> add FlagSOS