Skip to content

Commit 3311ec1

Browse files
jrchatrucEdVeralli
andauthored
Stark documentation FAQ (#214)
* [WIP] stark prover documentation * Typo Doble "of" in "of of only" * More progress * small addition * Small correction * Switch to katex for correct rendering * Move to proving_system directory * Progress * Finish recap side * Move to main `docs` directory * Add Makefile target for docs and README explanation on how to serve it * Add gh pages deployment workflow * Test * Restore yml * Final touches * [WIP] Stark prover FAQ * Add coset section * Minor correction --------- Co-authored-by: EdVeralli <[email protected]>
1 parent be810eb commit 3311ec1

File tree

11 files changed

+424
-0
lines changed

11 files changed

+424
-0
lines changed

.github/workflows/gh-pages.yml

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
name: github pages
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
pull_request:
8+
9+
jobs:
10+
deploy:
11+
runs-on: ubuntu-20.04
12+
concurrency:
13+
group: ${{ github.workflow }}-${{ github.ref }}
14+
steps:
15+
- uses: actions/checkout@v2
16+
17+
- name: Setup mdBook
18+
uses: peaceiris/actions-mdbook@v1
19+
with:
20+
mdbook-version: '0.4.10'
21+
22+
- name: Install Katex
23+
run: cargo install mdbook-katex
24+
25+
- run: mdbook build docs
26+
27+
- name: Deploy
28+
uses: peaceiris/actions-gh-pages@v3
29+
if: ${{ github.ref == 'refs/heads/main' }}
30+
with:
31+
github_token: ${{ secrets.GITHUB_TOKEN }}
32+
publish_dir: ./docs/book

Makefile

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
.PHONY: test clippy docker-shell nix-shell benchmarks benchmark docs
2+
13
test:
24
cargo test
35

@@ -17,3 +19,6 @@ benchmarks:
1719
# BENCHMARK should be one of the [[bench]] names in Cargo.toml
1820
benchmark:
1921
cargo criterion --bench ${BENCH}
22+
23+
docs:
24+
cd docs && mdbook serve --open

README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,14 @@ If you use ```Lambdaworks``` libraries in your research projects, please cite th
103103

104104
### Gadgets
105105

106+
## Documentation
107+
108+
To serve the documentation locally, first install both [mdbook](https://rust-lang.github.io/mdBook/guide/installation.html) and the [Katex preprocessor](https://github.com/lzanini/mdbook-katex#getting-started) to render LaTeX, then run
109+
110+
```
111+
make docs
112+
```
113+
106114
## 📊 Benchmarks
107115

108116
To run the benchmarks you will need `cargo-criterion`, to install do:

docs/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
book

docs/book.toml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
[book]
2+
authors = ["Javier Chatruc"]
3+
language = "en"
4+
multilingual = false
5+
src = "src"
6+
title = "docs"
7+
8+
[preprocessor.katex]

docs/src/SUMMARY.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# Lambdaworks Documentation
2+
3+
- [Introduction](./introduction.md)
4+
5+
- [Proving Systems]()
6+
- [STARKs](./proving_systems/starks/starks.md)
7+
- [Recap](./proving_systems/starks/recap.md)
8+
- [Implementation](./proving_systems/starks/implementation.md)
9+
- [FAQ](./proving_systems/starks/faq.md)

docs/src/introduction.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# Introduction
2+
3+
This site hosts the main documentation for Lambdaworks as a whole. It is still a work in progress.
Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
# Frequently Asked Questions
2+
3+
## Why use roots of unity?
4+
5+
Whenever we interpolate or evaluate trace, boundary and constraint polynomials, we use some $2^n$-th roots of unity. There are a few reasons for this:
6+
7+
- Using roots of unity means we can use the [Fast Fourier Transform](https://en.wikipedia.org/wiki/Fast_Fourier_transform) and its inverse to evaluate and interpolate polynomials. This method is much faster than the naive Lagrange interpolation one. Since a huge part of the STARK protocol involves both evaluating and interpolating, this is a huge performance improvement.
8+
- When computing boundary and constraint polynomials, we divide them by their `zerofiers`, polynomials that vanish on a few points (the trace elements where the constraints do not hold). These polynomials take the form
9+
10+
$$
11+
Z(X) = \prod (X - x_i)
12+
$$
13+
14+
where the $x_i$ are the points where we want it to vanish.
15+
16+
When implementing this, evaluating this polynomial can be very expensive as it involves a huge product. However, if we are using roots of unity, we can use the following trick. The vanishing polynomial for all the $2^n$ roots of unity is
17+
18+
$$
19+
X^{2^n} - 1
20+
$$
21+
22+
Instead of expressing the zerofier as a product of the places where it should vanish, we express it as the vanishing polynomial above divided by the `exemptions` polynomial; the polynomial whose roots are the places where constraints don't need to hold.
23+
24+
$$
25+
Z(X) = \dfrac{X^{2^n} - 1}{\prod{(X - e_i)}}
26+
$$
27+
28+
where the $e_i$ are now the points where we don't want it to vanish. This `exemptions` polynomial in the denominator is usually much smaller, and because the vanishing polynomial in the numerator is only two terms, evaluating it is really fast.
29+
30+
## What is a primitive root of unity?
31+
32+
The $n$-th roots of unity are the numbers $x$ that satisfy
33+
34+
$$
35+
x^n = 1
36+
$$
37+
38+
There are $n$ such numbers, because they are the roots of the polynomial $X^n - 1$. The set of $n$-th roots of unity always has a `generator`, a root $g$ that can be used to obtain every other root of unity by exponentiating. What this means is that the set of $n$-th roots of unity is
39+
40+
$$
41+
\{g^i : 0 \leq i < n\}
42+
$$
43+
44+
Any such generator `g` is called a *primitive root of unity*. It's called primitive because it allows us to recover any other root.
45+
46+
Here are a few important things to keep in mind, some of which we use throughout our implementation:
47+
48+
- There are always several primitive roots. If $g$ is primitive, then any power $g^k$ with $k$ coprime with $n$ is also primitive. As an example, if $g$ is a primitive $8$-th root of unity, then $g^3$ is also primitive.
49+
- We generally will not care about which primitive root we choose; what we do care about is being *consistent*. We should always choose the same one throughout our code, otherwise computations will go wrong.
50+
- Because $g^n = 1$, the powers of $g$ wrap around. This means
51+
52+
$$
53+
g^{n + 1} = g \\
54+
g^{n + 2} = g^2
55+
$$
56+
57+
and so on.
58+
- If $w$ is a primitive $2^{n + 1}$-th root of unity, then $w^2$ is a primitive $2^n$-th root of unity. In general, if $w$ is a primitive $2^{n + k}$-th primitive root of unity, then $w^{2^k}$ is a primitive $2^n$-th root of unity.
59+
60+
## Why use Cosets?
61+
62+
When we perform `FRI` on the `DEEP` composition polynomial, the low degree extension we use is not actually over a set of higher roots of unity than the ones used for the trace, but rather a *coset* of it. A coset is simply a set of numbers all multiplied by the same element. We call said element the `offset`. In our case, a coset of the $2^n$-th roots of unity with primitive root $\omega$ and offset `h` is the set
63+
64+
$$
65+
\{h \omega^i : 0 \leq i < 2^n\}
66+
$$
67+
68+
So why not just do the LDE without the offset? The problem is in how we construct and evaluate the composition polynomial `H`. Let's say our trace polynomial was interpolated over the $2^n$-th roots of unity with primitive root $g$, and we are doing the LDE over the $2^{n + 1}$-th roots of unity with primitive root $\omega$, so $\omega^2 = g$ (i.e. the blowup factor is `2`).
69+
70+
Recall that `H` is a sum of terms that include boundary and transition constraint polynomials, and each one of them includes a division by a `zerofier`; a polynomial that vanishes on some roots of unity $g^i$. This is because the zerofier is what tells us which rows of the trace our constraint should apply on.
71+
72+
When doing `FRI`, we have to provide evaluations over the LDE domain we are using. If we don't include the offset, our domain is
73+
74+
$$
75+
\{\omega^i : 0 \leq i < 2^{n + 1}\}
76+
$$
77+
78+
Note that, because $w^2 = g$, some of the elements on this set (actually, half of them) are powers of $g$. If while doing `FRI` we evalaute `H` on them, the zerofier could vanish and we'd be dividing by zero. We introduce the offset to make sure this can't happen.
79+
80+
NOTE: a careful reader might note that we can actually evaluate `H` on the elements $g^i$, since on a valid trace the zerofiers will actually divide the polynomials on their numerator. The problem still remains, however, because of performance. We don't want to do polynomial division if we don't need to, it's much cheaper to just evaluate numerator and denominator and then divide. Of course, this only works if the denominator doesn't vanish; hence, cosets.
81+
82+
----------
83+
84+
TODO:
85+
- What's the ce blowup factor?
86+
- What's the out of domain frame?

docs/src/proving_systems/starks/implementation.md

Whitespace-only changes.

0 commit comments

Comments
 (0)