Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix relative links in notebooks and documentation #346

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ hide:
- :fontawesome-solid-screwdriver-wrench:{ .lg .middle } __Build a circuit...__

---
[:octicons-arrow-right-24: from region graphs](https://cirkit-docs.readthedocs.io/en/latest/notebooks/region-graphs-and-parametrisation)
[:octicons-arrow-right-24: from region graphs](notebooks/region-graphs-and-parametrisation.ipynb)

<!--

Expand All @@ -40,20 +40,20 @@ hide:
- :fontawesome-solid-gears:{ .lg .middle } __Learn a circuit...__

---
[:octicons-arrow-right-24: for distribution estimation :fontawesome-solid-chart-area:{.lg}](https://cirkit-docs.readthedocs.io/en/latest/notebooks/learning-a-circuit)
[:octicons-arrow-right-24: for distribution estimation :fontawesome-solid-chart-area:{.lg}](notebooks/learning-a-circuit.ipynb)

[:octicons-arrow-right-24: for tensor compression :fontawesome-solid-file-zipper:](https://cirkit-docs.readthedocs.io/en/latest/notebooks/compression-cp-factorization)
[:octicons-arrow-right-24: for tensor compression :fontawesome-solid-file-zipper:](notebooks/compression-cp-factorization.ipynb)

[:octicons-arrow-right-24: as a (generative) multi-class classifier](https://cirkit-docs.readthedocs.io/en/latest/notebooks/generative-vs-discriminative-circuit)
[:octicons-arrow-right-24: as a (generative) multi-class classifier](notebooks/generative-vs-discriminative-circuit.ipynb)

[:octicons-arrow-right-24: ... all of the above, with PICs :fontawesome-solid-camera:{.lg}](https://cirkit-docs.readthedocs.io/en/latest/notebooks/learning-a-circuit-with-pic)
[:octicons-arrow-right-24: ... all of the above, with PICs :fontawesome-solid-camera:{.lg}](notebooks/learning-a-circuit-with-pic.ipynb)

- :material-scale-balance:{ .lg .middle }__Advanced reasoning...__

---
[:octicons-arrow-right-24: with squared circuits $($:fontawesome-solid-plug-circle-minus:{.lg}$)^2$](https://cirkit-docs.readthedocs.io/en/latest/notebooks/sum-of-squares-circuits)
[:octicons-arrow-right-24: with squared circuits $($:fontawesome-solid-plug-circle-minus:{.lg}$)^2$](notebooks/sum-of-squares-circuits.ipynb)

[:octicons-arrow-right-24: with logic circuits :fontawesome-solid-square-binary:{.lg}...](https://cirkit-docs.readthedocs.io/en/latest/notebooks/logic-circuits)
[:octicons-arrow-right-24: with logic circuits :fontawesome-solid-square-binary:{.lg}...](notebooks/logic-circuits.ipynb)

</br>
...to enforce constraints in neural nets
Expand Down
2 changes: 1 addition & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
site_name: cirkit
site_url: https://example.com
site_url: https://cirkit-docs.readthedocs.io
repo_url: https://github.com/april-tools/cirkit
nav:
- Getting Started: 'index.md'
Expand Down
10 changes: 5 additions & 5 deletions notebooks/compilation-options.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
"id": "78aa8c28-cc8b-45bd-9f10-3dd356c74777",
"metadata": {},
"source": [
"We explore the available options that can be specified when compiling a symbolic circuit. See the notebook [learning-a-circuit.ipynb](./learning-a-circuit.ipynb) for more details about symbolic circuit representations and their compilation. Currently, symbolic circuits can only be compiled using a PyTorch 2+ backend, which allows you to specify a few options, such as the semiring that defines how to evaluate sum and products and a couple of flags related to optimizations. Future versions of ```cirkit``` may include compilation backends other than PyTorch, each with their own set of features and compilation options. However, the philosophy of ```cirkit``` is to abstract away the design of circuits and their operators from the underlying implementation and deep learning library dependencies. This will foster opportunities arising from connecting different platforms and compiler tool chains, without affecting the rest of the library.\n",
"We explore the available options that can be specified when compiling a symbolic circuit. See the notebook [on learning a probabilistic circuit](../learning-a-circuit) for more details about symbolic circuit representations and their compilation. Currently, symbolic circuits can only be compiled using a PyTorch 2+ backend, which allows you to specify a few options, such as the semiring that defines how to evaluate sum and products and a couple of flags related to optimizations. Future versions of ```cirkit``` may include compilation backends other than PyTorch, each with their own set of features and compilation options. However, the philosophy of ```cirkit``` is to abstract away the design of circuits and their operators from the underlying implementation and deep learning library dependencies. This will foster opportunities arising from connecting different platforms and compiler tool chains, without affecting the rest of the library.\n",
"\n",
"We start by instantiating a symbolic circuit for image data, as shown in the following code. Note that this is completely disentangled from the compilation step and the compilation options we explore next."
]
Expand Down Expand Up @@ -155,7 +155,7 @@
"id": "19f901a9-fb2a-4a07-97c7-398b262c79d7",
"metadata": {},
"source": [
"## (1) Choosing a Semiring"
"## Choosing a Semiring"
]
},
{
Expand Down Expand Up @@ -290,7 +290,7 @@
"id": "f5c5a2bd-4517-4c91-ab09-375998e5095c",
"metadata": {},
"source": [
"## (2) Folding your Circuit"
"## Folding your Circuit"
]
},
{
Expand Down Expand Up @@ -450,7 +450,7 @@
"id": "a13286bb-afa1-4284-8087-04fba7d65289",
"metadata": {},
"source": [
"## (3) Optimizing the Circuit Layers"
"## Optimizing the Circuit Layers"
]
},
{
Expand Down Expand Up @@ -636,7 +636,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion notebooks/compression-cp-factorization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -356,7 +356,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
6 changes: 3 additions & 3 deletions notebooks/generative-vs-discriminative-circuit.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@
"\n",
"### Estimating $p(\\mathbf{x} \\mid y)$\n",
"\n",
"Recall that in the [earlier notebook]((https://github.com/april-tools/cirkit/blob/main/notebooks/learning-a-circuit.ipynb)) we were modelling $p(\\mathbf{x})$; our circuit had a single log-probability output, which we interpreted as $\\log p(\\mathbf{x})$.\n",
"Recall that in the [learning a probabilistic circuit](../learning-a-circuit) notebook we were modelling $p(\\mathbf{x})$; our circuit had a single log-probability output, which we interpreted as $\\log p(\\mathbf{x})$.\n",
"In our case, we could instead fit $10$ separate circuits, one per image class. However, this is wasteful.\n",
"Images have shared characteristics that all $10$ circuits would have to learn separately, from scratch.\n",
"Can we do better?\n",
Expand Down Expand Up @@ -354,7 +354,7 @@
"\\operatorname{argmin}_{\\theta} \\mathcal{L}(\\theta)\\quad \\text{where} \\quad \\mathcal{L}(\\theta) = \\lambda \\mathcal{L}_{dis}(\\theta) + (1-\\lambda) \\frac{\\mathcal{L}_{gen}(\\theta)}{|X|}\n",
"$$\n",
"\n",
"where we divide $\\mathcal{L}_{gen}(\\theta)$ by the number of pixels in the image, $|X|$, such that the losses are on a comparable scale[<sup>2</sup>](#fn2).\n",
"where we divide $\\mathcal{L}_{gen}(\\theta)$ by the number of pixels in the image, $|X|$, such that the losses are on a comparable scale.[<sup>2</sup>](#fn2)\n",
"\n",
"* $\\lambda = 0 \\rightarrow$ Model trained only generatively\n",
"* $\\lambda = 1 \\rightarrow$ Model trained only discriminatively\n",
Expand Down Expand Up @@ -1224,7 +1224,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
10 changes: 5 additions & 5 deletions notebooks/learning-a-circuit-with-pic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,11 @@
"source": [
"In this notebook, we will show an alternative way of learning the parameters of tensorized folded (probabilistic) circuits.\n",
"\n",
"This technique is actually based on another model class called *Probabilistic Integral Circuit* (PIC), which extends Probabilistic Circuits (PCs) by adding *integral units*, which allow modelling continuous latent variables.\n",
"This technique is actually based on another model class called [*Probabilistic Integral Circuit* (PIC)](https://arxiv.org/abs/2406.06494), which extends Probabilistic Circuits (PCs) by adding *integral units*, which allow modelling continuous latent variables.\n",
"\n",
"Fortunately enough, we do **not** need to fully understand PICs to apply them! In fact, from an application point of view, all we need to do is replacing every folded tensor parameter with a neural net whose output is an equally-sized tensor! Therefore, the actual parameters we are going to optimize are those of such neural nets, and not the original tensors. This is it -- nothing less, nothing more.\n",
"\n",
"To showcase this alternative parameter learning scheme, we will first instantiate a folded circuit as shown in ```learning-a-circuit.ipynb```."
"To showcase this alternative parameter learning scheme, we will first instantiate a folded circuit as shown in the [learning a probabilistic circuit](../learning-a-circuit) notebook."
]
},
{
Expand Down Expand Up @@ -65,7 +65,7 @@
"id": "73cd388c",
"metadata": {},
"source": [
"The one above is the very same circuit from ```learning-a-circuit.ipynb```. Let's now print some stuff related to its first and second layer."
"The one above is the very same circuit from the [learning a circuit](../learning-a-circuit) notebook. Let's now print some stuff related to its first and second layer."
]
},
{
Expand Down Expand Up @@ -277,7 +277,7 @@
"id": "dbe004cd",
"metadata": {},
"source": [
"That is it, we are done! 🎉 We can now even forget about PICs, and just train as in ```learning-a-circuit.ipynb``` as we do next. However, if you want to learn more about PICs (and understand the input arguments of ```pc2qpc```) please check: https://arxiv.org/abs/2406.06494."
"That is it, we are done! 🎉 We can now even forget about PICs, and just train as in the [learning a circuit](../learning-a-circuit) notebook as we do next. However, if you want to learn more about PICs (and understand the input arguments of ```pc2qpc```) please check [the original publication](https://arxiv.org/abs/2406.06494)."
]
},
{
Expand Down Expand Up @@ -437,7 +437,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
4 changes: 2 additions & 2 deletions notebooks/learning-a-circuit.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
"The **symbolic circuit** is a symbolic abstraction of a tensorized circuit, i.e., a circuit consisting of sum/product/input layers, each grouping several sum/product/input units, respectively. This symbolic representation stores the connections between the layers, the number of units in each layer, and useful metadata about the parameters, such as their shape and parameterization choices. Note that a symbolic circuit does not allocate parameters and cannot be used for learning or inference. By _compiling a symbolic circuit_ using PyTorch, we will later recover a probabilistic circuit that can be learned or be used for inference purposes.\n",
"\n",
"In ```cirkit.templates```, we provide several templates that can be used to construct symbolic circuits of different structures. In this notebook, we use a high-level template to build a symbolic circuit specifically for image data. To do so, we need to specify some arguments that will possibly yield different architectures and parameterizations. That is, we specify the shape of the images, and select one of the region graphs that exploits the closeness of patches of pixels, such as the _QuadGraph_ region graph.\n",
"See the [region-graph-and-parameterisations.ipynb](region-graph-and-parameterisations.ipynb) notebook for more details about region graphs. Moreover, we select the type of input and inner layers, the number of units within them, and how to parameterize the sum layers. See comments in the code below for more details about each argument."
"See the [notebook on region graphs and sum product layers](../region-graphs-and-parametrisation) for more details about region graphs. Moreover, we select the type of input and inner layers, the number of units within them, and how to parameterize the sum layers. See comments in the code below for more details about each argument."
]
},
{
Expand Down Expand Up @@ -408,7 +408,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
8 changes: 4 additions & 4 deletions notebooks/logic-circuits.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
"id": "91be2877",
"metadata": {},
"source": [
"## [Compiling a propositional formula using `cirkit`](#Compiling-a-propositional-formula)\n",
"## Compiling a propositional formula using `cirkit`\n",
"\n",
"As we said before, a propositional formula can be easily represented as a circuit with a tree-structure. For instance, $\\alpha$ presented before can be represented as the following tree:\n",
"\n",
Expand Down Expand Up @@ -243,7 +243,7 @@
"id": "718b33d9",
"metadata": {},
"source": [
"We can now compile the symbolic circuit to a computational graph just like we did in **learning-a-circuit.ipynb** notebook. Note that, since we assumed the boolean semiring to define $\\top$ as $1$ and $\\bot$ as $0$, we have to rely on `cirkit`'s `sum-product` semiring to obtain consistent results."
"We can now compile the symbolic circuit to a computational graph just like we did in the [learning a probabilistic circuit](../learning-a-circuit) notebook. Note that, since we assumed the boolean semiring to define $\\top$ as $1$ and $\\bot$ as $0$, we have to rely on `cirkit`'s `sum-product` semiring to obtain consistent results."
]
},
{
Expand Down Expand Up @@ -1016,7 +1016,7 @@
"Notice that the circuit is smooth - courtesy of `cirkit` - decomposable and structured decomposable - courtesy of the SDD target language. \n",
"\n",
"In logic circuits, this properties can be exploited to query for satisfiability checking and, at the same time, model counting - i.e. counting the interpretations $\\mathcal{I}$ that model $\\alpha$.\n",
"We can achieve this in `cirkit` by marginalizing over all the literals, similarly as we did in **learning-a-circuit.ipynb**."
"We can achieve this in `cirkit` by marginalizing over all the literals, similarly as we did in the [learning a probabilistic circuit](../learning-a-circuit) notebook."
]
},
{
Expand Down Expand Up @@ -1175,7 +1175,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.10.16"
}
},
"nbformat": 4,
Expand Down
Loading
Loading