Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,4 @@ julia = "1.6"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test"]
test = ["Test"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is the sort of whitespace change it's nice to not include in a code review...

1 change: 0 additions & 1 deletion docs/.gitignore

This file was deleted.

3 changes: 2 additions & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Gen = "ea4f424c-a589-11e8-07c0-fd5c91b9da4a"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"

[compat]
Documenter = "0.27"
Documenter = "1"
24 changes: 24 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Website Docs
- `pages.jl` to find skeleton of website.
- `make.jl` to build the website index.

The docs are divided in roughly four sections:
- Getting Started + Tutorials
- How-to Guides
- API = Modeling API + Inference API
- Explanations + Internals


# Developing
To build the docs, run `julia --make.jl` or alternatively startup the Julia REPL and include `make.jl`. For debugging, consider setting `draft=true` in the `makedocs` function found in `make.jl`.
Currently you must write the tutorial directly in the docs rather than a source file (e.g. Quarto). See `getting_started` or `tutorials` for examples.

Code snippets must use the triple backtick with a label to run. The environment carries over so long as the labels match. Example:

```@example tutorial_1
x = rand()
```

```@example tutorial_1
print(x)
```
6 changes: 0 additions & 6 deletions docs/build_docs_locally.sh

This file was deleted.

43 changes: 13 additions & 30 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,37 +1,20 @@
# Run: julia --project make.jl
using Documenter, Gen

include("pages.jl")
makedocs(
sitename = "Gen",
modules = [Gen],
pages = [
"Home" => "index.md",
"Getting Started" => "getting_started.md",
"Tutorials" => "tutorials.md",
"Modeling Languages and APIs" => [
"Generative Functions" => "ref/gfi.md",
"Probability Distributions" => "ref/distributions.md",
"Built-in Modeling Language" => "ref/modeling.md",
"Generative Function Combinators" => "ref/combinators.md",
"Choice Maps" => "ref/choice_maps.md",
"Selections" => "ref/selections.md",
"Optimizing Trainable Parameters" => "ref/parameter_optimization.md",
"Trace Translators" => "ref/trace_translators.md",
"Extending Gen" => "ref/extending.md"
],
"Standard Inference Library" => [
"Importance Sampling" => "ref/importance.md",
"MAP Optimization" => "ref/map.md",
"Markov chain Monte Carlo" => "ref/mcmc.md",
"MAP Optimization" => "ref/map.md",
"Particle Filtering" => "ref/pf.md",
"Variational Inference" => "ref/vi.md",
"Learning Generative Functions" => "ref/learning.md"
],
"Internals" => [
"Optimizing Trainable Parameters" => "ref/internals/parameter_optimization.md",
"Modeling Language Implementation" => "ref/internals/language_implementation.md"
]
]
doctest = false,
clean = true,
warnonly = true,
format = Documenter.HTML(;
assets = String["assets/header.js", "assets/header.css", "assets/theme.css"],
collapselevel=1,
),
sitename = "Gen.jl",
pages = pages,
checkdocs=:exports,
pagesonly=true,
)

deploydocs(
Expand Down
53 changes: 53 additions & 0 deletions docs/pages.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
pages = [
"Home" => "index.md",
"Getting Started" => [
"Example 1: Linear Regression" => "getting_started/linear_regression.md",
],
"Tutorials" => [
"Basics" => [
"tutorials/basics/modeling_in_gen.md",
"tutorials/basics/gfi.md",
"tutorials/basics/combinators.md",
"tutorials/basics/particle_filter.md",
"tutorials/basics/vi.md",
],
"Advanced" => [
"tutorials/trace_translators.md",
],
"Model Optmizations" => [
"Speeding Inference with the Static Modeling Language" => "tutorials/model_optimizations/scaling_with_sml.md",
],
],
"How-to Guides" => [
"MCMC Kernels" => "how_to/mcmc_kernels.md",
"Custom Distributions" => "how_to/custom_distributions.md",
"Custom Modeling Languages" => "how_to/custom_dsl.md",
"Custom Gradients" => "how_to/custom_derivatives.md",
"Incremental Computation" => "how_to/custom_incremental_computation.md",
],
"API Reference" => [
"Modeling Library" => [
"Generative Functions" => "api/model/gfi.md",
"Probability Distributions" => "api/model/distributions.md",
"Choice Maps" => "api/model/choice_maps.md",
"Built-in Modeling Languages" => "api/model/modeling.md",
"Combinators" => "api/model/combinators.md",
"Selections" => "api/model/selections.md",
"Optimizing Trainable Parameters" => "api/model/parameter_optimization.md",
"Trace Translators" => "api/model/trace_translators.md",
],
"Inference Library" => [
"Importance Sampling" => "api/inference/importance.md",
"MAP Optimization" => "api/inference/map.md",
"Markov chain Monte Carlo" => "api/inference/mcmc.md",
"MAP Optimization" => "api/inference/map.md",
"Particle Filtering" => "api/inference/pf.md",
"Variational Inference" => "api/inference/vi.md",
"Learning Generative Functions" => "api/inference/learning.md"
],
],
"Explanation and Internals" => [
"Modeling Language Implementation" => "explanations/language_implementation.md",
"explanations/combinator_design.md"
]
]
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ Then, the traces of the model can be obtained by simulating from the variational
Instead of fitting the variational approximation from scratch for each observation, it is possible to fit an *inference model* instead, that takes as input the observation, and generates a distribution on latent variables as output (as in the wake sleep algorithm).
When we train the variational approximation by minimizing the evidence lower bound (ELBO) this is called amortized variational inference.
Variational autencoders are an example.
It is possible to perform amortized variational inference using [`black_box_vi`](@ref) or [`black_box_vimco!`](@ref).
It is possible to perform amortized variational inference using [`black_box_vi!`](@ref) or [`black_box_vimco!`](@ref).

## References

Expand Down
File renamed without changes.
19 changes: 19 additions & 0 deletions docs/src/api/inference/mcmc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Markov chain Monte Carlo (MCMC)

Gen supports standard Markov Chain Monte Carlo algorithms and allows users to write their own custom kernels.
```@index
Pages = ["mcmc.md"]
```

```@docs
metropolis_hastings
mh
mala
hmc
elliptical_slice
@pkern
@kern
@rkern
reversal
involutive_mcmc
```
File renamed without changes.
7 changes: 7 additions & 0 deletions docs/src/api/inference/vi.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
## Variational inference
There are two procedures in the inference library for performing black box variational inference.
Each of these procedures can also train the model using stochastic gradient descent, as in a variational autoencoder.
```@docs
black_box_vi!
black_box_vimco!
```
20 changes: 18 additions & 2 deletions docs/src/ref/choice_maps.md → docs/src/api/model/choice_maps.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,13 @@ Choice maps also implement:
- `==`, which tests if two choice maps have the same addresses and values at those addresses.


## Mutable Choice Maps

```@docs
DynamicChoiceMap
EmptyChoiceMap
StaticChoiceMap
choicemap
```

A mutable choice map can be constructed with [`choicemap`](@ref), and then populated:
```julia
Expand All @@ -45,8 +51,18 @@ There is also a constructor that takes initial (address, value) pairs:
choices = choicemap((:x, true), ("foo", 1.25), (:y => 1 => :z, -6.3))
```


```@docs
choicemap
set_value!
set_submap!
Base.merge(::ChoiceMap, ::ChoiceMap)
Base.merge(::ChoiceMap, ::Vararg{ChoiceMap})
Base.isempty(::ChoiceMap)
```

```@docs
Gen.pair
Gen.unpair
Gen.ChoiceMapNestedView
```

Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,9 @@ FunctionalCollections.PersistentVector{Any}[true, false, true, false, true]

## Recurse combinator

TODO: document me
```@docs
Recurse
```

```@raw html
<div style="text-align:center">
Expand Down Expand Up @@ -161,3 +163,4 @@ The resulting trace contains the subtrace from the branch with index `2` - in th
└── :z : 13.552870875213735
```

Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
# Probability Distributions
# [Probability Distributions](@id distributions)

```@docs
random
logpdf
has_output_grad
logpdf_grad
```

Gen provides a library of built-in probability distributions, and four ways of
defining custom distributions, each of which are explained below:
Expand Down Expand Up @@ -39,6 +46,7 @@ piecewise_uniform
poisson
uniform
uniform_discrete
broadcasted_normal
```

## [Defining New Distributions Inline with the `@dist` DSL](@id dist_dsl)
Expand Down
55 changes: 55 additions & 0 deletions docs/src/api/model/gfi.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
## [Generative Functions](@id gfi_api)

```@docs
GenerativeFunction
Trace
```

The complete set of methods in the generative function interface (GFI) is:

```@docs
simulate
generate
update
regenerate
get_args
get_retval
get_choices
get_score
get_gen_fn
Base.getindex
project
propose
assess
has_argument_grads
has_submap
accepts_output_grad
accumulate_param_gradients!
choice_gradients
get_params
```

```@docs
Diff
NoChange
UnknownChange
SetDiff
Diffed
```

```@docs
CustomUpdateGF
apply_with_state
update_with_state
```

```@docs
CustomGradientGF
apply
gradient
```

```@docs
Gen.init_update_state
Gen.apply_update!
```
Loading