Skip to content

Commit

Permalink
doc and readme fixed
Browse files Browse the repository at this point in the history
  • Loading branch information
rcabanasdepaz committed Aug 29, 2019
1 parent 74d4632 commit 44e77ec
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 26 deletions.
26 changes: 6 additions & 20 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -101,8 +101,6 @@ We start by importing the required packages and defining the constant parameters
# number of observations (dataset size)
N = 1000
A model can be defined by decorating any function with ``@inf.probmodel``. The model is fully specified by
the variables defined inside this function:

Expand All @@ -113,15 +111,12 @@ the variables defined inside this function:
def nlpca(k, d0, dx, decoder):
with inf.datamodel():
z = inf.Normal(tf.ones([k])*0.5, 1., name="z") # shape = [N,k]
z = inf.Normal(tf.ones([k])*0.5, 1, name="z") # shape = [N,k]
output = decoder(z,d0,dx)
x_loc = output[:,:dx]
x_scale = tf.nn.softmax(output[:,dx:])
x = inf.Normal(x_loc, x_scale, name="x") # shape = [N,d]
The construct ``with inf.datamodel()``, which resembles to the **plateau notation**, will replicate
N times the variables enclosed, where N is the size of our data.

Expand All @@ -138,8 +133,6 @@ This might be defined outside the model as follows.
h0 = tf.layers.dense(z, d0, tf.nn.relu)
return tf.layers.dense(h0, 2 * dx)
Now, we can instantiate our model and obtain samples (from the prior distributions).


Expand All @@ -151,9 +144,7 @@ Now, we can instantiate our model and obtain samples (from the prior distributio
m = nlpca(k,d0,dx, decoder)
# Sample from priors
samples = m.sample()
samples = m.prior().sample()
In variational inference, we must defined a Q-model as follows.


Expand All @@ -169,9 +160,6 @@ In variational inference, we must defined a Q-model as follows.
qz = inf.Normal(qz_loc, qz_scale, name="z")
Afterwards, we define the parameters of our inference algorithm and fit the data to the model.


Expand All @@ -185,9 +173,7 @@ Afterwards, we define the parameters of our inference algorithm and fit the data
# learn the parameters
m.fit({"x": x_train}, VI)
The inference method can be further configure. But, as in Keras, a core
The inference method can be further configured. But, as in Keras, a core
principle is to try make things reasonably simple, while allowing the
user the full control if needed.

Expand All @@ -200,6 +186,6 @@ of our data.

.. code-block:: python
#extract the hidden representation
hidden_encoding = m.posterior["z"]
print(hidden_encoding.sample())
#extract the hidden representation
hidden_encoding = m.posterior("z", data={"x":x_train})
print(hidden_encoding.sample())
8 changes: 4 additions & 4 deletions docs/notes/getting30s.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ We start by importing the required packages and defining the constant parameters

.. literalinclude:: ../../examples/docs/getting30s/1.py
:language: python3
:lines: 1-13
:lines: 1-11

A model can be defined by decorating any function with ``@inf.probmodel``. The model is fully specified by
the variables defined inside this function:
Expand All @@ -57,14 +57,14 @@ This might be defined outside the model as follows.

.. literalinclude:: ../../examples/docs/getting30s/1.py
:language: python3
:lines: 27-30
:lines: 27-29


Now, we can instantiate our model and obtain samples (from the prior distributions).

.. literalinclude:: ../../examples/docs/getting30s/1.py
:language: python3
:lines: 48-53
:lines: 48-52



Expand All @@ -74,7 +74,7 @@ In variational inference, we need to define a Q-model as follows.

.. literalinclude:: ../../examples/docs/getting30s/1.py
:language: python3
:lines: 36-44
:lines: 36-42



Expand Down
4 changes: 2 additions & 2 deletions examples/docs/getting30s/1.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def qmodel(k):
m = nlpca(k,d0,dx, decoder)

# Sample from priors
samples = m.sample()
samples = m.prior().sample()


#### NOT showing 55
Expand All @@ -69,7 +69,7 @@ def qmodel(k):
#### 69

#extract the hidden representation
hidden_encoding = m.posterior["z"]
hidden_encoding = m.posterior("z", data={"x":x_train})
print(hidden_encoding.sample())


Expand Down

0 comments on commit 44e77ec

Please sign in to comment.