He talks about Denotational Semantic and interestingly describing it as: augmenting math to talk about meaning and then extend math to program in math.

## Thursday, February 20, 2014

### What is meaning of denotation semantic ....

An absolute not to be missed lecture by David Sankel: The intellectual Ascent to Agda.

He talks about Denotational Semantic and interestingly describing it as: augmenting math to talk about meaning and then extend math to program in math.

He talks about Denotational Semantic and interestingly describing it as: augmenting math to talk about meaning and then extend math to program in math.

## Saturday, December 7, 2013

## Wednesday, November 13, 2013

## Friday, October 25, 2013

### Dedication!

In "The Lady Tasting Tea" the author (David Salsburg) talks about calculations done in a Statistical paper published in 1920s by R. A Fisher. Mind blowing to think that he used this machine to do the tables and charts in the paper:

The author estimates it must have taken him 185 hours to prepare the tables.

The author estimates it must have taken him 185 hours to prepare the tables.

## Wednesday, October 23, 2013

### Interesting use of probabilistic model

In these lecture the presenter describes a system of differential equation that is solved for a set of initial condition. The results are used to build a probabilistic model, using the distribution of values and the probability model for state transitions in a dynamic model. The dynamic network is then used to make inference on the system directly with out needing to solve the differential equation.

An interesting side note is that with deterministic differential equation describing a system (in this case a biological system) our observation of the state of the system is noisy and infrequent (once every 30 minutes). Further justifying the probabilistic approach over a deterministic approach.

An interesting side note is that with deterministic differential equation describing a system (in this case a biological system) our observation of the state of the system is noisy and infrequent (once every 30 minutes). Further justifying the probabilistic approach over a deterministic approach.

## Monday, September 30, 2013

## Tuesday, September 17, 2013

### Monads vs Objects

The examples in documentation of MonadRandom does a good job on how monads are useful. I have it cut and pasted here:

There is some correspondence between notions in programming and in mathematics:

random generator ~ random variable / probabilistic experiment result of a random generator ~ outcome of a probabilistic experiment

Thus the signature

rx :: (MonadRandom m, Random a) => m a

can be considered as "rx is a random variable". In the do-notation the line

x <- rx

means that "x is an outcome of rx".

In a language without higher order functions and using a random generator "function" it is not possible to work with random variables, it is onlypossible to compute with outcomes, e.g.`rand()+rand()`

. In a language where random generators are implemented asobjects, computing with random variables is possible but still cumbersome [.]as you need to wrap the generators (Composition Pattern) in object and perform the computation as methods of the new object

In Haskell we have both options either computing with outcomes

do x <- rx

y <- ry

return (x+y)

or computing with random variables

liftM2 (+) rx ry

This means that liftM like functions convert ordinary arithmetic into random variable arithmetic [Since you can't do lift in OO, you would need to write your own version of each operation you like to support]. But there is also some arithmetic on random variables which can not be performed on outcomes [Or would be adhoc in OO]. For example, given a function that repeats an action until the result fulfills a certain property (I wonder if there is already something of this kind in the standard libraries)

untilM :: Monad m => (a -> Bool) -> m a -> m a

untilM p m = do x <- m

if p x then return x else untilM p m

we can suppress certain outcomes of an experiment. E.g. if

getRandomR (-10,10)is a uniformly distributed random variable between −10 and 10, then

untilM (0/=) (getRandomR (-10,10))is a random variable with a uniform distribution of {−10, …, −1, 1, …, 10}.

Subscribe to:
Posts (Atom)