Nice detailed explanation of back propagation in neural networks:

- A nice post using straight python
- Couple of nice post using numpy basic neural network and RNN.
- MIT class notes

Nice detailed explanation of back propagation in neural networks:

- A nice post using straight python
- Couple of nice post using numpy basic neural network and RNN.
- MIT class notes

I thin this video capture the essence of calculus.

"We have a geometric concept, and we want an algebraic expression for it".

It could be: we have XYZ concept, and want and algebraic expression for it.

The XYZ calculus (Tensor calculus in case of Geometry, differential calculus in case of algebra, Relational Calculus in case of Relational Algebra, Lambda Calculus in case of Function algebra...) is the tool to use to write the algebraic expression. Once one has an algebraic equation then one can reason by applying algebraic laws in the domain to simplify expression.

The method seems to always be "Write down the correct identity"... Struggling to find how this applies to Lambda Calculus..

"We have a geometric concept, and we want an algebraic expression for it".

It could be: we have XYZ concept, and want and algebraic expression for it.

The XYZ calculus (Tensor calculus in case of Geometry, differential calculus in case of algebra, Relational Calculus in case of Relational Algebra, Lambda Calculus in case of Function algebra...) is the tool to use to write the algebraic expression. Once one has an algebraic equation then one can reason by applying algebraic laws in the domain to simplify expression.

The method seems to always be "Write down the correct identity"... Struggling to find how this applies to Lambda Calculus..

Subscribe to:
Posts (Atom)