[Insight-users] Gradient descent Question PLEASE I NEED HELP

Yixun Liu yxliuwm at gmail.com
Mon Apr 29 20:45:39 EDT 2013


Hi Alaa,
Transform = (x + t_x, y + t_y)^T
theta = (t_x, t_y)^T

Yixun

----- Original Message ----- 
From: "alaamegawer" <alaamegawer at yahoo.com>
To: <insight-users at itk.org>
Sent: Monday, April 29, 2013 5:58 PM
Subject: [Insight-users] Gradient descent Question PLEASE I NEED HELP


> Please All
>
> I want to implement my Gradient descent and don't want to use implemented
> ItK  just to start learning
> so please take a look to attachment which contain my derivation to do that
> and i have a problem this derivation lead to add same value to
> displacement
> in x and y so please help me to know where's the error?
>
>
> Thanks In advance
>
> Alaa
> <http://itk-insight-users.2283740.n2.nabble.com/file/n7582905/GradientOptimazer.png>
>
>
>
>
>
> --
> View this message in context:
> http://itk-insight-users.2283740.n2.nabble.com/Gradient-descent-Question-PLEASE-I-NEED-HELP-tp7582905.html
> Sent from the ITK Insight Users mailing list archive at Nabble.com.
> _____________________________________
> Powered by www.kitware.com
>
> Visit other Kitware open-source projects at
> http://www.kitware.com/opensource/opensource.html
>
> Kitware offers ITK Training Courses, for more information visit:
> http://www.kitware.com/products/protraining.php
>
> Please keep messages on-topic and check the ITK FAQ at:
> http://www.itk.org/Wiki/ITK_FAQ
>
> Follow this link to subscribe/unsubscribe:
> http://www.itk.org/mailman/listinfo/insight-users



More information about the Insight-users mailing list