Switch to Conjugate Gradient

Since I posted about autoencoder neural network on my blog (2 years ago), there are many people visit my Github for that code, Hooray !! Thank you very much everyone. And again I have a new update for that code. I switch to use Conjugate Gradient instead of generative back-propagation. Someone may think is that take 2 years to update it, NO but i’m too lazy.

While I posting autoencoder article, I realize that we need better than normal backpropagation. So I try to explore “What is the easy way to change and what algorithm to swiched to?”. Then I found that in scipy library, it contains optimization algorithm in scipy.optimize module. So you can change the optimization algorithm whatever you want that build within scipy.optimize module. This link is refer to Github page of old autoencoder but the Conjugate Gradient is on conjugate branch.

Why do I switch to Conjugate Gradient? After I study UFLDL lesson within advance optimization part. There is a phrase say that Conjugate Gradient is better than Gradient Descent (Classic Back-propagation). So I started to study about Conjugate Descent and other advance optimization, but I don’t understand them. Finally, I found out the workaround to improve my implementation by using scipy library. If anyone have any suggestion, please comment. Thanks.

Reference:

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s