I read documentation for restoring a session. There is one thing that is not clear to me. Save/Restore work only for weights and variables defined by tf.variable or it take in account other things like for example the learning rate? I notice that the learning rate from one session and a following session of training is keep decaying , while my idea was that he restarted. I argued that since I notice very different results from a unique session of, let's say 20000 steps with learning rate that decayed every 2500 steps, and two session with the second that restored the first, using the same network and parameters obiously. Since the 2 session-based is everytime better (in terms of loss/accuracy) I thouth the learning rate was resetted in the second, instead printing it I notice that is decaying from the lasy value of the first session. Speaking about data, during the first session we largely cover all the data, and we use the same also for the second session. Are there other parameters that can affect these results? Thanks
How to train a CNN or RNN model on own dataset in TensorFlow?
How to suppress verbose Tensorflow logging?
Tensorflow, what is saved with tf.train.Saver?
Using real numbers output instead of argmax gives nan error
Is there a way to use a tensor as an input to functions such as tf.reshape or tf.constant
How to use model.ckpt and inception v-3 to predict images?
How to get the currently active tf.variable_scope in TensorFlow?
Tensorflow: Which graph statements are executed after the graph is built?
Tensorflow LSTM Regularization
tensorflow won't release system memory after force quit
How to understand the term `tensor` in TensorFlow?
Dynamic function call, depneding on condition in tensorflow graph
Overfitting with batch normalization [tensorflow]?
Feeding both jpeg and png images to the pre-trained inception3 model?
Using TensorArrays in the context of a while_loop to accumulate values
Gaming GPUs and TensorFlow