About as open source as a binary blob without the training data
Fushuan [he/him] @ fushuan @lemm.ee Posts 2Comments 1,238Joined 2 yr. ago
Fushuan [he/him] @ fushuan @lemm.ee
Posts
2
Comments
1,238
Joined
2 yr. ago
Deleted
Permanently Deleted
Deleted
Permanently Deleted
Deleted
Permanently Deleted
Deleted
Permanently Deleted
Deleted
Permanently Deleted
Deleted
Permanently Deleted
Hey, I have trained several models in pytorch, darknet, tensorflow.
With the same dataset and the same training parameters, the same final iteration of training actually does return the same weights. There's no randomness unless they specifically add random layers and that's not really a good idea with RNNs it wasn't when I was working with them at least. In any case, weights should converge into a very similar point even if randomness is introduced or else the RNN is pretty much worthless.