Y = np.array()ĭepending on the use case, there are other ways to "coerce" the target to be either 1 or 0. In the OP's case, it was subtracting 1 from the target/label. If you're getting negative losses, then the solution is to coerce the target to be 0 or 1. Model.fit(X, Y, validation_data=(X_test, Y_test), epochs=5, batch_size=128) pile(loss='binary_crossentropy', optimizer='adam', metrics=) Model.add(Dense(1, activation='sigmoid')) # split into input (X) and output (Y) variables Testset = np.loadtxt('test_rows.csv', delimiter=",") Others seem to run into similar problems sometimes when training CNNs, but I didn't see a clear solution in my case. I've truncated the sample output (and code call) below to 5 epochs. I'm pretty sure binary cross-entropy should always be positive, and I should see some improvement in the loss. I'm getting a binary cross-entropy error that is negative, and no improvements as epochs progress. I can get the example in the tutorial working, no problem, with the sample dataset provided. I'm following this tutorial (section 6: Tying it All Together), with my own dataset.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |