2.2 KiB
description: List of cost functions in Neataptic authors: Thomas Wagenaar keywords: cost function, loss function, mse, cross entropy, optimize
Cost functions play an important role in neural networks. They give neural networks an indication of 'how wrong' they are; a.k.a. how far they are from the desired output. But also in fitness functions, cost functions play an important role.
Methods
At the moment, there are 7 built-in mutation methods (all for networks):
Name | Function |
---|---|
methods.cost.CROSS_ENTROPY | |
methods.cost.MSE | |
methods.cost.BINARY | |
methods.cost.MAE | |
methods.cost.MAPE | |
methods.cost.MSLE | none |
methods.cost.HINGE |
Usage
Before experimenting with any of the loss functions, note that not every loss function might 'work' for your network. Some networks have nodes with activation functions that can have negative values; this will create some weird error values with some cost methods. So if you don't know what you're doing: stick to any of the first three cost methods!
myNetwork.train(trainingData, {
log: 1,
iterations: 500,
error: 0.03,
rate: 0.05,
cost: methods.cost.METHOD
});