Keras layers need input data to perform some computations and provide the output results. There are some requirements to create a complete layer which are as follows:
- Input Shape: The input shape refers to the type of data that we provide as input in machine learning. This data can be of any type, such as image, text, or video. This input data is converted into an array of numbers that work as input for the learning algorithm. The dimensions of the input data can be specified with the help of a shape tuple. The following example will give you a clear understanding of how you can get the shape in two dimensions for an input array of numbers:
>>> import numpy as np
>>> inp_data = (22,9)
>>> shape = np.zeros(inp_data)
>>> print(shape)
[[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 0.]]
Initializers: Initializers are used to assign weight to input data. There are some functions that assign these weights to that input data. Some of these functions include:
- RandomNormal: It is used to generate values using the normal distribution of the data we provided as input for the learning model.
- RandomUniform: This function is used to generate values with the help of uniform distribution of the input data.
- Zeros: As the name suggests, it generates zeros for all the input data.
- Ones: It generates ones for all input data.
- Constants: Here, the function generates constant values specified by the user.
1. Regularizers: Regularizers are used to apply penalties on layers at the time of optimization and therefore, these are used in the optimization phase.
2. Activations: Activation is a function that allows you to find the status of a specific neuron and whether it is activated or not. The activation function is used to make nonlinear transformations of the input data which then allows the neuron to learn models better. Some activation functions are also described below:
- linear: This applies a linear function as follows:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'linear', input_shape = (22, 9)))
- elu: It is used to apply standard exponential linear unit as shown below:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'elu', input_shape = (22, 9)))
- relu: This function applies rectified linear unit. Refer to the code below:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'relu', input_shape = (22, 9)))
- selu: It is used to apply scaled exponential linear units on the learning model.
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'selu', input_shape = (22, 9)))
- softsign: It applied the softdesign function as follows:
from keras.models import Sequential
from keras.layers import Activation, Dense
model.add(Dense(512, activation = 'softsign', input_shape = (22, 9)))
- softmax: This applies the softmax function as shown below:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'softmax', input_shape = (22, 9)))
- softplus: It is used to apply the softplus function. You can also refer to the following example:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'softplus', input_shape = (22, 9)))
exponential: This following configuration is used to apply the exponential function:
from keras.models import Sequential
from keras.layers import Activation, Dense
model = Sequential()
model.add(Dense(512, activation = 'exponential', input_shape = (22,9)))
In the above functions, if you have noticed that the value of activation changes where all the content remains the same. This is because we specify the function name over there to apply regularizer functions.