Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- '''Постройте архитектуру LeNet, изучив её описание в формате функции summary():
- Model: "sequential"
- _________________________________________________________________
- Layer (type) Output Shape Param #
- =================================================================
- conv2d (Conv2D) (None, 28, 28, 6) 156
- _________________________________________________________________
- average_pooling2d (AveragePo (None, 14, 14, 6) 0
- _________________________________________________________________
- conv2d_1 (Conv2D) (None, 10, 10, 16) 2416
- _________________________________________________________________
- average_pooling2d_1 (Average (None, 5, 5, 16) 0
- _________________________________________________________________
- flatten (Flatten) (None, 400) 0
- _________________________________________________________________
- dense (Dense) (None, 120) 48120
- _________________________________________________________________
- dense_1 (Dense) (None, 84) 10164
- _________________________________________________________________
- dense_2 (Dense) (None, 10) 850
- =================================================================
- Функция активации во всех слоях, кроме последнего, — гиперболический тангенс (англ. hyperbolic tangent). Сети с такой активацией обучаются лучше сетей с сигмоидой. Превзойти 'tanh' может только ReLU, но на момент разработки LeNet она ещё не применялась.
- Вызовите функцию summary(). Запустите обучение на одном объекте, чтобы убедиться в работоспособности кода (уже в прекоде).
- '''
- from tensorflow.keras import Sequential
- from tensorflow.keras.layers import Conv2D, Flatten, Dense, AvgPool2D
- import matplotlib.pyplot as plt
- import numpy as np
- features_train = np.load('/datasets/fashion_mnist/train_features.npy')
- target_train = np.load('/datasets/fashion_mnist/train_target.npy')
- features_test = np.load('/datasets/fashion_mnist/test_features.npy')
- target_test = np.load('/datasets/fashion_mnist/test_target.npy')
- features_train = features_train.reshape(-1, 28, 28, 1) / 255.0
- features_test = features_test.reshape(-1, 28, 28, 1) / 255.0
- model = Sequential()
- model.add(Conv2D(filters=6, kernel_size=(5, 5), padding='same', activation='tanh', input_shape=(28, 28, 1)))
- model.add(AvgPool2D(pool_size=(2, 2)))
- model.add(Conv2D(filters=16, kernel_size=(5, 5), activation='tanh', input_shape=(28, 28, 1)))
- model.add(AvgPool2D(pool_size=(2, 2)))
- model.add(Flatten())
- model.add(Dense(units=120, activation='tanh'))
- model.add(Dense(units=84, activation='tanh'))
- model.add(Dense(units=10, activation='softmax'))
- model.compile(loss='sparse_categorical_crossentropy', optimizer='sgd', metrics=['acc'])
- model.summary()
- model.fit(features_train, target_train, epochs=1, verbose=1, steps_per_epoch=1, batch_size=1)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement