Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Пример написания простой нейронной сети с обратным распространением ошибки #16

Open
xsa-dev opened this issue Dec 2, 2023 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@xsa-dev
Copy link
Collaborator

xsa-dev commented Dec 2, 2023

Этот пример демонстрирует, как создать простую нейронную сеть с back propagation для обучения на входных данных Х и ожидаемых результатах у.

import numpy as np

# Входные данные
X = np.array([[0,0,1],[0,1,1],[1,0,1],[1,1,1]])
# Ожидаемые результаты
y = np.array([[0],[1],[1],[0]])

# Функция активации
def sigmoid(x, deriv=False):
    if deriv:
        return x * (1 - x)
    return 1 / (1 + np.exp(-x))

# Инициализация весов
np.random.seed(1)
syn0 = 2 * np.random.random((3, 4)) - 1
syn1 = 2 * np.random.random((4, 1)) - 1

# Тренировка
for j in range(60000):
    # forward propagation
    l0 = X
    l1 = sigmoid(np.dot(l0, syn0))
    l2 = sigmoid(np.dot(l1, syn1))

    # вычисление ошибки
    l2_error = y - l2
    
    if (j % 10000) == 0:
        print("Error: " + str(np.mean(np.abs(l2_error))))

    # ошибка в пропорциях
    l2_delta = l2_error * sigmoid(l2, True)
    
    # backpropagation
    l1_error = l2_delta.dot(syn1.T)
    l1_delta = l1_error * sigmoid(l1, True)

    # обновление весов
    syn1 += l1.T.dot(l2_delta)
    syn0 += l0.T.dot(l1_delta)

print("Результат после тренировки:")
print(l2)
@xsa-dev xsa-dev added the enhancement New feature or request label Dec 2, 2023
@xsa-dev xsa-dev self-assigned this Dec 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant