Computer Vision :)

  • 홈
  • 태그
  • 방명록

RNN 1

RNN, LSTM 에서 tanh 를 사용하는 이유

https://stats.stackexchange.com/questions/444923/activation-function-between-lstm-layers Activation function between LSTM layers I'm aware the LSTM cell uses both sigmoid and tanh activation functions internally, however when creating a stacked LSTM architecture does it make sense to pass their outputs through an activation stats.stackexchange.com RNN을 공부하면서 Activation Function으로 sigmoid보다 tanh를..

Machine Learning/Theory 2021.05.25
1
더보기
  • 분류 전체보기 (27)
    • Deep Learning (21)
      • Object Detection (4)
      • Pytorch (7)
      • Speech Synthesis (4)
      • OCR (6)
    • Algorithm (1)
      • 자료구조 (1)
    • Programming (2)
      • Go (2)
    • Machine Learning (3)
      • Theory (3)

최근글과 인기글

  • 최근글
  • 인기글

최근댓글

방문자수Total

  • Today :
  • Yesterday :

Copyright © Kakao Corp. All rights reserved.

티스토리툴바