cython_lstm

  • I8_313250
    了解作者
  • 59.7KB
    文件大小
  • zip
    文件格式
  • 0
    收藏次数
  • VIP专享
    资源类型
  • 0
    下载次数
  • 2022-05-28 02:54
    上传日期
Python LSTM Python library for getting things done quickly, greatly, and without waiting 50 years for compilation
cython_lstm.zip
  • cython_lstm
  • topology.py
    3.4KB
  • layers
  • element_wise.py
    2KB
  • slice_layer.py
    1.7KB
  • base_layer.py
    1.7KB
  • tile_layer.py
    1.7KB
  • temporal_layer.py
    4.3KB
  • __init__.py
    646B
  • recurrent_averaging_layer.py
    2.8KB
  • layer.py
    7.9KB
  • recurrent_multistage_layer.py
    2.5KB
  • linear_layer.py
    5.8KB
  • connectible_layer.py
    2KB
  • activation_layer.py
    1.3KB
  • loop_layer.py
    4KB
  • recurrent_layer.py
    5.4KB
  • error.py
    1.5KB
  • neuron.py
    2.1KB
  • dataset.py
    802B
  • __init__.py
    413B
  • network_viewer.py
    4.2KB
  • network.py
    6KB
  • cython_utils.pyx
    2.9KB
  • trainer.py
    3.4KB
  • read.t
    6B
  • README.md
    1.5KB
  • Recurrent Net.ipynb
    17KB
  • .gitignore
    765B
  • Cython LSTM.ipynb
    223.7KB
内容介绍
Cython LSTM ----------- @author Jonathan Raiman @date 3rd November 2014 See the current implementation [on this notebook](http://nbviewer.ipython.org/github/JonathanRaiman/cython_lstm/blob/master/Cython%20LSTM.ipynb). ## Capabilities: * Multi Layer Perceptrons * Backprop over the network * Tanh, Logistic, Softmax, Rectifier, Linear activations * Recurrent Neural Networks (Hidden states only, no memory) * Backprop through time * Draw graph of network using matplotlib ([see notebook](http://nbviewer.ipython.org/github/JonathanRaiman/cython_lstm/blob/master/Cython%20LSTM.ipynb#drawing-the-network)) * Training using SGD or batch gradient descent * Tensor networks (quadratic form connecting layers) ### Key design goals * are to mimic simplicity and practicaly of Pynnet and Cybrain / Pybrain. * Model connections using matrices not explicit connections (to get vector algebra involved) * Construct and run million parameter models for LSTM and RNN type models * Be able to run AdaGrad / RMSprop on gradients easily #### Icing on the cake * Support dtype float32, float64 (currently float32), and int32 / int64 for indices * BackProp through structure * Variable input size indices for RNN (so batches of different sequence sizes can be run adjacent to one another -- currently difficult given numpy array size restrictions) * Language Models / Hiearchical Softmax parameters * Have an interface for Theano variables if needed (avoid compilation times and make everything cythonish)
评论
    相关推荐