This thesis addresses the following two problems related to the field of Machine Learning: the acceleration of multiple Long Short Term Memory (LSTM) models on FPGAs and the fault tolerance of compressed Convolutional Neural Networks (CNN). LSTMs represent an effective solution to capture long-...
In particular, two physics-informed multi-LSTM network architectures are proposed for structural metamodeling. The satisfactory performance of the proposed framework is successfully demonstrated through two illustrative examples (e.g., nonlinear structures subjected to ground motion excitation). It turns out...
multiLSTM for Joint NLU this implements a recurrent model for joint intent detection and slot-filling for the NLU task. requirements gensim==3.4.0 h5py==2.8.0 Keras==2.2.0 keras-contrib==2.0.8 keras-utilities==0.5.0 numpy tensorflow==1.9.0 ...
最近在使用multi-GPU LSTM做一些实验,但是总会遇到 None values not supported这个问题,查了好多资料,尝试了各种方法总算是解决了,一下是我总结的一些方法,希望能够帮助到大家。 在bilt_model()中调用定义的函数时不能使用tf.placeholder定义的tensor with tf.device('/cpu:0'): in_data = tf.placeholder(tf.flo...
第五条输出是隐藏层输出,为“序列从左往右最后一个隐藏层状态输出”和“序列从右往左最后一个隐藏层...
We propose a transferred Spatio-temporal deep model based on multi-LSTM auto-encoder model to fill the missing air pollution data values in this paper. The model combines transfer learning, autoencoder and LSTM, wherein LSTM can efficiently learn temporal information from long-term dependencies, th...
利用小波变换、注意力机制和LSTM对BGP数据进行异常检测(有监督)。 2.2 写作动机 之前人们使用LSTM做时间序列异常检测多是用stack-LSTM,这样带来的问题是参数量较大,并且模型过于复杂。本文作者首先使用小波变换提取出原始序列的多尺度信息,然后用LSTM和注意力机制提取各个尺度的信息,最后用单层LSTM做分类...
multiLSTM for Joint NLU this implements a recurrent model for joint intent detection and slot-filling for the NLU task. requirements gensim==3.4.0 h5py==2.8.0 Keras==2.2.0 keras-contrib==2.0.8 keras-utilities==0.5.0 numpy tensorflow==1.9.0 ...
Jánoki, Imre, Ádám Nagy, Péter Földesy, Ákos Zarándy, Máté Siket, Judit Varga, and Miklós Szabó. 2023. "Neonatal Activity Monitoring by Camera-Based Multi-LSTM Network"Engineering Proceedings55, no. 1: 16. https://doi.org/10.3390/engproc2023055016 ...
LSTM and GRU (Gated Recurrent Unit) are such recurrent models. For our experiments, we implemented an LSTM stack and a GRU stack as well, extended with fully connected layers. Stacks constructed from these recurrent architectures were suitable for working on the multivariate input features. The ...