Code Repository for Liquid Time-Constant Networks (LTCs)
-
Updated
Jun 3, 2024 - Python
Code Repository for Liquid Time-Constant Networks (LTCs)
Repository for the tutorial on Sequence-Aware Recommender Systems held at TheWebConf 2019 and ACM RecSys 2018
Liquid Structural State-Space Models
Efficient Python library for Extended LSTM with exponential gating, memory mixing, and matrix memory for superior sequence modeling.
The Reinforcement-Learning-Related Papers of ICLR 2019
Contains various architectures and novel paper implementations for Natural Language Processing tasks like Sequence Modelling and Neural Machine Translation.
Implementation of GateLoop Transformer in Pytorch and Jax
Python package for Arabic natural language processing
Sequential model for polyphonic music
Pytorch implementation of Simplified Structured State-Spaces for Sequence Modeling (S5)
Repo to reproduce the First-Explore paper results
An implmentation of the AWD-LSTM in PyTorch
VOGUE: Variable Order HMM with Duration
Source code for "A Lightweight Recurrent Network for Sequence Modeling"
Deep, sequential, transductive divergence metric and domain adaptation for time-series classifiers
Caption Images with Machine Learning
3rd Place Solution for 2nd AVA Challenge@IEEE MIPR 2024
Computer vision tools for analyzing behavioral data, including complex event detection in videos.
The course studies fundamentals of distributed machine learning algorithms and the fundamentals of deep learning. We will cover the basics of machine learning and introduce techniques and systems that enable machine learning algorithms to be efficiently parallelized.
An unofficial implementation of "TransAct: Transformer-based Realtime User Action Model for Recommendation at Pinterest" in Tensorflow
Add a description, image, and links to the sequence-modeling topic page so that developers can more easily learn about it.
To associate your repository with the sequence-modeling topic, visit your repo's landing page and select "manage topics."