Skip to content
#

luong-attention

Here are 10 public repositories matching this topic...

Language: All
Filter by language

This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras

  • Updated Sep 23, 2021
  • Python

Successfully established a text summarization model using Seq2Seq modeling with Luong Attention, which can give a short and concise summary of the global news headlines.

  • Updated May 6, 2024
  • Jupyter Notebook

The modelling pipeline aims to replicate the translation forum conducted in the imperial courts and the likes of which were rigorously formalised since the times of Dao’an 道安. As described in scholarly texts like translation of T 1543 Abhidharma-jñānaprasthāna-śāstra in 383 AD in Chang’an 長安.

  • Updated Oct 20, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the luong-attention topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the luong-attention topic, visit your repo's landing page and select "manage topics."

Learn more