Marek Biernacki Marek Biernacki

Attention is all you need

In this post we will review "Attention Is All You Need" a ground-breaking paper that introduced the Transformer architecture, a neural network model for NLP tasks that relies on attention mechanisms to process input sequences. The paper's contributions have had a significant impact on the field of deep learning and have inspired further research and advancements in the field.

Read More
Marek Biernacki Marek Biernacki

Neural Machine Translation by Jointly Learning to Align and Translate

In this post we review the paper “Neural Machine Translation” introduces attention which suggest a way of enhancing encoder-decoder architectures. It argues that current traditional encoder-decoder architectures are bottlenecked in performance by using a fixed-length vectors.

Read More