About 81,000 results
Open links in new tab
  1. Encoder Decoder Models — transformers 4.11.3 documentation

    Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation, etc in 100+ …

  2. Transformer (deep learning) - Wikipedia

    Transformer (deep learning) A standard transformer architecture, showing on the left an encoder, and on the right a decoder. Note: it uses the pre-LN convention, which is different from the post-LN …

  3. Context Representation via Action-Free Transformer encoder-decoder

    1 day ago · Built on a transformer encoder decoder with rotary positional embeddings, the model captures long range temporal dependencies and robustly encodes both parametric and non …

  4. Transformer Architecture Explained: How LLMs Work

    The transformer architecture’s encoder-decoder structure provides a flexible framework for processing and generating sequential data, though modern applications often use only one component …

  5. Transformers in Machine Learning - GeeksforGeeks

    Dec 10, 2025 · 3. Encoder–Decoder Attention Queries come from the decoder. Keys and Values come from the encoder output. This lets the decoder look at important parts of the input sentence while …

  6. Transformer Encoder and Decoder Models

    These are PyTorch implementations of Transformer based encoder and decoder models, as well as other related modules.

  7. Meet the Transformers: Encoder, Decoder, and Encoder-Decoder

    Apr 2, 2025 · The original Transformer used both an encoder and a decoder, primarily for machine translation. However, researchers quickly realized that using just one of these components, or …

  8. Transformer Architecture - Wikiversity

    Nov 28, 2025 · The transformer architecture, as introduced by Vaswani et al. (2017) has two high-level components: the encoder and decoder (see Figure 1). The encoder and decoder are both composed …

  9. Understanding Transformer Architectures: Decoder-Only, Encoder

    Nov 20, 2024 · Three primary variant configurations are decoder-only, encoder-only, and encoder-decoder transformers. Each has unique characteristics, applications, and notable models.

  10. [2403.13112] Efficient Encoder-Decoder Transformer Decoding for ...

    Mar 19, 2024 · We introduce a new configuration for encoder-decoder models that improves efficiency on structured output and decomposable tasks where multiple outputs are required for a single shared …