Llama attention extension: Revision history

Jump to navigation Jump to search

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

16 October 2024

9 August 2024

  • curprev 11:1511:15, 9 August 2024XenoEngineer talk contribs 11,409 bytes +11,409 Created page with "Category:Llama3.1 Category:AI Category:Python __NOTOC__ <div style="background-color:azure; border:1px outset azure; padding:0 20px; max-width:860px; margin:0 auto; "> = Programmatic Extension of Infini-Attention Feature = == Modifying Positional Encoding == To extend the context-length of LLama 3.1's inherent infini-attention mechanism, start by modifying its positional encoding vector.   A good place to begin is by increasing the frequency of the s..."