Contents in this wiki are for entertainment purposes only
This is not fiction ∞ this is psience of mind

User contributions for XenoEngineer

Jump to navigation Jump to search
Search for contributionsExpandCollapse
⧼contribs-top⧽
⧼contribs-date⧽
(newest | oldest) View ( | ) (20 | 50 | 100 | 250 | 500)

10 August 2024

9 August 2024

  • 11:5111:51, 9 August 2024 diff hist +34 N Template:MenuLMPrimerCreated page with "Category:Language Model Primer" current
  • 11:5011:50, 9 August 2024 diff hist +1,913 N Principles of Sinusoidal PositioningCreated page with "{{menuLMPrimer}} <div style="background-color:azure; border:1px outset azure; padding:0 20px; max-width:860px; margin:0 auto; "> ===Principles of Sinusoidal Positioning=== ====Periodicity, Frequency, Orthogonality==== In natural language processing (NLP) and deep learning, positional encoding is a technique used to incorporate information about the position of each token in a sequence into a vector representation. This allows the model to capture sequential relationship..." current
  • 11:1511:15, 9 August 2024 diff hist +11,409 N Llama attention extensionCreated page with "Category:Llama3.1 Category:AI Category:Python __NOTOC__ <div style="background-color:azure; border:1px outset azure; padding:0 20px; max-width:860px; margin:0 auto; "> = Programmatic Extension of Infini-Attention Feature = == Modifying Positional Encoding == To extend the context-length of LLama 3.1's inherent infini-attention mechanism, start by modifying its positional encoding vector.   A good place to begin is by increasing the frequency of the s..."

7 August 2024

5 August 2024

4 August 2024

(newest | oldest) View ( | ) (20 | 50 | 100 | 250 | 500)