Hello.

Just a student exploring the vast world of programming, fascinated by AI, which ignited this journey. Currently learning large language models (LLMs). A quiet learner who enjoys technical discussions.

LLM NOTE CHAPTER 03

Coding Attention Mechanisms Code repository: rasbt/LLMs-from-scratch This chapter explores the fundamentals of self-attention mechanisms and their implementation in natural language processing, progressing from basic attention to multi-head attention. We will implement these concepts step-by-step using Python code. 1. Attending to Different Parts of the Input with Self-Attention Self-attention allows a model to dynamically focus on different parts of an input sequence based on their relevance. Below is a simple implementation broken into key steps. ...

March 25, 2025 · 5 min

LLM NOTE CHAPTER 02

Data Preparation Stage Workflow Analysis Code repository: rasbt/LLMs-from-scratch This section explains the complete workflow of the data preparation stage: raw text → token → token ID → vector The following breaks down the core content of each stage step-by-step. 1. raw text → token Stage Description In this stage, raw text is split into tokens (words or symbols) using regular expressions. Subsequently, a unique vocabulary list is constructed by removing duplicates with set and sorting with sorted(). ...

March 23, 2025 · 3 min