Recordings of Reading Paper

  • 2025/07:

    • 2025/07/26:

      • Learning technical blog written by Lilian Weng, Why we think, forthe first section. (It is too long…)
    • 2025/07/27 - 2025/07/28:

      • Rereading and writing the summary notes for the classical transformer: Attention is all you need.

      • Thought: Attention pooling is the core component for understanding the power for transformer model, where context meaning is learning via the process of self-attention.

    • 2025/07/29: Attending courses for CS336-Lecture1: Course Overview and Tokenization

    • 2025/07/30: Read papers for Byte-Pair Encoding, just like the Huffman Encoding Process! (For shorter sequence and more efficiency).

    • 2025/07/31- 2025/08/05: Busy on the project of GUI-Agent! Reading the survey and several technical reports for several main GUI-Agents, including UITARS from bytedance, Qwen-vl, etc. (See the upcoming blog)

    • 2025/08/06 - 2025/08/07: Still focusing on GUI-Agent! It is a fascinating work!

    • 2025/8/16: This section will continue updating from tomorrow :), currently working on developing projects…