Karl Xie
xieyongheng.bsky.social
Karl Xie
@xieyongheng.bsky.social
he | 25 | maths graduate student | CS | AI | Physics | history & reality | languages: Chinese English German French | music: synthwave & modern classical music | videogames: RTS & FPS | writing
实习了一段时间,上班都是在vibe coding业务代码,下班还有一堆事等着做,刷刷新闻又发现整个世界越来越疯狂,突然有一种心力耗尽的疲惫感。

又是一个周末,天气晴朗,只想坐在阳台上什么也不干,只是晒太阳。
January 24, 2026 at 3:22 PM
做正确的事情,觉察和接纳恐惧和焦虑,让它们像风一样穿过你的身体。
February 9, 2025 at 4:04 PM
平静的大海培养不出优秀的水手。
February 3, 2025 at 10:37 PM
Google Colab,

short for Colaboratory, is a free, cloud-based platform that allows users to write and execute Python code in a Jupyter Notebook environment.

colab.research.google.com
Google Colab
colab.research.google.com
January 8, 2025 at 4:42 PM
PyTorch documentation

pytorch.org/docs/stable/...
PyTorch documentation — PyTorch 2.5 documentation
pytorch.org
January 5, 2025 at 3:59 PM
有一些单词天然就更基本,更重要,使用频率更多。
January 3, 2025 at 11:35 PM
将相近的语言比较着学是一种非常有趣的体验,比如英法德。
January 3, 2025 at 11:34 PM
单词是描述自我和世界的基本单元,能够在写作和口语中主动使用的单词越多,越精细、微妙,说明你对自我和世界的认知水平越高。
January 3, 2025 at 11:33 PM
Everyone should read the Constitution of the United States.
January 2, 2025 at 11:27 PM
悲观没有任何正面意义,只会消耗心力。

宁可错误地乐观,积极地解决问题,也不因为正确的悲观而缺乏行动力,错失未知的机会。
January 1, 2025 at 7:58 PM
语言的边界就是精确思维的边界。
January 1, 2025 at 7:37 PM
Understanding LLMs: A Comprehensive Overview from Training to Inference

Yiheng Liu, etc.

Source: arxiv.org/abs/2401.02038
Understanding LLMs: A Comprehensive Overview from Training to Inference
The introduction of ChatGPT has led to a significant increase in the utilization of Large Language Models (LLMs) for addressing downstream tasks. There's an increasing focus on cost-efficient training...
arxiv.org
January 1, 2025 at 4:02 PM
Glorot, Bengio. Understanding the difficulty of training deep feedforward neural networks. 2010

Source: proceedings.mlr.press/v9/glorot10a...
proceedings.mlr.press
January 1, 2025 at 2:33 PM
Inductive Representation Learning on Large Graphs

William L. Hamilton, Rex Ying, Jure Leskovec

Source: arxiv.org/abs/1706.02216
Inductive Representation Learning on Large Graphs
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing ...
arxiv.org
January 1, 2025 at 2:31 PM
Attention Is All You Need

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin

Source: arxiv.org/abs/1706.03762
Attention Is All You Need
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and d...
arxiv.org
January 1, 2025 at 2:30 PM
How Does Batch Normalization Help Optimization?

Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, Aleksander Madry

Source: arxiv.org/abs/1805.11604
How Does Batch Normalization Help Optimization?
Batch Normalization (BatchNorm) is a widely adopted technique that enables faster and more stable training of deep neural networks (DNNs). Despite its pervasiveness, the exact reasons for BatchNorm's ...
arxiv.org
January 1, 2025 at 2:26 PM
Understanding LSTM Networks

Source: colah.github.io/posts/2015-0...
Understanding LSTM Networks -- colah's blog
colah.github.io
January 1, 2025 at 2:13 PM
Andrej Karpathy blog: A Recipe for Training Neural Networks

Source: karpathy.github.io/2019/04/25/r...
A Recipe for Training Neural Networks
Musings of a Computer Scientist.
karpathy.github.io
January 1, 2025 at 2:12 PM
PyTorch, Dynamic Computational Graphs and Modular Deep Learning

source: medium.com/intuitionmac...
PyTorch, Dynamic Computational Graphs and Modular Deep Learning
Deep Learning frameworks such as Theano, Caffe, TensorFlow, Torch, MXNet, and CNTK are the workhorses of Deep Learning work. These…
medium.com
January 1, 2025 at 2:08 PM
番茄钟+费曼学习法+一天至少有半小时充分放松娱乐,这是过去一年经过实践证明效果良好的自律方法。
January 1, 2025 at 12:25 AM
长期的计划和抽象的目标,一定一定要拆解成每天、每周、每月的具体行动,贯彻落实。
December 31, 2024 at 12:36 PM
5 most important keys to success:

* passion
* persistence
* focus on the core issue
* attention to detail
* PDCA
* calm
December 27, 2024 at 11:26 PM
In the midst of the uncertainties of life, some feeling that a possible access to protection existed was essential. Men depended on the gods for reassurance in a capricious universe.
December 25, 2024 at 11:04 PM