From BERT Theory to Real-World RAG
nlpbertrag
My experience building production chatbots with RAG architecture
Read article
Sharing my journey in front-end engineering, best practices, and lessons learned across different industries. Plus occasional musings on hobbies and life beyond code.
My experience building production chatbots with RAG architecture
A deep dive into BERT architecture exploring the Transformer encoder and multi-head self-attention mechanism
A whitebox conceptual view of how BERT works, what it learns during pretraining, and why it fits code-mixed text
Exploring why pretrained language models like BERT are practical when labeled data and compute are limited