From BERT Theory to Real-World RAG
My experience building production chatbots with RAG architecture
Production RAG chatbot with 75%+ accuracy using AWS CDK, QuickSight, and Sonnet 4.
Scalable React + TypeScript portal with CI/CD automation and image processing.
Integrated Redshift as alternative knowledge base source for POC chatbot.
Built from scratch with Playwright E2E testing and VAPT compliance.
Chatbot with multi-model selection, reused across subsequent demos.
Cross-team delivery with postMessage auth POC and team mentoring.
My experience building production chatbots with RAG architecture
A deep dive into BERT architecture exploring the Transformer encoder and multi-head self-attention mechanism
A practical guide to labeling sentiment polarity for ML annotation
A whitebox conceptual view of how BERT works, what it learns during pretraining, and why it fits code-mixed text
Exploring why pretrained language models like BERT are practical when labeled data and compute are limited