Day 1

I wasn’t always into AI but that has changed somehow. I am a big user of GPTs though. Mainly used those to get understanding of some scientic research paper or rephrasing a mail or message. The turning point came when a YouTube video kindled the connection to an unfinished project from some previous gig. For some reason, I was searching for the good videos that explains these relatvely newer trends. I have seen some short videos over GPTs, LLM and RAGs but never gotten too deep into those.

At one of the former company, we were building the search service which parses the document (PDFs, mails, messages, logs etc.) for consumer to find relevant document based on the keyword search and link those to a entity. Essentially it would have created knowledge graph. This was pre-LLM era. Now GraphRAG can be used to do this.

I recommend to watch GraphRAG: The Marriage of Knowledge Graphs and RAG: Emil Eifrem . Emil explores evolution from the early search engines to Google’s PageRank algorithm which became Knowledge Graph: things, not strings . Next big thing is GraphRAG, combining Knowledge Graph with LLMs with structure relationship. This got me interested enough to go deep into the workings of the new gen AI (not the GenAI, might be). Emil has suggested Three reseach papers, I’ll put

Running LLM locally

I breifly explored how to run LLMs locally asm well. I think Ollama is the choice. I have 6GB VRAM machine which might fit for llama3.2:1b (1b - 1 Billon parameters). Space needed for the Weights and KV cache would be ~3.5 to 4 GB. I will try it out and let you know!

About the Applied AI

I am interested in workings of the LLM, rather say workings of Machine Learning, Deep Learning and Trasformer Architecture. To learn these, I would require working knowledge of enough math, mostly covering differentials and integrals, probability and matrices. Python as well. I have not touched the Python since 2019. time to brush up!

AI Deep Dive: The Game Plan

Okay, time to get serious. I’m not just interested in using AI anymore; I want to crack open the hood and see how the engine works. Specifically, I want to understand the core stack that makes LLMs possible: Machine Learning (the foundation), Deep Learning and The Transformer Architecture. To do this, I need to rebuild the toolkit: The Math (Gotta get comfortable with the language it’s all built on), Calculus: Differentials and Integrals, Linear Algebra: Matrices, vectors, etc. Probability & Stats: Box and Whiskers - all the whisking.

And The Code: Time to reunite with an old friend: Python. I haven’t touched it since 2019. Zero panic, just a fact. It’s time for a major brush-up. Libraries like NumPy, Pandas, and PyTorch are waiting.

Planning to dedicate two days for the AI in a week from my extremely busy schedule of the sabbatical 😄 (10 months and counting!) Other days I am exploring some computer science things like Composing Data structures, OS, some backend stuffs and Documentries like World at War, Cosmos etc.

Will keep posting! Peace ☮️