J

J

Thoughts

Beyond Softmax: Probabilistic Foundations and Bayesian Frameworks in Hybrid Search

Introduction In our previous exploration of probability transformations in vector search, we examined how softmax enables the normalization of disparate scoring systems into comparable probabilistic frameworks. This follow-up article delves deeper into the mathematical theory underpinning these transformations, with a specific focus on Bayesian probabilistic frameworks and their application to

By J

Thoughts

Computational Graph Logging and Differential Analysis for LLM Function Extraction

Abstract This research essay explores a novel approach to understanding Large Language Models (LLMs) through computational graph logging and differential analysis. We propose treating LLMs as complex mathematical functions and extracting their functional behavior by systematically logging kernel operations during inference. Our approach introduces two key innovations: (1) probabilistic kernel

By J

Thoughts

Computational Capabilities of Large Language Models: the Universal Approximation Theorem

Introduction The emergence of Large Language Models (LLMs) has prompted fundamental questions about their computational capabilities within the theoretical landscape of computer science. While these models demonstrate remarkable linguistic abilities, their precise classification within the hierarchy of computational systems remains an area of active exploration. Simultaneously, the Universal Approximation Theorem

By J