Foundation Models

Foundation Models (FMs) are large deep learning neural networks that are trained on large amounts of data. They are also know as “general purpose ai” or “GPAI” since they are designed to be general-purpose and serve as the starting point for future AI development. Foundation vs Traditional Models Common forms of foundation models include large language models (LLMs) or generative AI models. FMs differ from traditional ML models or “Narrow AI models” in that they are pretrained on a wide range of tasks and data, allowing them to perform a variety of tasks unlike the specialized models of the past....

July 12, 2024 · 2 min · 293 words · Xavier Loera Flores

Retrieval-Augmented Generation

Retrieval-Augmented Generation (RAG) is a process that optimizes the output of LLMs do that it utilizes and references a specific knowledge or domain base that may not have been included in the LLM’s training data. RAG can be seen as a cost-efficient extension of the LLM’s abilities using an organization’s knowledge base while improving the outputs so that it remains relevant, accurate, and useful in various contexts without needing to retrain the LLMs....

July 2, 2024 · 3 min · 612 words · Xavier Loera Flores