A parallel timeline: the rise of large language models
While agent frameworks evolved, a parallel and convergent revolution was happening in natural language processing and machine learning:
-
2017 – transformers: The paper Attention Is All You Need
(Vaswani et al. 2017) introduced the transformer architecture, which dramatically improved how machines process and generate language. -
2022 – ChatGPT: OpenAI released a chat-based interface to GPT-3.5 called ChatGPT, which enabled natural, interactive conversation with a general-purpose AI system.
-
2023 – open source LLMs: The releases of Llama, Falcon, and Mistral made powerful models widely accessible and accelerated the development of agent frameworks in open source and enterprise environments.
These innovations turned language models into reasoning engines that are capable of parsing context, planning actions, and chaining responses, and LLMs became key enablers of intelligent software agents.