TalksAWS re:Invent 2025 - Symbolic AI in the age of LLMs (DAT443)
AWS re:Invent 2025 - Symbolic AI in the age of LLMs (DAT443)
Symbolic AI in the Age of Large Language Models (LLMs)
Overview
The presentation covers the history, key elements, and modern applications of symbolic AI, and its relationship to the recent advancements in non-symbolic AI, particularly large language models (LLMs).
Symbolic AI, also known as "good old-fashioned AI", has been around for over 80 years, going through cycles of excitement and disappointment.
While LLMs have gained significant attention, the speaker argues that symbolic AI techniques remain highly relevant and can be combined with LLMs to address their limitations.
History and Evolution of AI
AI has a long history spanning over 80 years, with periods of "AI summers" and "AI winters" as the field has gone through cycles of excitement and disappointment.
Many technologies that were once considered AI, such as rule-based systems and heuristic search, have now become mainstream and are no longer viewed as AI.
The field of AI has also given rise to new programming paradigms, such as functional programming and logic programming.
Symbolic AI Concepts
Symbolic reasoning and logical reasoning are core elements of symbolic AI, which uses mathematical logic and set theory to represent and reason about the world.
Knowledge graphs and ontologies are prominent manifestations of symbolic AI today, allowing for the capture and organization of knowledge in a structured, machine-readable format.
Ontologies consist of concepts, properties, relationships, logical constraints, and individuals, and are defined using standardized ontology languages like RDF and OWL.
Ontologies capture the semantics of a domain, enabling the operationalization and automation of data processing, the uncovering of implicit information, and serving as documentation.
Working with Ontologies
When building a knowledge graph, organizations have the options to use an existing ontology, extend an existing ontology, or create a new ontology from scratch.
Assessing the suitability of an existing ontology involves considering use cases, competency questions, coverage, ontological commitments, and expressive power.
Ontology languages like RDF, OWL, and SHACL have different characteristics, such as open world vs. closed world assumptions, and can be used for different purposes, such as reasoning, validation, and inference.
Techniques like class hierarchies, concept schemes, and reasoning can be used to model and work with ontologies effectively.
Symbolic AI and LLMs
LLMs have introduced new challenges, such as hallucinations, anthropomorphism, and computational inefficiency, which can be addressed by combining symbolic AI techniques with LLMs.
Symbolic AI can provide correct answers, accountability, trustworthiness, and explainability, which are important features for many real-world applications.
Hybrid "neuro-symbolic" approaches that combine symbolic and non-symbolic techniques are emerging as a promising direction, leveraging the strengths of both paradigms.
Key Takeaways
Symbolic AI has a long history and continues to be relevant, even in the age of LLMs.
Ontologies and knowledge graphs are powerful tools for capturing and organizing knowledge, enabling data integration, and supporting reasoning and automation.
Combining symbolic AI techniques with LLMs can help address the limitations of LLMs, such as hallucinations, lack of accountability, and computational inefficiency.
Hybrid "neuro-symbolic" approaches that integrate symbolic and non-symbolic AI are a promising direction for future AI systems.
These cookies are used to collect information about how you interact with this website and allow us to remember you. We use this information to improve and customize your browsing experience, as well as for analytics.
If you decline, your information won’t be tracked when you visit this website. A single cookie will be used in your browser to remember your preference.