What is it about?
Smart buildings are equipped with thousands of sensors that monitor everything from temperature to air quality. However, interacting with these systems usually requires specialized technical knowledge or complex software dashboards. While consumer voice assistants like Alexa or Google Home are great for basic tasks, they lack a deep understanding of the unique, complex layouts and sensor networks of commercial smart buildings. This paper introduces the "Talking Buildings" chatbot framework. It allows anyone to type a natural question—such as, "How does this smart building monitor air quality in the east zone?" The system uses Artificial Intelligence to instantly translate this English question into a specific computer database query, retrieves the exact data from the building's sensors, and then uses another AI model to write a clear, conversational summary back to the user. This makes interacting with a complex building as easy as texting a friend.
Featured Image
Photo by Walls.io on Unsplash
Why is it important?
As smart buildings scale, the sheer complexity of their data management systems creates a barrier for everyday users and facility managers. This work is highly timely and important because it bridges the gap between advanced natural language processing and the physical built environment. What makes this research unique is its methodology: rather than relying on generic, cloud-based AI that guesses answers, it combines transformer machine learning models (BART and T5) with a strict semantic blueprint of the building (Brick Schema). It translates human intent directly into precise database queries (SPARQL). As the foundational research that successfully established this pipeline—paving the way for advanced future frameworks like OntoSage—this paper provides a critical roadmap for researchers and developers. It proves that we can democratize building data, allowing anyone to monitor real-time information and detect anomalies without needing a degree in computer science.
Perspectives
Developing the "Talking Buildings" framework highlighted a fundamental flaw in how we currently interact with smart environments: consumer-grade AI simply does not understand architectural context. A major takeaway from this project was realizing that a chatbot cannot effectively manage a building unless it possesses a semantic understanding of the physical space—knowing exactly how a specific sensor connects to a specific room. My personal perspective is that the future of Human-Building Interaction (HBI) relies on anchoring AI to these structured knowledge bases (ontologies). By forcing the AI to query the building's specific digital blueprint rather than generating answers from the broader internet, we ensure accuracy and build user trust. Ultimately, giving a building a "voice" transforms it from a passive, rigid structure into a collaborative partner that actively helps occupants live and work more comfortably.
Mr. Suhas Prakash Devmane
Cardiff University
Read the Original
This page is a summary of: Talking Buildings: Interactive Human-Building Smart-Bot for Smart Buildings, November 2024, Springer Science + Business Media,
DOI: 10.1007/978-981-96-0579-8_28.
You can read the full text:
Contributors
The following have contributed to this page







