Aarhus University Seal / Aarhus Universitets segl

Intelligent software, healing environments

A human-centred perspective into thinking about buildings

Health and safety in the built environment is an urgent, ubiquitous, and costly issue: in 1999 the National Academy of Sciences reported that annual fatal accidents in US hospitals outnumbered deaths in motor vehicle accidents. There is significant potential for information technology (IT) solutions to assist architects and hospital directors in enhancing health and safety: computer-aided architectural design (CAAD) software tools can support decision-making through automatic building floorplan analysis, and software simulations of daily operations.

However, current CAAD tools completely fail in automatically checking essential safety requirements involving people (e.g. how they move, what they see and hear), and activities they engage in, such as: "Doors should not open onto areas where a person may be occupied with some activity, such as washing hands at a sink." "The oncology ward must be safely and easily accessible to a patient that is visually or mobility impaired."

The core issue is that current CAAD tools only consider the geometry and structure of buildings: position and lengths of walls, doors, corridors and so on. Concepts of people, perception, and activities are completely absent from current CAAD building models, thus current safety analysis CAAD tools can not incorporate scientific findings about health and safety from architectural design research (i.e. evidence-based healthcare design).

In this DFF FPT1 project Beidi Li, a PhD candidate, Carl Schultz (PI) and Peter Gorm Larsen (PI) are directly addressing these issues by developing novel decision-support software tools, with real-world impact, for analysing the health and safety of public buildings (risk identification, assessment, mitigation), specifically, two large hospitals and a large research and learning facility.

Through three key partners, this project has:

  • high interdisciplinarity: bringing together expert partners from both (a) architecture engineering, and (b) software engineering, information technology and artificial intelligence;
  • real-world impact: conducting use cases on real-world, large scale buildings including the newly opened Urban Sciences Building (53 million GBP construction cost), and the New Parkland hospital ($1.3 billion US construction cost);
  • industry outreach: direct contacts with architecture firms New Parkland (US), C.F. Moller (Denmark), Hawkins/Brown (UK)

Theoretical foundations
Contemporary Computer-Aided Architecture Design (CAAD) systems explicitly represent objects such as doors, walls, slabs based on a domain-specific data model referred to as the Building Information Model (BIM), using a standard range of geometric constructs involving points, line-segments, polygons, meshes and other complex aggregates of basic geometric primitives. Building analysis and advanced simulation is conducted on BIMs with highly specialised algorithms for ensuring structural consistency and to investigate aspects such as access, lines-of-sight, egress, energy, air-flow, cost-estimation. However, what state of the art CAAD systems and advanced structural engineering and simulation methods lack is a human-centred, semantic perspective of the building: they retain their essential structure-centred "geometric" character. There are no concepts of "people" and perceptual-locomotion in building data models, making it impossible to express human-centred constraints about health, safety, well-being that are based on e.g. visibility, movement, function (whereas positional and structural constraints can be easily expressed about doors, walls, etc.). Thus, decades of research is cognitive psychology, evidence-based design and architectural engineering is completely disconnected from state of the art CAAD tools. A further critical computational challenge is constraint checking large industry-scale building models (e.g. thousands of BIM objects, over 1 million geometric mesh faces) across an enormous number of combinations of situations: different people (wheelchair users, hearing impaired etc.), activities (washing hands, moving, etc.), contexts (time of day, (un)crowded corridors, etc.). We address these limitations by (1) modelling the semantics of empty space, and (2) tightly integrating geometric constraint solving in logic programming languages.

The shape of empty space
Consider the region of empty space around an object such as a laptop or washbasin - this region is meaningful because a person must be located in that region to perform a particular act (e.g. washing hands). The geometry of this functional space region depends on properties of the person (consider wheelchair users, children, etc.), the task, and the object. Doors have an operational space required for opening and closing; people and sensors have range spaces (which can be further refined: visibility space, hearing space,..), and so on. These are examples of spatial artefacts: regions of empty space that are rich with perceptual-locomotive semantics.

Key Contribution: Our key insight is to introduce spatial artefacts as first-class objects within building data models, on the same ontological level as material objects such as walls and doors. By augmenting BIM with these primitive building blocks of perceptual-locomotion we can readily formalise human-centred qualitative constraints e.g. safety: "The functional space of the wash basin must not intersect with the operational space of the door".

Key Contribution: Provide a unifying formal framework for checking human-centred architecture design constraints that overcomes the challenges of BIM scale and combinatorial explosion. We will implement these augmented building data models in an advanced logic programming software language from artificial intelligence: Answer Set Programming extended to support spatial reasoning, ASPMT(QS). ASP is specifically designed to handle NP-hard problems with many advanced search-space pruning features (e.g. DPLL with learnt constraints, back-jumping, symmetry breaking etc.). Our extension will enable the seamless integration of semantic-level search-space pruning (e.g. hierarchical decomposition of buildings), and highly efficient geometric constraint solving (reasoning about polyhedra, 3D meshes, moving points, etc.)

Patient visibility space (red region), investigating privacy and visual access to outdoors. Figure: Carl Schultz, AU.
Patient visibility space (red region), investigating privacy and visual access to outdoors. Figure: Carl Schultz, AU.
Taking the functional space of objects (blue regions) into account when simulating agent paths through the environment. Figure: Carl Schultz, AU.
Taking the functional space of objects (blue regions) into account when simulating agent paths through the environment. Figure: Carl Schultz, AU.
Washbasin functional space (blue region) intersects door operational space (red region). Figure: Carl Schultz, AU.
Washbasin functional space (blue region) intersects door operational space (red region). Figure: Carl Schultz, AU.
Paths through campus (blue line) interacting with shadows and sunlight (yellow region). Figure: Carl Schultz, AU.
Paths through campus (blue line) interacting with shadows and sunlight (yellow region). Figure: Carl Schultz, AU.
Hazards (e.g. steep ramps) on paths (red line) computed from wheelchair movement spaces. Figure: Carl Schultz, AU.
Hazards (e.g. steep ramps) on paths (red line) computed from wheelchair movement spaces. Figure: Carl Schultz, AU.