The advent of the ‘Information Age’ has been predicted as early as the 1970s when sociologists like Webber and Rittel foresaw monumental changes in society due to an increased emphasis on data exchange and the sharing of information across many aspects of human life. Naturally the advent of the ‘Information Age’ has had its impact on the architectural profession as well as urban design. Fast forward 40+ years (during which we adopted PCs, the internet, and social media via the use of smartphones or tablets) and one can see that contemporary architects apply entirely different methods to source, manipulate, and share project related information than they have done in the past.
Technology has been one key enabler in this transition, serving architects as an instrument to facilitate novel ways of engaging with design-related information. If in the 1980s and 1990s designers used computational methods mainly to expand or rationalise their design representation capabilities (via 2D CAD and 3D renders), the new millennium has seen progressive use of design technology to explore design ideas parametrically. This is by applying tools such as McNeel’s Rhino/Grasshopper (or similar) and to coordinate detailed building component and system information virtually via collaborative platforms through Building Information Modelling (BIM).
Hand in hand with the proliferation of these methods for design exploration and delivery has come an increase in data interoperability across different software applications. Interoperability between software of similar purpose (such as from Autodesk’s Revit to Graphisoft’s ArchiCAD) is rather a mundane challenge that users should not have to struggle with (but often still have to), but interoperability across tools of different purpose shows far greater promise.
Project information is now readily accessible by anyone (with the right permission) at anytime, anywhere (via the cloud or other means of collaboration). What this means for designers is that they can increasingly make informed decisions on design changes based on feedback from performance analysis simulation that ties directly to their ‘live’ design and delivery models. We are moving from the ‘Information Age’ to the ‘Age of Validation’.
The key difference is the increasing use of models ‘for’ exploring design concepts instead of them serving mainly as a representational model ‘of’ a project. Good judgement about design-trends can be made without second-guessing or reliance on gut-feeling, on larger or more complex projects in particular. The information that can be associated to the digital models we use to shape our ideas offer useful feedback which can substantially impact the design process. The principle behind this morphogenetic approach to design is not new and it has gone through at least two decades of development, yet its consolidation across a great number of otherwise individual and distinct performance assessment (or other) applications is unprecedented.
A quick audit of auxiliary applications tying into McNeel’s Grasshopper software reveals several hundred applications, mostly based on open-source software developments. They now complement Grasshopper’s core functions with a near limitless tool infrastructure for validating a great number of design aspects. Similarly on the delivery side, data from BIM tools such as Revit, ArchiCAD or others, can increasingly be exchanged with analytical tools for cost validation, environmental sustainability optimisation, construction programming, as well as building system automation and asset tracking for Facilities Management. Interfacing capabilities among these applications are progressing and the technology enabling designers (and clients) to interact are symptomatic of our increased appetite for validation.
It is not only the architects who validate, for Engineers it is their bread and butter; in order to ensure their scientific calculus stack up to norms and regulations, ultimately ensuring structural integrity, environmental comfort and other key performance criteria. The developments described above allow them to step away from a role as facilitators of design predefined by the architect, towards greater involvement in the design process and even co-authorship. Engineers and others can become more pro-active in their engagement with the designers who can now engage them in a more informed conversation.
Next to the engineers, the contractors use the increased opportunities of information integration to validate not only the detailed construction/ installation information they receive from the various trades they operate with, but also the coordination of the construction program based on feedback about the movement of goods and human resources from their supply chain. In their case it is 4D (time) and 5D (cost) data extraction from BIM that helps them to de-risk their construction coordination. On site contractors now user laser setout to validate the positioning of construction elements against their ‘virtual counterparts’ in a 3D BIM.
The onslaught of validation doesn’t stop here. Some industry commentators argue that the group of stakeholders with the highest potential to benefit from validating the added information content on medium to large scale architectural projects is the client. Clients and their Facilities Managers are only just waking up to the possibilities that can be expected to emerge out of well formatted BIM from their supply chain. The more they learn how to tie that information to relevant benchmark data of their core business, the more they will become enabled to gain value from the information inherent to BIM and Computer Aided Facilities Management (CAFM) to operate and maintain their assets. Beyond that, validation of benchmark data across their portfolio will help them to make more informed decisions on future developments, no matter if they remain ‘owner/operator’ of any given asset, or plan to sell it off after completion.
On an even larger scale, societies have much to gain from the validation processes of information stemming from the built environment. Government agencies procuring e-commerce and the digital economy can use data stemming from the built environment and cross-reference it to a broad range of environmental, societal and macro-economic activities. We are only beginning to ask the right questions on how ‘Big Data’ may be interrogated for the common good and what it may mean for architects.
These opportunities don’t come without a warning. One might easily mistake the potential of information sharing and validating as an infallible approach to gain highly informed and thereby best design outcomes. Questions arise if the role of the architect is going to be reduced to just a filter of quantifiable data? There is a simple response to such a thought: Good design will never stem entirely from consensus and a democratic process of informed decision-making. A strong emphasis and reliance on quantifiable data that is not critically interpreted by experts will solely yield trash in-trash out results. Qualitative aspects such as an architect’s imagination and experience will remain essential to intuitively deal with highly complex design concepts, juggling a range of options while considering a great number of inputs. Being able to quickly validate assumptions made and adjust one’s thinking accordingly will become increasingly helpful the more we are able to adopt information to be validated in the first place.
What will change for architects is the fact that they will be increasingly expected to engage with the value proposition of design aspects that can be validated in a quantifiable fashion. Instead of being defensive about this prospect, they may as well start looking for ways to complement their existing design strength with added opportunities made available in this ‘Age of Validation’. New partnerships will form and new types of practices are likely to gain traction with members who embrace the ‘Age of Validation’ and turn its challenges into their advantage.