7:50 am Chair’s Opening Remarks

Quality Assurance & Standards

8:00 am Driving Internal Awareness of Computation & Setting Training Expectations to Accelerate Adoption


• Building internal knowledge and understanding of how computation can be applied so architects and engineers want it on their projects
• Budgeting time for training and setting expectations so designers are motivated to learn it for themselves
• Creating training resources that factor in and support the different learning abilities of a diverse range of staff

9:20 am Optimizing the Extent of Standardization in Computation to Ease Sharing of Work & Quality Assurance Without Hindering Creativity & Innovation

  • Elliot Glassman Senior Associate, National Leader of Computational Design for Building Systems, WSP


• Establishing minimum standards to ensure work done can be understood by project partners and other members of staff
• Identifying the line beyond which standardization limits the ability of designers to explore new solutions and push boundaries: when do you allow this line to be crossed?
• Exploring how standards should be updated to stay up to date with changing technology and best practices, without being edited so often as to cause confusion

10:00 am Morning Refreshments

10:30 am Implementing Quality Assurance Mechanisms for Computation & Digital Solutions to Time Effectively Reduce Risk

  • Sean Page Partner, Computational Designer, Architect, RDG Planning & Design


• Exploring how to quality assure an algorithm or script to ensure that the designs being generated can be trusted
• Outlining quality assurance mechanisms for dashboards, documentation and other digital assets being shared with clients and design partners
• Ensuring quality mechanisms are time effective and are not undermining the time saving benefits of doing this computation in the first place

11:10 am Identifying Which Tasks Should be Automated to Maximize ROI & Gain Buy-In to Computation


• Working with your people and assessing data to determine which tasks are most repetitive and would reduce burdens through automation
• Hearing what data and tools you need to enable automation: the steps you need to take
• Using automation as an introduction to computation, stepping non-technical staff towards other applications

11:50 am Using Automation to Update BIM Models & Digital Twins Through Design, Fabrication, Construction & Facility Management

  • Scott Overall Senior Associate, Computational Design, SHoP Architects


• Understanding what data and level of detail needs to be included in the model
• Appreciating the role of computation and automation in keeping models up to date to ensure the digital deliverable aligns with the built asset
• Defining the business case for digital twins and automation: why should your client pay for this?

12:30 pm Lunch

1:30 pm Collecting, Cleaning & Improving the Quality of Your Data to Prepare for Machine Learning Functionality


• Understanding what data you need to inform the machine learning algorithms
• Developing formatting standards so data can be cleaned and integrated in a way that is understandable for AI
• Overcoming the challenges of data ownership and access: forging collaborations across companies to broaden the data pool available

2:10 pm Establishing the Business Case & Internal Skillset to Develop In-House Software to Step Up Computation Beyond the Limitations of 3rd Party Tools

  • Will Wang Design + R&D Lead, EDG Architecture & Engineering


• Understanding the rationale for creating an in-house team of developers who can write Python code and create in-house computation solutions: what are the capabilities beyond ‘off the shelf’ tools?
• Hearing how to hire the right people, set expectations and integrate software developers into the organizational structure of a traditional design firm
• Assessing whether internally developed solutions should be proprietary or open source: what are the pros and cons of each approach?

2:50 pm Panel: Benchmarking the Functionality, Ease of Use & Interoperability of Tools to Determine Which You Should Use for Each Task


• Reviewing the use cases and limitations of popular tools such as Grasshopper and Dynamo to ensure you’re maximizing the value of your core tools
• Revealing plug-ins and niche tools that have been proven to add value or offer interesting functionality that could be applied to your future project
• Exploring the ease of use and interoperability of each tool: how do you integrate it into your existing technology suite, processes and people?

3:30 pm End of Conference