Hands-on Prototyping for BUILDing Forward

Read about the unique opportunity for geometry analysis, fabrication, and the resulting gallery installation as initially reported on the Tocci Blog...

Image Credit: Jamie Farrell

Image Credit: Jamie Farrell

On July 27th, an opening reception was held for Autodesk’s BUILDing Forward exhibit at the Boston Society of Architects. This exhibit celebrates digital craft in the greater Boston community and highlights the research projects made possible by the Autodesk BUILDSpace — a state-of-the-art research and development facility in the Design Center.

Tocci partnered with Sasaki Associates to research and develop a prototype called WinterLight, a proposal for a temporary winter pavilion for the Rose Kennedy Greenway. Currently in the early design phase, WinterLight is a warming hut designed to encourage activation of the city’s public realm during the winter months. The structure is a semi-dome with strategic openings in customized masonry blocks, designed to shield visitors from winter winds while they enjoy the warmth of an interior fire pit. The final location of the pavilion will be located in Boston: the site is to be determined.

Image Credit: Lucca Townsend, Sasaki Associates

Image Credit: Lucca Townsend, Sasaki Associates

This project required extensive computational design from Sasaki staff to strike a balance between desired aesthetic and regularity of the blocks. Tocci’s role was to assist with geometry analysis and support the design process through construction feasibility studies. With each new design iteration, we utilized Dynamo Studio to extract total pavilion dimensions, overall block quantities, block sizes, repeatable types, total volume, total weight, and other metrics.

Image Credit: Lucca Townsend, Sasaki Associates

Image Credit: Lucca Townsend, Sasaki Associates

image 2b - color analysis & block types.jpg

The BUILDing Forward exhibit provided the perfect opportunity to experiment with fabrication methods and materials for producing the unique geometry of the blocks. Sasaki chose a section of nine blocks comprised of five unique types from the overall pavilion to demonstrate scale and geometric variation. They first generated a digital model of the composition, and then processed the individual shapes into toolpaths for cutting profiles from Medium Density Fiberboard (MDF) using a Computer Numeric Control (CNC) three-axis router. They continued to cut an ingenious system of holes into the MDF sheets, lining up each piece using threaded rod. This created a negative form of each block shape for pouring concrete. Each concrete form also incorporated removable sections and a hole at the top for concrete. At this time, they sanded and coated the interior surfaces of the forms with an epoxy sealer to facilitate the release of concrete.

image 3 - molds disassembled.jpg

To prep for pouring, we disassembled the forms to coat the interior surfaces with form release. They were then reassembled on the threaded rod guides and tightened using nuts and washers.

image 4 - molds coated.jpg

We blended Portland cement and sand silica to create a concrete mixture that could support the compressed weight of the stacked blocks and maintain a smooth, gallery-quality finish. As each five-gallon-bucketful of concrete was poured through the top, a team tapped the sides of the forms, agitating the mixture and forcing trapped air bubbles to the surface.

Image Credit: Christine Dunn, Sasaki Associates

Image Credit: Christine Dunn, Sasaki Associates

At times, the form release did not properly work, forcing us to pry the blocks from their forms.

image 6 - block releasing.jpg

Rotating shifts of the Sasaki-Tocci team spent a week to producing the prototype, as each block required 24 hours to cure. With one last round of chiseling and sanding, all nine blocks were ready for their BSA Space debut.

image 7 - blocks stacked.jpg

The opening reception was well attended. It was inspiring to see so many creative projects coming out of the BUILDSpace and local AEC community. BUILDing Forward will be on display at the BSA until October 5th, 2017 if you would like to see our work and all the other excellent projects.

Stay tuned for more as the WinterLight project evolves into a full-scale realization.

Check out this Sasaki blog post about the BUILDing Forward event for even more information.

Dynamo-litia Boston - September 2016

This installment of Dynamo-litia featured Timon Hazell, Sr. BIM Engineer at Silman (Washington DC).

Bringing Engineers and Architects Together Through Digital Design
Design changes that took weeks to coordinate are now happening in hours. We are now able to create new iterations of complex designs in seconds. This speed has its benefits, but it also adds complexity to current collaboration practices. How can we work better as a single design team? How can we use conceptual abstract models to generate documentation models? How can we model non-planar framing directly in Revit? You know the answer to many of these involves Dynamo! Join us as Timon Hazell from Silman shares his experiences and talks through a few case studies using Revit, Rhino, Dynamo and Grasshopper.

When: September 22, 2016
Where: Shepley Bulfinch - Boston

More information at the Boston Society of Architects.

Due to A/V difficulties, a few portions of the presentation did not make the video. To follow along AND see upcoming announcements, make sure to download the presentation slides HERE.

Automated Room Placement From Existing Drawings

Sometimes the only resource for existing conditions on a project are scans of original architectural drawings, often produced decades earlier. Scanning the drawings converts them into a digital form but these are flattened images from which no smart information can be extracted. Tracing walls, stair locations, and other building elements on top of a linked image underlay is relatively easy in Revit. Rooms however present a more difficult challenge because they are only represented by a text label and therefore many rooms may occupy the same open space. For example, a corridor may contain several appendages or alcoves that do not have physical elements separating them. This task becomes much more time consuming when placing a few hundred rooms across several levels of an existing building, which I was recently tasked with.

I began this investigation by converting the PDF scans into JPG files and opening them in Photoshop where I could quickly isolate the room names only. Once isolated, use a combination of the Gaussian blur tool, inverse select, and black color fill to convert the room name locations into a larger black blob then export each floor as a new JPG.

In Dynamo import the JPG containing the black blobs and scan the image for black pixels using the Image.Pixels and Color.Brightness nodes. You may have to try out several pixel values — using a large number in the xSamples input will generate too large of a pixel array and will take a long time to run, using too small a number may cause the node to miss some of the block blobs.

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

The Color.Brightness node returns a list of values between 0 and 1 that correspond to the brightness found in each pixel. Using some list management techniques, the list is inverted and then all white pixels (0s) are filtered out and only the darkest values (largest) are used to isolate the points where text values are located on the existing plan. Circles are created at all of the remaining points, with the radius defined by the corresponding darkness values. The entire list of circles should be matched against itself to group all intersecting circles because the Color.Brightness node may have read multiple block pixels in each text blob. Then extrude all the circles as solids, intersect any joining geometry, and use the Solid.Centroid node to determine the center point of each solid, which in theory should be the location point of each text label on the existing plans. A Count node can be used to evaluate the resulting number of text location points and determine if the total count of black blobs read in Dynamo closely matches the total number of room names labeled in the PDF scan.

If the counts are significantly different, adjust the pixel value sliders in the beginning of the definition to create a more or less dense field of points and re-trace the process up to this point. If the counts are close, the next step is to query the traced existing conditions walls from Revit model for the corresponding level. This can be done using combination of GetAllElementsOfCategory with the Category walls and then filtering only the elements on the same level.

Once the walls appear in Dynamo, you will most likely see that the resulting points from the existing plans will not be at the right scale, orientation, and location as the Revit walls. Placing a Geometry.Scale and Geometry.Node between the list of solids and the Solid.Centroid node will allow you to experiment with various scale and rotation values prior to the creation of final points. After some trial and error and visual approximation, you should be able to scale and orient the cluster of points to a configuration that matches the scale of the Revit model -- each point should look like it lands in the center point of each room.

Even after scaling and rotating the points, they may still be located off to the side of the Revit walls. To coordinate locations between Revit and Dynamo, begin by going to Manage > Setup > Line Style in Revit and create a new line called Dynamo. In the floor plan view of the level you are working on, pick a prominent element (such as a corner of the building) and draw a model line and change it to the newly created Dynamo line style. Back in Dynamo, use the Select Model Lines by Style node from the Archi-Lab package to locate the "Dynamo" line in Revit. Get the location at the start of that line using the Curve.Start node and also the location of origin in Dynamo with the Point.Origin node. A vector can now be established between these two points and used to translate all of the room locations scanned from the JPG to the same location as Revit. Note that moving from the Dynamo origin to the line drawn at the Level in plan should also move the room points to the correct elevation.

Once everything appears to line up correctly, the Tool.CreateRoomAtPointAndLevel node from the Steam Nodes package will place rooms at each point in the Revit model. For open areas such as corridors, multiple rooms will now be overlapping, potentially causing Warnings. The last step is to go through the model and draw room separation lines at logical points where divisions should occur between the room elements.

After every room has been separated in Revit, the process of populating Room Names, Numbers and other parameters is a manual one. However, if you are lucky enough to possess a CAD file for the existing conditions, Dynamo can also be used to populate the newly-created Room elements with text parameters. Begin by importing the CAD file into the Revit model and move it to the correct location. As a best practice, I tend to delete and purge all unnecessary layers in the CAD file prior to import -- in this case you may save a one-off of room text only and import that instead. In Revit if you explode an imported CAD drawing, the text will become actual Revit text objects (learn more HERE). With Dynamo, all text objects can be grouped into clusters based on shared location, matched up with the Room element center points, and then populate the associated elements with parameter information: Name, Number, Space Type, etc. Due to incongruent alignment or the use of leader lines, this workflow will most likely not work for every text item from the CAD plan but it may alleviate a large portion of the manual data entry required to populate the Revit room elements. More about this process in a future post...

Although a Dynamo-based approach requires some trial and error, it allows you to quickly place a large quantity of Revit Room elements in the exact same location as a scanned drawing. Knowing that the room locations are correct allows for quicker naming and parameter manipulation using Dynamo or other means and reduces a portion of monotonous work.

RTCNA2016 Recap - "Computational Design for the 99%"


Several weeks ago I had the privilege of presenting at Revit Technology Conference – North America 2016. My presentation frequently repeated the phrase “Because Nobody Went to Architecture School to…” We have all been there at some point in our career – continuously repeating the same manual alteration to a Revit model, changing parameter information one click-at-a-time, or performing tedious data entry for hours on end – these are the moments when you wonder if the practice of architecture is not exactly what you dreamed about in architecture school. For all the advancements that BIM has introduced to the AEC industry, production validation, and maintaining uniformity of the information are still difficult undertakings. Tasks that require hours and days of individual modifications are not always professionally rewarding and monopolize time that can better be spent on the overall quality of the design and documentation. I often tell colleagues that if you find yourself asking the question, “There has to be a more efficient way to do this”, chances are good that Dynamo can help.

I did not come from a computer programming background but instead began teaching myself Dynamo to address specific problems frequently encountered in Revit. After achieving a basic understanding of how Dynamo works, I was able to investigate tasks of increasing complexity that began with simple changes to the model and evolved to automating entire processes. As my Dynamo experience continued to grow I began exploring ways that Revit could interact with other software platforms and how data could be manipulated and visualized. My skillset eventually evolved to where I understood more advanced concepts of geometry and parametricism for design but this was all built on the foundational knowledge acquired from researching daily production tasks.


REVIT MODEL ANALYSIS
In my presentation I preceded to share a sample of workflows that respond to specific challenges encountered on projects and tell the story of tedious task automation and process improvement for architectural practice. A highlight was the opportunity to collect data on a very large healthcare project that I developed into a workflow for tracking Revit model metrics. The goal was to look for correlations between various model metrics and how long it takes to sync or open the model — one of the most significant factors of workshared projects because the extra seconds and minutes it takes to sync on a slow model multiplied by all the users on the project adds up to many hours of lost productivity over the course of the project. Dynamo is used to track the overall size of the .RVT file, query and count various elements and categories, parse the Warnings export file, then export all the information to an Excel file. In addition to collecting these general model metrics, the Dynamo task updated two additional spreadsheets with every warning in the model over time and every placed family in the model over time. All three of these of these spreadsheets were linked into Microsoft PowerBI along with data from imaginIT Clarity’s Model Metrics tool, which tracks the time it takes to open the model over time. Over the course of three months, I ran this Dynamo definition on a daily basis for a total of 68 exports.

The final takeaway will not be a surprise to those who are familiar with Revit model performance… the data revealed that Auditing and Compacting the model as well as Purging Unused Families had the most overall impact on the time it takes to open and sync the model. Although this may not be a significant breakthrough, these real-time analysis tools help monitor the health of the model and indicate when may be the best time to intervene.

The last step was to find an easy way to communicate the status of the model to the production team. Since it is the responsibility of the Model Lead on the project to audit the Central file, Warnings are the only characteristic that individual team members have the opportunity to impact. The project from which this data was collected happens to be a children’s hospital so we placed an image of a Minion on the Message Board with a visibility parameter tied to the number of Warnings. The final Dynamo task overwrites the Warning count parameter in the Revit model and the Minion changes accordingly. Now the team is aware that when they open the model at the beginning of a workday, if the Minion is purple the Warnings have exceeded 400 and some time needs to be set aside to resolve.

In the end RTC was an excellent experience. I thoroughly enjoyed sharing my perspective and bonding with my fellow colleagues from all over the world.

Special thanks to everyone who helped contribute to my work:
RTC & Committee
Shepley Bulfinch
Jim Martin
Jim Chambers
Jessica Purcell
Christina Tully
Margaret Gammill
PJ Centofanti
Jamie Farrell

Design Space Exploration with Dynamo


One of the most challenging aspects of the architectural design process is determining how to organize form to fit an overall parti. Facing endless possible geometric configurations, making sequential alterations towards a fitting result can be difficult without a means to measure suitability. During the initial phases of design research, an architect gathers essential information such as program requirements to meet a clients needs, zoning and code information for a provided site, environmental and material influences, and aesthetic preferences. These assets serve as the foundation for a constraints based design approach where parameters can be assigned in an effort to influence and control form.

Constraints in design are rules or vocabularies that influence form through the design process. An inherent feature of the architectural process is that design must be performed within a set of given parameters. Parameters help to focus the scope of an architect by narrowing the range forms and formal relationships may take within a design solution... Constraint based design takes the parameters associated with a design problem and links them to the attributes of the formal components and relationships of a solution. (Dustin Eggink, http://goo.gl/EktbQ1 )

Dynamo is an ideal platform for constraints based design because the visual programming environment allows you build a parametric model that can be quickly adjusted with changes to input values.

Once you have a functioning Dynamo definition, all of the nodes can be consolidated into one Custom Node by dragging a selection window over everything and going to Edit > Create Node From Selection. This will transition everything to the custom node editing mode -- you can always tell when you are in this mode because the background is yellow.

To create a custom node, the first step is to give it a name, description of what it does, and category (where it will be saved in the Library). All of the input number blocks (far left side) must be swapped out for Input nodes, generally named for the variable they represent. Output nodes also need to be added after the final nodes in the definition (far right side) that are providing the finalized geometry. When these steps are complete, save the node. Back in the Dynamo node space -- also known as the canvas -- number sliders can be added to the newly-created custom node. It is helpful to click the down arrow on the left side of the node to set the minimum, maximum, and step interval because large numbers can take awhile to process or crash Dynamo while zeros will often create null values and turn the majority of your definition yellow with warnings. Now you have a fully parametric custom node that allows you to explore a range of formal configurations with the simple adjustment of number sliders.

Developing custom nodes for form making allows for use with the Dynamo Customizer -- a web-based viewer currently in beta for viewing and interacting with Dynamo models real-time. This platform has a lot of potential for sharing designs in the future and allowing colleagues or clients to experiment with their own manipulations of the design.

Check out this example for the twisting tower here: Dynamo Customizer - Twisting Tower.
DISCLAIMER: you will have to request Beta access and sign in with your Autodesk ID to view this. For step-by-step instructions, visit: http://dynamobim.org/dynamo-customizer-beta-now-available/.

After guiding parameters have been established, a design space can be generated for testing all possible variations of a few select variables of a design. Design space exploration is a concept involving a virtual -- or non physical -- space of possible design outcomes. This allows the designer to simultaneously see a wide range of options and extract only those that satisfy pre-determined criteria of fitness.

The core essence of this workflow is the use of Cartesian product which facilitates comparison of all possible pairings of variables. This mathematic operation can be understood as an array of combinations between x, y, z and 1, 2, 3 (below left) or as a slope graph of all possible correlations between the two lists of variables (below right).

Using the List.CartesianProduct node calculates all possible combinations of the number range values however all of the geometry is instantiated in Dynamo at the origin point, making it appear that only one object was created even though the count shows 132 (below left). Thanks to Zach Kron and the Design Options Layout node from the Buildz package, the nested list clusters of geometric objects are arrayed according to the Grid Size spacing value (below right). Using list management logic -- such as List.Transpose, List.Map with List.Transpose, etc. -- before the Design Options Layout node will re-arrange the list structure and result in different compositions of objects.

To set up a design space in Dynamo, the inputs to the custom node are fixed values. Whichever variables that you want to test must be left empty on the custom node and number ranges are connected to the List.CartesianProduct node. The number of list inputs in List.CartesianProduct must match the number of inputs left open on the custom node. It is also important to note that the list order for the number ranges in the List.CartesianProduct node must correspond to the same order of inputs in the custom node. The total number of values from each number range will not only determine the scale and form of the resultant geometry but count of values in each list will determine the overall size and shape of arrayed objects -- this is critical to remember because an excessive number of input values may take several minutes to process or potentially crash Dynamo. After the ranges of values have been set up, the List.CartesianProduct node is connected to the Design Options Layout node which arrays all possible combinations in 3D space. Depending on the geometry being tested, the Grid Size input determines the spacing between objects. When everything is connected correctly, Dynamo will display an array of forms which can be altered by changing the number range inputs and re-running the definition. If Dynamo crashes, geometry disappears, or there is an insufficient amount of variation in the forms, continue to calibrate the number ranges and explore the limitations of the parameters in your custom node.

A successful design space arrays all possible options along two or more axis utilizing the concept of dimensionality. Design space is theoretically unlimited, however the visualization of the virtual design space is limited to the constraints of graphic representation. Color can be added to provide visual differentiation of a third dimensions such as the analysis of generated outcomes, or could represent any of the associated properties of variables. Criteria for evaluation of fitness refers to the means by which the best solution is determined.

For example, a calculation of height of twisting tower forms can be colorized on minimum to maximum gradient (below left). Another representational technique is selective omission or hierarchical modifications to the representation (below right).

Ultimately design space and its subsequent representation is nothing more than a tool, designers still have to make decisions. Design space should function as a method of exploration to make informed, confident, substaintiated decisions.


Portions of this blog post were developed in collaboration with Jamie Farrell for our course Advanced Revit and Computational Workflows taught at the Boston Architectural College.

AEC Technology Symposium 2015

AECTS2015_nametag.jpg

AEC Technology Symposium 2015
Hosted by Thornton Tomasetti (NYC)
Baruch College
September 25th, 2015

RESEARCH & DEVELOPMENT IN AEC SESSION 1:
Measurement Moxie
Christopher Connock - Kieran Timberlake

Christopher Connock emphasized the importance of approaching each architecture project as an experiment — an opportunity to test new technology and ideas — a philosophy that Kieran Timberlake incorporates into all of its projects. They are particularly exploring the frontier of data capture using a wireless sensor network to gather building performance analytics. The comparison of plant diversity and placement against soil moisture and temperature sensors in a green roof can help assess drainage, plant health, and solar gain over time. Temperate and relative humidity sensors can be used to investigate an entire space or focus on a particular application such as the performance of materials in a building envelope. Hundreds of different sensors placed throughout a building can track and transmit environmental changes across a given day or even across seasons. Kieran Timberlake implemented a wireless sensor network to help inform their renovation of their new offices in the Ortlieb Bottling Plant in Philadelphia and then developed an in-house app to capture post-occupancy feedback from their own employees about the overall comfort of the space and to identify abnormal conditions. All of this research contributed to the development of tally — a life-cycle assessment (LCA) app and add-in for Revit that evaluates the environmental impact of building materials among design options and promotes a much more eco-conscious approach to design.

Grow Up, Grasshopper!
Andrew Heumann - NBBJ

Andrew Heumann believes in the need to change the perception of design technology in the AEC industry and integrate it more into practice. He showcased an extensive portfolio of projects that have used Grasshopper for Rhino and custom written apps to simulate inner office traffic patterns and the importance of sight lines, the use of human location and city data for urban planning, and the tracking of digital tools in the office to identify focus areas for development and support. All scripts and tools developed in-house at NBBJ are documented and packaged into products for use by project teams. In addition, custom dashboards and user interfaces help reduce intimidation and increase universal adoption — for example, reducing a complex Grasshopper script to a series of slider bars that control the inputs of a parametric design. Andrew also advocated for the use of hackathons and similar hands-on user meeting formats to promote design technology as a facet of culture and process. He shared an example of how a brief hackathon with senior partners at NBBJ led to the funding of a proposal for further development of an innovative tool for optimizing healthcare patient room configurations.

Evolving Modes of R+D in Practice
Stephen Van Dyck and Scott Crawford - LMN Architects / LMN Tech Studio

The Tech Studio was founded to support the prominent role of research and development at LMN Architects in Seattle, which led to an expanded use of analytic and generative tools to drive design. They have not only embraced the use of custom digital tools for the creation and visualization of complex forms but regularly construct scale models for material testing and to explore modular strategies as part of their iterative design process. Working with fabrication in mind facilitates improved precision for collaboration with engineers and consultants. In addition, they have found that a thorough digital process and physical models help better communicate design ideas thus resulting in increased positive community feedback. The development of a ray-tracing tool for exterior acoustics studies, custom panel creation for balance in musical acoustics and aesthetics, and a highly parametric pedestrian bridge spanning a major Seattle highway are a few examples of projects that demonstrate how research is guiding principal for design at LMN.

OPEN-SOURCE DATA AND APPLICATION:
Collaboration and Open Source - How the Software Industry’s Approach to Open Sourcing Non-Core Technology Has Created Innovation
Gareth Price - Ready Set Rocket

This presentation provided insight to the current state of technological innovation through the lens of a digital advertising agency. Gareth Price emphasized that individuals should not be hesitant to share ideas out of fear that another company will benefit from them. Particularly in the AEC industry, companies do not have the overhead to for pay tool creation and requisite support, nor can they cover the cost to pay an outside software consultant. The reality is that other people are busy with their own work and do not have the time nor resources to steal your ideas and commodify them. More importantly, it is advantageous to share ideas for a project because they may elicit constructive criticism, or inspire others to contribute to those ideas and improve them. Also, do not get too entrenched on one idea and know when to pivot — the next great idea may come as an unexpected derivative of the original intention.

Key quotes:
"purpose is the DNA of innovation"
"failure is the new R&D"

How Opens Source Enables Innovation
Mostapha Roudsari and Ana Garcia Puyol - CORE studio / Thornton Tomasetti

Mostapha Roudsari and Ana Garcia Puyol exhibited many examples of digital tools that have emerged out of CORE Studio - the research and development arm of Thornton Tomasetti. The majority of examples presented originated during previous CORE AEC Technology Hackathons and were then further developed into more robust products. Nearly every tool required collaboration from multiple individuals, with expertise in a diverse mix of software platforms, and oftentimes representing different companies. The takeaway from this presentation was the value of open source and hackathons as a means for getting a group of talented people into one room to create new tools for AEC design and representation. Mostapha wanted to make it clear that more important than software tools, code, and machining is the strength and power of the user community. If you want to be at the forefront of the movement, be a developer, however the community is just as important make an effort to share ideas and spread adoption.

Here are some of the many tools presented:

  • vAC3: open source, browser-based 3D model viewer. This project led TT to further develop Spectacles.
  • Spectacles: allows you to export BIM models to a web interface that allows you to orbit in 3D, select layers, and access embedded BIM information (demo HERE)
  • VRX (Virtual Reality eXchange): a method for exporting BIM models for virtual reality viewing via Google Cardboard
  • DynamoSAP: a parametric interface that enables interoperability between SAP2000 (structural analysis and design), Dynamo, and Revit
  • Design Explorer: "an open-source web interface for exploring multi-dimensional design spaces"
  • Pollination: "an open source energy simulation batch generator for quickly searching the parameter space in building design"

For more information, check out TT CORE Studio's GitHub, Projects, and Apps pages

Open Source: Talk 3
Matt Jezyk - Autodesk

Matt Jezyk provided an introduction to Dynamo including its history and the most recent developments. Dynamo may have started as a visual programming add-in for Revit but it is quickly transforming into a powerful tool for migrating data and geometry across numerous software platforms. The talk highlighted the role open source has played in the empowering independent developers to create custom content that expands capabilities and makes interoperability possible. By keeping Dynamo open source, it has benefited from contributions by individuals with a wide range of expertise looking to satisfy specific requirements. As part of a larger lesson taken from the growth of Dynamo, Matt emphasized that the key to the emerging role of technology in practice and the AEC industry as a whole is less about learning specific tools but about codifying a way of thinking — tools are only the implementation of a greater plan.

DATA-DRIVEN DESIGN
Beyond Exchanging Data: Scaling the Design Process
Owen Derby - Flux.io

Flux has been a frequent topic of conversation lately. The company initially marketed a product called Flux Metro which boasted the potential for collecting the construction limitations of any property based off zoning, code, municipal restrictions, and property records — an ideal tool for developers and architects to assess feasibility or use as a starting point for the design process.

The company has since pivoted to focus on creating a pipeline for migrating and hosting large quantities of data for many software formats. Their new product line features an array of plugins for transferring data between Excel, Grasshopper, and Dynamo, with plans to release additional tools to connect to AutoCAD, SketchUp, Revit, 3DS Max and more in the near future. Data exported from these software programs is hosted to a repository in the cloud where it can be archived and organized for design iterations and option investigation. Flux has great potential for achieving seamless interoperability of data and geometry between software platforms, and significantly improving the efficiency of AEC design and production process.

Holly Whyte Meets Big Data: The Quantified Community as Computational Urban Design
Constantine Kontokosta - NYU Center for Urban Science + Progress (CUSP)

The NYU Center for Urban Science + Progress (CUSP) is using research to learn more about the way that cities function. Buildings, parks, and urban plans are all experiments built on assumptions in which the true results don’t emerge until years and decades later. How do you measure the “pulse” of a city? How do macro observables arise from micro behavior? Constantine and CUSP have set out to test these questions by collecting and analyzing: NYC public internet wireless access points, Citi bike share, 311 complaint reporting, biometric fitness devices, and social media. They use these urban data sources to make better decisions and form initiatives for future community improvement projects. The results also have positive implications for city planning, city operations, and resilience preparation.

Data-Driven Design and the Mainstream
Nathan Miller - Proving Ground

Nathan Miller is the founder of the Proving Ground, a technology consultancy for Architecture, Engineering, Construction, and Ownership companies. In his experiences providing training and technological solutions he professes the importance of equipping staff with the right tools and knowledge to adequately approach projects. There is an intersection between managers and leaders responsible for projects and staffing, and those who are actually doing the work. It is imperative to focus on outcomes and not get deterred by the process.

The Biggest IoT Opportunity In Buildings Is Closer Than You Think
Josh Wentz - Lucid

Energy consumption, mechanical systems data, thermal retention, and other metrics are not recorded for the majority of buildings worldwide. These are incredible missed opportunities for evaluating the overall performance of a building and collecting real-time research that can inform better construction techniques. Lucid has developed a product called BuildingOS that offers 170 hardware integration options to collect robust building data. This data has the potential for helping facilities management departments better track efficiency and maintenance of their systems, in addition to contributing to the international pool of data to help us better understand how materials and systems perform over time.

RESEARCH & DEVELOPMENT IN AEC SESSION 1
Capturing Building Data - From 3D Scanning to Performance Prediction
Dan Reynolds and Justin Nardone - CORE studio / Thornton Tomasetti

This presentation highlighted CORE Studio's use of various technologies for capturing existing conditions data and testing architectural responses through computation. They have utilized drones for capturing the condition of damaged buildings and structures by assembling fly-by photos into a point cloud.The development of in-house GPS sensor technology accurate to 1 centimeter anywhere in the world has enabled measuring the built environment and construction assemblies to a high level of precision. CORE Studio has also investigated the use of machine learning for exploring all possible combinations of building design parameters and calculating embodied energy predictions. All of these design technology advancements are helping Thornton Tomasetti design more accurate better-informed systems.

Data-Driven Design
Luc Wilson - Kohn Pedersen Fox Associates PC

Luc Wilson thinks of data as an “Urban MRI” - a diagnostic tool for measuring the existing configuration of cities and predicting future growth. Multiple FAR and urban density studies were presented that exhibited how comparison to precedents and benchmarks helps to conceptualize the data and make visual sense of the analysis. The key to prediction is the ability to test thousands of designs quickly, which Luc has perfected by developing digital tools for quickly computing all possible combinations of input parameters and producing measurable outcomes for comparison. One of the most exciting portions of the presentation was the mention of a 3D urban analysis tool called Urbane, which Kohn Pederson Fox is working with NYU to develop — could this be the new replacement for Flux Metro?

Cellular Fabrication of Building Scale Assemblies Using Freeform Additive Manufacturing
Platt Boyd - Branch Technology

Platt Boyd founded Branch Technology after realizing the potential for 3D printing at a large scale by imitating structures found in nature. Branch has dubbed their technique “cellular fabrication” where economical material is extruded with geometric complexity to construct wall panels that are lightweight and easy to transport. Their process seeks to reduce the thickness required by traditional 3D printing technologies and the intricate geometric structure provides equivalent strength to that of a printed solid. The wall panels are printed free-form with a robotic arm on a linear track and then are installed onsite where insulation, sheathing, and finish material are added to reflect the same condition as traditional wood or metal stud construction. It will be really interesting to see Branch continue to refine their methods and start to tackle complex wall conditions for use in real-life building projects in the near future.


Watch videos of all the presentations HERE.

Automated Feasibility Project


AUTOMATED FEASIBILITY PROJECT

A common circumstance of working with commercial development clients is producing feasibility studies to examine the value of pursuing a project on a particular site. Factors such as local zoning ordinances, building code, desired program, and site constraints all inform the buildable potential of a site and determine whether it is financially and strategically advantageous to invest in a project. Feasibility studies require the right balance of accuracy and efficient time management given the likelihood that many projects will not come to fruition.

My colleague Jason Weldon and I are currently embarking on a project to investigate the potential use of Dynamo for automating portions of the feasibility process. The two essential advantages to this approach are buying back time for deeper investigation and the validation of proposed schemes through iteration. Automation through Dynamo will significantly reduce initial preparation and manual input of information required to start a BIM model. Alteration of computational constraints will enact changes to the model for testing schemes and instantaneous calculations provide a report for each iteration. This process promotes efficiency and substantiates certainty.

Phase 1
Zoning Setbacks, Levels, Floor-to-Floor Height, & Courtyard

All studies begin with the same ingredients: a plot plan or civil drawing and preliminary zoning research. Working from these items alone, a maximum allowable building envelope can be established. Floor Area Ratio allowances and incentive zoning can be incorporated to evaluate proportional modifications to the overall envelope.

automated feasibility_phase 1.GIF

Here I start out by collecting the property line for a site from Revit. From the property line, I am able to establish setbacks on all five edges of the site. Constraints for the overall number of floors and floor-to-floor height allow to me to quickly explore different massing compositions. Inserting a courtyard facilitates improved residential potential for the site due to increased light and ventilation. By adjusting the distance of the courtyard from the exterior face of the building, it is possible to experiment with the proportional balance of form and floor area efficiency.

automated feasibility_definition

Download the open source definition here .

BLDGS=DATA

bldgs-data.png

BLDGS=DATA
Hosted by CASE Inc. (NYC)
The Standard High Line Hotel
May 28, 2015

Data For Understanding Cities
Blake Shaw, Head of Data Science - Foursquare

For those of you who have never heard of Foursquare, it was originally a mobile device app for sharing locations and activities with your social network - occasionally creating the opportunity for chance encounters when two friends find themselves in the same vicinity. Nowadays Foursquare has evolved into a powerful mechanism for tracking human behavior. With the exponential rise of people connected to the internet via mobile devices, how will the constant production of "data exhaust" be harvested and what can it tell us? How does a city behave as a living organism? What is the most popular activity on a hot summer day (answer: getting ice cream)? Foursquare continues to examine correlations among human behaviors and asks how we can better interact with the buildings we inhabit. Everyone experiences an environment differently but how can data derived from countless previous experiences be used to inform future experiences and provide valuable recommendations? Data is the key to optimizing the potential for enjoyable experience.

Data for Building Insight - Panel Discussion
Brian Cheng, Designer & Associate - HDR Architects Jennifer Downey, National BIM Manager - Turner Construction Peter Raymond, CEO - Human Condition

This session started off with each person sharing a little bit about technology efforts at their companies. At HDR they are utilizing a combination of a custom dashboard and parametric modeling to analyze health care program and massing test fits. In addition, advanced model sharing and co-location methods enable instantaneous coordination with engineers and consultants. Turner presented an example of how LEAN strategies, aggressive coordination and scheduling, thorough communication, all combined with robust project data have made a significant impact at their company and played a large part in the world record setting concrete pour at the Wilshire Grand in LA. Human Condition demonstrated a construction safety vest that tracks body position, biometrics, and worker location. With the further development of wearable technology, real-time information can be gathered on every worker at the construction site and a holistic safety culture can be established through the incentives of exemplary performance.

During the discussion it was pointed out that currently in the AEC industry there is a culture of "commodification of mistakes," meaning that contingencies are written into contracts, numbers are carried for unforeseen costs, and there is a standing assumption of labor inefficiencies and injuries on the job. How can BIM be better utilized to mitigate these costly errors and how can new technologies improve job safety and productivity? Perhaps tools like clash detection and co-location make for a more streamlined design and construction process. Furthermore, BIM as a platform needs to become a mode of communication between project constituents and facilitate a timely transmission of data. Another question that emerged was how can decades of on-site construction knowledge and experience be gathered and implemented much earlier on in the design and documentation phases? Strategies like pulling seasoned construction workers into the coordination meetings and using tools like a company intranet to archive knowledge and solutions were suggested. It is also imperative to seek feedback and document the process in order to ensure continuous improvement.

Data for Retail Roll-Out
Scott Anderson, VP Global Corporate Store Planning & Development - Estee Lauder Companies Melissa Miller, Exec. Director Corporate Store Planning & Development - Estee Lauder Companies

This team is responsible for identifying new opportunities for brand positioning within retail department stores and carrying out the requisite construction. After years of managing projects through email, Word, Excel, and Gantt charts it became apparent that tracking the transfer of information across multiple platforms was incredibly inefficient. The team set out to construct a custom dashboard that managed all communications, actions, specific information, and progress by project. Now project managers and company executives can enter the system at any time and review progress. The new system has enabled transparency and drastically reduced the duration of projects.

Data for Indoor Positioning
Andy Payne, Senior Building Information Specialist - CASE Steve Sanderson, Partner & Director of Strategy - CASE

With the emergence of indoor positioning systems that triangulate mobile device location using Bluetooth, wireless, and GPS, a team at CASE Inc. has embarked on a project to harness indoor location data. Using a custom app created to track employee movement throughout the workday, CASE recorded one month of data and produced this analysis. From this data, it was determined that only 2/3 of space is being actively utilized in this BRAND NEW office the company just moved into. In addition, some of the program was not being used as originally intended or seldom used at all. Disregarding the potential for future company growth, CASE has wondered how the results this post-occupancy analysis would have affected the planning of the office layout prior to signing their lease. This led to a larger conversation about the opportunity for implementation in design. For example, how can this technology be applied to doctors and nurses in a health care setting to monitor their daily routines and learn more about the way spaces are truly used? The potential for better understanding of human behavior and the development of theoretical simulations to analyze building program is very exciting.

More about the development of the app and beacon technology...

Data for Building Buildings - Panel Discussion
John Moebes, Director of Construction - Crate&Barrel Doug Chambers, CEO - FieldLens Todd Wynne, Construction Technology Manager - Rogers-O'Brien Construction

These three gentlemen discussed coordination and the use of data to avoid significant delays in project timeliness. At Crate&Barrel, many of the Autodesk software products are used on a small project team to design and build new stores throughout the world. Careful documentation allows the Crate&Barrel to bring the procurement of steel and materials in-house at a significant cost savings and drastically reduce the possibility of mistakes in the field that affect valuable components of retail design. FieldLens is a task management product that allows construction managers to better orchestrate the construction process. With the ability to assign particular tasks to specific individuals, save notes and images, review a 3D model and construction documents, and track workers on site, a superintendent can keep much better tabs on aspects of the job and managers can have a continual progress update on how the work is progressing.

Data for Galactic Growth
Roni Bahar, Exec. Vice Presedent of Development & Special Projects - WeWork

WeWork is a company that offers coworking office space worldwide via an hourly or monthly subscription model. In the last four years they have seen exponential growth leading to construction on an unprecedented scale to accommodate demand (12 new office locations in just the last year). In an attempt to manage this frenzy they have embraced modular construction as a method for standardizing construction technique, aesthetics, and material cost regardless of location or contractor. The kitchen units, cubicles, conference rooms, bathrooms, and common area furniture are all modular components built in Revit complete with detailed finish information, material takeoffs, and construction details. As much of a well-oiled machine the procurement and development arm of WeWork is, it was fascinating to hear that the one lacking component to the process is hard data and feedback. With such rapid growth and a relatively small project team, the company is building offices faster than research can be conducted to determine the success of the spaces they are producing. In the next few years as the WeWork begins to catch up with the pace, it will be interesting to see how they aggregate data to substantiate the success of the experience beyond sheer number of offices and dollars.

More on WeWork...

The BUILTRFEED team were also at the event and posted an excellent summary. Check it out!

The Glass Cage

I recently read the book "The Glass Cage: Automation and Us" by Nicholas Carr and was inspired by his message of the growing advantages and dangers of technology, particularly pertaining to design and architecture.

Carr says, "Technology is as crucial to the work of knowing as it is to the work of production. The human body, in its native, unadorned state, is a feeble thing. It's constrained in its strength, its dexterity, its sensory range, its calculative prowess, its memory. It quickly reaches the limits of what it can do. But the body encompasses a mind that can imagine, desire, and plan for achievements the body alone can't fulfill. The tension between what the body can accomplish and what the mind can envision is what gave rise to and continues to propel and shape technology. It's the spur for humankind's extension of itself and elaboration of nature. Technology isn't what makes us 'posthuman' or 'transhuman', as some writers and scholars have recently suggested. It's what makes us human. Technology is in our nature. Through our tools we give our dreams form. We bring them into the world. The practicality of technology may distinguish it from art, but both spring from a similar, distinctly human yearning."

Autodesk University 2014

Autodesk University
AU2014 Summary
December 2-4
Mandalay Bay, Las Vegas

Fusion 360 Digital Fabrication Workflows:

This course highlighted several features of Autodesk Fusion 360, a program that facilitates easy manipulation of 3D geometry otherwise difficult to achieve in Revit. After exporting a conceptual mass family from Revit to Fusion 360, an undulating wall form was created that could then be imported back into Revit, populated with curtain wall using adaptive components and rendered in a perspective street view with Google Maps background. That same form was imported into a program called Meshmixer that provides advanced options for preparing an STL file for 3D printing and allows you to apply custom supports. Altogether, the integrated workflow across several software platforms was relatively seamless and the course exhibited proof that very complex and customized geometry to be created in Revit for any project type.

Building a Good Foundation with Revit Templates:

Members of the architecture, engineering, construction and manufacturing industries gathered for this round table discussion about best practices for starting a project in Revit. Two methods were compared, the use of a Template File and Default Project. At Shepley Bulfinch, we use a template file at the outset of every project that contains a minimum amount of views, families and general standards to provide a good starting point. A default project has the advantage of carrying much more initial information including pre-placed families and objects but requires a significant time investment to keep the content current. Overall, the common sentiment in the room was that template files are easiest to maintain and offer the most versatility for any project type. Additionally, it is preferable not to “front load” Revit models and start out with unnecessary file size when one of the biggest challenges among all projects is keeping the model as small and responsive as possible.

Energy Analysis for Revit:

Are you familiar with the native energy analysis tool in Revit (hint: it’s under the Analysis tab on the ribbon)? This tool has the potential to be very helpful for early feedback to help drive the design. The task can be farmed out to the cloud for faster processing and to post reports for multiple options. For more in-depth analysis, the Revit model can also be exported to GBSxml format and opened in Green Building Studio, a cloud-based energy simulation platform. Relatively specific configurations are required within a model for the analysis to run successfully and one of the predominant takeaways of the course was the emphasis on modeling with energy analysis outcome in mind from the start of the project.

Challenges of LEAN Design and Computational Analysis:

This very engaging roundtable discussion examined the emerging role of computational analysis and generative design to help make more efficient design decisions. The keynote address at the beginning of the conference featured the use of "machine learning algorithms" where information and constraints are entered into a computer and simulations are run to determine an optimal design outcome. To start this session, we identified wastes and ineffective behaviors within each profession and in the collaboration process between. After a predictable round of architect-bashing, the question was proposed: "Does computation and simulation allow us to come to confident solutions earlier in the design process and reduce waste?" If existing condition information, user requirements, code constraints and many of the other variables that influence the design process can be programmed to generate permutations, is this a promising direction for the future of the profession? The group came to the conclusion that computational analysis and simulation will never be reliable enough to deliver a comprehensive design solution but may be helpful in providing direction at challenging moments in the process.

Practical Uses For Dynamo Within Revit:

Dynamo is a visual programming environment that allows you to make custom changes within Revit and extract information otherwise unattainable with the native program features. The program utilizes a user-friendly graphic interface to make adjustments within the Revit API (the "back end" which contains all the building blocks for how the program functions). This course demonstrated many entry-level uses for Dynamo including:

  • quickly making changes to all instances of a family type in a model (example: adjusting the offset height of all columns at once)
  • advanced family geometry (example: controlling profile order to create cantilevered and wrapped swept blends)
  • wrapping structure along curved surfaces
  • generating separate finish floor on top of slab automatically from room boundaries


Utilizing Revit Models for Architectural Visualization:

This course covered work flow and best practices for exporting a Revit model to the Unity 3D, a game engine that enables real-time visualization and walk-throughs. The first step is preparing the Revit model for export by cropping down only what you need with a section box and turning off unecessary categories in Visibility Graphics. Export the model to FBX and import it to 3DS Max where materials, cameras and lights are applied. Lastly the model is imported into Unity where perspectives, walk-throughs and animations can be utilized. In summary, Unity 3D provides a compelling presentation piece that may appeal to some clients but it is important to consider the time investment that goes into the preparation process.

Dynamic Energy Modeling:

An energy and environmental analysis consultant presented a multitude of methods for assessing daylighting, wind, weather, energy consumption and other performance characteristics of a design. Specific tools covered included eQUEST, Green Building Studio, Autodesk360 Lighting Analysis, raytracing and raycasting, Rushforth Tools Library, Autodesk Ecotect and more. Although these programs were generally too advanced for the level of in-house analysis we use at Shepley Bulfinch, I enjoyed learning about numerous ways information can be extracted from Revit and used to help inform the architectural design process.

Revit + Dynamo = The Building Calculator:

Beyond parametric modeling and making tweaks within Revit, Dynamo can be used to extract much of the information stored within a model. By using an "export to excel" function, areas, quantities, dimensions, room lists and so much more can be exported and analyzed with the powerful tools Excel has to offer. Schedules can be created, complex building calculations can be scripted and automatically updated upon every change within the model, or checks and balances for code and zoning can be integrated to produce reports. Items can then be adjusted, renamed or resized to push back into Revit from Excel and make direct changes to the model. Dynamo provides a giant step forward in the pursuit of harvesting the full potential of BIM.

The Great Dynamo Dig: Mine Your Revit Model:

With all this excitement surrounding Dynamo, did you know there is also a SQL export function? This allows for the creation of a much more comprehensive database that can be thoroughly organized using database management software and mined for analytics and appealing visual graphics in Tableau.