AEC Technology Hackathon 2016

Last month I had the pleasure of attending the fourth annual AEC Technology Symposium and Hackathon put on by Thornton Tomasetti's CORE Studio in New York City. The symposium kicked off with many fantastic speakers, I highly recommend checking out the full videos of the presentations over on the TT CORE Studio Youtube playlist. As with last year's symposium, I was personally most impressed with the work presented by Luc Wilson and Mondrian Hsieh demonstrating the use of computational design and custom digital tools for urban planning and visual analysis with Kohn Pedersen Fox's Urban Interface.

This year was also my first ever participation in a hackathon. I registered with the goal of teaming up with technology enthusiasts and individuals from other disciplines to see if I could help develop a solution for some of the pain points frequently encountered during the design and documentation process. My hope was that I could leverage my Dynamo knowledge and experience in frequently uncovering barriers in architectural practice to learn something about coding bespoke applications and user interfaces from those more familiar with the software side of the industry.

Image courtesy of Thornton Tomasetti CORE Studio

Image courtesy of Thornton Tomasetti CORE Studio

The goals of the hackathon were simple...
This event is organized for programmers, web developers, and AEC experts to collaborate on novel ideas and processes for the AEC industry. The focus will be on digital/computational technologies that have been used on projects, the lessons learned from them, and how it impacted the overall project workflows. The Hackathon aims for attendees to learn new skills, generate new ideas and processes for the AEC community through data-driven design and customized applications.

Everyone had approximately 24 hours to assemble teams, formulate an idea, and get to work trying to create a prototype. After the 24 hours, each team was to report out on what they had created and a panel of judges would determine the winners. For the first hour, individuals from the group of 60 or so hackers had a chance to pitch their ideas and attempt to attract a team. Following introductions, everyone mingled and quickly decided which topics they found most interesting and figured out what skill sets were required to fulfill the goals of the project.

The team I joined forces with all were attracted to an idea originally proposed by Timon Hazell:
When continuously exchanging Revit models among constituents on a building project, it is a time-consuming process to track down what changed between versions. In an ideal world, the architects, engineers, or consultants who are sending the updated model will write a summary or list the changes but this rarely actually occurs. Therefore, the traditional approach typically involves a painstaking process of opening both models simultaneously on separate monitors and spotting differences via visual comparison. Is there a better way to see what has changed between two versions of a Revit model and analyze just how much has changed throughout the project?

We ended up with a fantastic team of diverse perspectives to tackle this problem:
Me - represented the architecture side: knowledge of project delivery, recurring challenges, and opportunities for process optimization
Timon - represented the engineering side: spends significant time receiving and interpreting design intent from the architect with little documentation of changes
Charles - represented the consultant side: acoustician with decades of architecture experience, also regularly receives design intent from the architect and must intepret
Matt - represented the software side: experience developing custom digital tools and troubleshooting prepackaged software solutions to enhance AEC production

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

From left to right: Kyle Martin, Matt Mason, Charles Prettyman, Timon Hazell

The first step was to define the problem: what are all the factors that constitute a change in a Revit model? After some brainstorming we identified 4 key change types:

  1. Elements added to the model
  2. Elements deleted from the model
  3. Family type or information parameter changes
  4. Geometry changes: location, shape, or size

We set out to create a two-part solution to this problem. First, a C# Revit add-in that essentially acts as a "diff" to compare all Revit elements between two models and generate a list of viewable items. Second, a JSON file and accompanying Dynamo workflow that would produce a data visualization for targeting concentrations of changes throughout the project.

Our trusty C# guru Matt immediately began coding the Revit add-in while the rest of the team created sample Revit models and cartooned out the data visualization component. After many hours of relentless coding, the first add-in prototype was ready to test. With a few rounds troubleshooting we were able to isolate the first list of altered Revit elements and export the first JSON file. The parameters associate with each Revit element contained within the JSON file allowed us to start building a Dynamo definition to restructure and visualize the data using the Mandrill package from Konrad Sobon. By early morning we had a working Revit add-in that mostly accomplished what we were looking for and began working out the kinks in the Dynamo workflow. As time began to evaporate in the final hours, we scrambled to test and troubleshoot the tools, assemble our presentation, and develop documentation. Ultimately we decided on the name Metamorphosis to represent the transformation of Revit models over time and their evolution into thoroughly-coordinated built form.

At the end of the Hackathon, approximately a dozen projects were presented to the judges in 5-minute maximum time allotments. Our team tried our best to efficiently explain the initial idea and walk the crowd through how the tools developed were a viable solution that would be easy to deploy to the average Revit user. After some deliberation, the winners were announced and we were thrilled to find out that we took second place largely in part because of the practicality of the problem we chose and the willingness to share our solution as open source.

And the development didn't stop there...

Following the hackathon, the code was improved in the Revit add-in to fine-tune some of the desired features. In addition the Dynamo definition was cleaned of its hackathon-induced spaghetti and properly labeled. And most importantly, everything was updated and organized into a GitHub repository.

INTRODUCING "METAMORPHOSIS" - An Open Source Revit Change Analysis Tool
Running the model comparison add-in results in a list of Revit elements that can be filtered and re-sorted. Clicking on the categories and individual elements adds them to the selection in the active view and zooms to their location.

List of changed elements sorted By Category (left) or By Change Type (right)

List of changed elements sorted By Category (left) or By Change Type (right)

Clicking the Color Elements button will apply Override Color in View to all elements that fall under 3 change types:

  1. Green - New Elements
  2. Blue - Geometry Change (size or shape)
  3. Red - All Other Changes: modifications to parameters, location, rotation, etc.
The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

The Color Elements feature works any view type: plan, RCP, section, 3D axon, etc.

For some of the change types, an Analysis Visualization Framework (AVF) object appears:

  1. A box for an element that has been removed
  2. Arrow(s) for an element that has changed location (in the case of elements like walls, there are two location points so you get two arrows)
  3. A symbol if an element has been rotated

On the Dynamo side, opening the .dyn file and browsing to the exported JSON file will process the accompanying data for visualization in Mandrill. Clicking "Launch Window" in the Report Window node to the far right will open up the interactive data visualization window containing 4 chart types:

  1. Donut Chart (upper left) - number of changes by element type
  2. Stacked Bar Graph (upper right) - number of changes by change type
  3. Bar Graph (lower left) - percentage of items changed vs. total number of items for each category
  4. Parallel Coordinates (lower right) - number of changes for each level, each overlapping line represents a different Revit element category


  • colorize elements in any active view to quickly identify changes, much more efficient than previous methods
  • color by change type allows you to target specific changes
  • sorting, filtering, and element selection in add-in interface allows for quick location and isolation of elements
  • quickly evaluate where most changes are occurring with analytics/visualization, this is particularly useful if the model comes with no documentation
  • compare current state to any previous model, helpful to tell the story of location and amount of changes over time
  • not just a tool for coordinating/viewing changes but making sure you cloud revisions as you go if the drawing set has already been issued

Interested in trying this tool out? Here is where you can access the datasets and learn more:

Github Repository
DevPost page
Presentation slides
Youtube Screen Capture Demonstration

In the end I got exactly what I wanted out of the hackathon experience. I was able to work with three individuals who possessing completely different skill sets than my own. I provided the team with background context and understanding of the problem from an architectural perspective so that we could devise a technological solution. More specifically, Timon and I pushed ourselves to utilize tools that we would not regularly encounter in practice and capitalize on the opportunity to learn the Mandrill package for Dynamo, JSON data formatting, Revit add-in configurations, and establishing a GitHub workflow for sharing and maintaining associated files.

A huge thank you goes out to the Thornton Tomasetti crew who worked so hard to put on such a well-executed event. Thanks to the judges who volunteered their time to hear all of our frenzied, sleep-deprived 5-minute-plus presentations. Lastly, shout out to my teammates who all worked tirelessly to make our idea a reality!

Recapping an Unforgettable Week of Design Technology

On Monday, November 14 I took the stage in front of 80 eager faces to help explain what in the world is Dynamo and how does it apply to the work that I do? For the fourth consecutive year, Autodesk University in Las Vegas began with a pre-conference Computational BIM Workshop. This year I received an invitation from the Dynamo production team (headquartered in Boston) to co-teach one of 3 sessions -- 2 Beginner and 1 Advanced. After a general introduction to what Dynamo is and how it functions, we covered topics such as: basic parametric principles, geometry generation, Revit element instantiation, Excel interoperability, and BIM parameter modification. It was really fun to not only teach such a large group of brand new adopters but to walk around the room and see the diversity of experience and computer skills. With nearly 250 combined workshop attendees and over 49 Dynamo course offerings at Autodesk University this year, it is clear that Dynamo is emerging as an essential tool in the AEC industry.

Monday night concluded with several networking events where I had the pleasure of meeting many renowned Dynamo enthusiasts from around the world who I had only communicated with through social media, blogs, and email. Highlights include: many members of the Dynamo team, the founders of several other Dynamo user groups, the development team, and other various colleagues, consultants, and computational designers and technologists.

East & West Coast user groups unite. Me with Cesar Escalante, founder of the San Francisco Dynamo User Group.

East & West Coast user groups unite. Me with Cesar Escalante, founder of the San Francisco Dynamo User Group.

On Tuesday I had the fortune of attending a full schedule of courses including:
Revit Analytics with Dynamo
Revit API for Designers -- Use Cases for Extending Creativity
Design Strategies with FormIt 360
The Future of BIM Will Not Be BIM -- and It's Coming Faster Than You Think

The entire trip culminated with my participation in a Dynamo Design Slam in front of a live audience in the Exhibit Hall. We were tasked with the following challenge:
Design the Las Vegas Strip’s newest attraction, hotel, or casino using Dynamo. But, do it live, on stage, Iron Chef-style against three other people, all in 30 minutes!

After a colorful introduction for each champion where I was affectionately dubbed "The Colossus of Color" (among other things), every move in the heat of competition was commentated by Anthony Hauck and Ian Keough.

Mid-Slam - all the competitors racing to finish their geometry using Dynamo.

Mid-Slam - all the competitors racing to finish their geometry using Dynamo.

Post-Slam - from left to right: Ian Keough, Me (competitor), John Pierson (Winner), Aparajit Pratap, Ritesh Chandawar, Ian Siegel (competitor), Racel Williams, Zach Kron, Colin McCrone (competitor), and Anthony Hauck. Image courtesy of Aparajit Pratap.

Post-Slam - from left to right: Ian Keough, Me (competitor), John Pierson (Winner), Aparajit Pratap, Ritesh Chandawar, Ian Siegel (competitor), Racel Williams, Zach Kron, Colin McCrone (competitor), and Anthony Hauck. Image courtesy of Aparajit Pratap.

The competition was an absolute blast but in the end I did not emerge the winner, congratulations John! 30 minutes is barely enough time to make anything significant in Dynamo but here is the result I was aiming for:

Full Dynamo definition available HERE.

Full Dynamo definition available HERE.

The moment the Design Slam ended, I hopped in an Uber for the airport and caught the red eye flight back to Boston to prepare for my looming ABX presentation...

After a short day of recovering from Autodesk University, I prepared for an 8am presentation at ABX2016 on Thursday, 11/17 with my colleague Jason Weldon [LinkedIn]. Our talk was titled "Arrested Development: Design Technology & Expediting Process" and showcased the use of various technological approaches for performing feasibility studies, data visualization, and design validation.

We started things off by exposing the audience to others' work that has inspired us over the last year and then walked through a case study that demonstrates how tools easily available to most offices--Revit, Dynamo, Excel, PowerBI, and web visualization--can significantly enhance and expedite tasks commonly encountered in architectural practice. Our intention was to help strengthen the Boston AEC community through our message that these tools are approachable and highly beneficial.

To learn more about our case study, please check out my blog post Automated Feasibility Project - Part 2.

In the months leading up to ABX2016, I joined forces with MakeTANK to help with the design of a pavilion for ABX. The MakeTANK is a committee at the Boston Society of Architects committed to exploring the surging role of digital fabrication, making, and material innovation in the Boston AEC community. The Pavilion was an overwhelming success and I was very proud to have participated in it's conception alongside individuals from: Jaywalk Studio, Sasaki, Shepley Bulfinch, CW Keller, SMMA, Studio NYL, Bluebird, and many other Greater Boston Area offices. On Thursday afternoon I helped the team deconstruct the Pavilion to be stored for future appearances at other local design events.

For more information about the founding of MakeTANK at the BSA and the creative process that went into the realization of the Pavilion, you can read my blog post MakeTANK Pavilion.

DLR GROUP - Des Moines:
By Friday, 11/18 I found myself in Des Moines, Iowa preparing for some rest and relaxation on a personal vacation however it would not be a successful journey without stopping by DLR Group to visit my friend and fellow Design Technology enthusiast, Ryan Cameron.

Ryan generously offered the opportunity to be a guest in a series he has been running internally called DLR Dynamo Next. I enjoyed presenting virtually to a handful of offices about the value and ease of use of Dynamo. I even got to drop a plug for the Dynamo-litia and encourage participation in the global user community. It was really enjoyable to meet some new folks and share my experiences.

...And to my complete surprise, a few weeks later a coffee mug with my personal logo mysteriously arrived at my desk back in Boston. As an act of gratitude for my visit -- and a perfect example of classic Midwestern generosity -- it turns out that Ryan was the sender. Thanks Ryan!

image 10_coffee mug.png

In the end, the entire week was a whirlwind adventure but the experience of a lifetime!

Automated Feasibility Project - Part 2

One of the earliest personal research projects for me using Dynamo was the concept of automating the traditional feasibility study process in architecture. When working on commercial and multi-family residential projects, developer clients often explore several properties to see which have the most potential for financial returns on investment. The client typically requires a feasibility report in a matter of days or even hours to quickly give insight on whether to pursue a property or not. To provide this data often requires me to maximize our efficiency within the time allotted.

The feasibility study process is highly repetitive and ripe for automation. Regardless of the site; conditions such as lot size, zoning, code requirements, public right-of-ways, and various other factors set constraints that can serve as a starting point a parametric model. Once built in Dynamo, a multitude of inputs can be flexed to explore different outcomes. More often than not the developer is looking for total area, number of levels, specific program mixture, floor-area-ratio (FAR), and other factors to create pro formas and evaluate the financial viability of pursuing the project. Having a flexible, easy-to-adjust parametric model allows for rapid test fits and option generation. Emerging web and data visualization tools further enable sharing and input with team members and clients.

In August of 2015, my colleague Jason Weldon and I made a first pass at building an approach using Dynamo based on experiences we had encountered in practice. This first Dynamo definition was error-prone but built a foundational understanding of how to structure for parametricism and how the resultant information could be extracted to satisfy the typical deliverable. It was this baseline project that inspired a second round of research for a presentation at Architecture Boston eXpo 2016. In preparation for ABX, we made the decision to specifically utilize a tool set readily available to most architecture offices: Revit, Dynamo, Excel, PowerBI, and web visualization.

The first step was to develop a system in Dynamo that could apply setbacks to any edge of a property shape regardless of size, angles, direction, and other characteristics of the geometry. The original attempt at this project revealed that using visible geometry -- points, lines, surfaces, solids -- was unreliable and at times computationally taxing. In Dynamo there are currently tools to evenly offset a shape in a chosen direction but there is not an easy way to pick and choose some edges to offset while keeping others in their original location. The most efficient approach would be to use vector calculations and plot the new corner point coordinates provided the property boundary lines and desired offset distances.

For any given corner, this mathematical strategy calculates the offset distance along the adjacent/opposite angles (SOH-CAH-TOA) and plots the new point location. After each corner is repositioned, a brand new polygon is established that represents the maximum build area for that site.

This Dynamo definition now becomes a universal mechanism for applying setbacks to any property shape. Polygons from software platforms such as Revit, Adobe Illustrator, or CAD can be fed into Dynamo to begin a feasibility analysis. As a test, I created an array of various 6-sided polygons and applied a different offset to 2 of the 6 sides. The resultant shapes could then be extruded to a maximum allowable height and produce floor plates at a set floor-to-floor heights in order to assess the total area of each site.

Once the foundation was established for a Dynamo approach, we kicked off a case study by looking at the criteria of fitness for evaluating what constitutes a satisfactory design outcome. Factors like density, usable area, design aesthetic, budget, and function all play an important role in the success of a project and it is crucial to set targets early on. A sample site was chosen and a list of site assumptions were compiled to begin establishing Dynamo constraints.

From this property shape we decided to isolate and compare 3 schemes for their proximity to a target FAR -- thus revealing the most efficient design. The steps in Dynamo included applying the required setbacks to the property boundary, specifying the maximum building width to allow for circulation and residential unit size, generating floor plates for as many level as can fit within the maximum allowable building height, and locating the corridor and means of egress, which will eventually be subtracted from the overall residential area.

Creating a flexible and reliable Dynamo definition paves the way for further analysis and visualization. The file can be posted to the Dynamo Customizer -- a shareable web interface where others can orbit around the 3D model and change the input sliders to adjust the geometry real-time. The Customizer is an excellent way to keep team members or clients up-to-date with the latest schemes or elicit feedback to allow them more agency in the design process. Another powerful tool that can be used -- Project Fractal -- runs your Dynamo definition through the cloud to analyze a design space of possible inputs and help the designer determine the configurations that best meet the criteria of fitness. For this case study we put the U-shape scheme in Project Fractal to test which arrangement of input dimensions results in a total Gross Area closest to the target FAR (without going over).

Check out this Project Fractal visualization for yourself HERE (must sign in with an Autodesk ID to access).

Check out this Project Fractal visualization for yourself HERE (must sign in with an Autodesk ID to access).

The geometry generated within Dynamo provides an excellent visual representation of early design explorations but ultimately the geometry is only a placeholder for corresponding calculations of area and program mixture. Calculations from the geometry should be set up to not only establish a simultaneous read out within Dynamo but allow for export to other calculation and visualization platforms such as Excel and PowerBI. For our case study, the finalized Dynamo geometry was exported to a blank tab in Excel and queried by a color-coded table of calculations. Based on the fixed Approximate Unit Areas for each residential unit type and desired Percentage of Total Units, the gross residential area extracted from each floor in Dynamo can be divided up to approximate the number of total units of each type and total beds per floor. At the very bottom of the table, the approximated unit mixture can again be compared to the target value to see how much area was omitted during the calculation process -- this example returned a total unit efficiency of 97%, not bad! Remember that this table only represents a preliminary study. As the design goes into Revit for bespoke modeling and when a diversification of unit mixture among floors is desired, a similar strategy between Revit, Dynamo, and Excel can be established to evaluate the precision of those changes.

Raw data populated in Excel from Dynamo also opens up the option for more powerful visual analytics tools like PowerBI. PowerBI is a free tool from Microsoft that allows you develop customized dashboards for any data source. Areas per level, unit mixture, program, FAR, and other feasibility metrics can populate easy-to-read graphics to share with the client and project constituents. If data is captured over time from all the design iterations, charts can be created that illustrate the evolution of the project.

"Hey, this doesn't look like architecture..." You're correct, read about this sample PowerBI dashboard HERE.

"Hey, this doesn't look like architecture..." You're correct, read about this sample PowerBI dashboard HERE.

An important final step is the seamless transition from feasibility massing to a Revit model. The geometry established in Dynamo can easily be used to begin instantiating generic floor, wall, roof, glazing, and other Revit elements that kick-start the documentation process and eliminate the need for project staff to tediously translate the entire design.

Image courtesy of Shepley Bulfinch.

Image courtesy of Shepley Bulfinch.

To wrap up this case study, we decided to also address the use of Dynamo to test out various parking garage configurations. Several months ago I built a definition for evaluating the maximum possible parking spaces that could fit within given site dimensions. The system was designed for a single-helix structure and would adjust the number of parking bays in accordance with minimum code distances for parking spot size, drive aisle width, ramp slope, turn radius, etc. If a required number of spots is established during the feasibility study process for a particular development project, an array of garage layouts and dimension configurations could be fed through Project Fractal to see which combination arrives closest to the target quantity. While this does not necessarily design the entire parking garage automatically, it is an excellent means to rapidly assess dozens or even hundreds of possibilities and validate a particular decision.

So why is all of this research important? Feasibility studies are a perfect example of a regularly repeated process in architecture built on a list of simple variables. Computers and emerging digital tools offer the opportunity to create a formula for rapidly producing a similar outcome. Furthermore, test fits can be altered, iterated, and shared faster than ever before -- buying back time that can be re-invested in delivering more thoughtful and innovative design.

Shout out to Eric Rudisaile for helping me think through the final steps of the vector approach to variable property offsets in Dynamo. Vectors are extremely efficient but their invisibility makes them tricky to troubleshoot, Eric helped me revisit math I hadn't used in a decade!

MakeTANK Pavilion

In June 2016 a new group emerged at the Boston Society of Architects dedicated to exploring the expanding role of digital fabrication and making in the AEC industry. Their mission was stated as follows:

MakeTank! supports the maker cultural revolution by exploring and advancing knowledge of making as an extension of architectural investigation through presentations, small group discussions and member-run training sessions with this friendly peer learning group.

MakeTANK Launch event at the BSA Space. Image courtesy of Chris Hardy.

MakeTANK Launch event at the BSA Space. Image courtesy of Chris Hardy.

By the second meeting, MakeTANK had identified the perfect opportunity to bring together designers and makers from all over Boston -- design a pavilion for Architecture Boston eXpo 2016! The group dove into materials, patterning, and precedents. After several weeks of discussion and preliminary ideas, I was asked if there is any way Dynamo can contribute to the modeling and testing phase of the design.

To be able to use Dynamo in any way to support the development of the pavilion, I first needed to gain a better understanding of the design concept. The team had decided to focus on a simple centenary arch shape for stability and ease of construction, with an emphasis on material and structural innovation. Early in the process, we realized that the use of any flexible material would create more propensity for bowing/buckling on the sides so strategies for a wider and sturdier base were necessary.

I began by producing a few hand sketches -- as I do so often in preparation for complex tasks in Dynamo -- before deciding that the quickest way to test out my design ideas was to create a sketch model out of paper. This allowed me to play with the proportions, scale, and think through the parametric constraints that eventually evolved into a Dynamo "sketch model" featuring a more elegant solution to stabilizing the base with tent-flap-like elements.

The Dynamo file for this "sketch model" can be downloaded HERE.

The Dynamo file for this "sketch model" can be downloaded HERE.

When the planning reached the consideration of materiality, Haik Tokatlyan and Steve Listwon from Jaywalk Studio came through with a brilliant idea for a rigid structure with flexible connection points that would allow for a gentle flexing of the pavilion humorously referred to at the "Jiggle Factor" -- or "J-factor" for short. They developed a few mock-ups of an assembly that fused wooden members with cast silicone nodes by pouring the pre-cured silicone through holes in the ends of the intersecting wood pieces then allowing it to cure into a hardened but pliable joint.

The flexible pavilion concept was a hit with the group and they set out to create more tests for the proper amount of flexibility, structural support, and steadfast adhesion to the wood. Finding the right durometer required mad-scientist experimentation with hardening agents, cure time, and temperature to strike the perfect balance. Ultimately Haik and Steve settled on using a urethane compound for its improved rigidity over silicone and the translucent red color was a nice bonus.

A casualty of the material testing process.

A casualty of the material testing process.

Another critical consideration to ensure structural whimsy and avoid a catastrophe was the orientation of the nodes around the structure. Several options were looked at, including: Arranging nodes in horizontal or vertical striations would provide stiffer rib segments. A gradient of increasing flexible nodes would introduce more jiggle towards the top of the pavilion while maintaining stability at the base. Variation in the stiffness of targeted joints would produce different movement throughout as well. Ultimately the team settled on an even distribution of nodes with a uniform durometer, leading to an emphasis on the chosen pattern for stability.

Many patterns were investigated in the effort to find equilibrium between aesthetics and structural integrity. Of all the triangle, rectangle, pentagon, and hexagon permutations, the classic Cairo pentagonal pattern was chosen as having an ideal balance of regularity and simplicity. Although at first the logic of the pattern seems to comprise of simple overlapping elongated hexagons, there is actually a precise mathematical relationship between the angles and shapes. Dynamo does not yet have any nodes or custom packages for Cairo patterning so I set out to develop a definition that could generate the pattern along any input surface. In the end I got something that resembled the pattern applied to a barrel vault but was not accurate enough for fabrication nor structural analysis.

Image courtesy of Ilaria Giardiello.

Image courtesy of Ilaria Giardiello.

The Dynamo definitions for these pattern experiments can be downloaded HERE.

The Dynamo definitions for these pattern experiments can be downloaded HERE.

Once materials, pattern, and node distribution had been decided, an easy way to test variations was to create physical models using the laser cutter. This approach allowed for rapid prototyping and much more flexibility in testing different pattern combinations. The addition of scale figures helped the team get a sense of the space underneath the vault and imagine how the pattern would eventually look at human scale.

When it came time to begin refining the parts, the folks over at Sasaki and Jaywalk Studio used a CNC Router to develop different versions of the wood members and carve molds out of high-density plastic for casting the urethane nodes. The CNC was also used to cut smaller components as the assembly became more intricate and a parts catalog was eventually created to keep track of all the pieces.

Image courtesy of Felipe Francisco.

Image courtesy of Felipe Francisco.

Image courtesy of Brad Prestbo.

Image courtesy of Brad Prestbo.

Image courtesy of Chris Winkler.

Image courtesy of Chris Winkler.

Early mock-ups helped inform material modifications and introduced the need for strategic reinforcement in certain portions. With a looming deadline, a smaller segment of the pavilion was constructed using finished pieces to test the resilience of the materials under the stress of extreme jiggle and make final tweaks.

Image courtesy of Haik Tokatlyan.

Image courtesy of Haik Tokatlyan.

Following the final testing phase, the team gathered in several evening and weekend sessions to assist in the casting, cutting, sanding, and small parts assembly processes.

Image courtesy of Brad Prestbo.

Image courtesy of Brad Prestbo.

Image courtesy of Brad Prestbo.

Image courtesy of Brad Prestbo.

In the week leading up to ABX, a few final meet-ups helped work out all the kinks with assembling the finalized components and choreographing the installation at the convention center. This was also the first time the team got see the full creation and test out the J-factor.

Image courtesy of Brad Prestbo.

Image courtesy of Brad Prestbo.

Image courtesy of Sasaki.

Image courtesy of Sasaki.

The pavilion ultimately turned out to be an overwhelming success at ABX2016. With prime real estate near the entry to the Exhibit Hall, many attendees were instantly drawn to the bright pop of red color and tantalizing sway of the canopy. It served as a perfect venue for several talks and presentations throughout the week, including live demonstration of the casting process by Jaywalk studio and occasional presentations of the fabrication process from the MakeTANK team. Additional pedestals and pieces of furniture throughout the space further promoted methods of making and provided an inviting reprieve from the busy showroom floor.

I am very proud to have been a contributing member of this project and thoroughly enjoyed meeting and working alongside colleagues from all over the Greater Boston Area who share a passion for craft and design. Keep an eye out for more inspiring creations to emerge from MakeTANK in the future!

Image courtesy of Jessica Purcell, Shepley Bulfinch.

Image courtesy of Jessica Purcell, Shepley Bulfinch.

A BIG thank you goes out to Brad Prestbo and Ryan Salvas who founded MakeTANK group, especially to Brad for all of the organization and leadership during the pavilion project. This thing would have never been possible without all the generous contributions of local companies and individuals who donated money, materials, and countless hours of personal time. I would also like to thank my employer Shepley Bulfinch for graciously co-sponsoring the pavilion and providing funds to make it a reality.

For more information about the founding of MakeTANK at the Boston Society of Architects and the creative process that went into the realization of the pavilion, I highly recommend checking out the profile pieces written by the Neue Guild -- MAKETANK: PART 1 and MAKETANK: PART 2 (coming soon, stay tuned). While you're over there, you might as well create a user profile and join the community of talented designers and collaboration opportunities they regularly promote.

Space Planning Update

In October, 2015 an approach I developed for space planning and programming using Dynamo was posted on the Dynamo Blog and now resides as a Use Case on the Dynamo website -- you can read about the original version in THIS BLOG POST. Since then, I have corresponded with dozens of individuals who have reached out with questions and clarifications about the workflow. As one of my earliest attempts at annotating a Dynamo definition, the flood of inquiries I received soon after pointed out some basic mishaps in the notes. In addition, the original Dynamo file was created in Dynamo 0.8.2 and the combination of new Dynamo version releases and custom package upgrades caused the occasional headache. Regardless of minor glitches and clarification, I have proudly watched as this workflow has continued to execute flawlessly over time with little to no modification and has inspired many others to adapt it to their specific needs.

One of the first people to share with me how they had put this approach to work was Brian Nickel who applied it to his studies in the architecture program at Montana State University.

Image courtesy of Brian Nickel. Read more about this image on his blog The Revit Saver.

Image courtesy of Brian Nickel. Read more about this image on his blog The Revit Saver.

Nearly a year later Brian followed up with an impressive demonstration of how the same functionality in Dynamo could be piped into FormIt 360 for reshaping into architectural form. This is an excellent example of how generating massing shapes from a list of program can seamlessly begin to generate architecture without a lengthy translation process across mediums.

Image courtesy of Brian Nickel.

Image courtesy of Brian Nickel.

Another fantastic example of how the space planning workflow can be utilized came from Ryan Cameron who showcased the use of Flux populate the program and visualize the blocks.

Image courtesy of Ryan Cameron.

Image courtesy of Ryan Cameron.

With the recent debut of within-node list management functions in Dynamo release 1.2.0, an update to the Space Planning definition was long overdue. Zach Kron came through with an adaptation that drastically reduced the overall number of nodes and satisfied the majority user needs with simple object spacing and an override color in view vs. material approach.

Image courtesy of Zach Kron.

Image courtesy of Zach Kron.

I finally found the time to evaluate, annotate, and update the original and sample datasets can be obtained HERE. Some re-organization and general cleanup occurred but the new definition essentially functions the same way as the original.

Now how could I possibly compare the number of nodes in each version to see how much more efficient the update is? I remembered that awhile ago I saw a Tweet from Andreas Dieckmann showing how to evaluate the composition of nodes used by parsing the .dyn file with Dynamo... genius! Analyzing both the original and new .dyn files reveals nearly 25% less nodes (32 total) in the new version.

Parsing and comparing the two .dyn files. A sample definition can be obtained HERE.

Parsing and comparing the two .dyn files. A sample definition can be obtained HERE.

I am thankful to the always-amazing Dynamo community for your open source spirit and insatiable curiosity. I am glad the space planning definition has helped so many and I look forward to seeing people show off their future interpretations.

Fuzzy String Matching with Dynamo

Eric Rudisaile and I recently collaborated on a post for the Dynamo Blog highlighting a package he recently published called FuzzyDyno. This package includes a few nodes for fuzzy string matching, an algorithmic approach to approximating matches between words. Tools like this are a perfect example of how computer science principals are gradually being incorporated into Dynamo through the hard work of independent developers and advancing the usefulness for architectural production. Read the full article HERE.

Dynamo-litia Boston - October 2016

This second Dynamo-litia workshop featured a live demonstration of modifying Revit parameters using Dynamo.

Workshop Description:
Back by popular demand! This session will showcase several practical workflows for everyday Revit production. If you are still wondering how Dynamo applies to the work regularly performed in architecture firms, this is the perfect chance to find out. The majority of the meeting will be devoted to a live demonstration and attendees will be encouraged to follow along. No prior Dynamo experience necessary; users of all levels welcome.

When: October 20, 2016
Where: BSA Space - Boston

More information at the Boston Society of Architects.

Apologies for the abrupt ending. The battery on the recording device died right before resolving and fully explaining the Element.SetParameter function but this video contains 99% of the relevant content. Presentation slides and datasets can be downloaded HERE.

Ceiling Alignment Made Easy(ier) with Dynamo

Rectangular ACT ceiling grid alignment is a task that frequently occurs during the design and documentation of architecture projects in Revit. After a ceiling has been placed, the typical approach involves creating a dimension string between two parallel walls and one of the gridlines in the ACT ceiling, selecting the dimension, and clicking the EQ symbol that automatically centers that gridline between the surrounding walls. This process must then be repeated for the perpendicular orientation.

For a more efficient workflow using Dynamo:

  • create a new Generic Model family
  • in the family editor, go to the Manage tab > Object Styles
  • in the Object Styles menu under Model Objects, click New under Modify Subcategories
  • name the new object DYNAMO and change the color to something bright that will be easily identified in the RCPs
  • in elevation, create a new reference plane and connect a dimension string between this and the Ref. Level
  • in plan, draw a rectangle the same size as one ACT tile (1’x1’, 2’x4’, etc.)
  • also draw a line at the midpoint of each direction to determine the center point of one tile
  • make sure that the lines are assigned to the upper reference plane so that they will be positioned near to the ceiling
  • in the Family Types dialogue, create a new parameter for Offset Height — this will be assigned to the dimension string in elevation and will determine the offset distance from the floor to the ceiling. It can be a Type parameter (same for EVERY instance of the family) or an Instance parameter, which would place at a default height and then allow adjustments for ceiling height variation in the project.

Not all projects are perfectly orthogonal. In some cases, there may be a defined angular shift in portions of the building, if not many unique angles. A secondary group of lines could be copied and pasted to the same place then assigned On/Off visibility parameters for an orthogonal and angled variation of the family. An instance parameter for the angle would allow for a custom rotation of up to 90 degrees on every instance in the project.

Once your family is completed, save and load into the Revit model. A Dynamo definition can then be built that targets ceiling elements in the model, queries their center point, and places the alignment family.

Ceiling Alignment - Dynamo Definition (hi-res image available HERE)

Ceiling Alignment - Dynamo Definition (hi-res image available HERE)

I chose to specifically isolate ceilings by Type and also by Level. This helps cut down on the requisite computation power and time that it takes the task to run. Another advantage is being able to open an RCP view, watch the families instantiate, and verify that everything has been configured correctly in Dynamo.

After the alignment families have been placed, users on the team can begin the task of manually aligning the ceiling grids to the red box. If it is determined that there is not sufficient space between the gridline nearest to the perimeter walls, the centerline crosshair of the family can be used instead to perfectly center the ceiling grid in the room instead.

Because the lines in the alignment family have been created with the name DYNAMO under the Generic Models category, it is easy to turn off their visibility through the project for final documentation via the View Template. Additionally, if ceiling have shifted and the positioning of the families becomes obsolete over time, it is easy to select one instance, right-click and select all in the view or project, then delete them entirely.

Given a minimum of 5 mouse clicks for the traditional process, selecting the Align tool (or typing the AL hotkey), picking a line on the alignment family, and then a gridline requires a few less clicks. Multiplied over dozens or even hundreds of ceilings throughout the project, this approach is vastly more efficient and removes the need for extra decision making.

One must still manually account for the minimum distance between the perimeter walls and the grid. And obviously this approach is not ideal on L-shaped and irregular ceiling profiles. However, the majority of ceilings in projects are rectangles and Dynamo can help production staff quickly work through an entire RCP full of ceilings so they can quickly apply their time to other pressing matters.

Dynamo-litia Boston Turns 1!

This week the Dynamo-litia Boston celebrated it's One Year Anniversary. To celebrate, I used Dynamo to generate a virtual birthday cake.

Here are some of the highlights of the first year:

First session: September 21, 2015

7 Presentations:

  1. Introduction to Dynamo
  2. Dynamo for Production
  3. Dynamo and the Evolution of BIM
  4. Dynamo for All
  5. Dynamo and the Zen of Data Flow
  6. Work Smarter Not Harder
  7. Bringing Engineers & Architects Together Through Digital Design

1 Workshop:
Revit parameter export
Panelized surface & analysis

Did you know there is an entire Vimeo album devoted to the Dynamo-litia?

Dynamo-litia Boston Album

6 videos
1,777 Plays
51 Finishes
Average time per view: 34m,06s

Top 10 Countries:
US, UK, Spain, Canada, Brazil, Australia, Netherlands, Italy, Singapore, Germany

Lastly, this year would not have been possible without the contributions of many. Special thanks to:

Boston Society of Architects:
Conor MacDonald
Sara Garber
Revit User Group

Autodesk - Dynamo Team

Shepley Bulfinch

Zach Kron - Autodesk
Kevin Tracy - NBBJ
Christina Tully – Shepley Bulfinch
Masha Pekurovsky – Perkins Eastman
Eric Rudisaile - Microdesk
Timon Hazell - Silman

Most importantly, the Boston AEC Community! Looking forward to future sharing and collaboration.

Dynamo-litia Boston - September 2016

This installment of Dynamo-litia featured Timon Hazell, Sr. BIM Engineer at Silman (Washington DC).

Bringing Engineers and Architects Together Through Digital Design
Design changes that took weeks to coordinate are now happening in hours. We are now able to create new iterations of complex designs in seconds. This speed has its benefits, but it also adds complexity to current collaboration practices. How can we work better as a single design team? How can we use conceptual abstract models to generate documentation models? How can we model non-planar framing directly in Revit? You know the answer to many of these involves Dynamo! Join us as Timon Hazell from Silman shares his experiences and talks through a few case studies using Revit, Rhino, Dynamo and Grasshopper.

When: September 22, 2016
Where: Shepley Bulfinch - Boston

More information at the Boston Society of Architects.

Due to A/V difficulties, a few portions of the presentation did not make the video. To follow along AND see upcoming announcements, make sure to download the presentation slides HERE.

Automated Room Placement From Existing Drawings

Sometimes the only resource for existing conditions on a project are scans of original architectural drawings, often produced decades earlier. Scanning the drawings converts them into a digital form but these are flattened images from which no smart information can be extracted. Tracing walls, stair locations, and other building elements on top of a linked image underlay is relatively easy in Revit. Rooms however present a more difficult challenge because they are only represented by a text label and therefore many rooms may occupy the same open space. For example, a corridor may contain several appendages or alcoves that do not have physical elements separating them. This task becomes much more time consuming when placing a few hundred rooms across several levels of an existing building, which I was recently tasked with.

I began this investigation by converting the PDF scans into JPG files and opening them in Photoshop where I could quickly isolate the room names only. Once isolated, use a combination of the Gaussian blur tool, inverse select, and black color fill to convert the room name locations into a larger black blob then export each floor as a new JPG.

In Dynamo import the JPG containing the black blobs and scan the image for black pixels using the Image.Pixels and Color.Brightness nodes. You may have to try out several pixel values — using a large number in the xSamples input will generate too large of a pixel array and will take a long time to run, using too small a number may cause the node to miss some of the block blobs.

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

Rooms from Existing - Dynamo Definition (hi-res image available HERE)

The Color.Brightness node returns a list of values between 0 and 1 that correspond to the brightness found in each pixel. Using some list management techniques, the list is inverted and then all white pixels (0s) are filtered out and only the darkest values (largest) are used to isolate the points where text values are located on the existing plan. Circles are created at all of the remaining points, with the radius defined by the corresponding darkness values. The entire list of circles should be matched against itself to group all intersecting circles because the Color.Brightness node may have read multiple block pixels in each text blob. Then extrude all the circles as solids, intersect any joining geometry, and use the Solid.Centroid node to determine the center point of each solid, which in theory should be the location point of each text label on the existing plans. A Count node can be used to evaluate the resulting number of text location points and determine if the total count of black blobs read in Dynamo closely matches the total number of room names labeled in the PDF scan.

If the counts are significantly different, adjust the pixel value sliders in the beginning of the definition to create a more or less dense field of points and re-trace the process up to this point. If the counts are close, the next step is to query the traced existing conditions walls from Revit model for the corresponding level. This can be done using combination of GetAllElementsOfCategory with the Category walls and then filtering only the elements on the same level.

Once the walls appear in Dynamo, you will most likely see that the resulting points from the existing plans will not be at the right scale, orientation, and location as the Revit walls. Placing a Geometry.Scale and Geometry.Node between the list of solids and the Solid.Centroid node will allow you to experiment with various scale and rotation values prior to the creation of final points. After some trial and error and visual approximation, you should be able to scale and orient the cluster of points to a configuration that matches the scale of the Revit model -- each point should look like it lands in the center point of each room.

Even after scaling and rotating the points, they may still be located off to the side of the Revit walls. To coordinate locations between Revit and Dynamo, begin by going to Manage > Setup > Line Style in Revit and create a new line called Dynamo. In the floor plan view of the level you are working on, pick a prominent element (such as a corner of the building) and draw a model line and change it to the newly created Dynamo line style. Back in Dynamo, use the Select Model Lines by Style node from the Archi-Lab package to locate the "Dynamo" line in Revit. Get the location at the start of that line using the Curve.Start node and also the location of origin in Dynamo with the Point.Origin node. A vector can now be established between these two points and used to translate all of the room locations scanned from the JPG to the same location as Revit. Note that moving from the Dynamo origin to the line drawn at the Level in plan should also move the room points to the correct elevation.

Once everything appears to line up correctly, the Tool.CreateRoomAtPointAndLevel node from the Steam Nodes package will place rooms at each point in the Revit model. For open areas such as corridors, multiple rooms will now be overlapping, potentially causing Warnings. The last step is to go through the model and draw room separation lines at logical points where divisions should occur between the room elements.

After every room has been separated in Revit, the process of populating Room Names, Numbers and other parameters is a manual one. However, if you are lucky enough to possess a CAD file for the existing conditions, Dynamo can also be used to populate the newly-created Room elements with text parameters. Begin by importing the CAD file into the Revit model and move it to the correct location. As a best practice, I tend to delete and purge all unnecessary layers in the CAD file prior to import -- in this case you may save a one-off of room text only and import that instead. In Revit if you explode an imported CAD drawing, the text will become actual Revit text objects (learn more HERE). With Dynamo, all text objects can be grouped into clusters based on shared location, matched up with the Room element center points, and then populate the associated elements with parameter information: Name, Number, Space Type, etc. Due to incongruent alignment or the use of leader lines, this workflow will most likely not work for every text item from the CAD plan but it may alleviate a large portion of the manual data entry required to populate the Revit room elements. More about this process in a future post...

Although a Dynamo-based approach requires some trial and error, it allows you to quickly place a large quantity of Revit Room elements in the exact same location as a scanned drawing. Knowing that the room locations are correct allows for quicker naming and parameter manipulation using Dynamo or other means and reduces a portion of monotonous work.

RTCNA2016 Recap - "Computational Design for the 99%"

Several weeks ago I had the privilege of presenting at Revit Technology Conference – North America 2016. My presentation frequently repeated the phrase “Because Nobody Went to Architecture School to…” We have all been there at some point in our career – continuously repeating the same manual alteration to a Revit model, changing parameter information one click-at-a-time, or performing tedious data entry for hours on end – these are the moments when you wonder if the practice of architecture is not exactly what you dreamed about in architecture school. For all the advancements that BIM has introduced to the AEC industry, production validation, and maintaining uniformity of the information are still difficult undertakings. Tasks that require hours and days of individual modifications are not always professionally rewarding and monopolize time that can better be spent on the overall quality of the design and documentation. I often tell colleagues that if you find yourself asking the question, “There has to be a more efficient way to do this”, chances are good that Dynamo can help.

I did not come from a computer programming background but instead began teaching myself Dynamo to address specific problems frequently encountered in Revit. After achieving a basic understanding of how Dynamo works, I was able to investigate tasks of increasing complexity that began with simple changes to the model and evolved to automating entire processes. As my Dynamo experience continued to grow I began exploring ways that Revit could interact with other software platforms and how data could be manipulated and visualized. My skillset eventually evolved to where I understood more advanced concepts of geometry and parametricism for design but this was all built on the foundational knowledge acquired from researching daily production tasks.

In my presentation I preceded to share a sample of workflows that respond to specific challenges encountered on projects and tell the story of tedious task automation and process improvement for architectural practice. A highlight was the opportunity to collect data on a very large healthcare project that I developed into a workflow for tracking Revit model metrics. The goal was to look for correlations between various model metrics and how long it takes to sync or open the model — one of the most significant factors of workshared projects because the extra seconds and minutes it takes to sync on a slow model multiplied by all the users on the project adds up to many hours of lost productivity over the course of the project. Dynamo is used to track the overall size of the .RVT file, query and count various elements and categories, parse the Warnings export file, then export all the information to an Excel file. In addition to collecting these general model metrics, the Dynamo task updated two additional spreadsheets with every warning in the model over time and every placed family in the model over time. All three of these of these spreadsheets were linked into Microsoft PowerBI along with data from imaginIT Clarity’s Model Metrics tool, which tracks the time it takes to open the model over time. Over the course of three months, I ran this Dynamo definition on a daily basis for a total of 68 exports.

The final takeaway will not be a surprise to those who are familiar with Revit model performance… the data revealed that Auditing and Compacting the model as well as Purging Unused Families had the most overall impact on the time it takes to open and sync the model. Although this may not be a significant breakthrough, these real-time analysis tools help monitor the health of the model and indicate when may be the best time to intervene.

The last step was to find an easy way to communicate the status of the model to the production team. Since it is the responsibility of the Model Lead on the project to audit the Central file, Warnings are the only characteristic that individual team members have the opportunity to impact. The project from which this data was collected happens to be a children’s hospital so we placed an image of a Minion on the Message Board with a visibility parameter tied to the number of Warnings. The final Dynamo task overwrites the Warning count parameter in the Revit model and the Minion changes accordingly. Now the team is aware that when they open the model at the beginning of a workday, if the Minion is purple the Warnings have exceeded 400 and some time needs to be set aside to resolve.

In the end RTC was an excellent experience. I thoroughly enjoyed sharing my perspective and bonding with my fellow colleagues from all over the world.

Special thanks to everyone who helped contribute to my work:
RTC & Committee
Shepley Bulfinch
Jim Martin
Jim Chambers
Jessica Purcell
Christina Tully
Margaret Gammill
PJ Centofanti
Jamie Farrell

Warnings by Level

With many people working in a Revit model at once, it is challenging to keep the time it takes to sync to Central and navigate between views minimal. A proven way to keep the model performing quickly is to regularly resolve warnings and keep the total number as low as possible. But this is easier said than done; project teams face many barriers when it comes to suppressing the number of Warnings including:

  • novice users may be intimidated by the Warnings manager interface
  • team members may intentionally avoid the tedium of inspecting issues one-at-a-time
  • a solitary BIM manager may not be able to keep up with the rate that the team creates warnings
  • unfamiliarity with the entire project may make it difficult for the individuals to make critical decisions
  • priority may go to production and deadlines for which the results are more immediately discernible than warning resolution

What if there was a way to make make this process easier and evenly distribute the responsibility among team members?

When working on larger architecture projects in Revit, it is common to assign portions of the building to different team members. Depending on the program, individuals may be responsible for specific areas or entire floors. This means that Dynamo can be used to parse the Warnings export file and assign the elements in conflict to separate users familiar with those levels.

Here is how the Dynamo process works:

  1. in Revit, go to the Manage tab, click on the Warnings button, and export the file
  2. open the .html file in Excel, select all cells, unclick the Merge button, and save as .xlsx — the reason for this step is because Warnings where two elements are in conflict occupy two rows but we will eventually only need one of the two elements
  3. in Dynamo, import the .xlsx file
  4. use list management to isolate the first of two line items for the Warnings that involve two elements then extract the unique ID number at the end
  5. using the Revit.ElementByID node from the Rutabaga package or Id to Element node from the Archi-lab/Grimshaw package, actual Revit elements can be selected
  6. the Element.Level node from the Clockwork package extracts the Revit level that each element lives on. Note that not all elements have an associated level because several Revit Warnings may apply to elements that span the entire project (system families?) like…?
  7. filter out any elements that do not return a Level and sort the list by Level
  8. export all of the warnings associated with each individual level to separate sheets in a new Excel spreadsheet using nodes from the Bumblebee package
  9. point the project team to where the Excel file is saved and they can navigate to the tabs for the levels they are responsible for then isolate the elements that are causing the warning in Revit by copying the ID number from the ELEMENT column in Excel, going to the Manage Tab > Select by ID, pasting the ID, and clicking Show
  10. it is wise to have the floor plan for that level already open or else it may take several minutes for Revit to automatically open a view to show the element(s) in conflict
Warnings by Level - Dynamo Definition (hi-res image available HERE)

Warnings by Level - Dynamo Definition (hi-res image available HERE)

sample of Excel file export

sample of Excel file export

After browsing through the exported Excel file, some of the floors may not have very many Warnings meaning that it is unnecessary to interrupt the entire team. An advantage of using Dynamo to re-structure all of the warnings according to their associated level in Revit is that the Excel file can also be used to visualize the results with data analytics tools. Using PowerBI, a visual dashboard can be created and quickly updated on-demand with Dynamo to show which levels are accumulating a high number of Warnings then only the team members responsible for those areas need to set aside time for resolution. Another advantage is the ability to keep track of the frequency with which certain levels require model maintenance. If one level is continuously encountering a high number of Warnings, perhaps there is an opportunity for training or a quick refresher on best practices to avoid discrepancies? Maybe allocating more an additional staff member to those levels will help avoid mistakes caused from haste? These are the data stories the begin to reveal themselves through visualization.

Warnings by Level visualization overlaid on project building section

Warnings by Level visualization overlaid on project building section

Specifics on the PowerBI process:

  1. the first thing you want to do is download Synoptic Panel from the Power BI custom visuals library (
  2. import the Synoptic Panel into your Power BI dashboard
  3. Sqlbi has an online tool for generating your own maps ( -- All you have to do is drag in an image file, and create zones
  4. it's important to make sure the names match the values in your data exactly or Power BI will not be able to match it
  5. to apply color blocks to an architectural building section, save the section as an SVG file and drop it into the visual in Power BI
  6. plug in your data and begin to customize
  7. lastly you can assign states, compare matched and unmatched, or set a gradient
LEFT: States - number of warnings colorized by how much need to be addressed, CENTER: Matched/Unmatched - level contains warnings if blue and does not contain any warnings if white, RIGHT: Gradient - shade of teal varies according to number of warnings

LEFT: States - number of warnings colorized by how much need to be addressed, CENTER: Matched/Unmatched - level contains warnings if blue and does not contain any warnings if white, RIGHT: Gradient - shade of teal varies according to number of warnings

Often there is just not enough time to get to every Warning. Maybe the team or model manager has a limited number of hours to invest into maintenance or a user may have an intermittent hour of free time and ask how they can help. For these scenarios, asking the team to work through the Warnings menu is not an efficient use of time and may also result in element ownership conflicts. Therefore Dynamo can be used to sort the entire list of Warnings according to easiest items to resolve, regardless of which Level they live on. OR after sorting Warnings by Level (see above), each sublist can then be further sorted by the easiest items to resolve. The resulting list can then be exported to Excel and organized or even color-coded from easiest to solve to more time consuming. Some of the easiest Warnings for users of all experience levels to resolve are:

  • Elements have duplicate 'Number' values.
  • Elements have duplicate 'Mark' values.
  • Room is not in a properly enclosed region.
  • Multiple Rooms are in the same enclosed region. The correct area and perimeter will be assigned to one Room and the others will display "Redundant Room..."
  • There are identical instances in the same place. This will result in double counting in schedules.
  • Highlighted walls overlap. One of them may be ignored when Revit finds room boundaries. Use Cut Geometry to embed one wall within the other...
Warnings sorted and colored in Excel pivot table according to easiest to resolve

Warnings sorted and colored in Excel pivot table according to easiest to resolve

All of these workflows demonstrate how increased familiarity with Dynamo leads to re-thinking traditional Revit approaches and discovering new ideas for efficient production. No longer are we trapped within the confines of the default Warnings manager but tools like Dynamo, PowerBI, and others allow for immediate and seamless Revit analytics. Most importantly, these tools can be distributed at scale across all BIM projects in the office and help empower users of varying experience levels to improve model management efforts, which ultimately results in increased Revit performance and significant time savings.

This post was produced in collaboration with Jessica Purcell, Application Support Specialist at Shepley Bulfinch. Her PowerBI and data analytics expertise made all of the above dashboards possible.

Dynamo-litia Boston - July 2016

This month's Dynamo-litia meeting featured special guest Eric Rudisaile, Solutions Specialist at Microdesk.

Work Smarter Not Harder
The AEC industry is changing; thriving companies are learning to leverage technology and incorporate automation. The power of computers can remove obstacles in traditional design workflows while improving the quality and consistency of the design. How can companies re-examine their processes to find opportunities where technology increases the value of their day-to-day design process, and add new tools to their toolkit? Eric Rudisaile from Microdesk will share his perspective on how the plethora of computational tools available can each contribute in their own way to revolutionize the way you work, and how to choose what’s best for your company.

The video and presentation slides are available HERE.

More information at the Boston Society of Architects.

Dynamo-litia Boston - June 2016

This month's Dynamo-litia meeting featured special guest Masha Pekurovsky, Design Technology Specialist at Perkins Eastman (NYC).

Dynamo and the Zen of Data Flow
Dynamo data flows from left to right: START --> FINISH. No, this presentation is not going to teach you how to reverse Dynamo gravitation, but rather how to make the best out of it.

In the past two years we saw the emergence of Bumblebee, Rhynamo, Dyno and Operating via Dynamo, these packages enable powerful collaboration workflows between Revit, Rhino, Excel … This means that you and your team can develop custom project-specific workflows much easier than ever before!

This month’s Dynamo-litia presentation is dedicated to contemplating readily available Dynamo-enabled interoperability workflows.

The video and presentation slides are available HERE.

More information at the Boston Society of Architects.

Dynamo-litia Boston - Workshop 1

This two-part workshop kicked off with a live workshop on basic Revit interaction — how to access elements and information, the altering of parameters, and interoperability with Excel and an introduction to Dynamo builds and the Package Manager. The second half of the meeting explored the geometry side of Dynamo — establishing a parametric system and administering analysis criteria.

The presentation slides and datasets are available HERE.

More information at the Boston Society of Architects.

Stairway to... Mt. Washington

Every March a group of Shepley Bulfinch employees (and friends) makes an excursion to the summit of Mount Washington. Situated in the the Presidential Range of the White Mountains in New Hampshire, Mount Washington is notorious for unexpected weather changes. Average temperatures for the month of March are a high of 20 degrees and a low of 5, with sustained wind gusts of up to 60 mph at the summit that can easily create a subzero windchill. Hiking at the tail end of Winter often means that crampons, ice axes, interchangeable layers, and many pairs of dry gloves are required equipment.

The hike itself lasts approximately 7-9 hours, covering 10.5 miles and nearly 4,250 vertical feet. After many months of hibernation during the blustery Boston Winter, serious training is necessary to prepare for the quad-burning ascent. Beginning 6-8 weeks prior to the climb, individuals begin hiking the egress staircase of the office building with reams of paper in their packs to simulate the weight of equipment and provisions carried on the day of the climb. This year everyone began recording their training sessions to stir up a little friendly competition and here are the results...

There are 16 floors in the office building and an approximated 736 average steps per lap (all the way up and down once) after accounting for the stairs, landings, intermittent stretch breaks, etc. Therefore it is estimated that the entire team collectively climbed roughly 349 laps, 11,168 floors, and took 257,269 total steps.

It looks like Monday, Wednesday, and Friday were overwhelmingly popular for making the time to fit in stair training around busy work schedules and other committments.

Here is the breakdown of number of laps per person:

...And an analysis of the average time per lap:
Notice that as weight and amount of laps increased, overall time would often slow down significantly.

We lucked out this year with sunny, warm weather for the entire hike. Everyone summited successfully and it was another fun-filled team experience.

Special thanks to our fearless guide David Meek who year-after-year invests hours of preparation and tireless effort to ensure a safe journey for all!

Dynamo-litia Boston - April 2016

Dynamo is a proven tool for modifying Revit information and automating repetitive tasks, yet figuring out where to get started can be an intimidating process. Please join us for the first in a two part series on how to begin using Dynamo. We will focus on what it is useful for and highlight several introductory workflows that can be understood with everyday Revit knowledge. Since Dynamo is new to the majority of the local AEC community, we will discuss how regular project challenges can be opportunities to explore principles and grow knowledge. For those who are already experts of other software platforms, see how Dynamo has made the process of transferring geometry and information easier than ever before.

The video and presentation slides are available HERE.

More information at the Boston Society of Architects.

SketchUp to Revit with Dynamo

Despite the possibility of creating complex geometry in Rhino or logic-driven forms with Grasshopper or Dynamo, the vast majority of designers in corporate architecture firms use SketchUp for 3D modeling and visualization. A common request is for someone to "convert" a SketchUp model to Revit but by no means is this an easy task.

I was recently asked if there is a quick way to transfer a custom designed furniture item that had been modeled in SketchUp to Revit for documentation purposes. I discovered the SketchUp to Dynamo package created by Maximilian Thumfart and built a tool that converts SketchUp geometry to a new Revit Furniture family as a freeform object. This allows you to quickly populate Revit with elements from SketchUp and annotate.


  1. Copy and paste only the geometry you want to export into a fresh SketchUp file.
  2. Similar to 3D printing, this workflow requires organized, continuos surfaces. It helps to break a model down to its shape in plan then trace the outline with smooth arcs and straight lines void of fractured line segments.
  3. Extrude and finalize the shape, it should remain ungrouped. Save the file.
  4. / 5. Open the Dynamo definition, browse to the SketchUp file, select the type of Revit Family you want to create (ie. Furniture, Generic Model, etc.), and click Run.
  5. In Revit go to the Architecture Tab > Component and place the family in the model. Lastly make sure to save the family into your project folder by clicking Edit Family and going to Revit > Save As > Family.


  • there are several ways to do this using default export and import tools in SketchUp and Revit, however the Dynamo approach is much quicker and bypasses the handling of multiple file formats.
  • cleaning up the SketchUp geometry will result in smaller and more efficient Revit families.
  • this approach could also work for larger scale uses like full building massings for design coordination and "tracing" with Revit elements.


  • these families are NOT parametric, meaning they do not have adjustable dimensions or on/off visibility features. However by going to Edit Family; dimension parameters, on/off parameters, material parameters, and other constraints can be added post-process.
  • oftentimes iterative SketchUp modeling is a messy process. Unless the geometry you are attempting to import is super organized, this method may fail.

This approach may not be ideal permanent workflow but does the trick for representation and as a placeholder for proper families to be built at a later time. More importantly, it opens a lot of doors for larger scale design facilitation.