Converting Space Planning Families to Revit Room Elements


A few weeks ago at the, Dynamo-litia Boston meeting a fantastic question was raised:

After using the space planning objects for programming in Revit, what happens when you need to transition that information to rooms?

This poses a legitimate challenge because the space planning objects are families and the information associated needs to be converted rooms - fundamentally different Revit elements.

Just days after the Dynamo-litia meeting I was presented with the opportunity to transform a Revit model full of intricately-arranged space planning objects across multiple Levels into Revit room elements. With some effort in Dynamo, I found a way to extract the name, department, and other parameters from each space planning object and generate Revit rooms in exactly the same spot with corresponding information.

Areas to Rooms_definition.png

Using this workflow, 639 programming families became Revit rooms in just under 90 seconds.

The Dynamo definition collects all instances of the Space Planning families in the model, groups them by Level, and locates the centroid of each. Using the Tool.CreateRoomAtPointAndLevel from the package Steamnodes, a room is placed at each centroid in the list with their associated levels.

The one caveat to successful conversion is taking the time to model Revit walls between the space planning objects. Consistency in setting up walls between objects and maintaining thorough parameter information within the objects will eliminate the possibility of creating Not Enclosed or Redundant rooms. Worst case, if rooms are overlapping within the same enclosed area after running Dynamo, you know that the centroid of the room is at the center point of the Space Planning family and you can add additional walls between those centroids to create separation between room elements.

The last step is to remove or hide the visibility of the space planning objects from the model and then you can perform Tag by Category to tag all rooms on each floor.

Dynamo-litia Boston - November 2015


Dynamo for Production

For all of the advancements that BIM has provided to the AEC industry, operational constraints in Revit consistently challenge the efficiency of production. The Dynamo visual programming add-in for Revit has been widely praised for providing a means to overcome logistical limitations. With the potential for gathering and restructuring information and elements from the model, Dynamo adds an additional layer of control and automation to every project. How are firms using the technology to optimize production? What are the most common tasks being automated?

The video and presentation slides are available HERE .

More information at the Boston Society of Architects .

Bespoke Brick Design


MODULAR FACADES WITH DYNAMO
A few weeks ago I presented how Dynamo can be used for bespoke brick design.

I chose "Cloaked in Bricks" by Admun Design & Construction Studio as a precedent to replicate using Dynamo.

Images courtesy of Archdaily - http://www.archdaily.com/775030/cloaked-in-bricks-admun-design-and-construction-studio

Images courtesy of Archdaily - http://www.archdaily.com/775030/cloaked-in-bricks-admun-design-and-construction-studio

Since a brick is a modular unit, creating an array and selectively rotating targeted bricks is a simple exercise using visual programming and parametricism.

The resultant arrangement of bricks can then be easily imported into Revit for documentation.

Dynamo provides limitless opportunity for design customization. This example would be complex to construct but if you want explore simple brick patterning or color variation, this is an excellent methodology. Furthermore, this approach can be applied to any modular facade system including: other block types, stone, metal panel, perforated metal, precast concrete, etc. Start to finish, this entire exercise took approximately 4 hours to produce and document.

Dynamo for Digital Fabrication


CUSTOM MAGAZINE RACK
Recently I set out to create a custom magazine rack in an effort to cover up an unsightly electrical panel in the living room of my apartment. My first attempt at modeling design ideas in SketchUp proved to be very time consuming because every slight adjustment to the design required re-modeling entire portions of the geometry. Eventually I came up with the idea of building a parametric model using Dynamo that could not only explore many iterations with minimal adjustments but also could simulate materials and assembly strategies.

To begin, I created a solid form in Dynamo that reflected one of the designs I had refined during earlier studies in SketchUp. I then generated two additional solids and applied a boolean difference to carve out a mail slot on the top and a slot for magazines on the front. I set the magazine slot at an angle so that magazines could easily be inserted and the top portion of the covers would remain exposed to display their titles.

Another key consideration was how the rack would mount. I ended up settling on a 12” metal picture rail with a 1/2” front lip that could be inserted into a slot in the back of the rack and attached to the wall. To incorporate the slot, I created another solid for the void space that the picture rail would require and again used a boolean difference function to subtract from the full design.

Dynamo_full definition_FINAL.png

For materiality, I just so happened to have an excess supply of 1/8” thick Baltic Birch plywood lying around so I decided the form would be nicely defined by evenly spaced vertical wood fins. To simulate the wood elements in Dynamo, I created planes set perpendicular to the wall mounting surface, copied each plane 1/8” to imitate the thickness of the material, and introduced a parameter for spacing between the pieces of wood to evaluate visual density. The planes were then used to intersect the solid geometry and leave behind the isolated slices of the original form as a representation of the wood. The parametric constraints facilitated quick adjustments and counts in the Dynamo definition ensured an economy of materials by balancing the desired look with a reasonable amount of wood used.

The next step was to add cross-bracing for structural stability. End cap pieces were added to the top and bottom as well as a support piece in the center of the rack tucked alongside the picture rail. To guarantee precise and equal spacing of all the vertical baffles, intersecting geometry was subtracted and tolerances were added to interlock all the pieces together.

Dynamo_ends_snip.png

After all the geometry was finalized, each piece was rotated and flattened onto the same plane for export to SVG. I opened the SVG file in Adobe Illustrator and used the vector lines to laser cut all the individual components out of corrugated cardboard to construct a mock-up. The SVG file format can also be exported to a DWG or DXF from Illustrator for use in AutoCAD.

rack_mock-up.png

The mock-up revealed that some of the tolerances were too loose, causing the magazine rack to easily lean from side-to-side as a result of inadequate interlocking of the cross-bracing. The advantage of building a parametric model in Dynamo is that I was able to reduce the tolerances of all geometry in the model ever-so-slightly for a perfectly snug fit.

The final result was a magazine rack that serves as both functional and sculptural. This was an excellent learning opportunity for digital fabrication and most importantly provided yet another purpose for using Dynamo.

What I Use Dynamo For Nearly EVERY Day


PARAMETER CHANGE CUSTOM DEFINITION

Working on architectural projects in Revit requires changes to the parameters of model elements on a daily basis. If only a handful of items need to be adjusted, the parameters can easily be modified by clicking the elements one-at-a-time. Yet oftentimes decisions are made on the project requiring the change of dozens or even hundreds of elements throughout the model - an extremely time-consuming process when executed individually. Common mothods for making mass changes to model elements are through Revit Schedules or by exporting parameter information to outside applications such as Excel via Ideate BIMLink and similar add-in products. However, there is even easier way to accomplish this...

Using the Dynamo visual programming add-in for Revit, parameters associated with Revit elements can be isolated and changed with the simple arrangement of a few nodes. This method is much faster and provides greater control than can be achieved from schedules and other tools because specific criteria of the parameters can be targeted and isolated. Once the Dynamo definition is built, the tool can be re-used every time future parameter changes are necessary. For this reason, I implement this approach nearly every day on projects. This tutorial will show you how to build a parameter change definition of your very own.

STEP 1 - set up the definition:
The four nodes on the far left are: the Revit element category that you want to make changes to, the specific parameter that needs to change, the existing name or value, and the new name or value. The name of the parameter and the existing values must match the exact spelling and capitalization as in the model. These values can be verified by going to the Revit model and selecting one of the elements to be changed or by creating a schedule.

Parameter Change_1_original definition_edit.jpg

STEP 2 - package the definition into a custom node:
Drag a window around all the nodes to select and go to Edit > Create Custom Node.

Parameter Change_1_original definition.jpg

STEP 3 - name the custom node:
Fill out the Name of the node and write up a brief description of what it does. The Category is what the node will be filed under in the Dynamo library for future use.

Parameter Change_3_custom node details.png

STEP 4 - place custom node on the canvas:
This is what the finished product looks like.

Parameter Change_4_custom node.png

STEP 5 - input changes:
When creating a custom node from a selection, Dynamo will automatically assign inputs to all open variables. To make a tool that can easily be understood by other future users, it may be beneficial to change the names of the Input nodes.

STEP 6 - putting the custom node to use:
Match the requisite Revit element Category, parameter name, and values to quickly change all instances of a particular element in your model.

ABX2015 Workshop - Computational Design on Every Project

Image: courtesy of Paul Kassabian

Image: courtesy of Paul Kassabian

Architecture Boston Expo 2015
Hosted by The Boston Society of Architects (BSA)
Boston, MA
November 19th, 2015

Welcome to Tomorrow:
Computational Design on Every Project
Paul Kassabian - Simpson Gumpertz & Heger (SGH)

Computational design technology allows designers and engineers to eliminate the repetitive "grunt work" inherent in every project. Automation was introduced during the Industrial Revolution and digital programming for automation arrived with the invention of computers, resulting in considerably increased efficiency for many industries. Tools for producing buildings have evolved immensely over the years - from analog to digital and now ever-changing software platforms - therefore it is imperative to stay flexible and ready to adapt to emerging technologies. Ultimately, the key is making tools work for us in order to ensure continuous improvement of the process for design and construction.

Current design software can be organized into three approaches:
1. Geometry - Rhino & Revit
2. Logic - Grasshopper for Rhino & Dynamo for Revit
3. Performance - software & custom add-ins for: daylight studies, structural analysis, building performance, etc.

And most computational approaches can be broken into three main categories:
1. RAD - Rapid Automated Drawings
2. RAY - Daylighting Analysis
3. RAP - Rapid Analysis & Planning

While Paul Kassabian appreciates the large and complex projects he often works on, some of his projects have been at smaller scales such as pavillions and installations where more experimental liberties can be taken. Regardless of size, invaluable lessons can be learned on any project.

One of the most interesting workflows presented was using custom code with Grasshopper and Rhino to generate structural beam and girder layouts for establishing maximum economy of materials. Structural members not only have prescribed sizes and connection methods but they also have an associated cost. By parametrically constraining the dimensions of each piece and including calculations for spacing and arrangement, the Grasshopper definition runs through hundreds of possible configurations simply by changing a few number sliders.

VIDEO - Parametric Steel Frame and MEP Layout

Even more impressive, a graph at the lower portion of the floor plan displays how Estimated Steel Cost fluctuates based on changes to the inputs (minute 1:35). This is a really impressive approach to showing cost sensitivity during the design process.

Another great project presented was a Grasshopper script that automatically generated a schematic structural layout while massing objects were manipulated in Rhino. This would allow an architecture team to model various design schemes and instantaneously understand the structural requirements. Rather than waiting days or weeks to get feasibility and cost estimates back from the structural engineer, this approach helps the architects design with structure in mind. The structural information from the finalized design can then be imported into Revit to provide a head start for the engineering team.

VIDEO - Master Planning Phase - Structural and Facade Systems Development

Lastly, Paul discussed the advantages of co-location over the traditional production process. On projects where he has the opportunity to work alongside the architect, he is able to gain a deeper understanding of the design and customize computational design tools that accommodate the intricacies of the overall vision. As opposed to the more traditional approach of exchanging digital files, the increased communication and flexibility from co-location has led to some of the most creative and alluring projects.

More information about the session can be found HERE.
About Paul Kassabian

AEC Technology Symposium 2015

AECTS2015_nametag.jpg

AEC Technology Symposium 2015
Hosted by Thornton Tomasetti (NYC)
Baruch College
September 25th, 2015

RESEARCH & DEVELOPMENT IN AEC SESSION 1:
Measurement Moxie
Christopher Connock - Kieran Timberlake

Christopher Connock emphasized the importance of approaching each architecture project as an experiment — an opportunity to test new technology and ideas — a philosophy that Kieran Timberlake incorporates into all of its projects. They are particularly exploring the frontier of data capture using a wireless sensor network to gather building performance analytics. The comparison of plant diversity and placement against soil moisture and temperature sensors in a green roof can help assess drainage, plant health, and solar gain over time. Temperate and relative humidity sensors can be used to investigate an entire space or focus on a particular application such as the performance of materials in a building envelope. Hundreds of different sensors placed throughout a building can track and transmit environmental changes across a given day or even across seasons. Kieran Timberlake implemented a wireless sensor network to help inform their renovation of their new offices in the Ortlieb Bottling Plant in Philadelphia and then developed an in-house app to capture post-occupancy feedback from their own employees about the overall comfort of the space and to identify abnormal conditions. All of this research contributed to the development of tally — a life-cycle assessment (LCA) app and add-in for Revit that evaluates the environmental impact of building materials among design options and promotes a much more eco-conscious approach to design.

Grow Up, Grasshopper!
Andrew Heumann - NBBJ

Andrew Heumann believes in the need to change the perception of design technology in the AEC industry and integrate it more into practice. He showcased an extensive portfolio of projects that have used Grasshopper for Rhino and custom written apps to simulate inner office traffic patterns and the importance of sight lines, the use of human location and city data for urban planning, and the tracking of digital tools in the office to identify focus areas for development and support. All scripts and tools developed in-house at NBBJ are documented and packaged into products for use by project teams. In addition, custom dashboards and user interfaces help reduce intimidation and increase universal adoption — for example, reducing a complex Grasshopper script to a series of slider bars that control the inputs of a parametric design. Andrew also advocated for the use of hackathons and similar hands-on user meeting formats to promote design technology as a facet of culture and process. He shared an example of how a brief hackathon with senior partners at NBBJ led to the funding of a proposal for further development of an innovative tool for optimizing healthcare patient room configurations.

Evolving Modes of R+D in Practice
Stephen Van Dyck and Scott Crawford - LMN Architects / LMN Tech Studio

The Tech Studio was founded to support the prominent role of research and development at LMN Architects in Seattle, which led to an expanded use of analytic and generative tools to drive design. They have not only embraced the use of custom digital tools for the creation and visualization of complex forms but regularly construct scale models for material testing and to explore modular strategies as part of their iterative design process. Working with fabrication in mind facilitates improved precision for collaboration with engineers and consultants. In addition, they have found that a thorough digital process and physical models help better communicate design ideas thus resulting in increased positive community feedback. The development of a ray-tracing tool for exterior acoustics studies, custom panel creation for balance in musical acoustics and aesthetics, and a highly parametric pedestrian bridge spanning a major Seattle highway are a few examples of projects that demonstrate how research is guiding principal for design at LMN.

OPEN-SOURCE DATA AND APPLICATION:
Collaboration and Open Source - How the Software Industry’s Approach to Open Sourcing Non-Core Technology Has Created Innovation
Gareth Price - Ready Set Rocket

This presentation provided insight to the current state of technological innovation through the lens of a digital advertising agency. Gareth Price emphasized that individuals should not be hesitant to share ideas out of fear that another company will benefit from them. Particularly in the AEC industry, companies do not have the overhead to for pay tool creation and requisite support, nor can they cover the cost to pay an outside software consultant. The reality is that other people are busy with their own work and do not have the time nor resources to steal your ideas and commodify them. More importantly, it is advantageous to share ideas for a project because they may elicit constructive criticism, or inspire others to contribute to those ideas and improve them. Also, do not get too entrenched on one idea and know when to pivot — the next great idea may come as an unexpected derivative of the original intention.

Key quotes:
"purpose is the DNA of innovation"
"failure is the new R&D"

How Opens Source Enables Innovation
Mostapha Roudsari and Ana Garcia Puyol - CORE studio / Thornton Tomasetti

Mostapha Roudsari and Ana Garcia Puyol exhibited many examples of digital tools that have emerged out of CORE Studio - the research and development arm of Thornton Tomasetti. The majority of examples presented originated during previous CORE AEC Technology Hackathons and were then further developed into more robust products. Nearly every tool required collaboration from multiple individuals, with expertise in a diverse mix of software platforms, and oftentimes representing different companies. The takeaway from this presentation was the value of open source and hackathons as a means for getting a group of talented people into one room to create new tools for AEC design and representation. Mostapha wanted to make it clear that more important than software tools, code, and machining is the strength and power of the user community. If you want to be at the forefront of the movement, be a developer, however the community is just as important make an effort to share ideas and spread adoption.

Here are some of the many tools presented:

  • vAC3: open source, browser-based 3D model viewer. This project led TT to further develop Spectacles.
  • Spectacles: allows you to export BIM models to a web interface that allows you to orbit in 3D, select layers, and access embedded BIM information (demo HERE)
  • VRX (Virtual Reality eXchange): a method for exporting BIM models for virtual reality viewing via Google Cardboard
  • DynamoSAP: a parametric interface that enables interoperability between SAP2000 (structural analysis and design), Dynamo, and Revit
  • Design Explorer: "an open-source web interface for exploring multi-dimensional design spaces"
  • Pollination: "an open source energy simulation batch generator for quickly searching the parameter space in building design"

For more information, check out TT CORE Studio's GitHub, Projects, and Apps pages

Open Source: Talk 3
Matt Jezyk - Autodesk

Matt Jezyk provided an introduction to Dynamo including its history and the most recent developments. Dynamo may have started as a visual programming add-in for Revit but it is quickly transforming into a powerful tool for migrating data and geometry across numerous software platforms. The talk highlighted the role open source has played in the empowering independent developers to create custom content that expands capabilities and makes interoperability possible. By keeping Dynamo open source, it has benefited from contributions by individuals with a wide range of expertise looking to satisfy specific requirements. As part of a larger lesson taken from the growth of Dynamo, Matt emphasized that the key to the emerging role of technology in practice and the AEC industry as a whole is less about learning specific tools but about codifying a way of thinking — tools are only the implementation of a greater plan.

DATA-DRIVEN DESIGN
Beyond Exchanging Data: Scaling the Design Process
Owen Derby - Flux.io

Flux has been a frequent topic of conversation lately. The company initially marketed a product called Flux Metro which boasted the potential for collecting the construction limitations of any property based off zoning, code, municipal restrictions, and property records — an ideal tool for developers and architects to assess feasibility or use as a starting point for the design process.

The company has since pivoted to focus on creating a pipeline for migrating and hosting large quantities of data for many software formats. Their new product line features an array of plugins for transferring data between Excel, Grasshopper, and Dynamo, with plans to release additional tools to connect to AutoCAD, SketchUp, Revit, 3DS Max and more in the near future. Data exported from these software programs is hosted to a repository in the cloud where it can be archived and organized for design iterations and option investigation. Flux has great potential for achieving seamless interoperability of data and geometry between software platforms, and significantly improving the efficiency of AEC design and production process.

Holly Whyte Meets Big Data: The Quantified Community as Computational Urban Design
Constantine Kontokosta - NYU Center for Urban Science + Progress (CUSP)

The NYU Center for Urban Science + Progress (CUSP) is using research to learn more about the way that cities function. Buildings, parks, and urban plans are all experiments built on assumptions in which the true results don’t emerge until years and decades later. How do you measure the “pulse” of a city? How do macro observables arise from micro behavior? Constantine and CUSP have set out to test these questions by collecting and analyzing: NYC public internet wireless access points, Citi bike share, 311 complaint reporting, biometric fitness devices, and social media. They use these urban data sources to make better decisions and form initiatives for future community improvement projects. The results also have positive implications for city planning, city operations, and resilience preparation.

Data-Driven Design and the Mainstream
Nathan Miller - Proving Ground

Nathan Miller is the founder of the Proving Ground, a technology consultancy for Architecture, Engineering, Construction, and Ownership companies. In his experiences providing training and technological solutions he professes the importance of equipping staff with the right tools and knowledge to adequately approach projects. There is an intersection between managers and leaders responsible for projects and staffing, and those who are actually doing the work. It is imperative to focus on outcomes and not get deterred by the process.

The Biggest IoT Opportunity In Buildings Is Closer Than You Think
Josh Wentz - Lucid

Energy consumption, mechanical systems data, thermal retention, and other metrics are not recorded for the majority of buildings worldwide. These are incredible missed opportunities for evaluating the overall performance of a building and collecting real-time research that can inform better construction techniques. Lucid has developed a product called BuildingOS that offers 170 hardware integration options to collect robust building data. This data has the potential for helping facilities management departments better track efficiency and maintenance of their systems, in addition to contributing to the international pool of data to help us better understand how materials and systems perform over time.

RESEARCH & DEVELOPMENT IN AEC SESSION 1
Capturing Building Data - From 3D Scanning to Performance Prediction
Dan Reynolds and Justin Nardone - CORE studio / Thornton Tomasetti

This presentation highlighted CORE Studio's use of various technologies for capturing existing conditions data and testing architectural responses through computation. They have utilized drones for capturing the condition of damaged buildings and structures by assembling fly-by photos into a point cloud.The development of in-house GPS sensor technology accurate to 1 centimeter anywhere in the world has enabled measuring the built environment and construction assemblies to a high level of precision. CORE Studio has also investigated the use of machine learning for exploring all possible combinations of building design parameters and calculating embodied energy predictions. All of these design technology advancements are helping Thornton Tomasetti design more accurate better-informed systems.

Data-Driven Design
Luc Wilson - Kohn Pedersen Fox Associates PC

Luc Wilson thinks of data as an “Urban MRI” - a diagnostic tool for measuring the existing configuration of cities and predicting future growth. Multiple FAR and urban density studies were presented that exhibited how comparison to precedents and benchmarks helps to conceptualize the data and make visual sense of the analysis. The key to prediction is the ability to test thousands of designs quickly, which Luc has perfected by developing digital tools for quickly computing all possible combinations of input parameters and producing measurable outcomes for comparison. One of the most exciting portions of the presentation was the mention of a 3D urban analysis tool called Urbane, which Kohn Pederson Fox is working with NYU to develop — could this be the new replacement for Flux Metro?

Cellular Fabrication of Building Scale Assemblies Using Freeform Additive Manufacturing
Platt Boyd - Branch Technology

Platt Boyd founded Branch Technology after realizing the potential for 3D printing at a large scale by imitating structures found in nature. Branch has dubbed their technique “cellular fabrication” where economical material is extruded with geometric complexity to construct wall panels that are lightweight and easy to transport. Their process seeks to reduce the thickness required by traditional 3D printing technologies and the intricate geometric structure provides equivalent strength to that of a printed solid. The wall panels are printed free-form with a robotic arm on a linear track and then are installed onsite where insulation, sheathing, and finish material are added to reflect the same condition as traditional wood or metal stud construction. It will be really interesting to see Branch continue to refine their methods and start to tackle complex wall conditions for use in real-life building projects in the near future.


Watch videos of all the presentations HERE.

Getting Started with Dynamo

I am frequently asked how I got started using Dynamo so I will take a moment to share my background and provide advice for making your first foray into the world of computational BIM.

I have been using Dynamo for less than a year with no prior visual programming experience and it has made a significant impact on the way I approach production. For all of the advancements that BIM has provided to the AEC industry, operational constraints in Revit consistently challenge the efficiency of production. Since entering the architecture profession, working on several project teams has revealed that every project requires repetitive tasks at some point, oftentimes meaning making changes to Revit elements one-at-a-time. To make matters worse, fluctuations in design direction, scope, or value engineering can necessitate a complete overhaul of portions of the model and lead to re-doing those consecutive manual adjustments. Dynamo adds an additional layer of control to overcome Revit limitations by providing the capability to gather and restructure information and elements in the model, thus creating the potential for repetitive task automation.

My best advice for getting started with Dynamo is to make time and find opportunities to use it. Identifying a specific problem will provide a framework with which to search for answers and guide your workflow. There are many wonderful developers out there who are sacrificing personal time to expand the capabilities of Dynamo and produce custom nodes for the Package Manager. In the spirit of Open Source, the worldwide community is generally willing to share knowledge and answer questions in the hope that more people will contribute ideas. It is important to keep in mind that demonstrations and documentation are intended to provide examples of what can be achieved. There isn’t a universal Dynamo definition that addresses multiple problems, every task requires slight modifications and customizations to correspond to the unique conditions of your project. Focus on the underlying principles of visual programming and embrace flexibility.

If you want to learn more about a Dynamo post you encountered, you can improve the chances for a quick response if you include sufficient information. I have had success in the past by including the following in an email:

  1. your question
  2. how long you have been using Dynamo
  3. what you are working on
  4. how you plan to apply this workflow
  5. what you have tried so far
  6. accompanying screenshots, sketches, diagrams

When it comes to resources, the tutorials on the Learn page of the Dynamo website are very helpful and I particularly encourage reading through the Primer to gain an understanding of the fundamental principles. Keep up with the latest news and tutorials by regularly checking the Dynamo Blog. The Dynamo Community Forum is an excellent place to post questions, responses are normally timely and everyone is friendly. Local Dynamo user groups have also been popping up around the globe, which are an excellent way to grow your network of individuals that you can reach out to for help and collaboration. There are currently groups in: Atlanta, San Francisco, Tokyo, Boston, Los Angeles, and Russia.

Lastly, there are many blogs produced by pure Dynamo enthusiasts that have helped me in my journey. I highly recommend that you check them out if you are looking for answers or inspiration:

Proving Ground (io) & The Proving Ground (org)
Archi-Lab
Buildz
Havard Vasshaug's blog
Simply Complex
What Revit Wants
Sixty Second Revit
AEC, You and Me
Jostein Olsen's blog
Revit beyond BIM
Enjoy Revit
Serial_NonStandard
Kyle Morin's blog
The Revit Kid
The Revit Saver
SolAmour's extensive list of resources

Listen to the Dynamo Team explain the history and recent popularity of Dynamo on the Designalyze Podcast.

Visualizing Rooms by Color with Dynamo

Using the Dynamo visual programming add-in, it is possible to isolate room geometry in a Revit model and color-code specific room types for the purpose of visualization. This workflow can be extremely helpful when trying to track down a lost room, visualize unit mixture, or analyze the adjacency and diversity of various rooms in the model. Ultimately this Dynamo method adds yet another tool that can be customized for a multitude of model validation tasks.

 FULL DEFINITION: Step 1 (green), Step 2 (blue), Step 3 (orange), Step 4 (magenta)

 

FULL DEFINITION: Step 1 (green), Step 2 (blue), Step 3 (orange), Step 4 (magenta)

Workflow:

Step 1: collect all Room elements from your Revit model by their room name parameter and then isolate the perimeter geometry of each.

Step 2: match the list of all rooms in the model against the name of one particular room you are looking for (String node in the red circle). Using the List.FilterByBoolMask approach, any null items (often due to unplaced or duplicate rooms) are filtered out that can potentially break the definition later on down the line. The perimeter lines of the filtered rooms are then used to generate a 3D extrusion of each room shape.

image 4.jpg

Step 3: the list of 3D room shapes is then passed to the Display.ByGeometryColor node to colorize in the Dynamo geometry preview mode. Each unique color requires an RGB color code. You can find custom RGB color values in Adobe Photoshop or Illustrator, or by going to websites such as Kuler and COLOURlovers .

image 4.jpg

Step 5: once the definition is set up for one room name, the central cluster of nodes can be copied for all other room names. If everything is correctly set up, a series of colored rooms will appear in the 3D geometry preview.

Step 6: for means of validation, a count can be taken of all rooms that have been collected from the Revit model and fully processed. Comparing this count to a schedule an Revit will verify whether all targeted rooms were accounted for and visualized in Dynamo or not.

With this approach, all rooms can in the Revit model can be queried, sorted, and visualized.

Additionally, this Dynamo definition can be simplified by utilizing advanced list management tactics, eliminating the need to copy the central of nodes for each unique room name.

PRCA Rodeo Map

Hebron Harvest Fair Rodeo - Hebron, CT

Hebron Harvest Fair Rodeo - Hebron, CT

Growing up in the agrarian state of Oregon, attending local rodeos during the summer was a favorite family activity. The raw athleticism, toughness, and tradition always made for good entertainment. The humility, sense of community, and incredible treatment of livestock instilled a deep admiration for the country folk for whom rodeo is their entire livelihood.

Unlike most professional sports, the cowboys and cowgirls are responsible for personally funding their own equipment, horses, lodging, and transportation. Their lives consist of driving hundreds, sometimes thousands, of miles between weekends to make it to the next rodeo where they compete for a paltry winner-takes-all purse in the 4-5 figure range. A missed lasso toss, a momentary loss of balance, or an overturned barrel can be the difference between earning enough to cover a few more weeks of food, gas, and supplies, or heading to the next event empty-handed. As if the meagre earnings aren’t enough of a deterrent, the physical toll and life-risking courage the sport demands are further testament to the passion and dedication these athletes possess.

Now having lived in Boston for the better part of a decade, my rodeo participation has been reduced to the occasional visit home or sporadic television broadcasts. Recently I was ecstatic to discover a rodeo in Northern Connecticut, however I had my doubts about the level of competition and authenticity. Given that the majority of rodeos occur in the western and southern portions of the United States, I wondered how much incentive an athlete would have to drive all the way to New England to compete. My curiosity got the better of me and I decided to look for a map of all rodeos in a given season of the Professional Rodeo Cowboys Association (PRCA), only to to find that no such map exists on the internet. Thus I set out to make my own map...

Data Acquisition:
The first step was to acquire the schedule for the 2015 rodeo season. The PRCA website does not contain a cohesive schedule for the entire season but does post the remaining season schedule as well as the results of each event that has already occured. Luckily the construction of the website is very simple, which made for easy scraping of the data into a format that can be processed.

PRCA 2015 Results
PRCA 2015 Remaining Schedule

Parsing and Re-structuring with Dynamo:
After collecting the lists of information for both the remaining schedule and results, I set out to use Autodesk Dynamo Studio 2016 to parse and re-structure the data for visualization. With the knowledge that the National Finals Rodeo event in Las Vegas in December marks the culmination of every rodeo season, I pared down all events to this timeframe, resulting in 620 total contests. This is a prime example of Dynamo as a powerful visual programming platform independent of Revit.

Visualization with Tableau:
The next step was to merge the two data sources and organize the information by: event name, city, state, country, and first day of event. Once everything was clean and consistent, I exported the data to Tableau for visualization. The Tableau map feature allowed me to position all 620 rodeos across the US and Canada and colorize them based on when they take place during the course of the season.

Takeaways:
In the end I was surprised to see a moderate cluster of dots in the New England region and was particularly pleased to find rodeos in Western New York, Massachusetts, Connecticut, and Maine. Perhaps there is a contingent of die-hard New England cowboys and cowgirls keeping the spirit of rodeo alive in the East? Regardless, my respect and admiration for this sport will always keep me coming back for more and I look forward to checking out the local rodeo scene. Enjoy!

Dynamo Studio "definition" for parsing and re-structuring the PRCA rodeo data.

Dynamo Studio "definition" for parsing and re-structuring the PRCA rodeo data.

How is Data Changing The Nature of Design?


This article was recently published on Building Design + Construction: article
And originally posted on the Shepley Bulfinch Insight Blog: article


At the BLDGS=DATA symposium in New York this spring, the discussion focused on strategies for harnessing the massive amount of data made available by modern technology. An increased capacity for analysis has led to immense data generation and an unprecedented ability to identify correlations. The AEC industry today is grappling with ways to make the best use of it and to develop standard processes for leveraging and sharing it.

This surge of data is changing the very nature of design as architects begin to embrace a much more data-driven approach. Advances in Building Information Modeling (BIM) allow for more thorough project documentation and the ability to share building information with contractors. Intricate digital models and environmental simulation enable offsite fabrication methods and building systems improvements that have the potential to increase quality and reduce construction costs. Most importantly, access to vast quantities of data is helping design teams better understand a client’s needs and can be used to validate a particular design decision beyond previously available means.

With an increased capacity for capturing data, it is imperative not to get lost in the white noise. The seemingly limitless stockpiles of information must be strategically vetted for meaningful interpretation with a focus on the value it provides to the process and the end product. One of the most evident attempts to find a balance between data and technology is the current infatuation with computational design, with its powerful new software platforms, intricate parametric tessellations, and innovative materials.

How can architecture make the most of the growing data movement? The true promise of this information age is not iteration and automation but the ability to substantiate expertise and predict outcomes.

To better position themselves to do so, architectural practices must acquire and develop new skills to be able to filter and find value in the newly available data sources. Computer science will become an essential component of design education and graduates will be encouraged to form much more integral partnerships with engineers, construction managers, and environmental sustainability experts. Architects must seek data collection and information management techniques to help inform their process, exhaust possibilities, and confirm design outcomes. The evolution of the practice of architecture is about changing our mentality and approach: broadening our thinking, not necessarily eliminating tradition. While technology is becoming a powerful tool, the most critical role belongs to the individual who, alone or as part of a greater whole, is harnessing that power.

Ultimately the AEC industry as a whole will benefit from an increasingly data-driven approach to design and construction that promotes improved communication, better quality projects, and fewer hindrances to the building delivery process.

-Kyle Martin


Kyle Martin is a member of Shepley Bulfinch’s architectural staff. He is a co-founder of the Boston Society of Architects Revit Users Group’s “Dynamo-litia” and currently teaches Advanced Revit and Computational Workflows at Boston Architectural College.

SPECIAL ANNOUNCEMENT: Dynamo-litia


Don't miss the first ever Dynamo user group in Boston!

Tuesday, September 22nd
9am - 10:30am
BSA Space - Boston Society of Architects
RSVP Here

The inaugural meeting will provide a broad introduction to the Dynamo visual programming add-in for Revit.

Automated Feasibility Project


AUTOMATED FEASIBILITY PROJECT

A common circumstance of working with commercial development clients is producing feasibility studies to examine the value of pursuing a project on a particular site. Factors such as local zoning ordinances, building code, desired program, and site constraints all inform the buildable potential of a site and determine whether it is financially and strategically advantageous to invest in a project. Feasibility studies require the right balance of accuracy and efficient time management given the likelihood that many projects will not come to fruition.

My colleague Jason Weldon and I are currently embarking on a project to investigate the potential use of Dynamo for automating portions of the feasibility process. The two essential advantages to this approach are buying back time for deeper investigation and the validation of proposed schemes through iteration. Automation through Dynamo will significantly reduce initial preparation and manual input of information required to start a BIM model. Alteration of computational constraints will enact changes to the model for testing schemes and instantaneous calculations provide a report for each iteration. This process promotes efficiency and substantiates certainty.

Phase 1
Zoning Setbacks, Levels, Floor-to-Floor Height, & Courtyard

All studies begin with the same ingredients: a plot plan or civil drawing and preliminary zoning research. Working from these items alone, a maximum allowable building envelope can be established. Floor Area Ratio allowances and incentive zoning can be incorporated to evaluate proportional modifications to the overall envelope.

automated feasibility_phase 1.GIF

Here I start out by collecting the property line for a site from Revit. From the property line, I am able to establish setbacks on all five edges of the site. Constraints for the overall number of floors and floor-to-floor height allow to me to quickly explore different massing compositions. Inserting a courtyard facilitates improved residential potential for the site due to increased light and ventilation. By adjusting the distance of the courtyard from the exterior face of the building, it is possible to experiment with the proportional balance of form and floor area efficiency.

automated feasibility_definition

Download the open source definition here .

BLDGS=DATA

bldgs-data.png

BLDGS=DATA
Hosted by CASE Inc. (NYC)
The Standard High Line Hotel
May 28, 2015

Data For Understanding Cities
Blake Shaw, Head of Data Science - Foursquare

For those of you who have never heard of Foursquare, it was originally a mobile device app for sharing locations and activities with your social network - occasionally creating the opportunity for chance encounters when two friends find themselves in the same vicinity. Nowadays Foursquare has evolved into a powerful mechanism for tracking human behavior. With the exponential rise of people connected to the internet via mobile devices, how will the constant production of "data exhaust" be harvested and what can it tell us? How does a city behave as a living organism? What is the most popular activity on a hot summer day (answer: getting ice cream)? Foursquare continues to examine correlations among human behaviors and asks how we can better interact with the buildings we inhabit. Everyone experiences an environment differently but how can data derived from countless previous experiences be used to inform future experiences and provide valuable recommendations? Data is the key to optimizing the potential for enjoyable experience.

Data for Building Insight - Panel Discussion
Brian Cheng, Designer & Associate - HDR Architects Jennifer Downey, National BIM Manager - Turner Construction Peter Raymond, CEO - Human Condition

This session started off with each person sharing a little bit about technology efforts at their companies. At HDR they are utilizing a combination of a custom dashboard and parametric modeling to analyze health care program and massing test fits. In addition, advanced model sharing and co-location methods enable instantaneous coordination with engineers and consultants. Turner presented an example of how LEAN strategies, aggressive coordination and scheduling, thorough communication, all combined with robust project data have made a significant impact at their company and played a large part in the world record setting concrete pour at the Wilshire Grand in LA. Human Condition demonstrated a construction safety vest that tracks body position, biometrics, and worker location. With the further development of wearable technology, real-time information can be gathered on every worker at the construction site and a holistic safety culture can be established through the incentives of exemplary performance.

During the discussion it was pointed out that currently in the AEC industry there is a culture of "commodification of mistakes," meaning that contingencies are written into contracts, numbers are carried for unforeseen costs, and there is a standing assumption of labor inefficiencies and injuries on the job. How can BIM be better utilized to mitigate these costly errors and how can new technologies improve job safety and productivity? Perhaps tools like clash detection and co-location make for a more streamlined design and construction process. Furthermore, BIM as a platform needs to become a mode of communication between project constituents and facilitate a timely transmission of data. Another question that emerged was how can decades of on-site construction knowledge and experience be gathered and implemented much earlier on in the design and documentation phases? Strategies like pulling seasoned construction workers into the coordination meetings and using tools like a company intranet to archive knowledge and solutions were suggested. It is also imperative to seek feedback and document the process in order to ensure continuous improvement.

Data for Retail Roll-Out
Scott Anderson, VP Global Corporate Store Planning & Development - Estee Lauder Companies Melissa Miller, Exec. Director Corporate Store Planning & Development - Estee Lauder Companies

This team is responsible for identifying new opportunities for brand positioning within retail department stores and carrying out the requisite construction. After years of managing projects through email, Word, Excel, and Gantt charts it became apparent that tracking the transfer of information across multiple platforms was incredibly inefficient. The team set out to construct a custom dashboard that managed all communications, actions, specific information, and progress by project. Now project managers and company executives can enter the system at any time and review progress. The new system has enabled transparency and drastically reduced the duration of projects.

Data for Indoor Positioning
Andy Payne, Senior Building Information Specialist - CASE Steve Sanderson, Partner & Director of Strategy - CASE

With the emergence of indoor positioning systems that triangulate mobile device location using Bluetooth, wireless, and GPS, a team at CASE Inc. has embarked on a project to harness indoor location data. Using a custom app created to track employee movement throughout the workday, CASE recorded one month of data and produced this analysis. From this data, it was determined that only 2/3 of space is being actively utilized in this BRAND NEW office the company just moved into. In addition, some of the program was not being used as originally intended or seldom used at all. Disregarding the potential for future company growth, CASE has wondered how the results this post-occupancy analysis would have affected the planning of the office layout prior to signing their lease. This led to a larger conversation about the opportunity for implementation in design. For example, how can this technology be applied to doctors and nurses in a health care setting to monitor their daily routines and learn more about the way spaces are truly used? The potential for better understanding of human behavior and the development of theoretical simulations to analyze building program is very exciting.

More about the development of the app and beacon technology...

Data for Building Buildings - Panel Discussion
John Moebes, Director of Construction - Crate&Barrel Doug Chambers, CEO - FieldLens Todd Wynne, Construction Technology Manager - Rogers-O'Brien Construction

These three gentlemen discussed coordination and the use of data to avoid significant delays in project timeliness. At Crate&Barrel, many of the Autodesk software products are used on a small project team to design and build new stores throughout the world. Careful documentation allows the Crate&Barrel to bring the procurement of steel and materials in-house at a significant cost savings and drastically reduce the possibility of mistakes in the field that affect valuable components of retail design. FieldLens is a task management product that allows construction managers to better orchestrate the construction process. With the ability to assign particular tasks to specific individuals, save notes and images, review a 3D model and construction documents, and track workers on site, a superintendent can keep much better tabs on aspects of the job and managers can have a continual progress update on how the work is progressing.

Data for Galactic Growth
Roni Bahar, Exec. Vice Presedent of Development & Special Projects - WeWork

WeWork is a company that offers coworking office space worldwide via an hourly or monthly subscription model. In the last four years they have seen exponential growth leading to construction on an unprecedented scale to accommodate demand (12 new office locations in just the last year). In an attempt to manage this frenzy they have embraced modular construction as a method for standardizing construction technique, aesthetics, and material cost regardless of location or contractor. The kitchen units, cubicles, conference rooms, bathrooms, and common area furniture are all modular components built in Revit complete with detailed finish information, material takeoffs, and construction details. As much of a well-oiled machine the procurement and development arm of WeWork is, it was fascinating to hear that the one lacking component to the process is hard data and feedback. With such rapid growth and a relatively small project team, the company is building offices faster than research can be conducted to determine the success of the spaces they are producing. In the next few years as the WeWork begins to catch up with the pace, it will be interesting to see how they aggregate data to substantiate the success of the experience beyond sheer number of offices and dollars.

More on WeWork...

The BUILTRFEED team were also at the event and posted an excellent summary. Check it out!

The Glass Cage

I recently read the book "The Glass Cage: Automation and Us" by Nicholas Carr and was inspired by his message of the growing advantages and dangers of technology, particularly pertaining to design and architecture.

Carr says, "Technology is as crucial to the work of knowing as it is to the work of production. The human body, in its native, unadorned state, is a feeble thing. It's constrained in its strength, its dexterity, its sensory range, its calculative prowess, its memory. It quickly reaches the limits of what it can do. But the body encompasses a mind that can imagine, desire, and plan for achievements the body alone can't fulfill. The tension between what the body can accomplish and what the mind can envision is what gave rise to and continues to propel and shape technology. It's the spur for humankind's extension of itself and elaboration of nature. Technology isn't what makes us 'posthuman' or 'transhuman', as some writers and scholars have recently suggested. It's what makes us human. Technology is in our nature. Through our tools we give our dreams form. We bring them into the world. The practicality of technology may distinguish it from art, but both spring from a similar, distinctly human yearning."

Autodesk University 2014

Autodesk University
AU2014 Summary
December 2-4
Mandalay Bay, Las Vegas

Fusion 360 Digital Fabrication Workflows:

This course highlighted several features of Autodesk Fusion 360, a program that facilitates easy manipulation of 3D geometry otherwise difficult to achieve in Revit. After exporting a conceptual mass family from Revit to Fusion 360, an undulating wall form was created that could then be imported back into Revit, populated with curtain wall using adaptive components and rendered in a perspective street view with Google Maps background. That same form was imported into a program called Meshmixer that provides advanced options for preparing an STL file for 3D printing and allows you to apply custom supports. Altogether, the integrated workflow across several software platforms was relatively seamless and the course exhibited proof that very complex and customized geometry to be created in Revit for any project type.

Building a Good Foundation with Revit Templates:

Members of the architecture, engineering, construction and manufacturing industries gathered for this round table discussion about best practices for starting a project in Revit. Two methods were compared, the use of a Template File and Default Project. At Shepley Bulfinch, we use a template file at the outset of every project that contains a minimum amount of views, families and general standards to provide a good starting point. A default project has the advantage of carrying much more initial information including pre-placed families and objects but requires a significant time investment to keep the content current. Overall, the common sentiment in the room was that template files are easiest to maintain and offer the most versatility for any project type. Additionally, it is preferable not to “front load” Revit models and start out with unnecessary file size when one of the biggest challenges among all projects is keeping the model as small and responsive as possible.

Energy Analysis for Revit:

Are you familiar with the native energy analysis tool in Revit (hint: it’s under the Analysis tab on the ribbon)? This tool has the potential to be very helpful for early feedback to help drive the design. The task can be farmed out to the cloud for faster processing and to post reports for multiple options. For more in-depth analysis, the Revit model can also be exported to GBSxml format and opened in Green Building Studio, a cloud-based energy simulation platform. Relatively specific configurations are required within a model for the analysis to run successfully and one of the predominant takeaways of the course was the emphasis on modeling with energy analysis outcome in mind from the start of the project.

Challenges of LEAN Design and Computational Analysis:

This very engaging roundtable discussion examined the emerging role of computational analysis and generative design to help make more efficient design decisions. The keynote address at the beginning of the conference featured the use of "machine learning algorithms" where information and constraints are entered into a computer and simulations are run to determine an optimal design outcome. To start this session, we identified wastes and ineffective behaviors within each profession and in the collaboration process between. After a predictable round of architect-bashing, the question was proposed: "Does computation and simulation allow us to come to confident solutions earlier in the design process and reduce waste?" If existing condition information, user requirements, code constraints and many of the other variables that influence the design process can be programmed to generate permutations, is this a promising direction for the future of the profession? The group came to the conclusion that computational analysis and simulation will never be reliable enough to deliver a comprehensive design solution but may be helpful in providing direction at challenging moments in the process.

Practical Uses For Dynamo Within Revit:

Dynamo is a visual programming environment that allows you to make custom changes within Revit and extract information otherwise unattainable with the native program features. The program utilizes a user-friendly graphic interface to make adjustments within the Revit API (the "back end" which contains all the building blocks for how the program functions). This course demonstrated many entry-level uses for Dynamo including:

  • quickly making changes to all instances of a family type in a model (example: adjusting the offset height of all columns at once)
  • advanced family geometry (example: controlling profile order to create cantilevered and wrapped swept blends)
  • wrapping structure along curved surfaces
  • generating separate finish floor on top of slab automatically from room boundaries


Utilizing Revit Models for Architectural Visualization:

This course covered work flow and best practices for exporting a Revit model to the Unity 3D, a game engine that enables real-time visualization and walk-throughs. The first step is preparing the Revit model for export by cropping down only what you need with a section box and turning off unecessary categories in Visibility Graphics. Export the model to FBX and import it to 3DS Max where materials, cameras and lights are applied. Lastly the model is imported into Unity where perspectives, walk-throughs and animations can be utilized. In summary, Unity 3D provides a compelling presentation piece that may appeal to some clients but it is important to consider the time investment that goes into the preparation process.

Dynamic Energy Modeling:

An energy and environmental analysis consultant presented a multitude of methods for assessing daylighting, wind, weather, energy consumption and other performance characteristics of a design. Specific tools covered included eQUEST, Green Building Studio, Autodesk360 Lighting Analysis, raytracing and raycasting, Rushforth Tools Library, Autodesk Ecotect and more. Although these programs were generally too advanced for the level of in-house analysis we use at Shepley Bulfinch, I enjoyed learning about numerous ways information can be extracted from Revit and used to help inform the architectural design process.

Revit + Dynamo = The Building Calculator:

Beyond parametric modeling and making tweaks within Revit, Dynamo can be used to extract much of the information stored within a model. By using an "export to excel" function, areas, quantities, dimensions, room lists and so much more can be exported and analyzed with the powerful tools Excel has to offer. Schedules can be created, complex building calculations can be scripted and automatically updated upon every change within the model, or checks and balances for code and zoning can be integrated to produce reports. Items can then be adjusted, renamed or resized to push back into Revit from Excel and make direct changes to the model. Dynamo provides a giant step forward in the pursuit of harvesting the full potential of BIM.

The Great Dynamo Dig: Mine Your Revit Model:

With all this excitement surrounding Dynamo, did you know there is also a SQL export function? This allows for the creation of a much more comprehensive database that can be thoroughly organized using database management software and mined for analytics and appealing visual graphics in Tableau.