The Real Cost of Sharing Grasshopper Scripts
Molly from the traffic team approached me with what seemed like the perfect computational design opportunity. Her team was drowning in manual work, checking hundreds of routes in Google Maps to analyse how road network changes would affect residents' travel times. The moment she described their process of looking up routes, noting times, and repeating this across different transport modes, I could already see the solution taking shape in my mind. I could hardly wait for her to finish talking, when she did, I promised her a proof of concept by the next day without any hesitation.
So, I got to work, connecting nodes, writing the Grasshopper script. It was going to change the way they worked forever. I didn't even need a full day to finish, within a few hours, I created the perfect Grasshopper script.
Checkout the LinkedIn Post about it.
The minute my script produced the right results, I practically sprinted across the office to Molly's desk. She fired up the script I sent her. A window popped up almost immediately, "missing plugin: eleFront R6".
Not ideal but it was just a missing plugin. I quickly walked back to my desk and sent over the missing plugin. Then, I walked back to Molly's desk. Maybe this hiccup wasn't so bad, maybe Molly still thinks that computational design is the best thing ever. My thoughts got disturbed with another window. This time it said "missing framework: Accord.NET". It was another error.
It wasn't the last one either, I spent the next few hours running back and forth, installing a bunch of components my script needed to run. Molly was patient the entire time, but I knew I had let her down. What should have been a perfect demonstration of computational design because a drawn-out exercise in troubleshooting. I felt like an over glorified IT support person (not offense to any IT people). The script ran perfectly on my machine, but I hadn't considered what it would take to make it work on someone else's.
A hidden challenge in computational design
This experience is a challenge in computational design that we rarely talk about. It's the gap between creating a solution and delivering it to others. While we love talking about algorithms and showing elegant solutions, we often overlook the complex reality of getting our tools into the hands of others. This challenge is uniquely difficult in computational design, where our scripts rely on specific combinations of software versions, plugins, and frameworks that may not exist on other machines.
It's easy to get caught up in the excitement of problem solving that we forget about making sure others can use it first. You can have the best solution in the world but it isn't going to make a difference if other people can't use it.
Unlike traditional software development, where deployment tools and practices are well-established, computational design scripts often exist in a grey area between custom tools and professional software. We're creating sophisticated solutions using platforms that weren't necessarily built for broad distribution.
The excitement of solving complex problems can blind us to the practical realities of implementation. A script that saves hours in theory becomes worthless if it takes days to deploy and requires constant maintenance. Imagine every time someone needs to run the script, they have to ask you for help and you take another couple of hours ensuring it works.
I've written about a framework (it's a list of questions) that have helped me understand this better in Why we need more than just scripts.
Making deployment a priority
Since that day with Molly, I've changed how I approach creating solutions for others.
Think about delivery from day one
Before even starting, I now ask questions like: "Who needs to use this?" and "What do they need to make it work?". It's about shifting my mindset to the user first, problem later. It's about understanding the long term impacts of the script instead of the short term ones.
Create snapshot folders
For every important script, I try to maintain a complete deployment folder that includes:
A frozen version of all required plugins and their dependencies
Test data sets that verify the script's outputs
Some documentation on how the script works and how to use it
This gives me a "home" for the script. It gives me a place I can point others to, if they need more information when I am not available. It also means they can learn and install the script at any time without having to wait for me.
Pretend you are a user
I now try to maintain a "clean" machine that mirrors our standard office setup. Every script gets tested there before deployment. This is as simple as just borrowing a blank laptop and seeing if the scripts runs as expected. The idea is to mimic a user's environment first without bothering them. This process has shown me a few surprising dependencies I would have missed:
Custom Python libraries that I'd forgotten I'd installed years ago
Grasshopper plugins that were so common to me, I considered them standard
Operating system permissions that I have, but others don't
Unfortunately, Molly was my test "computer" this time. I could have saved all that time, if I just borrowed a blank computer and test the script out before going back to her. I did have the time, I just rushed to impress her.
Moving forward
As computational designers, we often measure our success by how elegantly we solve problems or how much time we save. But we should also measure it by how effectively others can use our solutions. If you're solution can save two hours of time but it take six hours to set up every time, then it doesn't really save any time.
My experience with Molly taught me that the gap between creating and delivering solutions is just as important as the technical challenges we love to solve. Our role isn't just to create clever solutions, but to ensure they can actually make a difference in someone else's work.
The next time you're excited about solving a problem, take a moment to think about the journey your solution needs to take from your computer to someone else's.