Page Overview

Background & Transition

From architectural training to software and data-driven systems

Interests & Technical Drive

Curiosity and self-directed learning that shaped my skill development and career progression

Technical Skills in Professional Contexts

How analytical thinking and programming capabilities translated into real-world impact

Role & Work Evolution

  • Research reports and publications

  • Parametric modeling and computational workflows

  • Data engineering and full-stack systems

Selected Full-Stack Projects

  • Garmin Physiological Spatial Insights

  • Mass Timber Carbon Calculator

  • Dynamic Lands Lab

Analytical Mindset

I have long been interested in data-driven design, making decisions based on analytical, quantifiable results rather than pure intuition. In architecture school, this began as data-informed strategy and and performance-based thinking.

Academic portfolio excerpt: NYC demographics inform building programming

What if I could build the tools to generate the insight?

This desire to not just use insights, but build the analytical pipelines that make those insights possible is largely what lead me venture into programming.

San Francisco and Exposure to the Data Visualization Community

After graduating from KU, I tool my first professional job in San Francisco, which exposed me to a culture where software, data and visualization were primary creative disciplines.

I was particularly inspired by the data visualization work emerging from Uber’s data engineering and visualization teams, especially regarding the Kepler.GL and Deck.GL ecosystems.

The translation of invisible data into insights, grounded in physical space deeply resonated with my analytical and design oriented mindset.

I found the education and career journey of Shan He, the lead engineer of Kepler.GL to be an especially formative influence, as she also came from a formal architectural background and reoriented her career to focus on software and data.

Self-Directed Technical Growth

My development as a data scientist and software developer has been largely self-directed. While my formal roles did not initially require deep programming, data engineering or computational expertise, I’ve intentionally built those capabilities through personal research, independent projects and technical experimentation.

Grasshopper

Provided my first serious exposure to parametric logic and computational thinking. This exposure trained me to think in systems, design rule-based transformations and treat design (regarding both physical structures and visualization) as a dynamic process rather than static output.

Python

Everything seemed to accelerate when I learned Python. The application of mathematics to rule-based problem solving made calculus and statistics became intuitive, sparking my focus on data science, algorithms and data engineering projects.

I began to rely less on native Grasshopper for computational flows and focused more on Python-based computation, using Grasshopper as a data visualization tool.

Python unlocked mathematical modeling, algorithms, analysis and transformation, computer vision, geospatial processing and automation pipelines.

JavaScript

Wanting to bring data visualization to interactive, browser-based environments, I began working with D3.JS, Three.JS and later, Deck.GL.

This allowed me to share interactive visualizations outside of the Rhino/Grasshopper environment.

TypeScript

As projects grew more complex, such as the frontend for DynamicLandsLab, I introduced TypeScript for stronger typing and maintainability.

This is where my focus began to shift from exploratory projects to an increasing focus on wanting to build tools that could be used by others, beyond visualization.

React

At the same time I learned TypeScript, I began building with React to create dynamic, interactive tool experiences. React’s emphasis on reusable UI components reinforced ecosystems-level thinking in other areas of programming, including building flexible, function and class-based tools for Python.

React expanded how I though about coding practices and UI/UX, as I could now allow users to select parameters, filter data, trigger workflows and visualize results in real time.

C#

C# emerged through deeper work within the Rhino + Grasshopper ecosystem to build terrain modeling and analysis plugins for Dynamic Lands Lab. While a more recent addition to my language skillset, as I translate Python tools to C# for compileability, speed and scalability, C# has proven be a computationally powerful tool.

2019


2020


2021


2022


2023


2024


2025


2026

Personal Projects & Areas of Investigation

Image processing: moment signatures, similarity clustering and cellular automation

Moment signatures and similarity clustering (with ThreeJS visualization)

Class and object-oriented programming

Application of cellular automation in color image and geospatial raster data segmentation

Geospatial data and WebGL rendering

Commercial Aviation Fleet Mix and Industry Pattern Analysis

In Parallel with the operational modeling at SFO, I independently developed a comprehensive analysis and visualization tool to study spatial and temporal patterns in commercial aviation. The focus was on understanding how fleet mixes had changed over time and how these shifts impacted airport infrastructure. Using O/D data and aircraft type records, I highlighted how emerging aircraft/fleet trends could impact airport design requirements.

I shared this project with the early formation of Corgan’s R&D team and the project was included in the first annual Curiosity Report. The piece sparked interest across firm who were unaware that this level of data-driven industry analytics was possible internally. The visibility of this project directly contributed to the R&D team’s opening of a data scientist role, for which I was selected and joined the team in April of 2020.

Continued Application

The aviation industry analysis initiative catalyzed a broader effort within the firm to conduct large-scale data-driven market intelligence that continues today.

Professional Applications

Risk management with ML models

Expanding the firm’s analytical, computational and visualization capabilities

Predictive Operations Modeling

While embedded at Corgan’s SFO project office, I developed a predictive analysis tool to support overnight wayfinding deployment. Gate signage had to be revealed across the airport in one night prior to flight operations resuming in the morning. There was a risk of unexpected late arrival flights disrupting work zones.

On my own time, I built a system that analyzed historical BTS flight data to model late arrival probability and estimate passenger volume based on aircraft type. Using Grasshopper and Rhino, I mapped this data at 30 minute increments across the airport to visualize when and where late arrivals may occur.

I shared this system with Corgan’s aviation VP, who has since included this project in several client discussions to demonstrate analytical and strategic capabilities.

Evolving Role

*Joined Hugo (Corgan’s Internal R&D team) in April 2020

My role has evolved through self-directed technical and strategic development

Analytics, reports & publications

Data science & visualization

Computational workflows & automation

Market forecasting analytics pipelines

Data engineering

Geospatial analysis and remote sensing

Frontend web development

Full-Stack

I design and build data-driven systems that transform complex data into analytical tools and interactive applications that expand our team’s capabilities and inform real-world decision-making.

2020

2021

2022

My role has evolved significantly over time thorough self-directed skill development. I began primarily producing reports, publications and analytical visualizations, but became increasingly interested in how to scale analytical processes with automation. That lead me to focus more on developing computational workflows and automated data processing pipelines. Over time, this expanded into full-stack development, where I now develop and implement end-to-end analytical systems spanning data engineering, modeling, automation and interactive user interfaces.

2023

2024

2025

2026

Analytics, reports & publications

A sustained contribution to research reports and publications that translate complex analysis into clear, credible insights. My work focuses on grounding claims in rigorous data analysis and presenting results through intuitive visualizations that make patterns immediately legible. Over time, this work has also helped shape the team’s visual language and analytical style across research outputs.

Data science & visualization

A continuous growth trajectory of expanding my analytical capabilities. From descriptive, statistical analytics, to machine learning, neural networks and AI tools. I maintain a strong emphasis on gaining a deep understanding of the mathematics, data structures and algorithms behind analytical systems.

Computational workflows & automation

Greater focus on streamlining processes for repeatability and usability. Began with simple single script Python automation tools and now involves codebase/ecosystem-level thinking. 

Market forecasting analytics pipelines

I began being tasked by executive leadership to build analytical systems to aid in business development and strategic growth decisions. These quickly grew into large-scale, multi-temporal, geographic market analysis and forecasting tools, involving streamlined data engineering pipelines for data ingestion, harmonization, cleaning and analysis, as well as front-end interfaces to make insights actionable for non-technical stakeholders.

Data and systems engineering

As analytical scope grew, I needed develop repeatable processes to move quickly and efficiently. This strengthened and expanded the ecosystem-level thinking I developed through building computational workflows with a greater emphasis on data engineering pipelines.

Geospatial analysis and remote sensing

I began working with geospatial raster data from remote sensing sources including Landsat to support environmental and sustainability research efforts. Seeing the value in integrating geospatial analytics in design workflows, I have since been a strong advocate for building geospatial/GIS services at Corgan and have built technical infrastructure to make that possible.

Frontend web development

Took on front-end development projects such as the Mass Timber Carbon Calculator and began applying a greater focus on building front-end web-based interfaces to support analytical projects.

Full-Stack

Embraced a strong emphasis on building end-to-end systems and ecosystem-level thinking from data engineering, analytics and automation to UI/UX.

2020

2021

2022

2023

2024

2025

2026

Reports & Publications

Parametrically generated visualizations and infographics

Market research and analytics with web scraping, text mining and NLP

Energy infrastructure mapping and analysis

Commodity flow analysis and the impact on infrastructure

Spatial analysis tools and visualizations

Interactive housing market and demographics analysis

Energy generating potential by location

Data-Driven Parametric Design Workflows


Client: Hover Energy

Scope: Develop computational approach for wind-screen designs that maximize airflow with minimal turbulence to drive rooftop wind turbine units.

Dynamic CFD responsive screening systems

Logic to maximize airflow to turbines and minimize flow turbulance


Client: EdenGreen (Vertical Farms)

Scope: Develop a streamlined approach for right-sizing vertical farm systems to meet production requirements and spatial constraints. Spatial configuration options for micro-farm integration in commercial structures.

Parametric right-sizing and layouts

Model air flow

Optimize louver profile for max airflow and minimal turbulence

Cluster facade control points with nearest airflow x brep intersections

Loft and join louver profiles

Passenger Experience Modeling with Computer Vision and IMU


Client: Ontario International Airport (ONT)

Scope: Identify design-related pain-points and improvement opportunities in passenger experience. Study participants wore eye tracking glasses to track eye movement while simulating a typical passenger journey.

This streamlined data processing and analysis pipeline ingests eye tracking metrics, recorded video and IMU data to identify when participants displayed signals of confusion and focus.

Yolo object detection and classification

Image exif extraction pipeline for automated detection mapping

Combined signal analysis and spatial insights

Full-Stack Applications

Garmin Physiological Spatial Insights

Mass Timber Carbon Calculator

Dynamic Lands Lab: Spatiotemporal land analytics and visualization

Garmin Physiological Spatial Insights

Developed for:

Corgan's R&D team and Education studio to support a comprehensive research study into the physiological impacts of chronic health conditions and the impact of learning environments on teachers.

Notable Features

Python ETL pipeline for multi-sensor time-series synchronization

CRS harmonization with Pyproj

Timestamp-based path reconstruction with recursive interpolation for path smoothing

WebGL geospatial rendering with DeckGL

React state-driven UI for interactive filtering and analysis

Geojson layer extrusion toggles allow the user to render the model in 2D and 3D

Buttons active the project scope area and tooltips display program/space names upon mouseover events

Dropdown lists allow the user to filter data by individual study participant, condition group and physiological metric

Rechart barcharts dynamically update to display aggregated data summarizations based on the current filter selections

Sliders connect to the DeckGL layers and control spatial aggregation bin size allowing customized spatial resolution

Hexbin extrusion heights are scaled using the sliders for visual customization

A full-stack analytical application designed to correlate physiological signals with spatial environments. The system ingests raw Garmin activity exports and processes heart rate, walking pace, cadence, and GPS data through a Python-based transformation pipeline. Multi-sensor time-series streams are synchronized and normalized into a composite geospatial dataset, binding physiological signals directly to spatial coordinates.

To ensure spatial integrity, GPS data was harmonized to the WGS84 coordinate reference system. Location points were spatially resampled to mitigate signal noise and drift, then reconstructed into continuous walking paths by sequencing timestamps and recursively interpolating intermediate positions. This enabled smooth, motion-based visualizations of biometric response along movement trajectories.

The frontend application built with React, Vite, Deck.gl, and Recharts allows users to interactively explore these spatial-physiological relationships. Stakeholders can visualize biometric fluctuations along paths, filter metrics dynamically, and assess how environmental layout conditions may influence human response. The application is deployed via Vercel for internal use and demonstrations.

Mass Timber Carbon Calculator

Notable Features

Embedded into Corgan’s website for public use

Deployed using Azure DevOps pipelines for automated builds

Implemented custom interactive diagrams and branded data visualizations using D3.js


Development Challenges

Required multiple pages of content in a single page application without scrolling

Corgan’s website is a Drupal CMS, managed by a third party marketing firm. The site’s architecture does not cater to dynamic webpages, so the application had to be deployed as a microsite, embedded into the Corgan website via Iframe.

Iframe embedding meant the application needed to avoid nested scrolling and there would be no routing between pages.

This presented the challenge of designing an application with multiple pages of content, within a single, static application.

Solution

A nonconventional approach to mimic separate pages with React state-managed, dynamic divs that expand and collapse upon button click events. This provided the appearance of navigating between individual pages in the application without routing to separate pages.


Overview page with links to the white paper and expandable divs

Expanded content contains diagrams from Corgan’s marketing team, definitions and descriptions

The Forest Map section includes an embedded forestry map from the US Forestry Service to provide more context about domestic forestry and wood supply

The calculator consists of a side panel for user selections and dynamically adjusted D3.js visuals

The user controls the building’s floor plate size and level count with sliders and material and waste-handling selections with dropdowns. Upon each selection, the calculator recomputes outputs and adjusts visuals in real time

The user can set their project site location and select nearby mass timber suppliers. The calculator adjusts carbon calculations to include transportation related emissions.

Finally, the user can download the results as csv and upload to OneClick LCA or similar tracking tools

A full-stack analytical platform to model the true embodied carbon impacts of mass-timber construction. The tool extends conventional life-cycle assessment approaches by incorporating forestry-stage emissions, including biomass waste (“slash”) generated during timber harvesting. The system enables design teams to evaluate how species selection, transportation distance, and harvesting practices influence the carbon footprint of a timber building.

The application integrates a carbon accounting model with an interactive interface that allows users to explore scenario-based inputs and compare environmental outcomes in real time. By expanding the carbon accounting boundary back to forest harvesting processes, the tool provides a more complete understanding of the environmental implications of mass-timber design decisions.

Impact:

  • Provided Corgan with a more rigorous framework for evaluating mass-timber sustainability claims

  • Expanded internal capability around embodied carbon analytics

  • Supported sustainability research and client-facing design analysis

Dynamic Lands Lab

Dynamic Lands Lab is a personal research and development project focused on spatiotemporal analysis of geospatial data to better understand land characteristics and environmental change. The system integrates multisource datasets—including Landsat satellite imagery, LiDAR terrain data, agricultural land cover data, OpenStreetMap features, and historical weather records—to generate a multidimensional representation of terrain conditions and their evolution over time.

The project consists of a Python-based data engineering and analysis pipeline paired with an interactive web application built with React and Deck.gl. Together, these components allow users to explore land characteristics, filter terrain based on environmental indicators, and analyze spatial trends relevant to agricultural viability, land restoration, and land management.

The platform currently comprises a ~15k-line codebase spanning Python and TypeScript/JavaScript, with ongoing development focused on expanding analytical capabilities and improving performance, notably, the development of C# components.