Traditional remote sensing workflows are often bottlenecked by data volume and hardware limitations. Analyzing two decades of tree canopy change for a municipality like Clarksville, Tennessee, typically requires downloading gigabytes of raster imagery, managing local storage, and performing computationally intensive processing on a desktop workstation. This "offline" approach creates a barrier to entry for stakeholders who need immediate access to environmental insights but lack the software or hardware to process raw satellite data. The objective of this project was to bypass these physical limitations by developing a cloud-native web service. The goal was to engineer a lightweight, browser-based application that leverages server-side parallel processing to visualize 22 years of vegetation change instantly, allowing users to interact with the results without downloading a single file.
The solution was built using the Google Earth Engine (GEE) JavaScript API, utilizing a client-server architecture where the browser acts as the interface while Google’s cloud infrastructure performs the heavy lifting. I began by scripting an automated retrieval pipeline to access the "Science Percent Tree Canopy Cover" catalog. Instead of downloading these rasters, I wrote logic to filter the collection by geometry (Clarksville) and date (2001 and 2023) directly on the server. I implemented a differencing algorithm to quantify canopy change, applying classification thresholds to categorize the raw pixel values into semantic classes (e.g., Significant Loss, Regrowth, No Change).
To transform this analysis into a user-friendly web application, I utilized the GEE User Interface (ui) library to build a custom frontend. I engineered a dynamic legend panel and an interactive chart interface that summarized canopy change statistics on the fly. Furthermore, I programmed the generation of a time-series animation (GIF) directly within the console, using ui.Thumbnail to render a frame-by-frame visualization of the landscape's evolution. This approach ensured that the visualization was not a static output, but a dynamic response to the code being executed in real-time.
Percent Tree Canopy Cover 2001
Percent Tree Canopy Cover 2023
Classified Tree Canopy Change Map (2001–2023)
The final deliverable was a deployable script that functions as an end-to-end web service. When shared, the script instantiates a session that processes over two decades of geospatial data in seconds, rendering the classified change map directly to the user’s browser tiles. The application successfully identified urban expansion zones where canopy coverage had declined, visualizing these patterns through the interactive map and the accompanying statistical charts.
By keeping the data and processing in the cloud, the tool eliminated the latency associated with traditional GIS workflows. The inclusion of the custom UI elements—specifically the automated legend and the area-change charts—bridged the gap between raw code and end-user accessibility. This allowed non-technical stakeholders to view not just the spatial distribution of canopy loss, but the quantitative breakdown of that change, all within a standard web page environment.
Animated Visualization of Tree Canopy Change
Area by Tree Canopy Change Class Chart
Shareable Script Example
I developed a cloud-based geospatial application using the Google Earth Engine API to analyze and visualize tree canopy change. I scripted server-side processing instructions to handle multi-temporal raster math and built a client-side user interface (charts, legends, animations) to present the results dynamically in a web browser.
This project fundamentally changed my understanding of Web GIS architecture. In traditional web mapping (like ArcGIS Online), we often publish pre-cooked "tiles" that are static. In Earth Engine, I learned how to build a service where the analysis happens on request. The challenge was learning to think asynchronously—understanding that the code I write in the browser is merely a set of instructions sent to the server, and the image I see is the server streaming the answer back. I also realized the power of the ui library; adding a simple legend or chart transforms a raw script into a polished "Product" that feels like a professional software tool.
Learning I learned that code is the most efficient delivery mechanism for geospatial analysis. Instead of shipping a 5GB map package, I can ship a 5KB script that generates the same result. This "Function as a Service" (FaaS) model is the future of environmental monitoring. Moving forward, I plan to explore converting these scripts into standalone Earth Engine Apps, which remove the code editor entirely and present the user with a clean, app-like interface, further lowering the barrier for decision-makers to use remote sensing data.
For the George Washington Birthplace National Monument (GEWA), possessing robust ecological data was only half the battle; the second challenge was accessibility. Park managers and biologists needed to query fish sampling records and perform basic spatial analyses, such as determining the impact zone around specific stream reaches, without requiring access to complex desktop GIS software like ArcGIS Pro. The objective of this project was to democratize access to the park’s spatial data by deploying a lightweight Web GIS Application. The goal was to bridge the gap between the Enterprise Geodatabase and the end-user, creating a browser-based interface where non-technical staff could visualize live data and run custom geoprocessing tools on demand.
The workflow utilized ArcGIS Enterprise to transform static data into dynamic web services. I began by publishing the SQL-backed monitoring data as Feature Services to ArcGIS Server, ensuring that the web map consumed live data rather than static copies. This meant that any update to the backend database was immediately reflected in the web application. To enhance the user experience, I utilized Arcade scripting to configure intelligent pop-ups. Instead of simply listing attributes, I wrote expressions that dynamically generated hyperlinks to external species databases (such as FishBase) based on the specific species ID clicked by the user, seamlessly integrating internal park data with external scientific repositories.
The core technical achievement of the project was the development of a Custom Geoprocessing Service. I recognized that managers frequently needed to define "impact buffers" around sampling sites, but standard web widgets often lacked the specific logic to clip these buffers to the park boundary. I authored a model in ArcGIS Pro that accepted a user-defined distance, buffered the input points, and mathematically clipped the result to the GEWA administrative boundary. I then published this logic as a REST endpoint (GPServer), effectively exposing a server-side analytical tool that could be triggered by a simple widget in the web browser.
Geoprocessing Model
The final deliverable was a fully integrated web application that empowered park staff to perform self-service analysis. The custom geoprocessing widget functioned successfully within the browser environment, allowing users to input a buffer distance (e.g., "50 meters") and receive a visually distinct, boundary-clipped polygon layer in return. This capability reduced the dependency on GIS specialists for routine tasks. Furthermore, the logic was "stateless," meaning the server performed the heavy computational lifting without taxing the user’s local machine. The application demonstrated a complete Web GIS lifecycle: data was managed in the database, processed on the server, and visualized in the client, creating a seamless decision-support tool for park management.
Web Application - Showcasing Observation Filter
Species Entry with Dynamic Pop-up
Building Entry with Dynamic Pop-up
Geoprocessing Service Live in Web Application
I published enterprise map and feature services to ArcGIS Server and developed a web application using Web AppBuilder. I authored a custom ModelBuilder tool for buffering and clipping spatial data, published it as a Geoprocessing Service, and integrated it into the web interface alongside Arcade-enhanced pop-ups.
This project revealed the complexity of the REST API architecture. Publishing the geoprocessing service was particularly challenging; I learned that the logic which works on a desktop machine often fails on a server unless strict parameter definitions (Input Modes, Output Data Types) are set correctly. I also gained an appreciation for the difference between "Map Services" (fast, rasterized tiles) and "Feature Services" (slower, but queryable vector data), and how balancing the two is key to web map performance.
I learned that Web GIS is about accessibility, not just cartography. The most beautiful map is useless if the intended audience cannot use it to answer their questions. By building the custom Geoprocessing Service, I realized that my role as a GIS developer is to encapsulate complex spatial logic into simple, push-button tools. Moving forward, I plan to focus more on Arcade and Python API for Web GIS, as I see these as the primary tools for customizing the user experience in modern enterprise environments.