Patagonia Database Modelling (SQL)
Project Timeline:
8 weeks (March - April 2025)
Jaden Louie, Joseph Lucas
Team:
Draw.io, Miro, SQL, MySQL, and AI tools (e.g., ChatGPT & DeepSeek) for assistance
Toolkit:
Role:
Co-Database Designer
Topic Pitch Presentation, Database Schema, Operations Queries & Project Report
Deliverables:
Overview
This project involved designing a real-world MySQL database for Patagonia’s online retail system. Our goal was to model and support Patagonia’s end-to-end business processes—from checkout to repair—using SQL and entity-relationship design principles.
Objective
Develop a comprehensive backend database for Patagonia’s e-commerce workflow
Capture key operations such as order checkout, repairs, shipping, and customer account management
Use SQL procedures to simulate business logic including inventory updates and refunds
Methodology
Created 6 use cases and mapped business processes into a normalized ER diagram
Designed SQL procedures for inserting orders, processing returns/refunds, tracking shipments, etc.
Used stored procedures to simulate transactional workflows
Developed inventory deduction and restocking logic
Integrated repair logistics including technician assignment and material tracking
Project Start
Use Cases
To start the project, we first identified six key use cases from Patagonia’s business model — including Checkout, Cart Management, Customer Accounts, Order Shipping, Returns, and Repairs. Then, from each use case, we created a detailed template outlining the actors, steps, data inputs and outputs. (follow link to see generated use case template)
ER Diagram
Using the use cases as a guide, we built a fully normalized ER diagram to model Patagonia’s backend operations. It includes 23 entities and captures key relationships like customer orders, shipments, returns, and repairs. This ERD helped ensure our database could support real-world workflows like inventory updates, technician assignments and payment processing
Creating our Tables on SQL
To build our schema, we started by creating the independent tables first—these included tables like Customer, ProductCategory, and RepairTechnician. These tables don't rely on other entities and define primary keys and basic attributes.
Once those were complete, we moved on to dependent tables—like Product, ProductVariant, and OrderItem - which include foreign key relationships.
We followed the logical order of dependency, making sure referenced data was inserted first, like ensuring a product existed before creating its variant. We also inserted around 50 rows of sample data into transactional tables to test our queries and relationships.
Inserting Data
Lastly, we inserted 50 rows of dummy data to transactional entities to test queries and fulfill the outlines of the project. A combination of Datagrip , ChatGPT, and DeepSeek was used assist in this part.
Finished Database: We then validated our database using the select command to validate data entry. As a result, we were able to create the foundation of our Patagonia database, which we can now run business logical queries useful for managers/decision makers.
Operations Queries
To support day-to-day operations and business decision-making, we designed a separate SQL script that performs essential actions aligned with each use case. Actions such as Customer, Order, Repair Request creation and many more (essentially replicating our use cases). These procedures were built to simulate realistic business logic and transactional workflows, reflecting the end-to-end process flow within Patagonia’s system.
To keep things simple, I’ll just walk through one example from Use Case 3 - creating a new customer.. This procedure simulates a real-world customer sign-up. We created a stored procedure called CreateNewCustomer that takes the customer’s details - ID, name, email, phone number, and address ID - and inserts them into the Customer table . Once executed, it returns a confirmation message showing the newly created customer ID. As shown in the result table below, “John Doe” was successfully added to the database. This kind of procedure ensures consistency and efficiency when handling new customer entries in a live system.
The following link shows all the results of running all these operational queries, like shown below (link to results). If you would like to try and run these queries, download both sql files, run the first schema, then run each use case one by one on the queries schema to see how each case effects the database.
Project End
Reflections
This project was instrumental in deepening my skills in SQL and database modeling. Designing Patagonia’s backend operations from scratch gave me the opportunity to not just apply theoretical knowledge, but also to think critically about how real-world business processes map to relational structures.
Throughout the project, I became more confident in:
Writing and troubleshooting complex SQL queries and stored procedures
Structuring normalized schemas and resolving many-to-many relationships
Handling foreign key dependencies and sequencing table creation logicall
Conclusion
The Patagonia SQL database project gave me hands-on experience in building a fully relational backend system from the ground up. It required me to consider not just how data is stored, but how it flows across processes like checkout, returns, and repairs.
What set this project apart was the integration of practical SQL logic with thoughtful data modeling and the use of AI-assisted development. Each component - whether a stored procedure or ER diagram - was an opportunity to bridge classroom theory with scalable, maintainable systems.
Overall, this project significantly improved my technical fluency in SQL, gave me real exposure to database design thinking, and showed me how modern AI tools can complement human problem-solving in backend development.
More Data Analysis Projects
Cleaned and analyzed global road mortality data using Tableau, leveraging several techniques such as trend analysis, geospatial mapping, and interactive dashboards to create insights on high-risk regions.
Cleaned and analyzed Airbnb listing data to identify trends in occupancy, revenue, and location performance. Buillt an interactive Tableau dashboards and conducted a mock stakeholder presentation to showcase recommendations.
Redesigned website structure and conducted traffic analysis using Google Analytics and A/B testing to create a strategic acquisition plan and improve user engagement.