The purpose of this project is to demonstrate skills in the ETL (extract, transform, load) process. For this project our team saught out several different fast food databases from different fast food franchasis (in this case: Subway, Burger King, McDonalds, and Starbucks) and loaded the information into one cleaned SQL database.
Data for this project was found at https://www.kaggle.com/
Instructions on how to implement our process can be found below.
Follow these steps to create database and transfer data:
- Clone repo to your desktop.
- Open
pgAdmin4
and create a new database titledFastFood_db
. - Navigate to
FastFood_db
and open a query tool. - Open
ERD.sql
from the cloned repo folder. - Select all and run query.
- Close the query tool and open a new query tool from
FastFood_db
. - Open
food_class.sql
. - Select all and run query.
- Close query tool.
- On your desktop navigate to the cloned repo folder.
- Right-click the repo folder and select
Git Bash Here
to open new a git bash terminal. - Type
source activate PythonData
in the terminal and press enter. - From the terminal open jupyter notebook.
- In jupyter notebook create a new text file and rename it
config.py
. - Within
config.py
create a variable calledusername
and a variable calledpassword
. Assign your postgres username and password to the respective variables. - Save and close
config.py
. - Navigate to and open
Restaurant_ETL.ipynb
. - Go to Kernel and select Restart & Run All.
Example Query
To test database run the following example query.
- Open
pgAdmin4
. - Navigate to
FastFood_db
and open a query tool. - Open
example.sql
. - Select all and run query.
- Close query tool.