Automating EpicBook Deployment with Terraform, Ansible, and Azure DevOps Pipelines
In modern DevOps workflows, managing infrastructure and application deployments efficiently and securely is critical. In my recent capstone project, I automated the end-to-end deployment of the EpicBook application using Azure DevOps Pipelines, Terraform, and Ansible, applying a dual-repository model to separate infrastructure and application responsibilities.
Here’s how I structured and executed the project:
Step 1️⃣ — Create and Connect Repositories
I used two GitHub repositories to organize my workflow:
epicbook-azure-with-ansible – contains all Terraform configuration files for provisioning Azure resources.
Epicbook-ansible – contains the EpicBook application source code and Ansible playbooks to c…
Automating EpicBook Deployment with Terraform, Ansible, and Azure DevOps Pipelines
In modern DevOps workflows, managing infrastructure and application deployments efficiently and securely is critical. In my recent capstone project, I automated the end-to-end deployment of the EpicBook application using Azure DevOps Pipelines, Terraform, and Ansible, applying a dual-repository model to separate infrastructure and application responsibilities.
Here’s how I structured and executed the project:
Step 1️⃣ — Create and Connect Repositories
I used two GitHub repositories to organize my workflow:
epicbook-azure-with-ansible – contains all Terraform configuration files for provisioning Azure resources.
Epicbook-ansible – contains the EpicBook application source code and Ansible playbooks to configure VMs and deploy the app.
Instead of importing into Azure Repos, I connected Azure DevOps directly to my GitHub account using a GitHub Service Connection. This allowed Azure Pipelines to automatically pull the latest code and reduced manual steps, creating a seamless CI/CD workflow.
Step 2️⃣ — Create Azure Resource Manager (ARM) Service Connection
In Azure DevOps → Project Settings → Service Connections → New Service Connection → Azure Resource Manager, I created a service connection using Service Principal (automatic) by providing:
Tenant ID
Subscription ID
Client ID
Client Secret
This service connection enabled Terraform to authenticate securely and provision Azure resources automatically during pipeline execution.
Step 3️⃣ — Upload SSH Keys and Sensitive Configuration as Secure Files
To manage access securely, I uploaded sensitive files to Azure DevOps Secure Files:
id_rsa_azure – private key for VM access
id_rsa_azure.pub – public key
dev.tfvars.txt – contains sensitive Terraform variables (renamed in the pipeline to dev.tfvars)
web.yml.txt – contains Ansible variables for deployment (renamed in the pipeline to web.yml)
⚠️ Security Note: Never commit files like dev.tfvars or web.yml with secrets to your repository. Uploading them as Secure Files ensures they are encrypted and only available during pipeline runs. This is crucial for protecting credentials, database passwords, and other sensitive information.
Step 4️⃣ — Infra Repository Setup (Terraform)
In epicbook-azure-with-ansible, I structured the Terraform code into modules:
network → resource group, VNet, and subnets
compute → frontend and backend Ubuntu VMs
database → MySQL database (PaaS)
I defined outputs in outputs.tf to expose:
frontend_public_ip
mysql_fqdn
These outputs were later used to configure the application via Ansible.
Step 5️⃣ — Create Infra Pipeline in Azure DevOps
I created a YAML pipeline for the infra repository:
Installed Terraform
Downloaded the SSH public key and dev.tfvars.txt from Secure Files
Renamed dev.tfvars.txt to dev.tfvars within the pipeline
Ran terraform init, terraform plan, and terraform apply
The pipeline successfully provisioned:
Frontend and backend Ubuntu VMs
MySQL database
VNet and subnets
Terraform outputs (frontend_public_ip, mysql_fqdn) were visible in the pipeline logs.
Step 6️⃣ — Manual Handoff of Terraform Outputs
I copied the Terraform outputs from the infra pipeline logs and updated the app repository:
inventory.ini → frontend and backend IPs
group_vars/web.yml → prepared for database connection (kept secrets in Secure Files, not committed)
These updates ensured Ansible could connect to the correct infrastructure without exposing sensitive values in the repository.
Step 7️⃣ — App Repository Setup (Ansible)
The Epicbook-ansible repository contained:
EpicBook application code for frontend and backend
Ansible roles and playbooks:
common → installed common packages
epicbook → deployed the app and configured database connection
nginx → configured Nginx to serve the application
The roles were organized for reusability and clean structure, following best practices.
Step 8️⃣ — Create App Pipeline in Azure DevOps
For the app repository, the YAML pipeline:
Installed Ansible
Downloaded the SSH private key and web.yml.txt from Secure Files
Renamed web.yml.txt to web.yml and moved it to group_vars/web.yml
Set permissions on the key (chmod 600)
Ran the main Ansible playbook
The pipeline connected to both VMs and completed all tasks successfully.
Step 9️⃣ — Validate Application Deployment
Finally, I:
Copied the frontend VM public IP from Terraform outputs
Opened the EpicBook app in a browser
Submitted a test post to verify backend and database connectivity
✅ The full workflow—from infrastructure provisioning to app deployment—was fully automated and functional.
Key Takeaways
Separation of Concerns – Using two repositories and two pipelines (Infra vs. App) mirrors enterprise best practices.
Secure Handling of Secrets – dev.tfvars and web.yml were kept out of the repository and accessed securely via Secure Files, preventing accidental leaks.
End-to-End CI/CD – Terraform provisioning triggers application deployment through Azure Pipelines, demonstrating a production-grade workflow.
Reusable and Modular Code – Terraform modules and Ansible roles ensure maintainability and scalability.
https://github.com/Hajixhayjhay/epicbook-azure-with-ansible.git