I Stopped Copy-Pasting AWS Console Data and Let Lambda Do It for Me
It did not start as an automation project.
It started with one simple request. “Can you share the VPC and subnet details?”
I opened the AWS console. Clicked into VPCs. Checked subnets. Copied a few values. Dropped them into a spreadsheet. Done.
No complaints. No resistance.
Then the same request came again. Then it came with a deadline. Then it came when I was already busy with something else.
That is usually how these things begin.
The Moment Manual Work Starts Feeling Heavy
At first, the AWS console feels friendly. Everything is visual. Filters work. Lists make sense.
But as the environment grows, the console starts telling a different story.
More VPCs. More subnets. More availab…
I Stopped Copy-Pasting AWS Console Data and Let Lambda Do It for Me
It did not start as an automation project.
It started with one simple request. “Can you share the VPC and subnet details?”
I opened the AWS console. Clicked into VPCs. Checked subnets. Copied a few values. Dropped them into a spreadsheet. Done.
No complaints. No resistance.
Then the same request came again. Then it came with a deadline. Then it came when I was already busy with something else.
That is usually how these things begin.
The Moment Manual Work Starts Feeling Heavy
At first, the AWS console feels friendly. Everything is visual. Filters work. Lists make sense.
But as the environment grows, the console starts telling a different story.
More VPCs. More subnets. More availability zones. More scrolling.
You stop trusting your eyes. You double check values. You still miss things.
That is when a simple task turns into fragile work.
At this point, everything still looks manageable.
Repetition Is the Real Problem
The problem was not difficulty. The problem was repetition.
Every report followed the same pattern. Open console. Navigate. Filter. Copy. Paste. Format.
The output depended entirely on how careful someone was that day.
That is not a system. That is luck.
And engineers do not like relying on luck.
Choosing a Small Entry Point
I did not try to automate everything.
No EC2. No IAM. No billing.
Just VPCs and Subnets.
They are foundational. If networking data is wrong, everything above it becomes questionable. It was the safest place to start.
The goal became clear.
Stop asking humans. Ask AWS directly.
Before Code, the Flow Was Already Clear
Even before writing the function, the flow made sense in my head.
Lambda would run the logic. Python would talk to AWS. The data would be structured properly. Excel would be the output because everyone understands Excel. S3 would store the result so it is always available.
Simple. Linear. Predictable.
Permissions Decide Everything
Before the function could do anything useful, IAM had to be right.
Lambda needed to describe VPCs. It needed to describe subnets. It needed to upload a file to S3.
Nothing more.
Keeping permissions minimal saved time later and avoided unnecessary debugging.
This shows the exact permissions that make the automation work.
The First Real Friction Point
Then came Pandas.
Lambda does not ship with Pandas or OpenPyXL. That meant building a custom Lambda Layer.
I built it locally with only what I needed. No extra libraries. No bloated packages. Just Pandas and OpenPyXL, aligned with Python 3.11.
This step mattered more than expected.
A clean layer meant fewer surprises later.
When Defaults Quietly Fail You
The first execution failed.
Not because of bad logic. Because of default limits. Pandas needs memory.
Excel generation needs time.
Once memory was increased to 512 MB and timeout was adjusted, the function behaved perfectly.
Serverless does not mean resource-less.
Lambda Function Setup
- RuntimePython 3.11
- Memory512 MB or higher
- Timeout 1 5 mins 0 seconds
Boto3 fetched VPCs. Boto3 fetched subnets. Pandas shaped the data into rows and columns. OpenPyXL created two sheets in a single Excel file.
Everything stayed in memory. No temp files. No disk usage.
The function did one thing and did it cleanly.
import boto3 import pandas as pd from io import BytesIO import time
- boto3 connects to AWS APIs.
- pandas structures tabular data.
- BytesIO handles Excel creation in-memory.
- time is used for tracking execution duration.
def lambda_handler(event, context): start = time.time() print("Lambda started")
Logs the start of execution and records time.
ec2 = boto3.client(‘ec2’)
Establishes a client to interact with EC2 services.
vpcs = ec2.describe_vpcs()[‘Vpcs’] print(f"Fetched {len(vpcs)} VPCs")
Retrieves all VPC metadata from the AWS account.
Vpc_data = [{ ‘VPC ID’: vpc[‘VpcId’], ‘CIDR Block’: vpc[‘CidrBlock’], ‘State’: vpc[‘State’], ‘IsDefault’: vpc.get(‘IsDefault’, False) } for vpc in vpcs]
Extracts required information from the VPCs into a list of dictionaries.
subnets = ec2.describe_subnets()[‘Subnets’] print(f"Fetched {len(subnets)} Subnets")
Retrieves all subnet data.
subnet_data = [{ ‘Subnet ID’: subnet[‘SubnetId’], ‘VPC ID’: subnet[‘VpcId’], ‘CIDR Block’: subnet[‘CidrBlock’], ‘Availability Zone’: subnet[‘AvailabilityZone’], ‘State’: subnet[‘State’] } for subnet in subnets]
Gathers details like subnet ID, VPC association, AZ, etc.
Creates Excel sheets.
Finalizes and prepares Excel file for upload.
Uploads the Excel to S3.
duration = round(time.time() - start, 2) print(f"Total Execution Time: {duration} seconds") return { ‘statusCode’: 200, ‘body’: f’Report uploaded to s3://{bucket_name}/{file_name} in {duration} seconds.’ }
The Classic AWS Reminder
The next failure was quick.
Access denied.
The IAM role was missing permission to upload objects to S3. One small update fixed it immediately.
AWS errors are rarely mysterious. They are usually precise.
The First Successful Run
The execution finally completed.
Status showed “Succeeded”. Logs confirmed the runtime. S3 showed a new Excel file.
No console clicking. No copy paste. No manual formatting.
The system worked.
Opening the Report Felt Different
The Excel file had Different Services sheets showing all the resources.
VPCs, Subnets,EC2-servers, RDS, Volumes and all whichever was mentioned over Lambda Function.
Clean columns. Accurate data. Live infrastructure information.
This was no longer a document created by a human. It was a report generated by the system itself.
This shows the real value of the automation.
What This Quietly Became
This started as a small fix for a repeating task.
It turned into a reusable reporting foundation.
Today, the same pattern can be extended easily. EC2 inventories. IAM users. Security groups. Cost summaries.
Once the pipeline exists, expansion feels natural.
Final Thoughts
AWS already knows everything about your infrastructure.
The real work is asking the right questions and shaping the answers into something humans can trust.
This automation does exactly that.
Quietly. Reliably. Every time it runs.
If needed Full Lambda Code, just drop a comment i will share the IAM role (JSON format) and the github link for automation Code.