6 min read1 hour ago
–
TL;DR:* I built a battery-powered WiFi temperature sensor using ESP32-C3 that samples every 10 seconds, batches 30 readings, and transmits via UDP every 5 minutes. Aggressive power optimization (deep sleep, light sleep during sensor conversion, static IP caching) achieves ~139 mAh/month consumption. A 2500mAh 18650 battery should last well over a year. Code at github.com/mlhpdx/xiao-esp32c3-wifi-temp-sensor.*
I’ve been working on a side project that involves deploying temperature sensors at multiple sites where running power isn’t practical and where the only reliable connectivity is WiFi (no LoRa gateways, no Bluetooth hubs, just WiFi networks). As an unpaid project the constraints ar…
6 min read1 hour ago
–
TL;DR:* I built a battery-powered WiFi temperature sensor using ESP32-C3 that samples every 10 seconds, batches 30 readings, and transmits via UDP every 5 minutes. Aggressive power optimization (deep sleep, light sleep during sensor conversion, static IP caching) achieves ~139 mAh/month consumption. A 2500mAh 18650 battery should last well over a year. Code at github.com/mlhpdx/xiao-esp32c3-wifi-temp-sensor.*
I’ve been working on a side project that involves deploying temperature sensors at multiple sites where running power isn’t practical and where the only reliable connectivity is WiFi (no LoRa gateways, no Bluetooth hubs, just WiFi networks). As an unpaid project the constraints are tight: keep the hardware cheap and minimize maintenance (particularly onsite visits).
I’m not a professional when it comes to embedded systems work so I chose components with good documentation and a proven track record. The XIAO ESP32-C3 is power-efficient and WiFi-capable, the DS18B20 temperature sensor is ubiquitous and well-understood. The XIAO support LiPo charging, so a 18650 lithium battery would make sense. Not the cheapest combination, but one I felt confident I could make work.
The Problem with Wifi
WiFi is notoriously power-hungry. An ESP32-C3 draws around 180mA when actively transmitting versus 43µA in deep sleep. That’s 4,000 times more current. To wake up, connect to WiFi, authenticate, establish a TCP connection, send data, wait for acknowledgment, and disconnect every few minutes means a battery won’t last long.
The solution is theoretically obvious: minimize radio-on time as aggressively as possible. The implementation is all about the details.
Instead of sending each temperature reading individually I batch 30 samples (taken every 10 seconds over roughly 5 minutes) and send them as a single UDP packet. UDP was the right choice here since it allows a true “blind send” with no TCP handshake andno waiting for an acknowledgment. With UDP the device just transmits a single packet and immediately disconnects.
Another significant source of air time is DHCP. On the first send, the device uses DHCP and saves its IP configuration to RTC memory. On subsequent sends, it uses that cached configuration and connects in about 63ms instead of the 1–2 seconds DHCP requires. Combined with UDP’s fire-and-forget approach, the radio is only active for about 100ms every 5 minutes. This caching alone reduced WiFi power consumption by ~5x.
Learning Sleep Modes
Getting the sleep modes right reminded me of working with transactional memory in C++ two decades ago at Autodesk. The code needs to be carefully constructed to align with the rules about what state persists, what gets reset, and how to structure logic so the system can wake up and know where it left off (or reconstruct it).
The ESP32-C3 supports several sleep modes, but I use two: deep sleep and light sleep. Deep sleep consumes only 43µA but resets the chip completely on wake, including RAM. So all persistent state variables go into RTC memory, tagged with RTC_DATA_ATTR. I store the boot counter, network configuration, sensor address, and a buffer of 30 temperature samples there so they survice a deep sleep cycle.
Light sleep was a discovery that emerged from optimizing reading the sensor. The DS18B20 takes about 650ms to perform a temperature conversion with a blocking call. Then I found the sensor supports asynchronous mode where the conversion is started with a non-blocking call and the value can be retried later. So during that 650ms wait I put the ESP32 into light sleep at 800µA instead of keeping it fully active. That’s a 60x power reduction during the longest single operation in the wake cycle.
// Start async temperature conversionsensor.setWaitForConversion(false);sensor.requestTemperatures();// Light sleep during DS18B20's 650ms conversionLIGHT_SLEEP(650 * 1000);// Conversion complete, read the resultfloat temp = sensor.getTempCByIndex(0);
But light sleep introduces its own complexity: GPIO states don’t persist by default. The DS18B20 needs continuous power during conversion, so I had to use GPIO hold (gpio_hold_en()) to lock the power pins in their states before sleeping. Without this, the sensor would lose power mid-conversion and return the error value of 85°C. This was one of those gotchas that only became clear through testing.
// Power on sensordigitalWrite(SENSOR_POWER_PIN, HIGH);digitalWrite(SENSOR_GND_PIN, LOW);// Hold GPIO states during light sleepgpio_hold_en((gpio_num_t)SENSOR_GND_PIN);gpio_hold_en((gpio_num_t)SENSOR_POWER_PIN);// Now safe to light sleep - sensor stays poweredLIGHT_SLEEP(650 * 1000);// Release holds after conversion completegpio_hold_dis((gpio_num_t)SENSOR_GND_PIN);gpio_hold_dis((gpio_num_t)SENSOR_POWER_PIN);
Power Budgeting
The power optimization plan evolved as I understood the system better, but eventually I needed to know if it would actually work and did the math. Here’s the breakdown for a typical temperature sampling cycle:
Every 30th cycle includes a WiFi transmission, adding this overhead:
The WiFi cycle consumes about 2.5µAh total versus 0.5µAh for a regular cycle. Over a 30-sample batch (~5.3 minutes), that’s:
- 29 regular cycles: 14.6 µAh
- 1 WiFi cycle: 2.5 µAh
- Total: 17.1 µAh per batch
Running continuously, that works out to roughly 139 mAh per month. A 3000 mAh 18650 battery should last about 21 months — well over the one-year target.
These are calculated numbers, though, not measured. The real-world results depend on actual current draws (which vary by chip and conditions), self-discharge rates, and temperature effects on battery capacity. But the math gives me confidence that the design is in the right ballpark.
Press enter or click to view image in full size
Despite spending 93.6% of its time in deep sleep, the device consumes 43.7% of its power during brief active CPU periods. This is why minimizing wake and transmit time matters so much. Every millisecond the CPU is active costs ~1,000 times more than deep sleep.
UDP is Efficient
The packet structure is straightforward: 8 bytes for the DS18B20’s unique ID (so the backend can identify which sensor sent the data), 4 bytes for the sample count, and then 240 bytes of temperature data. The temperature data includes 30 pairs of floats, alternating between external (DS18B20) and internal (ESP32-C3) temperatures.
Total packet size: 252 bytes.
Binary encoding saves space and CPU time compared to JSON, and the backend can easily parse the fixed structure. I accept that some packets will be lost. Occasional packet loss for massive power savings is absolutely worth it.
Gotchas
The implementation has a few quirks worth noting. Since deep sleep resets the chip, there’s no traditional loop(). Everything happens in setup(), which reads temperatures, updates the sample buffer, increments the boot counter, and either goes back to sleep (boots 1-29) or transmits data (boot 30).
For debugging, I needed a way to observe the wake-sleep cycle without actually sleeping and losing the serial connection. That can be done using a UART adapter, but I decide to get clever. My approach is a DEBUG_MODE flag that conditionally compiles different versions of the sleep macros. In debug mode, DEEP_SLEEP() becomes delay() followed by goto RESTART, jumping back to the top of setup(). It’s a bit of a hack, to be sure, but it lets me see the complete cycle in the serial monitor without the hardware actually sleeping.
I was suprised that in RTC memory arrays, the volatile keyword is required. Without it, compiler optimizations can corrupt array access across sleep cycles so reads and writes don’t happen in the expected order. This caused data corruption until I tracked it down. Declaring the temperature sample buffer as RTC_DATA_ATTR volatile fixed it.
// Wrong - compiler may optimize away array accessesRTC_DATA_ATTR TemperatureSample temperatureSamples[SAMPLE_COUNT];// Correct - volatile prevents optimization corruptionRTC_DATA_ATTR volatile TemperatureSample temperatureSamples[SAMPLE_COUNT];
Finally, to prevent current leakage when the sensor is off, all three GPIO pins (power, ground, data) are set to INPUT mode (high-impedance) after each reading. Leaving them in OUTPUT or INPUT_PULLUP modes created a path for current to leak through the powered-down sensor.
Will It Get There?
Some of these sensors are deployed and operational. Early monitoring suggests the power budget is realistic, but only time will tell whether they actually last 3 months, 6 months, or the full year-plus the calculations predict. Battery self-discharge, temperature effects, network conditions, and a dozen other real-world factors will influence the outcome.
But the math is solid, the optimizations are proven, and the code is stable. I’m cautiously optimistic.
The complete source code is available on GitHub, including all the details I’ve glossed over here: GPIO hold timing, RTC memory layout, the binary protocol spec, and the debug mode implementation.
If you’re building something similar, I hope this helps you avoid at least some of the gotchas I encountered.