BirdyIO: bird nesting IoT

In 2020, while watching small birds nesting in my garden, I decided that I wanted to learn more – from a time series data mining point of view – about their apparent restless activity. After some initial brainstorming I realized I got into something really cool relating to digital electronics, sensors and signal processing & storage. Here’s some implementation detail:

+ Birdhouse fitted with dual-channel, pulsed-IR (38kHz) barriers
+ Atmega 328 µ-Controller acting as pulse source for the IR LEDs
+ Postgres database for long term event storage
+ NodeMCU ESP8266-12E µ-Controller as master:
  • fifo-type binary event buffer for 2 event channels (IR light barriers)
  • regular-expression style evaluation of event pattern + pattern duration (see table below):
  • detection of direction: (out > in) vs. (in > out) and depth of a connected action:
    • show: penetrate, retract from single barrier
    • peek: penetrate, retract through two barriers
    • look: penetrate, retract through two barriers, freeing initial one
    • walk: penetrate, retract through two barriers, passing both ones
  • anti-flicker filtering, suppressing repeated state alternations < 10ms
  • detection loop frequency achieved: ~ 800Hz
  • NTP time sync
  • periodic sensor self-checks on IR barrier function
  • birdhouse connected to wifi home network
  • local buffering of up to 200 qualified events in a transactional log
  • birdhouse webservice (json) endpoint to deliver event logs to a backend
  • Server backend (python) polling BirdyIO endpoint for new events

Event table

how this looks like in practice

full in/out transitions and other events in the 2020 nesting season

sample JSON message delivered by BirdyIO endpoint
directed in>out / out>in transitions (blue) and other activities (red) in the 2020 season

AirStation: environmental monitoring

Wifi-enabled weather stations are available in a great variety, usually covering: temperature, atmosperic pressure and relative humidity. Through my own interest in environmental factors related to covid-19 and the possible impact on the 2020 lockdown on air quality, I decided to use my existing knowledge in IoT/digital electronics to build a comprehensive, multi-sensor data source for long term monitoring and time series analysis

current project state:

After a few month of test operations in late 2021, I experiences frequent dropouts and system reboots, specifically in early morning low temp & moisture conditions. I concluded: NodeMCU needs improved sealing to withstand adverse weather conditions, actual sensors appear tolerant to moisture and temperature variations but MCU needs efficient sealing. Moreover, O3 sensor calibration requires temperature and rH% stability over several days of sample time. This needs to be repeated in a controlled environment.

implementation details:

  • NodeMCU ESP8266-12E µController with sensors:
    • BME280, digital, temperature, pressure and rel. humidity
    • SDS011, digital + laser-based, 2.5µm/10µm particulate matter (PM)
    • Winsen MQ131, analog, low-concentration O3 sensor
  • Custom library myTaskScheduler for asynchronous scheduling of data sampling (this library is also central to the StratoExplorer project).
  • µC acting as WiFi client in local network
  • OTA (over the air) updating of NodeMCU code
  • JSON endpoint to deliver sensor data:
    • average + standard deviation in sample interval
    • min/max values in sample interval
    • NTP based time information
    • sensor health status
  • Normal atmospheric pressure (QNH, hPa) corrected for temperature and humidity according to DWD standards
  • O3 sensor corrected for temperature and humidity
  • O3 sensor (long term) calibration mode
  • Backend process (python, cronjob) polling AirStation endpoint for data
  • Postgres database for long term storage and data analysis
prototype assembly (from left to right) of O3 / laser-PM / T-P-rH sensors and NodeMCU ESP8266-12E

StratoExplorer: payload for stratospheric balloon

Having experienced weightlessness on board a parabolic flight, as an (aero)space nerd, I am keeping one big dream for the future:

I would like to witness, with my own eyes, the curverture of the Earth as seen from space

Sadly, I am conscious of the fact that this goal cannot be reached without massive funding or support from outstanding individuals, such as Richard Branson or Elon Musk, that would offer me a ride.

Therefore, I am left with a poor man’s option – nonetheless challenging in nature – to launch a stratospheric balloon to the upper stratosphere at approx ~35km altitude and have my own eyes replaced by panoramic digital imagery. This goal can be reached with creative mechanical and digital engineering, some limited paperwork for air traffic control authorities (I am a pilot already – even excessive paperwork cannot scare me) and a budget roughly 2000 times lower than a ticket on Sir Richard’s Virgin Galactic.

For sure, I’ll get less bang for the buck, but I reckon it will be worth the effort !

This has actually been my entry project into IoT/µC-based digital electronics and of course I discovered that, in order to do this cleanly, I need to research and overcome some fundamental hurdles that led me to initially further other projects

Here are the main requirements and challenges met:

  1. reliable, low-power flight control unit
  2. continuous reception of 3D GPS position data and derivatives
  3. monitoring of inside/outside air temperature in a [-60C, 30C] temperature range
  4. long range (up to 40km) telemetry at reasonable data rates to transmit position and status
  5. panoramic color imagery with onboard storage of image data, possibly even limited downlink caps
  6. trajectory evaluation for landing site prediction
  7. asynchronous, periodic, fail-tolerant execution of onboard tasks

so far, these have been addressed as follows:

  1. while some people chose Raspberry-Pi as an easily maintainable unix platform for flight data management, a Pi’s power consumption and system overhead regarding interrupt handling and bare bones use of digital interfaces, led me to chose the Arduino-type advanced Teensy 3.6 32-Bit ARM Cortex-M4 180 MHz CPU as core for the flight hardware
  2. There is a range of light-weighted GPS devices available and I chose the ublox neo-6M GPS connected to Teensy via a serial interface. It is possible to adjust GPS mode during operations, switching from ground-based ranges of position and velocity to “space”-mode operations.
  3. Two Dallas DS18B20 digital temperature sensors fit to my range requirements and are attached to Teensy via single-wire serial interfaces with phantom power.
  4. Many people used to operate long range telemetry with 434Mhz frequency shift (FSK) modulated RTTY at rates of 50-300 baud and 10mW output. Apart from the transmitting unit being sensitive to temperature induced shifts of the carrier frequency, the receiving part is an adventurous undertaking, using SDR (software defined radio) and TTY decoding software – pretty much as radio amateurs decoding messages from the first generation of satellites. I got it working but it left a fragile impression on me which let me to switch to LoRa digital radio modules. Those are highly reliable, inexpensive, lightweight transceiver units that operate with multi-spectral modulation, sophisticated error-correction and multiple selectable bandwidth and packet length. Eventually I chose HopeRF 868Mhz SX1276 on both ends, an airborne ground plane 1/4 lambda antenna and a custom build (many thanks to Oleksandr from Ukraine !) Yagi antenna connected to the ground receiver. This is hassle free and solid hardware and initial tests inside a building and over a few hundred meters (with obstacles) were successful. A line-of-sight long range test, yet needs to be performed (most likely: Kalmit 49.321N 8.083E – Oberflockenbach/Weinheim 49.502N, 8.723E).
  5. Imagery will use two back-to-back mounted 2MP arducam OV2640 interfacing with Teensy on both Serial + SPI. I plan to rotate the camera assembly using a servo to rapidly step to 3-5 positions within the 180° hemisphere. Image acquisition requires approx 80-100ms per position with memory buffered storage to the Teensy’s SD card (8GB). Unlike other people, I am reluctant to launch expensive and heavy GoPro hardware and renounce to video recording. Moreover, with an amateur budget long range image or video transmission at acceptable bandwidth unfortunately is not feasible.
  6. GPS data is fed into a 120s buffer at 2s time intervals and a 3D linear regression solution is computed using a eigen-vector / matrix decomposition library that flawlessly runs on the Teensy. With the trajectory vector calculated, landing site is calculated from its intersection with the terrestrial plane and data is included in telemetry at payload decent (after helium balloon burst).
  7. One of the existing elements of programming activity is a custom library called myTaskScheduler. It allows to register independent services to be triggered at repetitive absolute (NTP- or GPS-time based) and relative time intervals. Support is provided to automatically process (average, sd, min, max) time series data collected by the registered services. Moreover, myTaskScheduler is able to perform health checks on services and if necessary respond to failures in a defined and reliable manner – without sending the global system into deadlock.

next steps (in a long way still to go …)

  • integrate LoRa and imagery control software with myTaskScheduler
  • wrap myTaskScheduler into a conditional “flight plan” logic to determine activities on launch, ascent, decent, landing.
  • develop software for the hand-held receiver (recovery) unit (LoRa + ESP8266), telemetry aggregation and data display on a dashboard powered by Grafana
  • consolidation of flight hardware on a custom-made PCB (the current wired setup not being reliable enough for field testing)