Im currently doing something
NOT exactly like that, but it has a similar premise. It has to do with live data monitoring of the power consumption of several Raspberry Pi's, with the ability to execute commands to one of several arduinos to power on/off an assigned RPi, as well as some other features as well. And all of this is hosted on a central Raspberry Pi.
What I would suggest is setting up a webserver based on python (Flask is good) on the RPi, or inside the docker container, which will be your orchestrator of commands.
This gives you the ability to execute commands based on plant, time, sensor data, etc. Having the orchestrator at a central location also allows for multiple plants to be watered with their own, unique requirements, at different locations.
And then to physically water the plants, use an Arduino with a WiFi hat, or some other controller with WiFi as well, and use MQTT on the device.
MQTT is a machine-to-machine communication protocol that works on a publish/subscribe basis, meaning a device only receives a message on subscribed topics, and this is all handled by the Broker. MQTT clients also have the ability to use handler functions, which means that a specific user defined function will be triggered when a message has been received. This user defined function allows for multiple logic checks to be made to determine the intended command, and then another function can be executed thereafter.
This is a very rough explanation of a solution to the problem and please excuse my grammar (I engineer, not spell
), but it is a fairly robust and "future proof" solution. It does involve alot of programming though, but it is also fun. Whats nice about this solution is that as soon as you get the core mechanisms down, like MQTT, webserver, etc, you can expand on this and add other features like boiling your kettle at a specific time, turn on/off your geyser, etc, and the upgrading prices will be the price of a cheap microcontroller (R200 >), and the electronic components.