-
Notifications
You must be signed in to change notification settings - Fork 55
Triggers are not cleaned up in modules #281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I have managed to work around this with some ugly code... Every time a script uses a module function that creates a trigger function, that module function also takes a registration parameter. I am passing this in as name from the calling script. Instead of returning the trigger function it instead registers it within the module by calling register_trigger(reg, func):
Then, each script must also call This leads me to believe that the core of the issue has something to do with the context the trigger is associated with. I could be way off, though, as I was unable to pinpoint where this happens in the pyscript code. There were some other interesting behaviors as well. For instance, even if I never stored the trigger function in a global variables (in the script or in the module) it would still react to events/state changes. However, without storing it, the only way to remove the trigger function was to reload the module. |
Is this still an issue? It seems like this may've been caused by #457, so it could be fixed now. |
I can confirm that this issue still exists with 1.4.0, so no, it is not fixed. @dlashua could you elaborate what you mean by
I have the problem that while developing my pyscript module, I tend to modify&safe it a lot. This seems to lead to a number of identical trigger functions existing in parallel and triggering simultaneously. Since this is caused by just simply saving the file repeatedly, I do not think that I can create a function that will prevent it. |
Use a dictionary instead. For example I have a triggers that dynamically get created and destroyed based on other sensors. This is the only small example I have and is a work in progress but this design pattern ensures only a single instance of the trigger is setup: import akarsoft.util.dt as dt_util
scene_name = "scene.night"
triggers = {}
rising_sensors = ["sun.sun.next_dawn", "sun.sun.next_rising"]
setting_sensors = ["sun.sun.next_setting", "sun.sun.next_dusk"]
trigger_sensors = ["sun.sun.next_rising", "sun.sun.next_dusk"]
@state_trigger(*trigger_sensors)
@time_trigger
def create_time_triggers():
scene_lights = hass.states.get(scene_name).attributes["entity_id"]
rising_time = dt_util.as_localtimeiso(dt_util.get_midpoint(*rising_sensors))
setting_time = dt_util.as_localtimeiso(dt_util.get_midpoint(*setting_sensors))
@state_trigger(
[f"{scene_light} == 'on'" for scene_light in scene_lights],
state_check_now=True,
state_hold_false=0,
state_hold=5,
)
@time_active(f"not range({setting_time}, {rising_time})")
def outside_lights_rising(**kwargs):
log.debug("Rising: Triggered!")
light.turn_off(label_id="outside_lights", transition=0)
@state_trigger(
[f"{scene_light} == 'off'" for scene_light in scene_lights],
state_check_now=True,
state_hold_false=0,
state_hold=5,
)
@time_active(f"range({setting_time}, {rising_time})")
def outside_lights_setting(**kwargs):
log.debug("Setting: Triggered!")
scene.turn_on(entity_id=scene_name, transition=0)
triggers.update(
dict(
rising=dict(time=rising_time, trigger=outside_lights_rising),
setting=dict(time=setting_time, trigger=outside_lights_setting),
)
) |
After more investigation on #277 I've found this.
modules/trig_from_mod.py
scripts/quick_test.py
To reproduce, save file and see logging start. Change
ITER
and save file again. Notice both sets of logging still occurring.Moving
make_simple_trigger
intoquick_test.py
causes the expected behavior (the logging stops on reload ofquick_test.py
).The text was updated successfully, but these errors were encountered: