Community FAQ
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
MHScott
Engaged Sweeper II

Not sure if it started this week or if I just noticed it... flows running at full scale that include large amount of secondary data (ex; reg keys and/or software list) are failing with a "Execution has exceeded the maximum allowed memory consumption of 1024 MB. - EC:102" error.

Works upwards of 7500 assets but fails higher than that.

Execution of flows against >25k assets without secondary data runs fine.

 

1 ACCEPTED SOLUTION

Hi @MHScott.
This EC:102 error comes from the execution runner when an execution exceeds the memory limit. 

A best-practice for workflow designs that processes large datasets would be to process a smaller amount of data in each execution, and/or use a Recursive Flow to re-call the flow to process the next batch of data. For example, if the flow retrieves a big list of assets, it could pull one page of data, process it, and then have the flow call itself again to process the next page, etc, until the processing is complete (with branching logic within the flow to check to see if there are more results to process).  This should help to avoid running into execution runner memory limits. 

You can find one example of a recursive flow using the "List of Assets" action pagination in the template "Custom field bulk edit".

View solution in original post

4 REPLIES 4
MHScott
Engaged Sweeper II

No problem.  I took advantage of the holidays to play around with Flow Builder.

I'll go with the simpler flow; I first pulled all of our Windows assets and looped through looking for devices with low disk space to make a ServiceNow ticket.

I got things working better (it still fails from time to time) by putting a small sleep in the code.

I also have enhanced the script a little more to leverage filters to remove servers and then ultimately look for devices with low disk space.

With the sleep though, the scripts are creeping up on the max time limit now though.

Hi @MHScott.
This EC:102 error comes from the execution runner when an execution exceeds the memory limit. 

A best-practice for workflow designs that processes large datasets would be to process a smaller amount of data in each execution, and/or use a Recursive Flow to re-call the flow to process the next batch of data. For example, if the flow retrieves a big list of assets, it could pull one page of data, process it, and then have the flow call itself again to process the next page, etc, until the processing is complete (with branching logic within the flow to check to see if there are more results to process).  This should help to avoid running into execution runner memory limits. 

You can find one example of a recursive flow using the "List of Assets" action pagination in the template "Custom field bulk edit".

MHScott
Engaged Sweeper II

I am trying to avoid the complexities of recursion, but have one flow working.

If it helps anyone else, I also had success in some scripts by putting in a second or two sleep after loading or manipulating large amounts of data.

Jacob_H
Lansweeper Employee
Lansweeper Employee

Hey MHScott - apologies for not replying sooner, I was banned at home from using my computer during vacation.  Thanks for letting us know about the memory issue -  I have passed this along to the developers to see if this can be resolved -  what are you trying to do, if you don't mind me asking?  Maybe we can put a filter or something on there in the meantime

Forum

New to Lansweeper?

Try Lansweeper For Free

Experience Lansweeper with your own data.
Sign up now for a 14-day free trial.

Try Now