/build/static/layout/Breadcrumb_cap_w.png

Dynamic limit on scripting?

I need to make a script to install on 500+ machines.  But, I need to limit it to run on no more than 10 machines at a time and with a gap of ten minutes between runs to avoid overloading.  Short of having to make multiple copies of a script with labels containing 10 machines and then a staggered installed, is there a way to tell the SMA to only run ten at a time and then delay?

Has anyone done anything like this?


0 Comments   [ + ] Show comments

Answers (2)

Posted by: Hobbsy 1 year ago
Red Belt
0

Nope, the only way you can stagger in that way would be to create multiple scripts, multiple target labels and then run them each on their own schedule.

Posted by: frank.clark@magaero.com 10 months ago
Senior White Belt
0

You can do this with the API calls (as mentioned in https://www.itninja.com/question/where-to-find-information-on-exporting-kace-inventory-information-to-other-data-applications).

You would be able to script something that gets a list of Machine IDs

Then does a foreach where it does x (10?) then waits a set amount of time before continuing.

       Connect-SmaServer -Server $KBOX -Org $Org -Credential $Cred -Verbose;Invoke-SMAScript -ScriptID [SCRIPT ID] -TargetMachineID [MACHINE ID] | Tee-Object -Variable RunTask

This returns a RunId for the script.

You could do one at a time and do a while loop where it checks the status and while success and failure are both $null continue. Just have it get the results and check, if not done then wait x seconds and repeat.

      Connect-SmaServer -Server $KBOX -Org $Org -Credential $Cred -Verbose;Get-SmaScriptRunStatus -Id [SCRIPT RUNID] | Tee-Object -Variable RunResult


 
This website uses cookies. By continuing to use this site and/or clicking the "Accept" button you are providing consent Quest Software and its affiliates do NOT sell the Personal Data you provide to us either when you register on our websites or when you do business with us. For more information about our Privacy Policy and our data protection efforts, please visit GDPR-HQ