Try SureDone for FREE today!

We're software that helps growing brands & retailers grow and scale. Sync, sell and ship your products and inventory on online marketplaces and storefronts faster, easier and more accurately.


Cron Schedule for Automation

Cron Schedule for Automation

Hi guys,

is this the proper format for automation cron jobs?

I got this from SD support a year ago or so.

0 10,11 * * *

basically I want to run it 10am and 11 am every day.,11_*_*_*

Please let me know.

  • Hey Erno!

    That is the proper format, and is a good site to check against any cron line you make.

    One important thing to note is that the automation cron jobs run on UTC time, so if you want it to run at 10am and 11am UTC every day, then the cron line you posted is correct. If you want it to run at 10am or 11am EST every day for example, you would need to convert the timezone, so in that case it would be 0 14,15 * * * since 14:00 and 15:00 UTC is 10:00 and 11:00 EST, respectively.

    Hope that helps and let me know if you have any more questions on that!

  • yes, I figured the time difference, when I saw it running 3am:)

  • I have a file with 2,800,000 SKUs. No matter what we do the automation fails, because the connection between the server and SureDone is closed before the whole file is copied over to SD....

    If I run ~500,000 rows, then there is no issue....

    Is there any way to define what row number we want to run in the automation API? Let says:

    Automation 1: Read the file from 1-500,000 rows

    Automation 2: read the file from 500,001- 1,000,000 rows 

    Automation 3: read the file from 1,000,001- 1,500,000 rows 


    Please advise, thanks!

  • Hey Erno,

    We've recently been developing some "pointer" features for Automation Engine that come close to this, but it's for API parameters and not file reading yet, exactly.

    Can you share the automation ID for this? We can take a look and see if there's any other way around this limitation for now. 

  • automation is the Inventory Import #1003


  • Hi Erno,

    Just getting back to you on this, unfortunately I tried playing around with some timeouts and the connection will still time out after 3 hours for example and we can't quite set a timeout limit that high without introducing other risks within the system.

    We don't have the ability to seek a subset of the file yet, it's actually easy to implement a "start at" offset, but more difficult to include an "end at" offset out of the box, which is what would be needed here for this file.

    So unfortunately no good workarounds for now, but we can certainly add something like seeking a subset of a file via FTP in the development pipeline, however that would take some time, and I can't provide an ETA on that at the moment.

    Let me know if I can provide any other info, and again we'll add that to the development pipeline in the meantime.


Login or Signup to post a comment