Too many triggers in a row

Hi Team:-

I hope you are well. Here we have an issue initially shown in our triggers while running our app as shown in this image. As we store our data into the Tulip table, we set a limit as 100 so that we expect better performance when running the app.

Upon troubleshooting this error, we figured out that one table with limit 100 records had aggregations with calculations=mode, which associated with the error in the snapshot via triggers. The symptom upon seeing the error is there’s no record generated in that same table (no new row inserts).

Here’s some questions that I have:-

  1. I understand the table record default limit is 1000. If we anticipated to generate about 4000 records a year, what is the recommendation to set on the limit? 5000?
  2. Would it cause severe performance issues if we have 15K-25K records within the same table?
  3. If we needed to keep the data for 3-5 years due to GxP requirements for example, do we deal with such huge amount of records in the table?
  4. Could you advise how do we deal with record archival within Tulip?

Thanks!

Regarding the error message, there are some interesting caveats to that 100 triggers limit. It doesn’t seem to take into account any triggers before the transition that starts a loop. It does take into account inactive triggers in the loop sequence, as well as any queued triggers after your loop exit condition (such as App Start and Step Enter triggers, in which case you woudl get one such error message for each successive queued trigger after you hit 100).

If you can fit your loop actions into just one trigger on Step enter, it is possible to iterate through 100 records (one record per loop iteration) and exit the loop (technically a 101st trigger) without getting the error. Theoretically you could do multiple records per trigger, but this can be messy.

Another watchout is if you try to use a timer on your loop-exit-landing step to resume the loop to do multiple 100-iteration loops, be careful you don’t run into the auto-logout timer if enabled, as this can cause the app terminate in the middle of your loop.

Regarding your questions:

  1. The query results limit is 1000 records.
  2. Tables can have hundreds of thousands of records in my experience with no problems.
  3. Not a problem as long as you have a way to filter your records to get to within 1000 records of the one you want. You can always export tables to csv and locate data offline.
  4. I’m not on the IT side but I believe there is some backup process for the tables data–this may differ depending on if you are Tulip Cloud or Tulip Customer Cloud.