I’m working on Tool tracking app and I’m currently trying to solve following scenario:
There are Transport unit (carts) on which multiple tools are prepared for designated production line. I have logic in app to know what has to be prepared on transport unit.
Some tools are in Tool kit (individual tools that has to be used together to function properly) and additional individual tools. Each tool is stored in Tulip table, tools which are part of a group have ToolKitId column filled and individual tools have this column empty.
Tools with IDs BH1G9_1, BH1G9_2 … are individual tools. Other that does not have empty TollKitId are part of tool kit.
Each tool is in Tools tulip table and have column Transport unit id. Transport unit id is generated in app with combination of Production line Id and Random string generator.
I can filter out and display correct tools on Interactive table widget that needs to be on Transport unit. In image below upper interactive table contains Tool kits and table at the bottom contains Individual tools. Tools from selected ToolKits and individual tools must be prepared for Transport unit.
So finally my question:
A need to update filed TransportUnitId on all selected Tools (individual and tools in tool kits) like shown below:
(Not exact sample from step image above but I think it is clear what have to be done)
Currently I don’t know how to do that. In “SQL language” I would update TransportUnitId on multiple rows based on filter "where ToolKidId in ([Id1], [Id2]…) for ToolKits and "where id in (…) for individual tools.
hello @ermingut, thanks for posting your question!! based on what I’m understanding, mass update of Table Records (similar to Queries and Aggregations, but to write data) will solve the problem you’re describing. this feature is being developed, keep an eye out on Announcements - Tulip Community for when it’s released!!
in the meantime however, you could loop through all the elements you want to update (by switching through two steps and having Step Enter Triggers). here’s a demo of one way to implement this (which was done for a BOM, but the logic should be similar): https://youtu.be/J50Ju1Ov8nI.
would this work as a solution?? let me know if so and we can provide more information on how to do it for your use case if necessary.
I suppose I can use that. It is quite complicated.
One issue can not be solved this way…
I’m working on multi user app and it can happen that multiple users is working on same step. And as I can see update of table fields is quite slow.
In the meantime another user can execute same Step with same data while first user will be in the middle of the process. That can lead to data inconsistencies.
Currently I’m testing Connector functions with SQL db in background. And that kind of updates, if executed correctly, are immediate and other user will not be able to interfere with another user data.
Also thanks from my side for posting the question - this is a really interesting use case. I think that Gio’s suggestion for using the loop is a good idea but I see the issue that you have with updating.
My suggestion for this would be to add a status column to your tools table that would set a tool/toolkit to ‘reserved’ at the same time as you set the TransportUnitId. And you can also set it to ‘in use’ as long as the tool/toolkit is actually used. Then you only have to check the status of a tool before you set the TransportUnitId.
This should make sure that no two operators can use a tool at the same time.
Please let me know if this solution makes sense or if I have understood something wrongly.
We have run into the same use case where we need to update multiple rows in a table. We went the same path of using looping and updating each individual record, but this is slow and there is a trigger limit of 100. We envision our use case reaching as high as 5000 records.
It was recommended that we use the new looping feature in Automations, but this too has a timeout.
The need is to mass create, update, or delete records.
The post from May of 2021 stated “this feature is being developed.” That was 4 years ago. We can use this functionality now!
Since I don’t want the app user to wait for that, I solved that by using node red.
The app calls a node red connector and gets an immediate 200 back (order accepted).
When node red is done with calling and writing data (the beauty is you can manipulate data on any possible way) it calls a machine to give feedback to the Tulip App.
Thank you, Thorsten, for suggesting middleware. It confirms one of our potential solutions. Native Tulip functionality would be more advantageous and will continue to advocate for its implementation in a future release.
But we do acknowledge this still may not be the ideal solution for bulk table updates. We’ve had some internal discussion with the Tulip product team on this recently, and I will check in on that / raise your interest in a better native solution here as well!
The time limit associated with automations makes it unreliable. I support a regulated industry where lose of data is a significant issue.
The number of records that are able to be processed within the 60 second timeout limit is dependent on the amount of processing that needs to occur and may change based on loading. It is more reliable to have a fixed loop limit instead of a time limit. That way I can ensure all of the records are processed.