Overwrite Table in Tulip via API using an IPAAS solution

Hey Everyone,

I’m building an app that will create and store items in our LIMS system after issuing out of SAP. Our LIMS system has a frequently updated set of storage locations, and there are over 50,000 storage locations in our system. Fortunately, it is very easy for me to get a to pull a table from our LIMS system that contains all relevant storage information that I would need to pass into tulip for the end users to choose from.

Because this is updated frequently in our LIMS system, my current approach is to create a flow in our IPAAS solution set to run every night that will pull the Table from our LIMS solution, delete all rows in the table in tulip, and replace them with all of the rows in the table pulled from our LIMS system that day. (the Update API does not work for us because it is not apparent what items have changed in a given day).

Step one for me would be to use the “Delete All Records” API call, but is there a way for me to bulk add all ~50,000 rows via API? I want to avoid having to make one API call per record, which is seemingly what the “Create a Tulip Table record” calls for.

Any alternative propositions are welcome!

Thanks!

Hi @jay.rack ,

do you native access to the lims database? A possible solution for this would be using tulip automation : Run on a hourly, daily, weekly base you can request the data based on sql query and push them to the table using a loop.

Delete of the old values could be executed in the same automation using api call for deleting all rows.

Hey @tokotu

I do have native access to it, although we prefer to route through our IPAAS solution and/or Proxy Server when we can since its on prem if possible

I also have worries about the performance of the loop in Tulip given the large number of records. I don’t have a lot of Tulip experience, but I did build an app around a year ago that used a looper and it was pretty slow if I recall correctly (I may not be recalling correctly so feel free to call me out!)

The in-App performance is much slower that’s correct but the looper in automation is fast. And you mentioned this should be a automatic nightly job, so you don’t have to worried about performance but if you do : You can easily define your sql query in tulip to load your lims data in smaller packages.

Is it possible to route this automation through our IPAAS solution/ a Proxy Server rather than directly to our SQL database?

Do you know of any documentation detailing this process? Sorry for all the questions, I’m just getting into using Tulip heavily

Thanks!

You can use http connector instead sql connector. Does IPAAS have API?, so tulip can request data? AND if you use proxy, check proxy limit too :wink:

IPAAS is essentially middleware to facilitate integrations via API and it lets you perform any intermediate manipulations as well… I also saw that tulip has a snowflake connector… it just so happens that we put all of our LIMS data in snowflake as well, so I can use that as a backup plan so long as I’m able to parse one of the columns out into different columns within Tulip.

back to your suggestion, Yes Tulip can request data from IPAAS, my question would be if we did that, would it still be one at a time?

In other words, would I need to configure the response from IPASS to Tulip to only return one record at a time because that is all Tulip can handle? or could I send the entire table back as a response to tulip, and then Tulip can use that response to add them one at a time within the automation (I want to avoid making 50,000 API calls from Tulip if possible)

Tulip can “easily” process much data since you use automation for this. But thinking about splitting 50k rows to smaller buckets 10k,10k … seems logical. If you would asked me :smiley:

The great thing bout tulip is, test it, try it, work with it :smiley:

From my side, in different scenarios i also requested a big chunk of data since i implemented a connection to our LLM instance.

I seem to be getting the hang of it, but its not looking like i have the ability to delete all table records from inside the automation

That’s right! What i meant is to call http connector on tulip table api for deleting all records as you wanted to do it anyway.