Multiple record creation via Tulip API

Hello all,

There are many inquiries that customers would like to upload many records from a CSV or an Excel file. Tulip has a CSV import function, but we need to select each field to match before uploading. So, It’s difficult to upload a CSV file automatically.

If Tulip API can accept array of objects as POST method, we will be able to upload multiple records to a table at one transaction. Right now, we have to use Node-RED to read and send a line of a CSV file one by one.

Kind Regards,
Akira Takahashi

A workaround is to build your json payload by transforming an array in string with delimiter as , « key » = we use it a lot for POST with array in the payload

Hello Youri,
Thank you for your answer, but I didn’t get it - can you please give an example of how a sample multi-record payload looks like?
And if possible, the code which generates it?

Thank you

Hello Youri,
@youri.regnaud

I didn’t notice your post. Sorry. But I would like you to give us a sample of how to create msg.body for multiple records. I encountered 400 error when I put msg.body like below.

[
{“id”: “1”},
{“id”: “2”},
{“id”: “3”}
]

I am also curious about this. I thought the only way to create multiple records was multiple API requests. Certainly there is nothing in the API docs suggesting otherwise… I am using python, not node red. I think I know what you’re talking about as far as building a string with the comma delimiter, I have to do a similar construction for querying the database to find all records matching some list of ID’s.

Here is the code for anyone curious:

def list_format(id_list):
    """Creates a list of values that can be used in Tulip API requests
    with an isIn filter. Primarily intended for taking a list of IDs 
    connected to a test plan and returning all results from the 
    sample or Test Results table matching those IDs"""

    if len(id_list) != 0:
        id_list_string = ''

        for i in range(len(id_list)):

            if i == 0 and len(id_list):
                id_list_string += "[" + "\"" + id_list[i] + "\""
            else:
                id_list_string += f"\"{id_list[i]}\""

            if i != (len(id_list) -1):
                id_list_string += ","

            if i == (len(id_list) - 1):
                id_list_string += "]"

    return id_list_string

Then your parameter JSON would contain:

        parameters = {
            
            "filters.0.field":"id",
            "filters.0.functionType":"isIn",
            "filters.0.arg":id_list_string,
            "filterAggregator":"all",
            "limit":LIMIT,

        } 

Hey @tulip Team,

can you give an update on this product suggestion?

For us and many of our customers this would be a great simplification to also send arrays and not just single objects/rows to the table API.
When developing a script to push data to the Tulip API, the admins are often at the point where they have a finished array that they now want to send to the table API and this is not possible.
In this case, you always have to build another loop to send each data record individually.

Further background:

In recent years, the Table API has been used by our customers more and more instead of connectors for exchanging data with an ERP system.
There are various reasons for this: For example, sometimes there is no web service on the ERP side that could be called with connectors and so the IT admins build a script to send data to the Tulip API. Or they want to have absolute control over which data (mostly production orders and bills of materials) is pushed to Tulip tables.
And it has another great side effect: it simplifies the app logic for the app builders, as they only have to work with tables instead of connector functions and the resulting objects in variables/arrays.

Kind regards
Jeremy Nicholls

2 Likes

Hey @Jeremy and everyone else in this thread,

thanks for the feedback and suggestions regarding data import capabilities on Tulip. We are actively exploring ways to enhance data exchange between our platform and various systems beyond the current capabilities.

While the idea of allowing bulk uploads through an array of objects is appealing, there are practical considerations to address, such as API performance and client side implementation. It’s important to note that even with bulk operations, there would still be a need to enforce limits on the number of records per transaction to ensure the API’s reliability and responsiveness. As a result, some form of looping through records on the client side would likely still be necessary.

We understand that changes in this area could significantly simplify the process for relevant use cases, and it’s something we are considering carefully. However, I want to be transparent that as of now, there is no defined timeline for when work in this area might begin.

Still I’m more than happy to discuss concrete use cases and related challenges at any point. Feel free to reach out directly at stefan(at)tulip.co and we can set up time to discuss!