Mass update Tulip table from csv file


I have a master data table that includes basic data of articles for a list of fields, this data is updated all time by an interface from a third party system (SAP), by a demand, I have to add a field and import all data to this particular field only, the ID of the table is the article number.

So, I have this CSV file with 2 columns, the article id and values of the new field.

How can I upload it and update ONLY this field ?

In the table there are much more than 10000 records so doing it one by one is not logical !

Best regards,
Amit Berku

Could write some python and use the table api How to Use the Table API | Tulip Knowledge Base - Support for Building Operations Apps to update it

Hi there,
I’m not a Python developer and looked for an internal Tulip tool to use, not an external one, another solution I can see here, is uploading this CSV to a Tulip table with the same key, run an application with 2 steps, in the first, get all rows into an array (by a connector function) and in the second do kind of loop and pop an id, get the value from the new table and update it in the parallel original table where I wanted to update in the first place.
It’s a very long solution but it is the only one I can implement with the Tulip, if there is any other solution, please help me with that.

Amit Berku

Hi Amit,

Would it be working for you as following ? :

  • Export your current masterdata table from Tulip as csv
  • Add the additional info on excel
  • Import back the csv file with info updated (select Overwrite existing records )


Hi @Patxi,
It would be a great solution but I have an interface that updates this table 24/7 by changes done in the main erp - SAP and if I’ll do so, I can loose updates or inserts so only directly update to this Tulip table should work


Probably usable but this need two mains tools :

  • API Tulip (to get the the row to update)
  • scheduled task windows (to update them)