Here’s a suggestion for your post in the Tulip Community:
Limitation on Column Count for Tables in Tulip API
I recently encountered an issue when trying to create a table via the Tulip API with more than 203 columns. The API response indicated that the request was well-formed but failed due to the limitation on column count:
{
"details": "Request was well-formed, but some fields were invalid. columns: the length must be no more than 203.",
"errorCode": "ValidationFailed",
"fieldErrors": {
"columns": "the length must be no more than 203"
}
}
It turns out that it’s not possible to create tables with more than 203 columns using the API, which is quite frustrating. In some cases, especially in complex manufacturing or data-tracking scenarios, we need to manage large datasets that require many columns. Having to split the data manually across multiple tables just to fit within the limit complicates the process and increases the risk of human error.
I believe allowing for more flexibility in the number of columns would greatly enhance Tulip’s capabilities for users dealing with large-scale datasets. Alternatively, providing an option to bypass this limitation through special configurations or APIs designed for handling larger tables could also be helpful.
It would be great to see an update or workaround to support larger tables in future releases. Does anyone else face similar issues, or have any suggestions for efficiently managing such large datasets in Tulip?
Thanks!