POST
/
v1
/
tables
/
{data_table_key}
/
records
/
bulk

How to use it

Create your data table

To push data to a data table, you’ll have to import it in Qobra manually first, via a csv file. You can import your file here. After uploading this csv, you’ll arrive here.

For more information, see the related help center article

If you try editing your data table, you’ll find:

  • the table-api-name, that you can edit, to identify your data table
  • the id-field-api-name, that you can edit if needed
  • the name-field-api-name, that you can edit if needed

On this page, you will also find each of your data table fields api keys (field-api-key). You can edit them if needed.

Variable api keys

This endpoint is based on dynamic fields. This mean that all annotations that look like <variable-key>, will have to or will be replaced by their actual API keys.

Trouble shooting

When trouble shooting a webhook import, first check you imports are present in Qobra’s import page here.

  • if your imports do not appear in the list of previous imports (after refreshing the page), check the authentication
  • if your import is in warning, you’ll see the records which did not import, with the error’s detail

  • if your import seems successful, but data in your data table is not updated, your fields api name must mismatch the ones parameterized in Qobra.

Authorizations

X-API-Key
string
headerrequired

Path Parameters

data_table_key
string
required

Used to identify the data table

Query Parameters

all
boolean
default: false

Optional: if all is true, the request will delete all the data table’s content, before doing the import

debounce
integer
default: 0

Optional (minimum:0): debounce is the number of seconds we wait before launching imports. This query parameter is used in conjunction with the all query parameter. If you want to push a large number of records, but delete the data tables records before, this debounce system will enable you to split this large amount in several requests. For instance, if you have 300k records to push, you can set a the debounce query parameter to 15 minutes and split your synchronisation in 60 requests of 5k records each.

Body

application/json
items
object[]

Array containing all the records you want to import.

Response

200 - application/json
import_id
string
required

unique identifier of the import you created

objects_imported
integer
required

number of records your request contained