Genesys Cloud - Developer Announcements!

 View Only

Sign Up

Deprecation: Datatable CSV Upload Endpoint POST /api/v2/flows/datatables/{datatableId}/import/jobs

  • 1.  Deprecation: Datatable CSV Upload Endpoint POST /api/v2/flows/datatables/{datatableId}/import/jobs

    Posted 3 hours ago

    Summary

    What’s Changing

    The operation that starts a CSV import for a Datatable is changing. POST /api/v2/flows/datatables/{datatableId}/import/jobs is being replaced by POST /api/v2/flows/datatables/{datatableId}/import/csv/jobs.

    With the new flow, uploading the file is still done to the URL in uploadURI, but the upload must be a PUT request to that URL, and the request must include every header returned in uploadHeaders. The previous response only exposed the URL, which is no longer enough for a supported upload step.

    The POST /api/v2/flows/datatables/{datatableId}/import/jobs endpoint will be removed after the transition period.

    Deprecation

    This endpoint remains available during the deprecation period, but we recommend migrating as soon as possible:

    POST /api/v2/flows/datatables/{datatableId}/import/jobs

    API Explorer – start import job

    Relying on uploadURI alone for the upload step is deprecated and will not be supported long term.

    What to Use Instead

    Use this operation so your client receives both the upload URL and the required headers in one contract:

    POST /api/v2/flows/datatables/{datatableId}/import/csv/jobs

    API Explorer – start CSV import job

    New Endpoint – Behaviour Summary

    Starts an import of CSV rows into a Datatable. Upload the CSV by sending a PUT request to the URL in uploadURI. The PUT must include every header returned in uploadHeaders. The response also includes a token you can use to poll import status until the job completes.

    Effective Date

    Friday, April 24, 2026

    Details

    Due to infrastructure changes to improve platform stability for Datatables, a successful upload of Datatable rows now depends on HTTP headers that are allocated per upload session and must be sent with the upload request. Those values are not inferable from the URI alone; they have to be returned by the API and applied by the caller when uploading to uploadURI.

    The existing endpoint only returned the URI, so clients following the old contract would upload without the required headers and fail against Datatable row uploads.

    Customer Impact

    After you call the new import job endpoint, the response includes an uploadURI and uploadHeaders.

    To send your CSV file:

    1. Send a PUT request to the uploadURI returned in the response.
    1. Apply every header from uploadHeaders on that PUT request (names and values must match what the API returned).
    1. Use the CSV file bytes as the request body, as required by your import documentation.

    The upload will only succeed if you use PUT to uploadURI and include all of the uploadHeaders from the response; omitting or changing them will cause the upload to fail.

    The POST /api/v2/flows/datatables/{datatableId}/import/jobs endpoint will be removed after the transition period (subject to 0 usage).

    Impacted Resources

    POST /api/v2/flows/datatables/{datatableId}/import/jobs

    Issue References

    CW-4335

    Contacts

    @Jordon McGowan  

    Please reply to this announcement with any questions. This helps the wider developer community benefit from the discussion. We encourage you to use this thread before contacting the designated person directly. Thank you for your understanding.