Legacy Dev Forum Posts

 View Only

Sign Up

Upload CSV almost but not quite

  • 1.  Upload CSV almost but not quite

    Posted 06-05-2025 18:08

    AllanK | 2023-03-10 07:32:18 UTC | #1

    Hi,

    So I'm triggering and import job to get a URI - that seems to be working fine. Returning a url to post to.

    Running that from a function library (I'll post it below)

    However when trying to upload the CSV, while I'm getting a 200 message - it doesn't like my csv it appears. I'm trying to append data to an existing data table.

    This is a simple 2 column table - with phone (expressed as tel:+......) and inin-outbound-id

    here's the first couple of rows.. these numbers have been randomised.

    cat ../tempfiles/outboundlists/outboundNums.csv phone,inin-outbound-id tel:+61428462115,9067161e609325abc7895de2ada21506 tel:+61428461645,5f395e0f730453021ff176996b20fb91 tel:+61428461897,6df7771114358d70455a6760c0a37ac1 tel:+61428461778,91f38ff0f017139c2011565b032dc6af

    Here's my upload code

    url=uploadURI

    hed = get_token(role)

    try: headers=get_token(role) # this returns a full header line from a function I built files={'file':open(fileName, 'rb')} #fileName refers to where I saved the file (also tried files={'file':(fileName, open(fileName, 'rb'),'text/csv')})

    headers = {'Authorization': 'Bearer '+auth_token}

    r = requests.post(url, files=files, headers=headers, verify=True) print(r.statuscode) print(r.headers) print(r.content); except Exception as er: print('exception') traceback.printexc()

    Here's the response from the status request. So looks like I'm sending the right headers - but something missing ---- BTW - the csv was created from a dataframe - so if I can avoid exporting to csv that would be awesome.

    { "id": "14619592-f5d1-44b1-b0ba-5fb9c3f3cd33", "owner": { "id": "9f73e8dd-2030-449e-80c5-ff0b1e8584c2", "selfUri": "/api/v2/users/9f73e8dd-2030-449e-80c5-ff0b1e8584c2" }, "status": "Succeeded", "dateCreated": "2023-03-10T06:46:35Z", "uploadURI": "https://apps.mypurecloud.com.au/uploads/v2/datatables?additionalInfo= [cut out the rest from here] "importMode": "Append", "errorInformation": { "message": "Success", "code": "SUCCESS", "status": 200, "messageWithParams": "Success", "details": [], "errors": [ { "message": "Import failure at item 1 with key \"?\"", "code": "FLOWSDATATABLESIMPORT_FAILURE", "status": 400, "messageWithParams": "Import failure at item {itemNum} with key \"{key}\"", "messageParams": { "itemNum": "1", "key": "?" }, "details": [], "errors": [] },

    and the trigger call

    def triggerDataTableImport(role,tableID,mode): apitoken=gettokensdk(role) apiinstance = PureCloudPlatformClientV2.ArchitectApi(api_token) if mode=="ReplaceAll": body = ({"importMode" : "ReplaceAll"}) else: body = ({"importMode" : "Append"})

    try: apiresponse = apiinstance.postflowsdatatableimportjobs(tableID,body) return apiresponse.uploaduri ,apiresponse.id except ApiException as e: print("Exception when calling PostFlowsDatatableImportJobsRequest->postflowsdatatableimport_jobs: %s\n" % e)


    AllanK | 2023-03-10 19:54:48 UTC | #2

    BTW - I know there is the method of inserting row by row - but this job will run multiple times a day and may have quite a lot of data - so the preference is clearly to upload with a csv..

    FWIW - here is how I generate the headers.

    Authentication functions

    import PureCloudPlatformClientV2 from PureCloudPlatformClientV2.rest import ApiException import base64, requests, configparser import sys,os

    connconfobj = configparser.ConfigParser() connconfobj.read("configs/connection.cfg")

    def get_token(role='devReportingRed'): role='devReportingRed' if role is None else role ENVIRONMENT = 'mypurecloud.com.au' # eg. mypurecloud.com

    connconf_obj = configparser.ConfigParser()

    connconf_obj.read('connection.cfg')

    envparams = connconfobj[role] roleFunction = env_params["function"]

    print(f"Role function is {roleFunction}")

    CLIENTID = envparams["id"] CLIENTSECRET = envparams["secret"] if not CLIENT_ID:

    print ('Role specified not found using reporting read only from dev')

    CLIENTID = '******' CLIENTSECRET = '********' authorization = base64.b64encode(bytes(CLIENTID + ":" + CLIENTSECRET, "ISO-8859-1")).decode("ascii") requestheaders = { "Authorization": f"Basic {authorization}", "Content-Type": "application/x-www-form-urlencoded" } requestbody = { "granttype": "clientcredentials" } response = requests.post(f"https://login.{ENVIRONMENT}/oauth/token", data=requestbody, headers=requestheaders) if response.status_code == 200:

    print("Processing Request")

    pass else: print(f"Failure: { str(response.statuscode) } - { response.reason }") sys.exit(response.statuscode) responsejson = response.json() requestHeaders = { "Authorization": f"{ responsejson['tokentype'] } { responsejson['access_token']}" } return requestHeaders


    Jerome.Saint-Marc | 2023-03-10 19:54:50 UTC | #3

    Hello,

    Your csv file (at least the column names) doesn't match what you defined for your Architect Datatable.

    A Datatable is defined with a Reference Key (attribute/field name = "key") During the creation/configuration of the Datatable, it is asked to enter/type a "Reference Key Label". And you can then define Custom Fields.

    Let's say I define a table with Reference Key Label = "inin-outbound-id" Using this field in my example to be sure it is unique. And then define a custom field with name "phone". The Datatable row contains 2 attributes: "key" (which contains the inin-outbound-id value - but the column name is still "key"), and "phone" (the name of the custom field). My csv would then reference key and phone as the column names: phone,key tel:+61428462115,9067161e609325abc7895de2ada21506 tel:+61428461645,5f395e0f730453021ff176996b20fb91 tel:+61428461897,6df7771114358d70455a6760c0a37ac1 tel:+61428461778,91f38ff0f017139c2011565b032dc6af

    Regards,


    AllanK | 2023-03-10 19:57:13 UTC | #4

    Thanks @Jerome.Saint-Marc

    Had the column name wrong.. of course !!!

    Thanks - my head was jelly by this stage..


    system | 2023-04-10 19:57:49 UTC | #5

    This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.


    This post was migrated from the old Developer Forum.

    ref: 18829