Genesys Cloud - Main

 View Only

Discussion Thread View
  • 1.  Automatic export of data tables

    Posted 02-12-2025 23:50
    Hello Everyone
    I understand that it is possible to export data manually by selecting the target data table and clicking the "Export Data" button.
    This time, I am looking for a way to export data automatically, rather than manually.
    Specifically, I would like to export a specific data table automatically at 13:00 every day.
     
    I would like some advice, including whether this is possible or not.
    Best regards,

    #API/Integrations
    #ArchitectureandDesign
    #Reporting/Analytics

    ------------------------------
    Matsumoto Shun
    Unknown
    ------------------------------


  • 2.  RE: Automatic export of data tables

    Posted 02-13-2025 08:59

    Hello Matsumoto Shun,

    This doesn't seem possible currently and I would recommend raising an idea on the Genesys Cloud Product Ideas Lab so that it can be considered for future development 



    ------------------------------
    Sam Jillard
    Online Community Manager/Moderator
    Genesys - Employees
    ------------------------------



  • 3.  RE: Automatic export of data tables

    Posted 02-13-2025 10:39

    You have a couple of solutions here.

    1. Export Schedule.  You can schedule ANY export using the schedule on a recurring basis  These can be sent to your inbox or you can email out to 10 or less email addresses.  There are many options for the schedule:  Scheduled Exports view - Genesys Cloud Resource Center
    2. Create Static Link allows you to create a link to the displayed data on a schedule as described above and then your BI tool like Power BI can extract that data on a scheduled basis to be consumed outside of Genesys.  Generate static link - Genesys Cloud Resource Center
    3. Export View: There is an API that could be scheduled using a web service or a batch file that can export Views in Genesys Cloud:  https://developer.genesys.cloud/devapps/api-explorer#post-api-v2-analytics-reporting-exports 


    ------------------------------
    Robert Wakefield-Carl
    ttec Digital
    Sr. Director - Innovation Architects
    Robert.WC@ttecdigital.com
    https://www.ttecDigital.com
    https://RobertWC.Blogspot.com
    ------------------------------



  • 4.  RE: Automatic export of data tables
    Best Answer

    Posted 30 days ago

    I developed a PowerShell script that I manually run before deployments, which gets a list of all of the data table details from an org.  It then loops through each data table to download the .csv file with all the records.  I'm sure you could place something like my PS script on a server and run it with task scheduler or perhaps run it in azure pipeline. 



    ------------------------------
    Carlos Albor
    Principal PS Consultant
    ------------------------------



  • 5.  RE: Automatic export of data tables

    Posted 30 days ago
    Edited by Jose Albor 30 days ago

    Using the gc.exe command line application.  Start by setting your region, oauth clientid and clientsecret variables.  Then get the list of data tables from the API.

    $varResponse = .\gc.exe flows datatables list -a --clientid $ClientId --clientsecret $ClientSecret --environment $Region -i --outputformat JSON --profile gc-cli | ConvertFrom-Json 

    Then loop through the response objects.

    $varDataTableJob = .\gc.exe flows datatables export jobs create $($_.id) --clientid $ClientId --clientsecret $ClientSecret --environment $Region --outputformat JSON --profile gc-cli | ConvertFrom-Json

    Use a Do{this operation} Until{this condition is met} to query the below API, with a 5 second pause.

    $varDataTableStatus = .\gc.exe flows datatables export jobs get $($_.id) $($varDataTableJob.id) --clientid $ClientId --clientsecret $ClientSecret --environment $Region --outputformat JSON --profile gc-cli | ConvertFrom-Json

    You can then use the downloadURI from the response to download the file. I set $varDownloadID = $varDataTableStatus.downloadURI

    Set your $filepath (save directory + filename) variable.

    $varfile = .\gc.exe downloads get $($varDownloadID) --issueRedirect=false --clientid $ClientId --clientsecret $ClientSecret --environment $Region --profile gc-cli | ConvertFrom-Json 

    I then leveraged a PS System.Net.WebClient operation to download the file with a $varfile.url and $filepath that I set in my script. Keep in mind this WebClient operation can be unreliable with downloading files, giving the script a false-positive response that it saved the file when it didn't actually save. To overcome this problem I placed the WebClient portion into a Do{this} Until{this condition is met} operation.  To keep attempting to download the data table until the $filepath is true.  

    End of loop. 



    ------------------------------
    Carlos Albor
    Principal PS Consultant
    ------------------------------



Need Help finding something?

Check out the Genesys Knowledge Network - your all-in-one access point for Genesys resources