Genesys Cloud - Main

 View Only

Sign Up

  • 1.  Parallelize API / Data Actions

    Posted 09-15-2025 08:43

    Hi all

    We have around 10-12 API data actions pulling data from systems as a call is answered by the IVR.  Fraud score, customer profile, language preference, tenure, repeat call counter etc etc

    We have optimized each to be just a few 100ms each, but when there are a few in series it can add up and we can touch 4 or 5 seconds to accumulate everything and have the flow make a choice of routing.

    Question - is there a way to parallelize the data tasks?  What have other people done?  We are investigating trying to collapse a few smaller calls into one larger one, but we cant really leave any out or push later in the flow as we are aiming for a good level of personalization  in the IVR

    Any tips or tricks?   Would this be something to log on the ideas platform? 

    thanks
    Alex


    #API/Integrations

    ------------------------------
    Alex Whyatt
    Scrum Master
    ------------------------------


  • 2.  RE: Parallelize API / Data Actions

    Posted 09-15-2025 11:35

    Hi Alex,

    I don't think its possible to run the APIs in parallel, but I will let others confirm and provide any tips/tricks they have learnt.  It might be a good idea to ask this in the Genesys Cloud Developer Community as well as they may have some good ideas for optimizing your process. 



    ------------------------------
    Sam Jillard
    Online Community Manager/Moderator
    Genesys - Employees
    ------------------------------



  • 3.  RE: Parallelize API / Data Actions
    Best Answer

    Posted 09-15-2025 12:01

    Hello,

    Just to share an idea - did you have a look at Genesys Functions Data Actions? This is basically an AWS Lambda (nodejs only).

    I can't say if that would save time compared to what you are doing today. This would imply the usual cold start time on an AWS Lambda (if your traffic is low and the lambda is not "reused"). But you could try to combine your 10-12 API calls in a single Genesys Function (and at least save time on the calling/triggering of 10-12 Data Actions).

    That also depends on the type of auth you have with your existing data actions. I mean that if it is a basic or user auth, that wouldn't "consume" more when run from the Genesys Function. If it is user defined (oauth) (i.e. Client Credentials Grant), you would lose the caching of token that exists with the Web Services Data Actions.

    Also note that the limit on concurrent Genesys Functions is lower than on other data actions - right now, it is 25.

    Regards,



    ------------------------------
    Jerome Saint-Marc
    Senior Development Support Engineer
    ------------------------------



  • 4.  RE: Parallelize API / Data Actions

    Posted 09-15-2025 13:15

    Hello Alex,

    I have thought about that in the past, and I do believe there is something you can try.

    I'm not saying it's a good solution or anything (probably isn't to be honest), but you can take a look at it and see if it helps.

    You could trigger an Workflow to run parallel to your Inbound Call Flow.. Maybe trigger it on "Customer Start" event.

    On the Workflow you can can run all APIs (one by one, not parallel), while on your Call Flow you can have a "Welcome prompt" or something similar that has barge-in disabled... With that you can be sure that the customer has to spend at least X amount of time listening to that first prompt (let's say 5 seconds, or 10..), and while that happens your Workflow should be able to finish running all APIs. (So, you should make sure that this first prompt is long enough to always have finished all API requests)

    When the Workflow is done, you could add all relevant information as Participant Data (multiple or a single one with everything in it).

    And then the Inbound Call Flow would use "Get Participant Data" to get all that information and set it on the necessary variables.

    ---------

    I don't think it's a great solution... But it's something you could try to improve.

    The reason it isn't a great solution is because the Workflow and the Call Flow have no knowledge of the other running. (So the Call Flow doesn't actually know that the Workflow started and is running the APIs... and the Workflow doesn't know if the Call Flow is still running or if the customer already hang up, etc..)

    But I don't think there is any other solution, you can't really use an audio to play in the call flow during those 10 APIs, because the prompt will be interrupted when each API is finished. 

    Let me know if this helps in any way, or if I misunderstood your needs!



    ------------------------------
    Marcello Jabur
    ------------------------------



  • 5.  RE: Parallelize API / Data Actions

    Posted 09-16-2025 10:42

    @Marcello Jabur, while I applaud the ingenuity on the approach, I'm not sure I'd hang my hat on that; Triggers are not guaranteed to run in real time, and if there is a delay in system processing it's entirely possible that the trigger runs later than your attempt to fetch participant data.  I also believe that the flow consumes the conversation at the ingest/start of the flow, and works off of the data available on the conversation at that point; if the conversation is updated for participant data while the flow is executing, I don't believe the flow has access to those updated attributes.  I have seen people resolve this last point by transferring to another flow to "reset" the participant data to current, although i can't advise as to whether this is a good strategy.  

    IMHO, the most effective way to solve this is to leverage Jerome's suggestion.  Functions were specifically designed as a way to handle scatter/gather use cases, as the single function call can parallelize multiple HTTP calls and aggregate all of the responses into a single cohesive response that gets returned to the flow.  It will take a little more in the way of setup, getting all the appropriate credentials onto the integration, structuring the request template, and developing/deploying the node.js code to handle all the HTTP requests, but once it's complete it becomes a really elegant way to handle the use case (then the action becomes named something like "Customer Intake Query", and becomes really easy to understand its usage within the flow and bind all the outputs to flow variables that can be leveraged later on).  Depending on the particulars, this might need to be broken into two Functions that run serially, simply to fit under the package size limitations (see: https://help.mypurecloud.com/articles/add-function-configuration/; package size must remain below 256 MB), but that is still substantially better/more readable within the flow layout than 10-12 serial API requests.  



    ------------------------------
    Richard Schott
    Product Manager
    ------------------------------



  • 6.  RE: Parallelize API / Data Actions

    Posted 09-16-2025 10:53

    Hi Richard,

    Yes, I absolutely agree that my solution is probably not a great idea haha. Just trying to give him some ideas. It could be useful for someone who wants to try to "parallel" some things like that, but like you said it might not be the greatest approach.

    But regarding what you mentioned on the point of getting the current participant data (that the flow might not have access to it), if that's the case it could be achieved by using a Data Action (instead of the "Get Participant Data" component). Through the action I believe the Participant Data should be the latest updated, so this would be a way to not require a transfer to other flow.

    Also, can you expand on what you mentioned regarding "triggers are not guaranteed to run in real time"?

    Is that a common issue and it's expected to be kept that way? I wasn't aware of that



    ------------------------------
    Marcello Jabur
    ------------------------------



  • 7.  RE: Parallelize API / Data Actions

    Posted 09-16-2025 11:00

    @Marcello Jabur see: https://help.mypurecloud.com/faqs/how-quickly-a-trigger-invokes-the-workflow/.  Most of the time triggers run near instantaneously, but occasionally there are internal systemic conditions that may slow their processing.  This is partly due to the distributed nature of Trigger processing, as well as the inherent nature of the pub/sub process for eventing.  In practical terms it's not likely to make a difference, but it's worth noting; especially for a use case like this, where we're attempting to aggregate information that is going to be used throughout the conversation lifecycle, so it just has to work.  



    ------------------------------
    Richard Schott
    Product Manager
    ------------------------------



  • 8.  RE: Parallelize API / Data Actions

    Posted 09-16-2025 11:04

    Interesting Richard, thank you for sharing! It's good to have that in mind

    I guess that confirms my idea wouldn't be really reliable!



    ------------------------------
    Marcello Jabur
    ------------------------------



  • 9.  RE: Parallelize API / Data Actions

    Posted 09-16-2025 11:10

    keep in mind it's not that it's going to be unreliable, it's just that it's not a 100% guarantee.  Coupled with your statement about the fact these flows are running fully independently and without any knowledge of each other, it gives me pause to recommend such an approach on a mission critical operation; especially when there are other options available that avoid those potential pitfalls.  



    ------------------------------
    Richard Schott
    Product Manager
    ------------------------------