Oracle Digital Assistant Design Camp: Testing Strategies
From Frank Nimphius
How well is your digital assistant performing? How smart is it? Is it getting smarter? The only way you know is by by observing, measuring and testing.
In this design camp we will dive into some of the strategies for proving the worth of your digital assistant: from the NLP to the custom components to the conversations themselves. We will show you specific testing strategies to ensure the bot correctly understands natural language - and also ensure that it doesn't incorrectly resolved phrases which are out of domain, or are intended for different skills.
We will demo some of the best practices we have implemented in Artie where we can verify NLP changes through 180 THOUSANDS tests which run automatically whilst we grab a coffee! An then you'll see how we can do "what if" analysis on how to best define your confidence thresholds.
We'll also take a dip into some of the new features of ODA to add more power to your testing strategies.