A search for Chatbot providers will yield thousands of results, there lies a difficult selection process for an organization, seeking for the right product that suites their requirement. The top-most selection criteria involves scalability, flexibility, total cost of ownership, wide options for interfaces, AI (Learning and Training) algorithms and robust analytics. On top of these criteria, Chatbot with specialized domain training is a must that helps in quickly deploying a well-trained and tested brain.
#Choice of AI (Enterprise vs. Cloud)
Developing the algorithms to build a good AI is critical. With limited control on customization available with open source AI like Wit.ai, Api.ai, etc ready to deploy open-source chatbot was scoped out. The architecture was developed from scratch with best algorithmic libraries picked, tailored and created fresh to best suit the design.
Consequently, medium and large organizations prefer to have the AI on premise due to the data security policies they are abided to. Hence, developing the AI platform gave an advantage for on premise and cloud. It was also factored that infrastructure requirements were minimal so as to reduce total cost of ownership.
Often, large organization preferred to have HR applications from different vendors to enable best practice. Hence, ready connectors to known HRMS are a must when building a chatbot. H# is build with ready connectors for well-known HRMS and accepts all open source web service formats. Additionally its capability to integrate with legacy systems makes it an easy to connect with third party systems.
Most Chatbots converse based on human initiation. However, there may be many automated or manual events which may be pushed to employees like survey, event request notification, important document upload, and so on. This feature was built to accommodate such triggers and further the conversation would flow in a normal fashion.
One feature of Chatbot is to respond to queries related to documents. The ideal way to respond is by picking up selected sections from documents and displaying to user. However, for something like a Leave policy, there could be various clarities and clauses mentioned, such as, employee’s probation, employee grade or band, location, has the employee resigned, employee gender, etc.
With all this data available in the HRMS, a bot can read employees profile and display specific responses instead of generic responses. For an example, a male employee asking “Am I eligible for Maternity leave?” the response can be tailored to say, “Sorry, Maternity Leave is only eligible to female employees.”
There are various analytics reports and interfaces built to understand bot flows, usage, engagement time, etc. All these finally come down to two important aspects:
1. Improving Bot’s Learning Capabilities: With self-learning and training bots the mind of bot keeps getting better which further result in less training after a certain point of time.
2. User Expectation: The functions a bot responds to may be limited by business requirements initially stated. With the ability of a bot to accept free-text messages, user may seek other functions from a bot, which is then recorded by the analytics tool. Thus, helping organization to understand users’ expectations.
Click here to continue reading Godwin Pinto’s article.