There is no formal definition of KPO. Generally, this term is implied for outsourcing activities, where the services delivered are more complex and service providers need to possess vast domain knowledge or need to be armed with higher educational qualifications. KPO services already outsourced to India include data analytics, content management, research and information services, animation, biotech and pharmaceutical research, medical and health services. However, the postal shine for KPO out of India is research -equity and financial research as well as market research.
Incorporated in 2000,eClerx claims to be India’s first and only publicly listed KPO company. Its clientage encapsulates Fortune/FT 500 companies in high technology industry (hardware, software & services), travel and leisure, media and entertainment, online and bricks & mortar retail, investment banking and hedge funds, and industrial manufacturing and distribution.
Within the KPO broad sector, eClerx has tried to identify areas that they think are both large and are growing quickly. So the two businesses that they have today, fulfill both these criteria. One is Banking and Financial Services, where they essentially provide middle and back office support to large banks globally. The other one is around sales and marketing support, where they provide online operational support to large manufacturing and retail companies.
Here are excerpts from an interaction with PD Mundhra, co-founder and executive director, eClerx.
GS: How has KPO evolved over time? Have you noticed any shifts in this industry?
PD: We have been in this business for about ten years. In the early days, after the dot-com boom, if you were able to provide these services and offer client cost reduction of operating out of India that was very compelling and it was enough sometimes to win mandates and win business. As time passed, clients become more sophisticated, more vendors emerged and industry became more mature, pure cost arbitrage was not sufficient to win mandates or business. One had to provide better execution than what the client could provide locally. One had to provide the cost saving along with better quality.
Now, in the most recent wave of evolution, I think what we see is that apart from providing low cost and high quality clients expect some insights into how operations can be enhanced. One has to partner with clients to evolve services to meet the emerging needs. From that perspective the industry has grown not just in terms of quantity but also in terms of maturity and sophistication.
GS:As the industry grows, has it extended its presence in other geographies? Do you see more markets being added?
PD: I would say yes because there are two or three trends that have broadened the addressable markets for companies like us. One is the emergence of new technologies that make work more and more location friendly. For example, if you look at cloud, companies and clients are hosting application and data on large internet applications. This information can be accessed by companies, employees or vendors like us. So the shift to the cloud makes work less tied to a physical location and thus helps outsourcing.
The other thing is, with the passage of time, there are more and more case studies available that showcase how different organizations have been successfully moving fairly complex and critical activities offshore with very successful results. So when companies look around and see their competitors benefiting from this strength they are more motivated to make those experiments themselves. So, that is also why more clients are trying offshore and more vendors are emerging. This also helps increase the addressable market.
GS:There are many security concerns that buyers have while outsourcing data. Comment.
PD: It is a very legitimate and valid concern, as many companies including ours handles very sensitive data. The industry has become more aware of the risks of data leakages and therefore more and more efforts are being put into place to implement network architectures and security features that minimize the probability of data leakage. For example, if one enters any of our facilities two or three layers of authentication including bio-metrics, two layers of username,password before you can log on to our system is required.
And then the data never really comes down to your desktop. So basically employees work out of a system where they can view the data on their screen, process it, but they cannot save it on their hard drive, they cannot print it, they cannot email it. This setup is not unique to our company, almost all competent companies are doing similar things. Vendors are trying to use network and security architecture that minimize the risk of data leakage.
The other thing people are becoming complacent about is exposing each employee only to the bare minimum amount of data they need to perform their task. This way any one employee does not get access to a large amount of information. But, besides all of this, does it minimize that probability to zero, it doesn't. I think the fact that we are implementing these architectures, the fact that we have industry standard certifications gives confidence to the clients.
To read the Part -II of the interview CLICK HERE