Mr Imran is a grounded man. Speak to him about anything tech - and he will instantly highlight how India is coming to the foreground in technology more than ever before. As the CTO of E2E Networks, he has had his fair share of challenges and triumphs, leading their tech team for over a decade from the hitech city of Hyderabad.
We had a tête-à-tête with him when he recently visited Delhi for a conference.
Can you talk a little bit about what E2E’s vision is - about growing and scaling a Swadeshi AI-First Hyperscaler - and your journey through this process?
E2E’s vision is to build a complete suite of tools to make it easier and more efficient to deploy modern-day AI workloads. There is also a strong focus on leveraging open source technologies both for building the tech stack as well as for making available open source models which can be further trained and modified according to the requirements of customers. Eventually the days of manually setting up infrastructure, monitoring it, and scaling it manually as an AI company is a complete distraction from the core requirement of building AI solutions. E2E aims to be that platform which takes into account all of these underlying requirements and offers a one-stop solution to businesses to train and deploy their AI models.
When did you join E2E? How has your journey been?
I joined E2E at the time of its inception, around 2010, and it's been a fulfilling journey to say the least. We started with building VMs (Virtual Machines) manually; then we scaled. The transition and the journey from an IAAS (Infrastructure As a Service) offering to building scalable products like storage, networks and so on, and then further into AI solutions, has been a tremendous challenge and a really enjoyable one.
Tech is the backbone of E2E. Tell us your experience of scaling a tech team as the company grew.
At E2E, we believe that our people are the core strength of our company. If you see across the board, we have team members who have spent more than 8 years in the company, right from Tech to Sales to Finance. So, naturally, the way we scaled up was to empower our best performers with more responsibility and then building teams around them.
In tech, we have benefited from our extensive experience in open source technologies and are able to build standardised solutions. Once you are in complete control of your technology stack using open source, it's easier to scale it as well.
How much do you rely on open-source solutions, and why?
It's an intentional choice to use standard-based open source solutions simply because of the no-strings attached availability and licensing. That is one of the core reasons behind our ability to scale. Secondly, we actually pay for enterprise support for a lot of our open source usage - not just for support but as a reasonable consumer of open source, we want to show our solidarity with the companies building open source products. Thirdly, it's easier to scale an open source product for availability, resiliency and scale as there is a vibrant community available to bounce off ideas with.
How do you build an organization that can keep up with the pace of change that’s currently happening in AI?
Eventually it is people, people, and people. AI is a rapidly evolving paradigm and we have now built teams who have worked on AI for a period of 4-5 years. As again, we leverage open source offerings like Kubeflow and other products, which are pretty much setting the standard on how to do AI as a service.
The selection of the tech stack itself has a major advantage for us. Add to this our ability to do massive performance on infrastructure and network, and we are able to deliver high-performant AI solutions to our customers.
What kinds of technology problems do you think will drive the next wave of innovation?
The ability to onboard a huge number of users onto AI is where the biggest challenge and opportunity lies. The question is, how do we help data scientists build models at scale (read PBs of data), deploy them for scale (let’s say, a publicly available API), and monetize those offerings. This is going to drive the next wave of innovation.
The current buzz around GPT and generative AI will drive a lot of users to AI and there will be a significant opportunity to streamline this process.
What drives clients to E2E?
Although a principal amount of discovery happens because of our price to performance ratio, our customers have started appreciating our advantage of being a strong tech company leveraging open source solutions to build services. We are also seeing significant attention to our GPU products with A100, A30, L4 and other cards and the soon to be launched latest gen cards as well.
How do you go about getting employees to begin to think a little differently about what the technologies are capable of, raising that technological imagination in the team?
A lot of the foundational changes in terms of how teams think has to do with transparency of the company’s vision and strategy. We have always communicated our vision and core strategy to all stakeholders internally and that gets the teams excited. At E2E, we don't build products in silos. It's common for a sales or a finance person to join a tech demo and give suggestions based on what they hear as feedback from our customers.
Of course, we invest in employee training programs, encouraging employees to attend conferences etc. to widen their technological worldview. In a nutshell, transparency breeds trust; trust fuels innovation; and innovation delivers customer value.
Could you relate some experiences with your team that have been special?
We have had deployments that have stretched on for days. To see my team members self-organise themselves to guarantee 24/7 coverage for both deployment and support was a satisfying experience on having built a team which is aligned with the requirements of the company and not just driven by me as the manager.
E2E is one of the largest swadeshi AI-first Hyperscaler platforms. What have been the 3 biggest challenges of building from India for the world?
The first one would be access to capital to build an AI-first platform as the existing hyperscalers are all able to deploy millions or billions of dollars in a short span of time.
Secondly, a lot of nations have their own privacy laws like the GDPR (General Data Protection Regulation) in Europe and there are similar laws in the US as well which restrict how much compliance you can do. Being an Indian company domiciled in India, we of course have to adhere to Indian laws above all else. A data protection framework which creates a level-playing field for Indian companies would be a good starting point.
Thirdly, we need to start training our students/professionals in generative AI so they can build products and models rather than just working in service companies.
Can you relate a couple of key incidents from your career at E2E that have transformed you as a leader?
Covid had a dramatic impact on the ability to lead a team simply because of having to reimagine everything you are used to. Secondly, the transition to doing more products rather than simple IAAS meant we had to restructure the teams to be able to deliver them.
Can you tell us what you look for in an engineer before they are hired in E2E?
The ability to find solutions and demonstrate actual proficiency in code/ops rather than simply subject matter expertise is a key differentiator. Other things include the ability to ask smart questions, communication skills, technical skill upgrade history, and so on.
Finally, can you talk a bit about the future of the Indian tech ecosystem, and where you see it headed?
India is going to be generating massive amounts of data and the tech ecosystem would be transitioning to build products with these large data sets. The fundamentals of the industry will change as more AI penetration happens in academia and availability of trained machine learning engineers happens at scale. That would be an epochal moment for the Indian tech industry - to leverage that talent and build more products and services suited for a global landscape. I see us transitioning out of the services model and stepping directly into products.
Do you agree with Sam Altman’s comment on India’s chances of building a foundational AI company?
AI requires massive amounts of funding simply because of the GPU compute required to do training on the LLMs. Sam isn't far off the mark when he says that you cannot build a foundational AI company for 10 million dollars. The real momentum in building such companies would come in with increased funding to invest over a longer period of time for building foundational AI/AGI.
We have everything it takes in terms of tech talent, availability of infrastructure, etc. to be able to build such companies. The only thing we require is sustained funding over a longer cycle to leapfrog into a foundational AI company.