In a candid conversation with India Technology News, Mr Vivek Basavegowda Ramu dives deep into the imperative role of Performance Testing in enabling enterprises to ensure a secured environment and address scalability demands.
Here are the excerpts from the interaction-:
- What has been the impact on performance testing with a rapid acceleration of digital initiatives and release of applications?
Before I start with the answer, I would like to thank the India Technology News team for organizing this interaction. Coming back to your question, previously we had seen the technology or whenever we had a product release, it used to happen once a quarter.
But now we are living in a time where releases to production happen three times a day. So companies are releasing code three times a day in production. Here you can see how important it is to have testing and more importantly performance testing done to make sure the product is resilient and stable, if not, things are not going to work in production for sure.
The timeline has shrunk for a very short duration. The impact is definitely there in the entire industry and so is in performance testing. The way we are trying to cope with this challenge, or the way performance testing has evolved to handle this challenge is that we are now more into introducing automation wherever possible within the performance testing ecosystem.
And also we are venturing more into CICD. It’s Continuous Integration and Continuous Delivery, where whenever code is pushed, we have some pre-setup scripts and performance testing activities, which do take care of themselves without having anyone actually do them. So that way, we are ensuring the product’s performance is up to the mark.
Also the practices have been moved more towards the cloud. So the infrastructure scaling is easier to handle. The mindset has changed, the practices have changed and we are trying to cope up with the increased releases and digitalization process.
- How does performance testing omit the latency issues for enterprises and which industry verticals are witnessing maximum demand for performance testing?
In the last 4-5 years, we have seen a lot of smart technologies evolve, especially those related to edge technologies like wearables, self-driving cars, smart homes, smart devices and much more.
So, these things have evolved a lot and have been increasingly incorporated into our day-to-day lives. If you consider Retail, Consumer, Automobile, Healthcare etc you will notice technology has greatly advanced. It is there in a day to day life in front of our face, even if we take Entertainment, be it Instagram or Facebook, the amount of time we wait for content to load or the attention span we have is really less. So it’s all about speed now.
Even if you go grocery shopping these days in the US, you can see stores like AmazonGo where you really don’t have to pay at the end of the trip for groceries, you just pick up the groceries and with the help of cameras, sensors and inventory tracking the payment system does the rest.
Everywhere you look, there is more and more advancement happening with the introduction of Virtual reality, Metaverse etc. Also every year is different, especially after the pandemic things have really evolved with technology. So performance testing sort of goes along with it.
No matter how well you build your product, application, or software, your ultimate goal is to ensure that thousands of users will use it and that they will be well served. So ultimately, it should be built for hundreds and thousands of users. Performance testing is a way of ensuring that it does take place in reality. Even today, if you open your browser and any webpage is not loading for five or six seconds, chances are either you will refresh it or you will close it.
Be it any vertical or sector, wherever going, performance testing is also going with it and this is the only way to ensure zero or low latency systems for enterprise.
- Talking about new technologies, then how do you think the arrival of 5G which is creating an entirely new ecosystem of applications will impact the performance testing segment?
Things that were unthinkable 4-5 years ago are now possible thanks to 5G technology. The speed at which it works is incredible. You can download GBs of movies in a few seconds. So that’s the speed it can deal with and can transfer data. What it really does is create more opportunities and encourage more innovative thinking. 5G technology essentially provides a platform for exchanging large amounts of data, and with that comes the user expectation that everything will work at lightning speed.
So from a performance testing standpoint, what we are able to do is process that much data at a high speed and ensure stability. 5G has made things more innovative, especially in smart homes or smart technology. Today, if you have a smart home (like smart plugs, smart switches, smart bulbs, smart locks, smart cameras etc) you can access them from anywhere in the world with your phone or an app. More computing should happen in the cloud, this is making today`s infrastructure and communication very complex and you cannot bring this to the world without ensuring performance.
One other example is virtual reality. Imagine the amount of data that must be processed in a split second to provide an immersive experience if you have to play a game or attend a meeting in the metaverse. This demonstrates how critical performance testing is when it comes to 5G; the more reliable these technologies are in terms of performance, the better the innovations will be.
So if I get it right, then the real time applications, the usage of real time applications will surge and that in turn will complement the performance testing segment.
Yes, you summarized it really well. So, any real-time application or anywhere you need to access data at a much faster rate, either download or upload, There, 5G would be required to assist in making it happen, as well as performance testing to provide assurance.
- The next question would be about the role of the cloud in improving performance testing objectives and simulating real life conditions. What are your thoughts on that?
The cloud has made it easier for technology to evolve in many ways. Previously, whenever we had a need to horizontally or vertically scale infrastructure to handle additional load, it was a very tedious few days’ process. But now those things can be done with just a click of a button, scaling and computing is something in which the cloud has really emerged and given us a lot of comfort.
And also we can leverage additional services in the cloud, the big cloud providers like AWS, Azure and Google Cloud have made many services like AI/ML, Analytics, Security etc available since data already resides in the cloud system it is easy to incorporate these services and derive maximum benefits. So any information or any work we have to do, the amount of effort that is required to do it has shortened because there are already pre-available services. The cloud also guarantees an uptime of 99.95% or higher, which is an excellent achievement in terms of avoiding server outages. It is definitely more reliable at simulating real world data and conditions.
Cloud computing greatly aids infrastructure, but it also means more computing and the ability to build complex applications with the expectation of near-zero response time, posing new challenges for performance testing and engineering.
- What are some of the best practices for organisations in selecting the right testing tools for their specific IP environments, and infrastructure?
Each application is different, and a lot of it depends on how it is developed from both a software and hardware point of view. Even though it is difficult to provide a generic solution, these are a few pointers that will help you decide:
- Capability: It can deal with the protocol you are using in your application
- Budget: Some of the tools are way too sophisticated, you will have a lot of licensing costs associated with it. There are also some open source tools too but it is not very user friendly.
- Team`s Skill Set: You have to consider your team’s skill set, you don’t want to bring in a tool which cannot be understood or used up to its full potential from your team. Either you build your team around it by upskilling or find a tool which is more suitable to the skill set of your existing team itself.
- Tool Support: If the team is new or don’t have required skill maturity then tools team support is required, licensed tools will have dedicated support based on the agreement but open source tools will have only fellow-community to take help from in general forums.
- Integration with Other Tools: With CI/CD, DevOps it is important to integrate well with other tools to enable faster deployment cycle
- Low Code/No Code Tools: For simple low load UI executions, you can also consider tools like BlazeMetre which will run on your browser directly and can record your user action and help you with performance testing.
But again, sometimes things go way beyond this in performance testing. Highly skilled performance testing resource is advised to evaluate best suitable tool, few industry standard tools which are around and really well known are LoadRunner, JMeter, NeoLoad etc.
- What will be your message for businesses who are hesitant to adopt performance testing tools, how can they gather significant ROI after the implementation of these tools?
I have been in situations where performance testing was not really thought about. Performance testing usually takes place almost at the fag end of the release cycle once development, functional stability, integration testing, etc are completed. Most of the time the project will not account for enough time for performance testing and engineering activities, more often than not what happens is, you find more issues in performance and release cannot be signed-off. It goes back to the development team with performance fix suggestions and should be re-tested and causes a lot of rework.
Any businesses or teams if they are reluctant to plan for proper performance testing and deploy project into production, expecting things to go well, that’s when you see a lot of problems in production and many users leaving one star reviews as they are unhappy with the way the website or app is working, taking lot of buffering time, slowness and in processing the payments etc.
So ultimately, you are not going to make it if the performance of your application is not right. Again, I’d like to emphasize the same point: no matter how good your application is and how wonderful the features are, if hundreds of thousands of users cannot use it concurrently and be satisfied and served within a very high performance, your product will undoubtedly fail.
So any businesses out there who are really reluctant or thinking twice should include performance testing and engineering as part of your SDLC without fail. Also of late, we see more shift-left and shift-right performance testing, in which performance testing can be introduced even in the early stages of development and also when an application is in production. You need to have good performance if you want to survive, or if you want to really make it big in technology.
- Cybersecurity attacks have become more prevalent in recent times with hackers discovering vulnerabilities in software applications. What will be your recommendations for organisations to prevent attacks from a testing perspective?
In the last three-four years, the position of cybersecurity engineer has gained a lot of traction. That is because all these big companies are either dealing with customer data, which is very sensitive, or dealing with any payment system or dealing with some sort of money involved in it. So they cannot at any point, let the cybersecurity attacks take place. If your company is attacked, credit card information is stolen, payment information is stolen, proprietary information is gone etc then that’s going to get circulated on the dark web. Your company will lose all its reputation, revenue and definitely the brand value will be impacted a lot. So they really do not want to be in a situation where they are the victims of cyber security attacks.
And in the US we have seen some big companies that have gone through this, like Target and very recently Uber, where they had a security breach. It is hard to believe a 17-year-old hacker was able to get into Uber’s security system and hack the entire system. Initially, it all started as finding bugs, hacking for fun, but now things have really gone bad because if they are able to crack it they can get access to a lot of data and funds.
So there is also the concept of ethical hacking, where you hack a system deliberately to find vulnerabilities in it. The best way to actually stop it or be more prepared is to begin by thoroughly educating your team; the phishing or attack must begin with a single entry point, so your team or company resources should be aware of which emails not to open and which links not to click. So corporates do a good job of training the resources, but because they are the first entry point, they are more vulnerable.
The second thing is that the security of your application needs to be very secure. So these days, we are seeing multifactor authentication, and some biometric-related authentication, that cannot be hacked very easily. So that needs to be in place.
The other thing is that your system needs to be built with much more efficiency, be it the firewall that you are going to introduce, be it the password maintenance structure you are going to introduce or any networking related protocols. You need to have an end user encryption or decryption methodology so that even if something is hacked, it’s not so easy to access all the data. Frequent audit is recommended on top of it to ensure overall security.
Furthermore, you can make a backup and disaster recovery system mandatory, with enhanced access security. These systems will sync data in near real time and are maintained in different regions, in case any attack the primary system can be shutdown and these backup systems can serve customers.
An application will have many softwares installed and many will be from third parties, most of these constantly release patches and updates which will strengthen against cybersecurity attacks, constantly update softwares to newer releases after due diligence. Also, have a position for a cybersecurity engineer or ethical hacker who as part of your team can do those activities ahead of time and see if there are any gaps in your system so that you can be ahead of the game. So these are the steps you can incorporate to ensure that cybersecurity-wise, you are safe.
For businesses that have mission-critical processes, performance testing becomes all the more important if they want to be successful in the long run.