Updated: Mar 29
Automation testing is absolutely essential and a must-have in the modern software development approach. The benefits of automation testing are known to everyone making it even more desirable. In fact, admiration of zero manual testing, shift-left, and in-sprint automation is pushing companies to introduce automation as soon as possible in their projects. But then each organization has a distinct approach to achieving its automation goals. However, there are some common mistakes that organizations make while implementing automation.
While working on automation frameworks, I have tried to identify common challenges that organizations come across and mistakes they tend to commit. These mistakes create a snowball effect and affect the potential return on investment (ROI) of the automation.
Perfect your automation implementation by avoiding the common mistakes
Automation testing lifecycle
To plan, implement and maintain test automation, I divide the automation testing life-cycle into 4 sub-phases. This helps me to track and control automation in the projects. I have named these phases four as -
1. Automation planning,
2. Automation design/development,
3. Automation implementation and execution
4. Automation framework/script maintenance and enhancement
As I look at automation as a four-step process, I would like to represent these mistakes phase-wise.
Let's look at common mistakes we tend to commit while implementing test automation -
1. Mistakes during Automation Planning Phase
1.1 Not calculating Return on Investment (ROI)
The first and most common mistake teams do is not knowing if the effort they will put into automation will give a return on investment or not. The primary goal of automation is to reduce cost spending yet achieve better levels of quality. Do we calculate the ROI of implementing automation in the project? And if it is negative, what is the point of doing automation. This is the fundamental check team should do before starting automation.
Mitigation - follow the below formula to calculate the ROI of your automation execution.
ROI = Lifetime cost of manual effort saved by automation - [Cost of developing automation + Lifetime cost of maintaining automation]
1.2 Not having a formal Automation Plan/Automation Goal
They say 'a bad plan is better than no plan'. However, 80% of software projects doing automation testing don't have a plan. No wonder why so many automation projects fail to meet expectations. This is because automation has no guideline, no clearly defined goal, or statement to doneness essentially no automation test plan. Often automation projects are run along with manual testing but think how different the two approaches are? How different their implementation is? Doesn't it take a different mindset to perform manual testing and automation testing? Hence a different and thoughtful plan to justify both approaches and their success criteria?
Mitigation - Create an exclusive, dedicated, and comprehensive automation test plan.
1.3 Not formalizing and prioritizing the test scope before starting automation
There is a common saying that 'you cannot improve what you cannot measure', which stands true as well while running automation projects as well. How can we start automation without defining the scope of automation testing? Because without defining the scope of automation, there is no way we can benchmark and measure the progress or success of the automation. Without defining the scope of automation we will falter at every step of the automation testing phase.
1.4 Not having requirements documentation for automation framework/Unrealistic expectation
Have you ever had an automation framework requirements document for the automation framework? We have experienced that the complexity of developing an automation framework is essentially the same as developing a business application. Then when we don't start a business application development without a proper formal requirement document, why do we develop an automation solution without articulating requirements first. This leads to incorrect/unknown expectations from the automation framework and often these expectations are unrealistic.
Automation Frameworks - Don't start automation framework development without a formal requirement document specifying the goals of the automation.
Automation Tools - Create formal requirements and expectations from the automation tool and onboard an automation tool that meet the requirements.
1.5 Automating at the incorrect Technical Layer
Often testers focus on automating UI tests to ensure end-to-end testing coverage instead of evaluating options to cover testing at integration, API, or DB layer. Automating at lower layers will provide better and granular testing coverage at a faster rate with lesser effort. UI testing is slower and has high maintenance. Although we cannot eliminate the need for UI testing, especially for user-facing apps and Saas products it should be kept as little as possible.
Mitigation - Understanding and following the test pyramid is fundamental to successful and efficient automation. It gives importance to more testing at lower technical layers than layers on top of the pyramid.
1.6 Selected tools because it is open source and FREE
A major reason for automation failure is a selection of automation tools or libraries just because it is free of cost or open source. Though these tools do a fabulous job if implemented properly however this is where the challenge lies.
Developing an automation framework is essentially the same process as developing any other software application for business by a team of developers. It is like a development project in its own capacity and should have a strong software development lifecycle (SDLC) standard. And automation developers should also have the same level of competency as application developers. Then there is the need for review processes, architecture, design approvals, comprehensive testing, following coding standards, code management, etc. If we expect high-quality output from automation, an in-house developed automation framework should comply with all the good practices of SDLC.
All of this comes at a cost and this could be quite high. And then the risk of automation project failure is high as organizations always have other business commitments. Hence a free and open-source solution does not come for free and has the risk of failure exists if not developed thoroughly.
Follow the exact best practices in developing an automation framework which is followed while developing business application software.
Worth considering paid or less costly tools available in the market that provides cost-saving. Along these lines, we have developed BotPlay Codeless Automation, a codeless UI automation tool that provides 4x faster automation.
1.7 Selecting an incorrect Automation Tool
This is different from the last item as although we might onboard the most popular automation tool it may not fit best for your use case. Hence test team should focus on their use case while selecting tools for automation.
Identify the use case you are looking to automate. Identify which layer it belongs to in the test pyramid. Approaching tool selection will be easier if apply the lens of the test pyramid.
Do a POC using the trial versions of the tool available for your use case.
1.8 Choosing Automation Tool based on Team's Skill Set
Another challenge is selecting an automation tool just based that justifies the skill set of the team. The world is moving toward codeless solutions for every aspect possible and automation testing is no different. A codeless automation tool will provide faster automation without any advanced skill set.
Mitigation - The goal of automation is not to enhance the programming skillset of the team rather it is to save manual testing costs. If the primary domain of the organization is not in automation, it is best to onboard a tool rather than develop a custom automation framework. Especially for a small organization, it is better to implement a tool as developing and maintaining an automation framework will have a lower ROI than a commercial tool.
2. Mistakes in Test Design/Development Phase
2.1 Not Creating Technical Design
Classic problems faced in application development are experienced in automation framework development as well. In fact, maybe at a larger scale in automation development as in general automation developers have a lower programming skill set than a full-time application developer. Not having a technical design document will lead to a low-quality framework because lack of -
Unstructured modules/Monolithic application
Improper design patterns
Not following best coding practices
No formal review process
Low code reusability
No modularity at the function level
No separate test phase of automation framework
2.2 Missing Exception Handling
Often execution failures of automation need to probe and debug into the framework code because expectations are not handled in the code itself in the first place. Hence framework gives false alerts/in reports and doesn't have a graceful execution.
Mitigation - All types of exceptions should be carefully caught at each function level. Code has to be formally reviewed to ensure no functional is left without exception handling.
2.3 Improper logging mechanism
As we execute the automation framework, this should produce efficient logs so errors in the framework can be traced and troubleshoot without looking at the code.
Mitigation - Efficient logging mechanism has to be integrated with the automation framework, specifying successful steps, errors, and warnings.
2.4 No code management/branching strategy and release management
Automation framework produces huge code. Often multiple automation developers to work together and cause code conflicts. This leads to code being unstructured and code conflicts are inevitable. Also as we release the automation framework for building test automation scripts on top of it, things become more complex and multiple versions arise.
Mitigation - Implement a branching strategy for automation developers to work together and enhance the features of the automation framework. Then using release management and using separate development and release branches, the automation framework should be distributed in the project for their usage and test cases automation scripting.
3. Mistakes in Automation implementation and Execution
3.1 Not having a defined testing scope
Not having a defined scope will lead to no planning in the automation implementation phase. We will not have a defined path to follow and a way to measure the progress of automation scripting.
Mitigation - Automation should have a defined scope before automation scripting starts. The regression pack has to be formally put in from of automation framework as a backlog for automation.
3.2 Not managing Test Data
The framework should have a proper test data management strategy. Storing data in excels, csv, etc is an old standard and makes automation execution slow. Also, variable test data should not be stored in the automation script itself.
Mitigation - Automation framework should have the capability to store and retrieve test data from lighter sources like JSON, XML, etc.
3.3 Automating larger flows
Automating a larger flow will increase the chances of depicting incorrect health of the application under test (AUT), especially in the case of UI testing. Essentially there will be a case where the whole script will fail due to a small failure.
Mitigation - Dividing large flows into smaller testable granular flows will help in the overall stability of the execution. The pass rate of the test suite will improve and failures will correctly point to the specific failure point.
3.4 Not doing proper testing validations
Often automation scripts are created without putting validation checks in the script. Manual tester explicitly checks the testing validations while execution but in automation script often automation testers miss putting checks of actual vs expected in the automation script. Hence diluting the automation testing coverage.
Mitigation - A formal sign-off should be taken from the functional owner. A functional owner could be a business analyst or a functional QA to ensure coverage and correctness of automation scripts.
4. Mistakes in Automation Maintenance Phase
4.1 Not updating the code as per changes in the software functionality
Change or enhancement in the functionality has a ripple effect instigating the SDLC right from development to testing to productization. Keeping automation scripts is often missed causes automation pack outdated and out of sync.
Mitigation - Automation framework shall be developed considering an easy possible update to the existing automation test scripts. What is the use of an automation framework where for every change in the application functionality, a tester has to make code changes in the automation framework code to enhance the automation test script? This is the poorest possible design of an automation framework. To cater to such issues, BotPlay ensures codeless automation development and maintenance. Also, it supports test automation scripting in natural language hence ensuring 4x faster automation development with 80% less maintenance compare to other tools and custom frameworks.
4.2 Not running automation daily
The automation team shall have an engineering way to run automation daily. In the age of agile software development, it becomes more important to run automation after every check-in. Not running automation daily creates two problems
Not leveraging automation to full potential and not testing application enough.
We will not able to understand if automation scripts are not updated as per the latest code changes. Running automation daily ensures there is a check on the automation pack to remain as per the latest application code. It also ensures we do incremental upgrades to the automation scripts rather than realizing late to make larger and unmanageable changes.
Mitigation - Running on a dedicated automation machine will ensure automation can be run anytime without any time loss of the automation tester. Even a better approach is to integrate automation run in CI/CD pipeline to test the build.