About

My name is Brian Warner, and I’ve been involved in Quality Assurance (IT/Software) since 2005. Having worked for a variety of companies, including Warner Brothers, Yahoo!, Universal Music Group (UMG), eHarmony and Toll Free Forwarding, I’ve had a wonderful opportunity to learn different approaches to the topic of Software Quality. From Web, to Telecom, there are unique opportunities for ensuring quality to customers.

My work relies heavily with writing code to test code. Automation has been part of my work since 2010. Trying to remain language agnostic, I currently write code in Python and Groovy/Grails. My view of languages is that the best language is the one that the QA team can effectively maintain and rapidly generate to keep up with a compressed timeline. QA often gets the short window at the end of a development cycle, so writing code rapidly is key. Some prefer to architect in Java, but for me I prefer a language I can prototype concepts with rapidly, hence my use of Python.

My Thoughts on QA

Some managers feel that a QA automation engineer should use the same underlying technology (language) as the developers. In some cases this is in deed true, but if the QA team is automating a front end (such as a website) then the choice of language doesn’t need to be consistent with the development team. An argument used at eHarmony, was that the developers could drop in and support QA development from time to time, and therefore having everyone use the same language (Groovy) was key. This, as it turned out, was a moot point. Developers never have time to drop in and help QA development.

QA teams need to be self sufficient. We, like every time, need help from IT for VM support, Jenkins/CI and so on, but the QA team needs to handle as much as possible of their own needs. As for language choice, I feel that’s up to the QA team itself. If we’re talking website/webapp/mobile application automation, then language is less of an issue. Developers may use Rust, Java, JS for the application side while the QA team could use JS, Python or Ruby for the automation side. After all, the application automation is hitting the site from a black-box concept. There’s no need to embed end-to-end automation into the same framework/language or repo as the primary development.

Communication

When I started in QA, transitioning from Web development to QA, I was greeted with hostility. There was an uneasy alliance between QA and development teams. I’ve since felt this at many different jobs I worked at. Instead of fighting with developers, however, I chose to work with them. Working with people from a variety of backgrounds I saw them as people, not obstacles. Yes, sometimes they hate it when I point out a flaw in code it took them 3 days to write, but everyone has a best approach to hearing bad news. It doesn’t always bode well to blurt out, “your code has a flaw.” Sometimes someone worked to 3am on that code, and needs some prompts to the right direction: “Great work on this, I think we’re close, but you know we have an issue that I’m pretty sure you could resolve.”

I know. Corny. But I live by good communication. Without it, I think a multitude of problems are introduced.

Automation Frameworks

I’m not averse to Javascript frameworks, but they do make me nervous. In fact all pre-made frameworks give me cause for concern. I was a huge fan of the Golem automation framework (written in Python), and then the developer just up and walked away. This is a major concern of frameworks. After 2020, I ended up writing my own framework and just using it instead.

The upside to writing your own framework is total control. You maintain it, you create it, it’s much easier to know the ins and outs of your own code than to dig through and tweak another person’s framework.

The downsides to writing your own framework is the maintenance. For me, it’s well worth the effort, but for some teams it may be too much a PITA. Having a stable framework that meets your needs and gets regular community support is one way to go, if you just don’t have the bandwidth to maintain a framework’s codebase.

Reporting

One clear miss on most automation frameworks is in the reporting. At best you might get some HTML/XML spit out about test results. I prefer a standalone view of past test runs, names of branches under test, environments run on, and results. I want to see the DELTA of success/failure. That’s what Golem was good at… that’s what I recreated and improved upon.

What I decided to go with was a local database that takes stores test results, along with the named environment, time of test, and the Branch that was tested. I wrote a simple Angular web frontend to support the results, that uses a spreadsheet-like view to display all this data with sortable columns.

By writing the code myself, storing my my local db, and presenting in a web based spreadsheet, I can quickly see patterns – increasing failures, etc.

Parallel Processing

Another important aspect of a framework is to run tests in parallel. This is especially important when you have dozens or hundreds of tests that need to be processed. Running hundreds of tests sequentially could take hours, but if we run 10 tests at a time, we cut the time to process down substantially.

This is why I love using Selenoid. Yes, I know, I could use Selenium Grid, but it’s slow in comparison to the rapidity of Go. Selenoid is written in Go and runs in Docker. Docker makes it easy to setup, and Go makes it FAST.

Handling Fragile Tests

The bane of automation is the brittle test. You have a test that loads a page, waits for a modal, clicks a link, fills out a form, hits submit, waits for a response – there is always the possibility that the environment, browser, test load or just the test itself may break and the break is unrelated to a customer experience. It is a fault in the test execution. Sometimes it works, sometimes it doesn’t. We should do our best to minimize these problems, but there is also something we can add to the test cycle: RETRY LOGIC.

Retry Logic will check if a test in queue has just failed and rerun it. I handle this by waiting 60 seconds and attempting the same test again. If it X times (2 or 3 times), then I log it as a final failure.

Security

My view on testing is that it should also include security violation attempts. XSS, SQL Injection, directory traversal, authentication bypassing, token stealing are all valid tests that a QA tester should be attempting on new code being introduced. This is especially important if there isn’t a dedicated Cybersecurity specialist on site. Security testing might be automation, but in general this type of testing needs to be clever and unique. Security testing lends itself to manual labor, where one thinks of ways to bypass a blockage, or a fact finding methodology to iterate upon.

Once I discovered that a company I worked for, had (unknown to them) their development environments indexed on archive.org. I scrutinized the output on archive.org and found a few indexed dates had error output displaying some information that shouldn’t be public. Attackers are doing these very same techniques.

Attackers at another job would use X-Forwarded-For values of fake internal IP’s, in order to access sites on the intranet. Tools like Burp Suite can reproduce these types of tampering attacks.

Training for Security can come from sites like:

  • Burp Suite Training
  • Hack the Box
  • Offensive Security
  • Sofia Santos

Some Security Tools:

  • Websites: Archive.org, VirusTotal, AbuseIPDb.com
  • Burp Suite
  • Distro’s: CSI Linux, Kali Linux, Parrot OS
  • ELK (Elastic Stack)
  • Maltego
  • Metasploit
  • Suricata

Beyond the application testing, I also think it prudent that if you work for a small company, a QA person could setup a VM to become an ELK server and have it start digesting logs, and displaying usage and threat data. If valuable to the company, it can become part of a live monitoring system, when there is no SOC present.

Conclusion

I think Quality Assurance, with regards to software, is more than manual testing. Today it’s even more than automation. It’s a large umbrella for ensuring software quality against a variety of issues, including security related threats. Automation is a staple today, and staying up to date with new technology is paramount, but knowing the core logical sequence of steps to test is fundamental. Knowing how to report and communicate to developers and team members is almost as important as finding a critical flaw. QA needn’t be adversarial to development but should be a cooperative effort.

Connect