Teams and organizations looking to get serious about (or to further improve) their software testing efforts can learn something from looking at how the "big boys" organize their testing and quality assurance activities. It stands to reason that companies such as Google, Microsoft, and Amazon would not be as successful as they are without paying proper attention to the quality of the products they're releasing into the world.
But a look at these software giants reveals that there is no one single recipe for success. Here is how five of the world's best-known tech companies organize their QA and what you can learn from them.
Google: Searching for best practices
How does the company responsible for the world's most widely used search engine organize its testing efforts? It depends on the team and the product. The team responsible for the Google search engine itself, for example, maintains a large and rigorous testing framework. Since search is Google's core business, the team wants to make sure that it keeps delivering the highest possible quality, and that it doesn't screw it up.
To that end, Google employs a four-stage testing process for changes to the search engine, consisting of:
- Testing by dedicated, internal testers (Google employees)
- Further testing on a crowdtesting platform
- "Dogfooding," which involves having Google employees use the product in their daily work
- Beta testing, which involves releasing the product to a small group of Google product end users
Even though this seems like a solid testing process, ex-Google director James Whittaker explains in this video that there is room for improvement, if only because communication between the different stages and the people responsible for them is suboptimal (leading to things being tested either twice over or not at all).
But the teams responsible for Google products that are further away from the company's core business employ a much less strict QA process. In some cases, the only testing is done by the developer responsible for a specific product, with no dedicated testers providing a safety net.
In any case, Google takes testing seriously. In fact, testers' and developers' salaries are equal, something you don't see everywhere.
More details about testing at Google can be found on the Google Testing Blog.
Facebook: Developer-driven testing
Facebook does not employ any dedicated testers at all. Instead, the social media giant relies on its developers to test their own (as well as one another's) work. While in the past this was mostly done manually, nowadays Facebook employs a wide variety of automated testing solutions. The tools that are used range from PHPUnit for back-end unit testing to Jest (a JavaScript test tool developed internally at Facebook) to Watir for end-to-end testing efforts.
Like Google, Facebook uses dogfooding to make sure its software is usable. Furthermore, it is somewhat notorious for shaming developers who mess things up (breaking a build or causing the site to go down by accident, for example) by posting a picture of the culprit wearing a clown nose on an internal Facebook group.
Facebook recognizes that there are significant flaws in its testing process, but rather than going to great lengths to improve, it simply accepts the flaws, since, as it says, "social media is nonessential." Also, focusing less on testing means that more resources are available to focus on other, more valuable things.
Rather than testing its software through and through, Facebook tends to use "canary" releases and an incremental rollout strategy to test fixes, updates, and new features in production. For example, a new feature might first be made available only to a small percentage of the total number of users.
By tracking the usage of the feature and the feedback received, the company decides either to increase the rollout or to disable the feature, either improving it or discarding it altogether.
Amazon: Deployment comes first
Like Facebook, Amazon does not have a large QA infrastructure in place. It has even been suggested (at least in the past) that Amazon does not value the QA profession. Its ratio of about one test engineer to every seven developers also suggests that testing is not considered an essential activity at Amazon.
The company itself, though, takes a different view of this. To Amazon, the ratio of testers to developers is an output variable, not an input variable. In other words, as soon as it notices that revenue is decreasing or customers are moving away due to anomalies on the website, Amazon increases its testing efforts.
The feeling at Amazon is that its development and deployment processes are so mature (the company famously deploys software every 11.6 seconds!) that there is no need for elaborate and extensive testing efforts. It is all about making software easy to deploy, and, equally if not more important, easy to roll back in case of a failure.
Spotify: Squads, tribes and chapters
Spotify does employ dedicated testers. They are part of cross-functional teams, each with a specific mission. At Spotify, employees are organized according to what's become known as the Spotify model, constructed of:
- Squads. A squad is basically the Spotify take on a Scrum team, with less focus on practices and more on principles. A Spotify dictum says, "Rules are a good start, but break them when needed." Some squads might have one or more testers, and others might have no testers at all, depending on the mission.
- Tribes are groups of squads that belong together based on their business domain. Any tester that's part of a squad automatically belongs to the overarching tribe of that squad.
- Chapters. Across different squads and tribes, Spotify also uses chapters to group people that have the same skillset, in order to promote learning and sharing experiences. For example, all testers from different squads are grouped together in a testing chapter.
Testing at Spotify is taken very seriously. Just like programming, testing is considered a creative process, and something that cannot be (fully) automated. Contrary to most other companies mentioned in this article, Spotify heavily relies on dedicated testers that explore and evaluate the product, instead of trying to automate as much as possible.
What's the future of testing at Spotify? Kristian Karl, the company's test manager and creator of the model-based testing tool GraphWalker, said:
"I think we will spend as much time on testing tomorrow as we do today, but the tools and the information that we will have and get from our automation will make our testing job different."
One final fact: In order to minimize the efforts and costs associated with spinning up and maintaining test environments, Spotify does a lot of testing in its production environment.
Microsoft: Engineers and testers are one
Microsoft's ratio of testers to developers is currently around 2:3, and like Google, Microsoft pays testers and developers equally—except they aren't called testers; they're software development engineers in test (or SDETs).
The high ratio of testers to developers at Microsoft is explained by the fact that a very large chunk of the company's revenue comes from shippable products that are installed on client computers, rather than websites and online services. Since it's much harder (or at least much more annoying) to update these products in case of bugs or new features, Microsoft invests a lot of time, effort, and money in making sure that the quality of its products is of a high standard before shipping.
What you can learn from the big guns of IT
If the culture, views, and processes around testing and QA can vary so greatly at five of the biggest tech companies, then it may be true that there is no one right way of organizing testing efforts. All five have crafted their testing processes, choosing what fits best for them, and all five are highly successful. They must be doing something right, right?
Still, there are a few takeaways that can be derived from the stories above to apply to your testing strategy:
- There's a "testing responsibility spectrum," ranging from "We have dedicated testers that are primarily responsible for executing tests" to "Everybody is responsible for performing testing activities." You should choose the one that best fits the skillset of your team.
- There is also a "testing importance spectrum," ranging from "Nothing goes to production untested" to "We put everything in production, and then we test there, if at all." Where your product and organization belong on this spectrum depends on the risks that will come with failure and how easy it is for you to roll back and fix problems when they emerge.
- Test automation has a significant presence in all five companies. The extent to which it is implemented differs, but all five employ tools to optimize their testing efforts. You probably should too.
Finally, here's another take on the spectrum of testing activities (or "schools," as the author calls them), written by former Microsoft principal engineer Alan Page.
Keep learning
Take a deep dive into the state of quality with TechBeacon's Guide. Plus: Download the free World Quality Report 2022-23.
Put performance engineering into practice with these top 10 performance engineering techniques that work.
Find to tools you need with TechBeacon's Buyer's Guide for Selecting Software Test Automation Tools.
Discover best practices for reducing software defects with TechBeacon's Guide.
- Take your testing career to the next level. TechBeacon's Careers Topic Center provides expert advice to prepare you for your next move.