Cross Browser testing on Real Devices Made Easy
Instant access to test your website or app on 2,000+ REAL devices & browsers anytime, anywhere. Trusted by millions of developers worldwide!
weather.com is accessed by millions of people each day, so it is critical for the team to ensure each customer gets the best experience possible. For this to happen, the team must check that releases work well across their entire user base, regardless of browser, device or OS used. Test website
Todd Eaton, Head of DevOps and Web Operations at test IBM & Weather, and Vikas Thange, Lead SDET at Xpanxion LLC, hosted a webinar to share how the team achieves quality at speed.
The duo explain that the team used to have large, infrequent test releases with an unacceptable number of failures. They didn’t have enough test coverage to catch all failures, and they weren’t releasing fast enough.
Todd and Vikas describe how they revamped their test-and-release process to increase test coverage, reduce test failure rate, and release multiple times a day. This resulted in 90% automated testing (up from 30%) and a hybrid framework for high-level testing that is now a benchmark in the industry.
Along the way, they were asked questions about weather.com’s test-and-release process, automations, regressions, and best practices. Here’s a roundup of their answers:
What are some best practices for manual teams moving to automation?
- You need a robust, reliable automation framework which allows fast adoption and early success. While this requires a huge investment for initial development, in the end, it’s worth it.
- You need the approval and willingness of all stakeholders involved, for migrating manual testing to complete automation.
At what stage do you start writing automation tests for new features?
Right when the app is in development, you need to start writing automation tests for any new features. You shouldn’t consider development as ‘done’ until all automated tests are in place. Test website
What are the steps and best practices we need to follow before code merge?
Step 1: QA test a functionality locally and works with the developer for fixes.
Step 2: The developer runs unit tests along with any local tests, including security, accessibility and performance tests.
Step 3: The developer builds a PR and submits it for a code review.
Step 4: A code reviewer reviews the PR. They either comment on the PR and send it back to the developer, or approve it and move the Jira ticket to ‘Dev complete’.