Poorly Tested App Crashes the Iowa Caucus

We may not know who won the Iowa caucus – but we certainly know that a poorly tested smartphone app lost.

This Monday, the Iowa caucus has nearly been rendered meaningless due to the Democratic party’s inability to reliably report and tally the results. While there have been reports that the overall caucus process and the management were deficient, the smartphone app that was used for reporting the results failed miserably.

Here is my review of this incident only from a software testing perspective. There is a lot of valuable lessons to be learned from this.

Do not introduce new technology at a critical time.  With the entire nation watching this first voting towards the Presidential election of 2020, it is a wrong decision to use a new app. Also, not sure how much effort and time by software maker or the democratic party put in to train all the users. It is prudent and practical to try new technologies in a smaller and limited environment to ensure both the quality of the application and user adaption.

Installation Failures. It is reported that many caucus chairs were trying to install the app on the same day or the previous night. Software teams should ensure that the users were able to successfully install the apps on their devices, well before the time of usage. And, this can be done by timely email reminders, with detailed instructions and usage analytics collected back from the app itself. A proper app installation testing will ensure that the app can be successfully installed across a verity of devices, models, carriers, and operating systems. Also, test both clean installs as well as reinstall or updates.

Too much security is as bad as too little security. I can understand the need to make app attack resistant, especially with the history of external influences. But we cannot build apps with too many layers of security that will frustrate the users from abandoning the apps.

In this case, many caucus chairs were frustrated and think that the series of PIN numbers and layers of security absolutely mucked it up. Unable to use the app, they all ended up calling the hotline. A good QA team performing usability tests can often find such issues and work to increase the ease of use.

Whitebox, unit, and integration testing are essential. The CEO of the company that made the app said that “problem was caused by a bug in the code that transmits results data.”  QA teams should focus their attention on the interactions between the boundaries of the systems. Usually, this is done by robust integration testing and proper unit testing. Systems should also be tested for typical end-of-the-day operations, such as this transmission of results.

Test for extreme conditions. QA teams should test mobile apps for extreme conditions such as low bandwidth, congested network traffic, intermittent connectivity, and high user loads. Teams often use tools to emulate low bandwidth and high latency to test the applications.

In summary, using an experienced and skilled QA team to validate your applications on real devices under real-life conditions are very essential. Trigent offers a full suite of testing services that cover both mobile and cloud applications. We provide testing services that match the speed of innovation.

Author

  • Chella Palaniappan

    I help clients with their new technology initiatives, help with their software development and QA needs across a variety of technologies, including Microsoft, Java, opensource, cloud and mobile. I often write about SharePoint and related technologies.