UK Site SA Site
Insights

Trends and insights making an impact in your digital transformation journey

Performance testing survival tips for #BlackFriday
Ferdinand Nell
Lead Consultant, Test Automation / Performance Testing - DVT Global Testing Solutions, DVT

Performance testing survival tips for #BlackFriday

Is your website or mobile app prepared for the high volumes of traffic expected on Black Friday (and Cyber Monday)? Be ready with these performance testing tips.


Ferdinand Nell, lead Consultant, Test Automation/Performance Testing at DVT Global Testing Solutions.


The 24th November 2018 is the official day for Black Friday, the day after Thanksgiving (Friday) in North America and takes place in countries all around the world, including South Africa. While it may seem like months away, this favourite annual online shopping day is the cause of many sites and apps to crash leaving consumers disappointed and social media buzzing.


Too many companies do not prepare for the high volume of traffic to hit their website on the day. As an example, the retailer Takealot revealed to MyBroadband that it saw a record number of visits to its site on Black Friday in 2017, with over 2.2 million hits.


However, in November 2017 a handful of South Africa’s largest e-commerce websites crashed or experienced erratic behaviour on #BlackFriday. Websites and apps were temporarily down due to overwhelming volumes. Headlines like 'The glitch that stole Christmas: Black Friday crashes online stores in SA' hit the news.


There are specific steps that businesses can take to ensure that sales run smoothly on Black Friday plus to avoid brand reputational damage. Performance testing your website and app are essential to prevent any bottlenecks and to identify potential application issues that may surface under extreme conditions.


Performance testing allows you to validate your application capacity such that it performs under various loads and conditions. The following performance testing practices can be followed to avoid brand embarrassment or loss of revenue:


1. Obtain a Baseline, then Benchmark and Repeat
  • Identify Business as Usual processes and typical user load over a period of time.
  • Develop a suite of performance tests to simulate that and obtain a baseline.
  • Execute those tests whenever changes are made to the underlying system and compare your results with that of your baseline.
  • Understand how your system reacts to load and what affects performance.
2. Find the Ceiling – Stress Testing

Use your BAU performance suite to drive ever-increasing load against your system in order to determine the point at which:


  • Functional errors appear.
  • Performance degrades to unacceptable levels.
3. System Endurance – Soak Testing

Determine the behaviour of your system during prolonged use; garbage collection, SQL cursors, storage space etc. have a tendency to only illustrate potential problems over time.


4. Brace for Impact – Risk Mitigation

Determine the stress points of the system and identify points of failure.


Plan for failover and mitigate risk around those areas by:


  • Increasing hardware capacity.
  • Optimising system configuration and load balancing strategies.
  • Enhancing solution and architectural design.
5. Verify Your Solution

Rerun your performance tests to the point where they failed previously and confirm that your solutions have been effective.


6. Extrapolation and the Production Question

Extrapolation is a useful tool for determining the theoretical performance capability of a particular environment by scaling the results obtained from another.


Let’s assume performance baselines were obtained on the QA environment. If, for arguments sake, one has determined that, on paper, Production has 5 times the effective server capacity of QA, then one can expect Production to handle up to 5 times the concurrent user load that was generated on QA – give or take 15%.


Naturally, real world results may differ, but this provides a starting point from which further verification can take place.


If you still need convincing that performance testing is critical to your ecommerce site, here are a few stats from last year's Black Friday sales to make you think:


  • Amazon’s Black Friday 2017 sales hit a record high of $2.4B across its three major sites in the US, UK and Germany.
  • In a report released by Adobe, Cyber Monday hit a new record of $6.59 billion in sales, making it the largest U.S. online shopping day ever.
  • Online sales volume on the Black Friday shopping day grew 24% year-over-year, according to data from Salesforce’s retail intelligence unit.
  • According to Salesforce 42% of Black Friday orders were placed on a smartphone, and only 49% on a desktop or laptop computer. That marks the first year that computers generated less than half of all online orders.

What is apparent is that Black Friday is now a global e-commerce phenomenon and a day of massive potential for online retailers. Can you afford to miss out and have your brand name shamed? If your answer is no, then look at consulting performance testing experts so that your business can reap the benefits.


Some Typical Performance Testing terminology and their definitions:
Response time

The amount of time the website or app takes to respond to a user's request. As an example, if someone searches for a product on your app, they shouldn’t wait for longer than a few seconds before the product they are searching for appears.


Throughput

While it depends on the load, the throughput refers to the number of transactions and data that your Application Under Test (AUT) can process within a specified period.


Have a ballpark figure beforehand that your website or application should be able to process such as 300 transactions per second, for example.



Resource utilisation

The rate at which an application’s underlying architecture utilises resources such as memory, disk input-output, and CPU. Server resources, such as network I/O, processor and memory utilisation shouldn’t be running at maximum capacity. Aim for resource utilisation to be below 85% and leave headroom for spikes in activity.


Maximum User Load

The number of concurrent users, transactions or processes that a website or application can handle simultaneously. For example, and depending on your user footprint, your site or app might need to be able to process and complete 5,000 concurrent tasks simultaneously. This can be simulated through performance testing


Behaviour under Stress Conditions

This process tests whether the website or application will be able to interact with systems even if it is experiencing a high load. Stress testing, Spike testing and Peak testing are examples of performance testing approaches which aim to replicate these conditions.


Article resources:

Takealot reveals Black Friday revenue, BusinessTech, 27 November 2017, https://businesstech.co.za/news/internet/213351/takealot-reveals-black-friday-revenue/


Black Friday Online Sales Surge to New Records With Mobile Pushing Fastest, Fortune.com, 26 November 2017, http://fortune.com/2017/11/26/black-friday-online-2017-sales-record/


Market Beyond, 29 November 2017, https://themarketbeyond.com/black-friday-now-global-e-commerce-phenomenon/


Cyber Monday Hits New Record At $6.6 Billion, Forbes.com, 28 November 2017, https://www.forbes.com/sites/jeanbaptiste/2017/11/28/report-cyber-monday-hits-new-record-at-6-6-billion-over-1-billion-more-than-2016/#629b2a233662


About DVT

DVT is a software development and testing company that focuses on digital transformation technology solutions for clients globally. Its services include custom software development for mobile, Web and traditional platforms, software quality assurance, automated regression testing, UX/UI design, cloud application services, BI and data analytics solutions, project management, business analysis, DevOps and agile training and consulting. Founded in 1999, DVT has grown to over 700 staff with offices in the UK (London) and South Africa (Johannesburg, Centurion, Cape Town and Durban). DVT is a company within the software and technology group Dynamic Technologies. www.dvt.co.za.


Editorial contacts:

Ferdinand Nell
Lead Consultant, Test Automation / Performance Testing
DVT Global Testing Solutions
DVT
+27 82 092 5260
fnell@dvt.co.za


Karen Heydenrych
Communications manager
DVT
+27 83 302 9494
kheydenrych@jhb.dvt.co.za