Skip to content

Keynote – Hybrid Automation (API) feat. YourTestCloud.com Testing in the Hybrid Cloud & Beyond

2014 May 7
by jonathon.wright

Hybrid Automation (API) feat. YourTestCloud.com Testing in the Hybrid Cloud & Beyond

A much overlooked and critical element of the modern day cloud ecosystem, is the interlocking web of dependencies often hidden and unknown to its varied clients and customers. For example, Google location services, or the Amazon eCommerce and fulfillment infrastructures are present in a myriad of services often at a level far removed from the core functionality.

The modern cloud aware, Service Oriented Architectures (SOA) are uniquely vulnerable to interlocking failure modes, caused by simple outages of services not perceived to be core or even present. The concept of private cloud computing has added an extra level of complexity to the Solution Under Test (SUT) ecosystem without solving any of the existing infrastructure challenges. Even normal operations of an interlocking web services can be perceived by an end user or client as an error.

The full value of both Messaging & API testing of complex ecosystems can be unlocked by utilising the power of Test Automation as a Service (TAaaS) platforms. These can be utilised to support testing of any ecosystem that has multiple hybrid cloud endpoints which span numerous Platform(s) / Infastructure(s) as a Service (PaaS / IaaS) providers

 

Unicom – ALM – Agile Portfolio Management in the Cloud

2014 May 7
by jonathon.wright

Agile Portfolio Management in the Cloud

- Enterprise ‘Agile Portfolio Management’ in the hybrid cloud (public / private) platform;
- Introducing ‘Social enterprise’ platforms within a global technology landscape;
- Implementation of enterprise collaboration tools using industry standards – Introduction to BPMNv2.2 (Business Process Modelling Notation) and xPDL (extendable Process Definition Language);
- Effectively communicating to all business echelons utilising ‘Funnel Virtualisation’ leveraging BPM;
- Workflow management/solutions using XAML (Extensible Application Mark-up Language);
Challenges faced with introducing ‘Agile Portfolio Management’ across multi-platforms / BYOD / mobile devices.

 

Unicom – BPM – Agile Portfolio Management in the Cloud from Automation Development Services

BCS – SIGIST – Closing Keynote – Test Automation in the Hybrid Cloud

2013 December 5
by jonathon.wright

Test Automation in the Hybrid Cloud

What is the future of test automation? The possibilities associated with cloud computing provide instant scalability, flexibility and availability for testing on demand with no upfront investment. This provides the industry with a perfect opportunity to utilise powerful high volume automated testing solutions.
The global testing cloud marketplace will allow for the joint collaboration of leading test specialists following industry best practice. This enables firms of all sizes to access the latest test approaches and methodologies whilst providing a unified platform for domain experts to represent business processes and user story acceptance criteria in natural language with content sensitive business validation.

 

BCS – SIGIST – Test Automation in the Hybrid Cloud from Automation Development Services

Agile-Portfolio.com – Launch

2013 August 9
by jonathon.wright

AgilePortfolioManagement

EUROSTAR 2013 – Webinar

2013 July 23
by jonathon.wright

Test Automation in the Cloud

What is the future of test automation? The possibilities associated with cloud computing provide instant scalability, flexibility and availability for testing on demand with no upfront investment. This provides the industry with a perfect opportunity to utilise powerful high volume automated testing solutions.
The global testing cloud marketplace will allow for the joint collaboration of leading test specialists following industry best practice. This enables firms of all sizes to access the latest test approaches and methodologies whilst providing a unified platform for domain experts to represent business processes and user story acceptance criteria in natural language with content sensitive business validation.

 

 

EuroSTAR – Test Automation in the Cloud – Webinar from Automation Development Services

How to become ‘Performance Testing As A Service (PTaaS)’ Ready

2013 June 25
by jonathon.wright

Following on from my previous blog post on “How to become ‘Automation Ready’ I thought it would be nice to apply the same principles but this time with ‘Performance Testing As A Service’ (PTaaS).

 “Cloud-based ‘Infrastructure As A Service’ (IaaS) Feature-Driven ‘Acceptance Test Driven Development’ (ATDD) through’Performance by Example’ using ‘Organisational-wide Agile’ (OA) approach in “Continuous Integration, Build & Delivery” (CIBD)  Environment’ via ‘High Volume Mobile Automated Testing’ (HVMAT)’”

In layman’s executing performance testing in the cloud directly from your mobile spinning up thousands of real end devices (both physical mobile devices & virtual clients) to test against real applications or website’.

IMG_2660

The quote from one of my favourite authors  inspired me to write this blog as I’m currently doing this for a global retail client in the states which required performance testing featuring thousands of real end mobile client devices (combination of iPads / Andriod tablets) connecting to the internet (via 3G/4G  or local store WiFi).

“The best time for planning a book is while you’re doing the dishes” – Agatha Christie

Now with only a single sprint complete non-functional testing it was time to practice what I preach:

Phases

Bit of background around non-functional testing and the how to define the perfect ‘Non-Functional Requirement‘  (NFR) which I defer to a good friend & colleague Stevan Zivanovic of BTTB who provides a excellent “Non-Functional Requirement Cube” (NFRc):

NFR

No stranger to ‘Performance Testing’ the following list of acronyms used within this blog:

Acronym Definition
PTaaS Performance Testing As A Service
PTM Performance Test Management
PTE Performance Test Environment
SLM Solution Lifecycle Management
ALM Application Lifecycle Management
SUT System Under Test
EUT Environment Under Test
NFR Non-Functional Requirement
NFRc Non-Functional Requirement Cube
PTS Performance Test Session
POS Performance Optimization Sessions
PTP Performance Test Profile
PUP Performance User Profiles
PIP Performance Interface Profiles
PBP Performance Background Profiles
BPM Business Process Modelling
BPS Performance Test Scenario = Business Process Scenario
BPT Business Process Test
TA.db Test Asset db
TA.c Test Asset cube
TA.j Test Asset journals
DLT Distributed Load Testing
PPT Protocol-level Performance Testing
HVAT High Volume Automated Testing
HVMAT High Volume Mobile Automated Testing
TFS Team Foundation Server / Service
IaaS Infrastructure As A Service
NAS Network Access Storage
CIBD Continuous Integration, Build & Delivery
DDA Dynamic Data Adapters
AI Actionable Insight
OA Organisational-wide Agile
ATDD Acceptance Test Driven Development

The source for some of this content in this blog was presented a few years back in 2011 at the ‘British Computer Society’ (BCS) ‘Special Interest Group in Software Testing’ (SIGiST) conference (Conference Slides):

So the question is again how to become ‘Performance Ready’ in 4 easy(ish) steps?

Step 1 – Select the ‘Performance Test Scenarios’ (PTS) that will make up your ‘Performance Test Profile’ (PTP) this can be done in a number of different ways:

PerformanceTestSession

Option A – Convert existing ‘Business Process Scenarios’  (BPS) which represent paths through the ‘Business Process Model’ (BPM) into ‘Performance Test Scenarios’ (PTS) by re-executing current automation solution and encapsulating ‘Test Asset journals’ (TA.j) or directly from either the ‘Test Asset cube’ (TA.c) or a ‘Test Asset db‘ (TA.db). This source can be exported into a generic ‘WebTest’ compatible format (feature supported by Hyper-Test.com):

BusinessProcessModel

Option B – Manually generate ‘Performance Test Scenarios’ (PTS) using a tool such as ‘Fiddler’ that can capture all the ‘Http traffic’:

Fiddler

TIP: Once completed ‘Performance Test Scenario’ (PTS) has been captured in ‘Fiddler’ this session can be exported to ‘WebTest’ compatible format:

Export

Option C – Manually generate ‘Performance Test Scenarios’ (PTS) using a Load Testing tool such as ‘Microsoft Load Testing’ (VS2010.2/VS2012.3) or ‘LoadRunner’ (v11.5):

PTaaaS_SimpleWebTest

Step 2 – Select the ‘Performance Test Profile’ (PTP) or test transaction mix of ‘Performance Test Scenarios’ (PTS) previously generated:

PerformanceTestProfile

TIP: The ‘Performance Test Profile’ (PTP) test transaction pacing model can be defined either by taking real figures from monitoring a snapshot of the live ‘Solution Under Test’ (SUT) or defining the volumetrics calculations manually:

Volumetics

Step 3 – Select the ‘Performance Test Session’ (PTS) by selecting one or more ‘Performance Test Profiles’ (PTP) such ‘Performance User Profiles’ (PUP), ”Performance Interface Profiles’ (PIP) such as ‘Batch’ or ‘WebServices’ and the ‘Performance Background Profiles’ (PBP)  such as ”Ambient’ Traffic’ :

NFR

The ‘Performance Test Session’ (PTS) also applies a session pattern such as ‘Load/Stress/Soak’ combined with any goals such as ‘Benchmark/Month-End/Peak’ :

TIP: This approach provides full traceability of the ‘Non-Functional Requirements’ (NFR) otherwise referred to as ‘Performance Test Reference’ (PTR):

Picture1

This approach supports ‘Acceptance Test Driven Development’ (ATDD) through the abstraction to ‘Business Level Question’ (User Stories) that are asked by the business right down to the individual ‘Business Process Tests’ (BPT) or individual ‘Business Test Transactions’ (BTT)  that provide the ‘Business Level Answers’ (Acceptance Criteria) all in a language that the business can understand and make ‘Actionable Insight’ (AI) to make informed business decisions:

BusinessProcessTransactions

Step 4 – Configure the ‘Performance Test Environment’ (PTE) ready to execute against the ‘Solution Under Test’ (SUT) or ‘Environment Under Test’ (EUT) as part of the ‘Performance Test Management’ (PTM) solution.

For this example ‘Performance Test Environment’ will be in the Azure cloud:

IC590645

Executed against a Multi-Tier ‘Solution Under Test’ (SUT) again hosted in the Azure cloud:

IC553228

Performance Test Management’ (PTM) solution will be ‘Microsoft Test Manager’  as part of the ”Team Foundation Service’ in the Azure Cloud:

IC406358

The ‘Performance Test Management’ (PTM) solution allowed access to ‘Microsoft Test Manager’  via both Web and Mobile:

PTaaS_25-06-2013 10-08-23

The ‘Performance Test Management’ (PTM) solution also supported Legacy access via Desktop Client of ‘Microsoft Test Manager’ which requires ‘Visual Studio 2012.3 Premium or above”:

PTaaS_25-06-2013 10-08-28

The ‘Performance Test Environment’ (PTE) requires access to ‘Test Controllers’ and ‘Test Agents’ to execute / run against a Multi-Tier ‘Solution Under Test’ (SUT):

PTaaaS_ConnectDB

The ‘Test Controller’ needs to be configured for ‘Load Testing’ and de-registered from any ‘Team Project Collection’:

PTaaaS_SQL

To support ‘High Volume Automation Testing’ (HMPT) for distributed testing we need to support the required virtual users :

Infrastructure as a Service – Test Agent – Instance Resources:

Configuration Component CPU HD Memory
< 1000 virtual users Test agent 2.6 GHz 10 GB 2 GB
< 2000 virtual users Test agent Dual processor 2.6 GHz 10 GB 2 GB
N x 2000 virtual users Test agent Scale out to N agents each with Dual 2.6 Ghz 10GB 2GB

The ‘Test Controller’ also requires additional resources to control the required amount of virtual users:

Infrastructure as a Service – Test Controller – Instance Resources:

Component Test Controller Application Tier Test Controller Data Tier Test Controller Application/Data Tier

CPU

Min: 1 GHzRec: 2 GHz Min: 1 GHzRec: 2 GHz Min: 1 GHzRec: 2 GHz

Disk – System

Min: 1 GBRec: 1 GB Min: 1 GBRec: 1 GB

Min: 1 GB

Rec: 1 GB

Disk – Install

Min: 1 GBRec: 48 GB Min: 8 GBRec: 48 GB

Min: 8 GB

Rec: 48 GB

Memory Min: 1 GBRec: 1 GB Min: 1 GBRec: 1 GB

Min: 1 GB

Rec: 1 GB

Depending on the level/detail of monitoring for the  ‘Environment Under Test’ (EUT) is required. In this case the collection ‘Dynamic Data Adapters’ (DDA) such as traditional performance counters (‘perfmon’) of the a Multi-Tier ‘Solution Under Test’ (SUT) that may have a number of Application / Database Servers:

Component Test agent Test controller application tier Test controller data tier Test controller AT/DT
CPU Depending on the test, the CPU is frequently the limiting factor. Not heavily used. Not heavily used. Not heavily used.
Disk Heavily used when detailed logging is enabled in your load tests. Not heavily used. 10 GB space required for 24 hours of test data. 10 GB space required for 24 hours of test data.
Memory Depending on the test, memory might be the limiting factor. Not heavily used. Heavily used by SQL. Heavily used by SQL.

Depending on amount of persistent storage in the cloud you may want to detach the ‘Load Test’ DB and locate it onto cloud based ‘Network Access Storage’ (NAS):

PTaaaS_SQLTest

To provide support for “Continuous Integration, Build & Delivery” (CIBD)  you need to create a Test Setting for a ‘Distributed Load Testing’ (DLT) that contains the roles (Dynamic Data Adapters) and deployment (Build Process Template):

LabManagement

Complex ‘Performance Test Session’ (PTS) containing a number of different ‘Performance Test Profiles’ (PTP) such ‘User’, ”Traffic’ or ‘Ambient’ can be executed by a single command (for example):

Microsoft Load Testing ‘Protocol-level Performance Testing’ (PPT):

“Mstest /TestContainer:PerformanceTestProfile_Soak.loadtest /testsettings:Remote.Testsettings /resultsfile:D:\results\MyResults.trx”

or GUI-based Cross Browser (i.e. IE, Safari, Chrome, FireFox) ‘High Volume Automated Testing’ (HVAT) :

“ArtOfTest.Runner.exe list=”MasterDriver.aiilist”

or GUI-based Cross Platform (i.e. WP8, Andriod, iOS, BlackBerry)  ‘High Volume Mobile Automated Testing’ (HVMAT):

mstest /testlist:MasterDriver /testmetadata:”TestProject.vsmdi” /testsettings:”MobileDevices.testsettings”

tfs-build-server-setup

Configure the ‘Performance Test Environment’ (PTE) to run load tests in the cloud (requires VS2013  preview edition & access to the early adoption program)

IC665274

NOTE: The early adoption program is limited to 15,000 virtual user minutes:

IC666801

Execution is monitored in the normal way the only difference is the introduction of the Load Test Manager:

IC667982

The Load Test Manager provides accesso view past load test runs or currently running load tests:

IC665293

This information can also be access via ‘Team Foundation Service’ in the cloud:

IC665294

The ‘Performance Test Environment’ (PTE) is now ready to be ‘Performance Test Management’ (PTM) cloud based solution on any device anywhere in the world:

IMG_2660

Congratulations you are now ‘‘Performance Testing As A Service’ (PTaaS) Ready

How to become ‘Test Automation As A Service (TaaaS)’ Ready

2013 June 14
by jonathon.wright

How to become ‘Test Automation As A Service’ ready in 4 easy steps:

Step 1) Generate a Business Process Model (BPM) using Business Process Modelling notation v2.2 (BPMNv2.2) this can be done in a number of different ways:

Option A – Manually create the BPM using a open source modelling tool such as BizAgi :

BusinessProcessModel

Option B – Generate the BPM from existing internal documentation sources (such as Visio Diagrams or Functional Specifications) which can be exported directly or can use a tool that automatically generates eXtendible Process Definition Language (xPDL) from sources such as XML/XLSX (this feature supported by Hyper-Test.com):

BPM_ExportVisioBPM

Option C – Generate the BPM from Solution Under Test (SUT) source code repository example below using DynoForms (XML/XSD):

AllISeeIsXML

NOTE: The advantage of using the direct source is that when it changes it automatically updates the BPM.

Step 2) Create the Business Process Scenario (BPS) using workflow (v4) /eXtensible Application Markup Language (XAML) this can be done in a number of ways:

Option A – Manually by overlaying the BPS paths onto the generated BPM:

SysRepublic_BusinessProcessModelling_Core_v20130613D

Example ‘forgot password’ path (A1 > D1 > E1 > F1 > G1 > A2 > A1 > B1 > C1) compared to a ‘simple login’ of (A1 > B1 > C1)

Option 2 – Import the BPM (xPDL) into a test management tool such as HP ALM (v10 or above note the current version 11.5)

BPM_AltFlow

Option 3 – Generate the BPS (XAML) by importing the current BPM (xPDL) components then create the workflow logic (this feature supported by Hyper-Test.com):

ADS_BusinessProcessScenarios_xPDL

Option 4 – Use the existing business workflow logic defined by the application under test (AUT) this could already be in XAML format and can be edited in VS2012.RC3

ADS_BusinessProcessScenarios_XAML

NOTE: The advantage again of using the direct source is that when it changes it automatically updates the BPS.

Step 3)  Create the Business Process Tests using an Automation Solution this can be done in a number of different ways:

Option A – Use a cloud based 5th generation automation solution such as ‘Test Automation as a Service” that supports both import & exporting of BPM (xPDL) &  BPS (XAML) :

TaaaS_Portal

Option B – Develop an Automation Solution that supports both BPM (xPDL) &  BPS (XAML):

Framework

Option C – Use a tool that supports Business Process Testing such as TaaaS.net, open source ATDD/BDD/TDD  solutions (such as SpecFlow/JBehave) or commercial tools (Odin/Ranorex/Telerik)

ADS_BusinessProcessTesting

NOTE: Slides taken from the ‘Test Automation as a Service’ presentation given at STARWest in 2012.

Option D – Migrate your current existing Automation solution into this BPM (xPDL) &  BPS (XAML) format this can be automatically done by using Hyper-Test.com :

HyperTest

Step 4) Congratulations you are now ‘Automation Ready’.

Using the ‘First Day Automation’ approach as published in the best selling ‘Experiences in Test Automation‘ book, without having  to write a single line of code just using natural language (Business Level Keywords) with content sensitive validation (Model-Based) as provided by Test Automation as a Service and executed in the hybrid cloud that is platform, technology, client, browser, version, language agnostic and can run on all environment under test (EUT):

Option A: High Volume Automated Testing (HVAT) in the cloud using Microsoft Lab Management and powered by Azure

LabManagementEnvironment

Test environments are in the Azure Cloud and can be either saved to persistent storage or spun down (PAYU)

TaaaS_AzureServerFarm_Containers

Test execution is managed by Microsoft Test Manager (2012.3/2010.2) through test agents technology:

ControlledMTM

Option B: High Volume Cross-Platform Automated Testing (HVAT) in the cloud using ‘Test Automation as a Service‘ VM dispenser technology:

Combo_GeoBased

Test execution is managed by ‘TaaaS.net Portal‘ that is accessible on any platform or device that supports HTML (v5), Metro (WPF) or SilverLight (v5):

TaaaS

The VM Dispenser technology can spin up a scalable amount of test agents in the cloud and deploys the necessary components (app/web server builds / test agents (using either QT/BlueStack/Hyper-V) communicating over http (WCF/WebServices)

CloudExecution

Test environments are either saved to persistent storage or spun down (PAYU) hybrid cloud model (public/private/community)

Option C – High Volume Mobile Automated Testing (HVMAT) in the cloud with real mobile devices using Microsoft Test Manager and powered by PerfectoMobile

CloudMobileAutomation

Test execution is managed by Microsoft Test Manager (2012.3/2010.2) through test agents technology:

ComboRealDevices

Further information can be found at ‘Test Automation as a Service‘ website

TaaaS.net – Evaluation

2013 June 13
by jonathon.wright

Automation evaluation for evaluating automation based solutions based on the objectives / goals of automation:

Automation Goals

  • Maintainable – reduce the amount of test maintenance effort through use of the self-maintaining test asset loader/scraper;
  • Effective – self-validating test assets achieved using natural language with context sensitive validation against business and testing rules, workflows and data;
  • Relevant – clear traceability of the business value of Automation through the visualisation of the tests via Business Process Modelling (BPMNv2.2 compliant);
  • Reusable – Unified platform which non-domain experts can use a natural language to represent business processes and user story acceptance criteria;
  • Manageable – reports on SUT health including ratings such as percentage availability since build/release, reported errors over time and traffic to error ratio;
  • Accessible – to enable collaboration on concurrent design and development;
  • Robust – to provide object/event/error handling and recover with fault tolerance is built in to report and continue on different levels of fuzzy matching combined with the non-technology specific test definition language;
  • Portable – technology agonistic – Platform, client/component, browser, version & language Test type agnostic – smoke, regression, integration & performance;
  • Reliable – to provide fault tolerance over a number of scalable test agents;
  • Diagnosable – actionable defects provided by environment under test (EUT) live pause-playback supported by dynamic data adapters (DDA) for accelerated defect investigation and resolution
  • Measurable – provide testing dashboard along with customisable reporting.

Score Card

  • Platform Support – Support for multiple operating systems, tablets & mobile
  • Technology Support – “multi-compiler” vs. “compiler-specific” test tools;
  • Browser Support – Internet Explorer, Firefox, Google Chrome or any other browser based on web browser controls;
  • Data Source Support – obtain data from text and XML files, Excel worksheets and databases like SQL Server, Oracle and MySQL;
  • Multi-Language Support – localized solutions supporting Unicode;
  • Test Type Support – functional, non-functional and unit (i.e. nUnit & MSTest);
  • Test Approach Support – i.e. Hybrid-Keyword/Data-Driven testing;
  • Results & Reporting Integration – including images, files, databases, XML documents;
  • Test Asset / Object Management – map an object not only by its caption or identifier;
  • Class Identification – GAP analysis of object classes (generic / custom) and associated methods capabilities based on complexity, usage, risk, feasibility and re-usability;
  • Test Scenario Maintenance – manual effort (XPATH/regular expressions), self-maintaining (descriptive programming/fuzzy logic) or script less (DSLs);
  • Continuous Build & Integration / Delivery Integration – with build & delivery solution;
  • Future proofing – external encapsulation of test assets & associated meta data (XAML/xPDL), expandability (API/DLL/. NET), HTTP/WCF/COM/WSD and OCR/IR;
  • License, Support & Maintenance Costs – pricing policy along with any hidden costs.

Test Approach Support

This can also be referred to as test automation framework/testware generation that is going to be used:

Picture1

Test Approach cross reference chart below:

TestApproach_background

Taken from the “Hybrid Keyword Data Driven Framework” presented at the ANZTB in 2010 updated with the “Test Automation As A Service” presented at STARWest in 2012

Webinar – Feedback – “Questioning Automation”

2013 April 12
by jonathon.wright

 

 

 

 

 

For a complete list of the feedback or to take part in our on-line survey

TMT Evangelist – BrightTalk – Webinar (11th April 2013 @ 4pm)

2013 April 8
by jonathon.wright

A BrightTALK Channel

Join the TMT Evangelist channel now (other presenters include Dan North)

A BrightTALK Channel