Digital change is inevitable, whether it is driven by you, your customers, the competition or regulation. Using Continuous Quality our end-to-end approach accelerates change, giving businesses agility to deliver faster, better, cheaper outcomes to ensure that the digital transformation has the required impact.
Performance testing has been a crucial part of overall systems quality for decades. Performance assessments are carried out for all types and size of businesses who have a reliance on IT systems that support concurrent operations – so pretty much everyone!
Enablement, through more efficient and user-friendly automation tools, has resulted in the activity being ubiquitous within the IT industry. This notwithstanding, the need for carrying out performance inspections is driven by the risk that a programme’s ROI will not be realised due to issues of system responsiveness, efficiency or stability. On top of not delivering the expected financial return, other implications of releasing a system that does not achieve the required levels of performance is the harm of reputation, industry standing and business growth.
Typically, in the case of a system that performs below the required levels, it would often be identified that the system was constrained by some form of hardware limitation. Common factors include a lack of CPU availability, insufficient memory, slow disk access and network bandwidth constraint.
The modern-day advent of cloud based computing platforms, with essentially instant and unlimited computing power through technologies such as AWS auto-scaling, then leads organisations to ask themselves… is conducting performance testing no longer a necessity to gain confidence that your service can reliably deal with peaks in utilisation? After all, you can always easily spin-up more cloud-based computing resources to deal with this - right?
In our opinion the answer is not as straight-forward as that. The goal of this article is to argue the case for the continuing value of performance testing inspections for this new cloud-based computing paradigm. We will do so by individually addressing areas where we feel Performance Testing will remain a crucial element of the application development and deployment process.
(Published June 2018)
Have you ever thought of your car as being an extension of your payment card?
Well, that’s the kind of transformation the global payments landscape is undergoing, through sweeping changes driven by regulatory initiatives, technology trends & dynamic consumer behaviour. Technology trends include IoT (Internet of Things), and the use of Artificial Intelligence and distributed ledger.
The rise in uptake & interest in IoT devices, has led to the ubiquity of Application Programming Interfaces (API). An API allows the bank to share specific data or services with 3rd parties, ensuring cross-compatibility between apps and devices without compromising one another’s IP.
API brings a new breed of players, like Payment Initiation Service Providers & Account Information Service Providers, into the payment world and allow them to compete with the banks directly. These new players are set to erode the revenues of banks and card networks. As APIs challenge the traditional business model & operating structure, the banks can no longer view them as a tactical asset. They need to determine the level of API openness to retain a competitive edge, and employ the right QA strategy to ensure the security of consumer data as they open to the outside world.
This white paper discusses the possible new revenue streams that the bank can examine, and the challenges in embracing the APIs.
‘I think, therefore I am‘. The seventeenth century dictum of Rene Descartes could now extend to ‘things’ with the advances in Artificial Intelligence (AI).
According to the Bank of America Merrill Lynch report, advances in computing technology, machine learning and user-friendly interfaces will have a significant impact on the labour market, costing $ 14 trillion by 2025 and possibly resulting in the loss of 140 million jobs. Ray Kurzweil predicts that technical singularity could be as close as 2045!
Scepticism from other quarters notwithstanding, AI raises a whole host of disturbing questions relating to research, ethics and its long-term usage, as is evident from the recent Asilomar Principles. The rise of super-intelligence also brings the question of existential risks to the fore.
While much of the discussion is focused on the effects of AI and the resultant (job) redundancies in quality assurance, there are key questions related to assuring the AI itself. Apart from the question of accuracy of algorithms, the realm of assuring ‘intelligence‘ involves hitherto unexplored areas and difficult questions related to inherent biases within the algorithms themselves.
This Whitepaper describes the use of artificial intelligence in quality assurance and the assurance aspects of AI itself.
User experience – A key ingredient for Customer Satisfaction across the globe. With the proliferation of financial services and the increase in customer empowerment, product companies are collaborating with end users to build a successful product.
The realm of omni channel platforms using cloud-based technology helps to reach end customers and assists in enriching the design, functionality and usability of the product.
Customer beta experience provides a reality check and increased understanding of desired features. CSBT provides a unique opportunity in the early stages of the testing life cycle to find out why target consumers use a particular product and how they use it.
This paper describes the emerging trend in QA focusing on customer beta testing in the financial services industry. This is inspired by android and gaming releases where the product developers partner with the end users, for quality assessment and acceptance confirmation.
Docker containers and Microservices have been gaining in popularity over the last few years. Along with DevOps, customers are now able to adapt their software and manage to separate roll-out of individual features instead of full-fledged code deployment.
While there are benefits in using Dockerised Microservices, there are also challenges such as complex testing and maintaining test portability. A greater amount of testing needs to be taken into consideration — component testing, integration testing, API contract testing, etc.
This white paper discusses the challenges of Dockerised Microservices testing, levels of Microservices testing and the tools that can be used to automate tests at each level. It also details how test containers can be used to test Microservices hosted in Docker containers, and how customers will benefit from automation and test containers, thereby expediting roll-out of their business applications.
The mass adoption of the Internet of Things (IoT) is a multibillion-dollar opportunity for product companies and the manufacturing supply chain. An estimated 30bn devices, or “Things”, will be connected to the internet by 2020, with a value estimated to be $1.7bn. This is being enabled by the reducing cost and Mohr’s cycle of processing power of sensors and chips. Also playing a pivotal role are the emergence of internet gateways such as 4-5G, Low Power WAN, Near Field Communication, Bluetooth and Zigbee protocols. However, this also means there will be significant increases in the number of nodes, networks and systems that create a large attack surface and attract the attention of hackers and other actors with malicious intent. In order to prevent attacks from these individuals, the security posture of these IoT devices needs significant improvement.
Not long ago, cybersecurity-related concerns tended to revolve around the very real threats of data and identity and intellectual property theft. In this space, the bad guys range from script kiddies, experts with malicious intent, hacker groups and hacktivists through to organised crime and in some cases nation states. The archetypal victims typically were consumer-facing retailers and financial institutions that, in some high-profile cases, saw hackers siphon off millions of customer records and account numbers.
While those cases were scary enough, they didn’t tend to target enterprises that were dependent on the reliable operation of very expensive physical assets – such as manufacturers. That is changing, however, as bad guys begin to threaten human safety by targeting physical assets and potentially shut down or take control of the physical infra-structure upon which we all increasingly depend.
This means that there needs to be a step change in how security is perceived and avoidance isn’t the answer; risk management is.
This white paper discusses the security and risk management solutions necessary to avoid being hacked, prevent data loss and improve the security of the IoT ecosystems.
Good quality, improved efficiency, less cost to market, ready to deploy assets, standardisation across projects, easily maintainable – If these are the qualities you are looking for in your automation framework, Automation Industrialisation is a one stop shop for all your needs.
Industrialisation with its five disciplines of modularisation/reuse, specialisation, standardisation, automation and continuous improvement has long been heralded as the solution to most if not all of IT’s woes. It helps in streamlining test automation activities across verticals and across geographies. While we note the implementation details and advantages of the industrialisation approach for a specific industry, i.e. Oil and Gas, we believe its legacy can be followed easily across verticals.
The banking industry has spent over a decade journeying towards a completely modernised, digital channel-based sales, service & delivery system. However, both the pace of progress and the time for the banks to change have been unpredictable. Now, given the cost pressures in the industry and global economic uncertainties, there is increased urgency (or rather an opportunity) for banks to adopt an efficient approach and go for complete digitalisation and modernisation.
However, embracing holistic digitalisation and modernisation requires a thorough understanding not only of the benefits to the banks but also of customer expectations, for example, how relevant, convenient and personalised would the product offers and services be with an enhanced user experience?
Digital banking should combine a very positive customer experience with a banking model that delivers highly effective and efficient sales and services to customers.
In view of this aim, the banking industry has also taken into consideration the facts & figures below when comparing 2014 and 2015.
As banks are already moving towards the next big thing, Banking 4.0 and Cloud implementation, it is estimated that by 2017, around one billion people will use mobile banking, and this has already become a way of life for many of us.
This paper looks at how the market trend is evolving with respect to digitalisation across various banking products and services, and how the digital wave is effecting changes across regions globally. As these digitalisation initiatives are fast hitting the banks, there are perceived challenges posed to the bank’s IT delivery teams with respect to maintaining quality delivery and meeting customer expectations, whilst lowering costs and improving time to market. This paper also details how SQS can support the banks in overcoming such quality challenges, and looks at how to ensure end to end digital quality for the banks.
Over the last few years, “Rapid Deployment Solutions” have been gaining market share as industries opt for faster implementation of ERP solutions into the production cycle. To keep pace with this demand, we have developed an innovative SAP testing approach, i.e. industrialised rapid deployment of SAP assets using model-based testing. With this approach, the reusable test assets are managed in different industries as per their respective business process hierarchies. This means that the numerous end-to-end scenarios for particular industries are identified and the business processes required to complete the scenarios are mapped under each scenario or variant. The scenarios are structured in a hierarchical manner in our own test management tool.
In today’s business environment, end-to-end scenarios must be tested as many SAP modules integrate with each other to run an entire business. With industrialised deployment, the business processes for the customer are identified by analysing the Business Blueprint or accessing the customer’s SAP Solution Manager using a much faster technique uniquely developed by SQS. These business processes are then linked to SQS assets for SAP. These assets are then deployed with respect to the business processes to form the end-to-end scenarios for testing, ensuring maximum traceability of requirements as well as flexibility for the customer’s business team to define the ‘test scope’ or ‘test BOM’ faster and more easily. The selected and tailored end-to-end scenarios are then automated using our own industrialised and robust SAP test automation framework.
This whitepaper explains the model-based testing approach, asset deployment in a factory model, and the automation of the scenarios using an industrialised automation framework; it also details how customers will benefit from this system, thereby growing their business.