Wednesday, September 3, 2008

Compatibility Testing - Google Chrome

Google Chrome, the web browser  is here! 

My initial impression is quite good. I tried it with English, Hebrew and Russian sites, graphics, video, etc.

However, as a testing guy, I wouldn't settle for that. I don't know of any serious web application that works perfectly well across different browsers. JavaScript, video, fonts, exotic styles are part of the issue. There are custom plug-ins - Google, in their "how did we do it" comics, promised just yet rough treatment for these.  There are unique technologies like XFD and such. There is an endless variety of other fruits of programmers' creativity that might stumble as well when approached through the new browser.

Long story short: if you want to be sure your web application works with Google Chrome - TEST IT. Compatibility tests, when planned correctly and done well, can reveal most of the problems your users will encounter - and all that in a fraction of time usually required for the full-blown System tests.

We at Conflair can help you - see our page on Google Chrome Compatibility Tests. If you have questions, feel free to e-mail me at michael.yudanin@conflair.com.

P.S. Just found an image link on my own Conflair webpage that would not work in Google Chrome. Works perfectly well in IE and Firefox... After some exploration, I found that the problem was the good old .onclick property for the image object.

Here is the problem and the fix:

Problem code:

animArr[i].onclick = "ConflairServices/"+tmpArr[0];
}
function Switch()
{
document.FeaturedService.src = animArr[imgNumber].src;
document.getElementById("OpenFeaturedService").href = animArr[imgNumber].onclick;


The fix (based on abusing the .name property of the image object):

animArr[i].name = "ConflairServices/"+tmpArr[0];
}
function Switch()
{
document.FeaturedService.src = animArr[imgNumber].src;
document.getElementById("OpenFeaturedService").href = animArr[imgNumber].name;


Friday, April 4, 2008

Visual Studio Team System: the Platform and the Tools

The New Role of Visual Studio

Microsoft’s Visual Studio has been for a long, long time a platform of choice for developers who use Microsoft technologies such as Visual Basic and ASP, C++ with MFC, C#, VB.NET and ASPX. With Visual Studio Team System, or VSTS, Microsoft offered something much bigger than a set of sophisticated compilers. They provided a platform for software life cycle processes and a tool for functional and performance test automation.

VSTS as a Software Life Cycle Platform

Visual Studio Team System allows the development of software life cycle processes and makes them easily available to all stakeholders. This is done through work items (VSTS term) that represent deliverables of software processes and workflows associated with these work items. For example, if we go for the basics, we can configure:


- Requirements management process and Requirement work item with meta-data and attachments

- Change management process with Change Requests and Risk work items and a workflow with customized statuses and approvals

- Defect tracking process with custom Defect report form, flow and reports

- Test Cases with meta-data and steps and Test Data work item.

The main advantage of VSTS as a software life cycle platform, in my eyes, is the ease with which it can be configured and maintained. Familiar Windows GUI standards, convenient tools and the ultimate flexibility of work item definitions and workflows are a huge plus. Powerful objects help a lot. For example, the capability to trace the history of changes for each work item, e.g., a defect, is built-in.

VSTS has a native SharePoint portal generation capability, which makes posting guidelines and visualizing data achievable without extensive design and programming. The illustration provides an example from a real project. The programming effort invested in it was a fraction of what would be done in another environment.

VSTS also integrates with Excel. I mean really integrates. You can edit work items in Excel and update them in VSTS with two or three clicks. No need for unreliable and time consuming steps to import/export each time you want to analyze and update something in Excel.

VSTS as a Test Automation Tool

Visual Studio Team System for Testers, a licensing variation on VSTS, provides a number of tools that can be used for testing. The most interesting of them is the Web Test – a tool for automating functional tests, the scripts of which can be used for performance testing as well.

In contrast to most of the test automation tools on the market, Web Test records HTTP transactions, not Windows API calls. When you click on a [Submit] button, what’s recorded is not the click but the transaction going to the server as a result. You can parameterize recorded transactions with test data, add or remove transactions, add validation points (called Validation Rules in VSTS lingo), etc. While it’s not possible to perform GUI validation with this tool, the scripts can be quite reliable in terms of being oblivious to minor changes in the application user interface – the curse of functional test automation. VSTS scripts can also detect defects that would be missed with GUI tools. Consider, for example, broken reference to a JavaScript or CSS file. These would never be discovered by another tool unless they caused changes in the application’s appearance, yet VSTS detects them without any extra effort.

The scripting languages used by Web Test are C# and VB.NET, with no shortage of books and classes to learn them. That, too, is an advantage. After all, C# is much more common than any proprietary testing tool language, and sooner rather than later will be more common than VBA of VBScript.

One Room, Many Windows

Architecturally, VSTS followed the idea of modularity. Work items are defined as XML files, so you can mess up one without ruining the other. Same applies to workflows.
The Team Foundation Server, VSTS’s core, has a Web Services API, which allows development of various clients.

TeamPlain, a web interface to VSTS, is a remarkable example. Initially developed by devBiz, it has found its respectful place among VSTS downloads after Microsoft's acquisition of the company in 2007. Besides allowing you to handle work items, it's also very light and undemanding. No Java applets, no ActiveX components, nothing to install on the user's computer. Just a decent web browser is sufficient. Works through firewalls and proxies.

Common Framework Advantages

There are a number of advantages in using a common platform for a number of major stakeholders of the software process. These stakeholders would include those in management, business analysts, developers, quality assurance and test engineers. Different sets of tools for each role, as good as they can be, will necessarily instill different concepts and different language – and as a result will require conscientious effort to communicate in common terms. Visual Studio Team System, being a common platform, provides universal terminology, one framework and improves communication as a result. Coupled with the ease of configuration and a set of testing tools, this has serious potential for more efficient software development resulting from people paying attention to contents rather than being bogged down in configuration details, trying to understand one another and make tools developed by different people for different purposes to communicate.
Add to that significant cost savings resulting from licensing, maintenance, and personnel training for one tool rather than three or four, and you will see that the economic argument in favor of Visual Studio Team System makes sense.


VSTS Community

Visual Studio Team System has a vibrant community developed around it: portals, blogs, forums and code samples sharing. Microsoft clearly decided to adopt some of the ways of the 2.0 technology community. Here are some links that will give you an idea:


http://teamsystemrocks.com/ – a great place to begin; has a number of useful tutorials.

http://msdn2.microsoft.com/en-us/library/ms182409(VS.80).aspx – in-depth articles about using VSTS for testing.

http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=1408&SiteID=1 – VSTS functional and performance testing forum.


http://msdn2.microsoft.com/en-us/teamsystem/bb500979.aspx - how to download and install TeamPlain

Useful blogs:


http://blogs.msdn.com/dscruggs/


http://moustafa-arafa.blogspot.com/ - useful if you are into heavy customizations of VSTS

Tuesday, March 4, 2008

Quality-as-a-Service: The Next Step?

Quality – Function or Service?

Software quality is usually considered to be a function within an organization. Whether we are talking about setting up and enforcing quality assurance processes or validating product quality by testing, we are usually thinking about a set of roles within an organization that are responsible for QA and testing. From time to time external consultants can be brought in, or the whole testing department can be outsourced, but quality control still will be a permanent, recognizable function owned by the company.

In the same fashion, until recently, companies have owned software applications. They could be installed on an internal server or on a rented space in co-location, yet they still would be an organizational asset.


SaaS, or Software-as-a-Service, introduced a new trend. Instead of owning software, a company would use it as a service on a subscription basis. Lower cost of ownership makes the SaaS option quite appealing: subscriptions are usually cheaper than licensing and upgrade fees. Furthermore, new hardware for hosting the application is not required, there is no need for extra personnel to maintain it, etc.


Here is the question, though: can this model be extended to other IT areas, namely to providing quality assurance and testing services?


The Anatomy of Software Quality

Software Quality can be separated into two categories: process setup and service delivery.

Process Setup entails defining the procedures for verification and validation activities throughout the software life cycle: reviews, inspections, testing, test automation, metrics and such. For each of these activities, we define its place in the software life cycle, design entry and exit criteria, and create templates and guidelines. We also configure tools to manage the deliverables: defect tracking applications, traceability links, and testing aids.

Process Delivery is about taking all these great processes and implementing them: conducting reviews, preparing test plans, designing and executing test cases, logging defects, and calculating metrics.


However, there is a problem: many of these tasks are either temporary or periodic. For example, the setup usually happens once and then requires infrequent maintenance. Requirement reviews do not happen every day or every week either. Testing has its peaks and valleys, too. The smaller the company, the more difficult and costly it is to maintain permanent personnel for these tasks. Resorting to temporary contractors does not bring much relief either: the learning curve frequently eliminates the cost advantage; attempts to shorten the learning curve negatively affect performance.


QaaS – a Win-Win Possible Solution?


Quality-as-a-Service, or QaaS, is an attempt to implement the SaaS-like model for software quality assurance and testing.


The idea is quite simple. A professional services company focusing on software QA and testing will offer its services to multiple clients on a subscription basis. Each single client does not have a need for full-time QA and testing personnel, yet has its peak periods. Together, they provide enough work for a number of QA professionals and software test engineers. Just like one accountant can serve a number of small businesses, one QA architect will set up and maintain processes for multiple clients. One test engineer will be familiar enough with a number of application lines to be able to design and execute tests for all of them without a significant learning curve.


The client companies will pay a subscription fee for a defined scope of services, plus a discounted rate for extra effort. This will be similar to SaaS clients who pay a subscription for a set number of transactions, with each additional transaction requiring an extra charge.


The Economics of QaaS


Quality-as-a-Service offers a number of benefits:


- No recruiting of QA and testing personnel: usually a costly endeavor


- No expense to maintain permanent personnel between peaks


- No significant learning curve for new contractors


- Knowledge cross-pollination: QaaS people will be able to implement lessons learned from one application to another


- More interesting jobs for QA and testing professionals together with the security of a stable job


The technology for QaaS does not constitute a challenge. The work can be done on-site as well as off-site, depending on the client’s needs, similarly to the way it happens now. The management expertise, metrics and their visualization exist as well in professional services companies that focus on software QA and testing.


QaaS Adoption


There are no significant technological challenges to the QaaS model. Many of the concerns such as enforcing data security, disclosing confidential information to competition or maintaining control over dedication of resources exist and are successfully handled when companies deal with contractors, employees who leave and go to work elsewhere or off-shoring.

In the end, the main challenge is to facilitate a paradigm shift. There is a great need to get out of the box, take a fresh look, and do something new.

I believe that companies developing and implementing software are ready for it.





Technorati tags: , ,

Thursday, February 14, 2008

Bug of the month?

Some software defects are tough to discover. Others are difficult to describe. Yet some defects are just delightful.

After a week of convincing a programmer that the user should know when the software he is working on cannot perform one of its important functions, he went ahead and created an error message. However, the error message was rather laconic and, I should say, self contradicting: it said "Error: OK". When the Test Engineer raised objections regarding the contents of the error message, the response was even more puzzling: the developer claimed that he had been told to output system's original error message, which he did...


Technorati tags: ,

Wednesday, February 13, 2008

Logic in Life and Software

Logic is a great thing. Arguably, the anchor of our mental health. Think about it: the logic, and mathematics as its subsection, describe the world the one and only way it can be. 2+2 is always 4; we cannot imagine a world in which 2+2 is 3, 5 or !@#$%.

Many things, though, are not bound by logic. One of them is the content of our speech. Recently one of the presidential candidates (guess who?) suggested that the government much track illegal immigrants. This statement triggered a loud applause of his excited audience, so this remarkable man continued his line of wisdom with the following: if the government cannot track illegal immigrants, this task should be outsourced to FedEx or UPS. After all, they track millions of packages every day... Even if we leave aside the moral aspects of this comparison, after a second or two of applying our innate capacity for logical thinking to this comparison we would see that the suggestion for the government to track illegal immigrants is akin to a statement that UPS and FedEx should track packages that have never been delivered through them.

Unfortunately, software requirements are as free from logical constraints as speech is. Throughout my career in software Quality Assurance and Testing I was confronted with requirements that contradict each other and even themselves.

The point of QA is not only to catch defects when they make it into a code. If we wish to be called Analysts and not Testers, we should conduct Requirements Reviews, apply our logic and try to find contradictions before they make it into the code. It’s usually quite simple: after all, there is a difference between QA Analyst and a cheering crowd :).




Technorati tags: , ;, , ,