6:33 AM

TestPartner - Product Preview

Automated, repeatable testing with TestPartner

TestPartner is an automated testing tool that accelerates functional testing and facilitates the delivery of business-critical applications.

With TestPartner, you can more rapidly validate applications before going live, verify that application updates don't introduce regressions or even test nightly builds with automated smoke tests. TestPartner offers a storyboard-based visual approach to testing that lets application users confidently capture test scenarios. VBA scripting is available to enable advanced users to meet even the most demanding test cases. You build test assets earlier in the development life cycle, test more thoroughly and deploy applications with confidence.

TestPartner's broad environment support means that enterprises with heterogeneous environments can be confident about easily validating Microsoft, Java, web, SAP, Oracle and many other distributed technologies with speed and consistency.

6:29 AM

Features in TestDirector 7.5

Web-based Site Administrator

The Site Administrator includes tabs for managing projects, adding users and defining user properties, monitoring connected users, monitoring licenses and monitoring TestDirector server information.

Domain Management

TestDirector projects are now grouped by domain. A domain contains a group of related TestDirector projects, and assists you in organizing and managing a large number of projects.

Enhanced Reports and Graphs

Additional standard report types and graphs have been added, and the user interface is richer in functionality. The new format enables you to customize more features.

Version Control

Version control enables you to keep track of the changes you make to the testing information in your TestDirector project. You can use your version control database for tracking manual, WinRunner and QuickTest Professional tests in the test plan tree and test grid.

Collaboration Module

The Collaboration module, available to existing customers as an optional upgrade, allows you to initiate an online chat session with another TestDirector user. While in a chat session, users can share applications and make changes.

Features in TestDirector 8.0

TestDirector Advanced Reports Add-in

With the new Advanced Reports Add-in, TestDirector users are able to maximize the value of their testing project information by generating customizable status and progress reports. The Advanced Reports Add-in offers the flexibility to create custom report configurations and layouts, unlimited ways to aggregate and compare data and ability to generate cross-project analysis reports.

Automatic Traceability Notification

The new traceability automatically traces changes to the testing process entities such as requirements or tests, and notifies the user via flag or e-mail. For example, when the requirement changes, the associated test is flagged and tester is notified that the test may need to be reviewed to reflect requirement changes.

6:24 AM

Basics of QTP

QTP (QuickTest Professional) lets you create tests and business components by recording operations as you perform them in your application.

Test - A compilation of steps organized into one or more actions, which we can use to verify that our application performs as expected. A test is composed of actions (3 kinds of actions are there in QTP Non-reusable action, Reusable action and External action).

1) First step is Planning Before starting to build a test, you should plan it and prepare the required infrastructure. For example, determine the functionality you want to test, short tests that check specific functions of the application or complete site.Decide how you want to organize your object repositories.

2)Second step in QTP is Creating Tests or Components

We can create a test or component by

a) Either recording a session on your application or Web site.

As we navigate through the application or site, QuickTest graphically displays each step we perform as a row in the Keyword View. The Documentation column of the Keyword View also displays a description of each step in easy-to-understand sentences. A step is something that causes or makes a change in your site or application, such as clicking a link or image, or submitting a data form.

OR b) Build an object repository and use these objects to add steps manually in the Keyword View or Expert View. We can then modify your test or component with special testing options and/or with programming statements.

3)Third step is Inserting checkpoints into your test or component. A checkpoint is a verification point that compares a recent value for a specified property with the expected value for that property. This enables you to identify whether the Web site or application is functioning correctly.

4)Fourth step is

Broaden the scope of your test or component by replacing fixed values with parameters.To check how your application performs the same operations with different data you can parameterize your test or component.When you parameterize your test or component, QuickTest substitutes the fixed values in your test or component with parameters. Each run session that uses a different set of parameterized data is called an iteration.

We can also use output values to extract data from our test or component. An output value is a value retrieved during the run session and entered into the Data Table or saved as a variable or a parameter. We can subsequently use this output value as input data in your test or component.

We can use many functional testing features of QuickTest to improve your test or component and/or add programming statements to achieve more complex testing goals.

5)Fifth step is running the test

After creating test or component, we run it. Run test or component to check the site or application. When we run the test or component, QuickTest connects to your Web site or application and performs each operation in a test or component, checking any text strings, objects, or tables you specified. If we parameterized the test with Data Table parameters, QuickTest repeats the test (or specific actions in your test) for each set of data values we defined.Run the test or component to debug it. We can control the run session to identify and eliminate defects in the test or component. We can use the Step Into, Step Over, And Step Outcommands to run a test or component step by step. We can also set breakpoints to pause the test or component at pre-determined points. We can view the value of variables in the test or component each time it stops at a breakpoint in the Debug Viewer.

6)Sixth step is analyzing the results

After we run test or component, we can view the results. View the results in the Results window.After running the test or component, we can view the results of the run in the Test Results window. We can view a summary of the results as well as a detailed report. Report defects identified during a run session. If Quality Center is installed, we can report the defects fond out to a database. We can instruct QuickTest to automatically report each failed step in the test or component, or we can report them manually from the Test Results window.

6:24 AM

Comparison of Agile Testing with other methods

Agile methods are sometimes characterized as being at the opposite end of the spectrum from "plan-driven" or "disciplined" methodologies. This distinction is misleading, as it implies that agile methods are "unplanned" or "undisciplined". A more accurate distinction is to say that methods exist on a continuum from "adaptive" to "predictive". Agile methods exist on the "adaptive" side of this continuum.

Adaptive methods focus on adapting quickly to changing realities. When the needs of a project change, an adaptive team changes as well. An adaptive team will have difficulty describing exactly what will happen in the future. The further away a date is, the more vague an adaptive method will be about what will happen on that date. An adaptive team can report exactly what tasks are being done next week, but only which features are planned for next month. When asked about a release six months from now, an adaptive team may only be able to report the mission statement for the release, or a statement of expected value vs. cost.

Predictive methods, in contrast, focus on planning the future in detail. A predictive team can report exactly what features and tasks are planned for the entire length of the development process. Predictive teams have difficulty changing direction. The plan is typically optimized for the original destination and changing direction can cause completed work to be thrown away and done over differently. Predictive teams will often institute a change control board to ensure that only the most valuable changes are considered.

Agile methods have much in common with the "Rapid Application Development" techniques from the 1980/90s as espoused by James Martin and others

6:21 AM

What is Agile Testing?

Agile software development is a conceptual framework for software engineering that promotes development iterations throughout the life-cycle of the project.

There are many agile development methods; most minimize risk by developing software in short amounts of time. Software developed during one unit of time is referred to as an iteration, which may last from one to four weeks. Each iteration is an entire software project: including planning, requirements analysis, design, coding, testing, and documentation. An iteration may not add enough functionality to warrant releasing the product to market but the goal is to have an available release (without bugs) at the end of each iteration. At the end of each iteration, the team re-evaluates project priorities.

Agile methods emphasize face-to-face communication over written documents. Most agile teams are located in a single open office sometimes referred to as a bullpen. At a minimum, this includes programmers and their "customers" (customers define the product; they may be product managers, business analysts, or the clients). The office may include testers, interaction designers, technical writers, and managers.

Agile methods also emphasize working software as the primary measure of progress. Combined with the preference for face-to-face communication, agile methods produce very little written documentation relative to other methods. This has resulted in criticism of agile methods as being undisciplined.

Agile methods are a family of development processes, not a single approach to software development. In 2001, 17 prominent figures in the field of agile development (then called "light-weight methodologies") came together at the Snowbird ski resort in Utah to discuss ways of creating software in a lighter, faster, more people-centric way. They created the Agile Manifesto, widely regarded as the canonical definition of agile development, and accompanying agile principles.

Some of the principles behind the Agile Manifesto are:

  • Customer satisfaction by rapid, continuous delivery of useful software
  • Working software is delivered frequently (weeks rather than months)
  • Working software is the principal measure of progress
  • Even late changes in requirements are welcomed
  • Close, daily cooperation between business people and developers
  • Face-to-face conversation is the best form of communication
  • Projects are built around motivated individuals, who should be trusted
  • Continuous attention to technical excellence and good design
  • Simplicity
  • Self-organizing teams
  • Regular adaptation to changing circumstances

The publishing of the manifesto spawned a movement in the software industry known as agile software development.

In 2005, Alistair Cockburn and Jim Highsmith gathered another group of people — management experts, this time — and wrote an addendum, known as the PM Declaration of Interdependence.

5:59 AM

Solid State Group Uses uTest for Software Testing of Social Networking Site

uTest today announced that Solid State Group, a content management, web applications and services consultancy, is using uTest for a professional social networking website. In addition to the flexibility of uTest's Software-as-a-Service (SaaS) platform and the ability to meet tight deadlines, Solid State sees the global community of testers(uTesters) as a huge draw.

"Sometimes team members are too close to the project to pick up every single error, which is why it helps to have a fresh set of eyes dedicated to finding bugs and delivering a high-quality application," said Felicity Stone, QA Project Manager, Solid State Group. "uTest's unique platform affords us the opportunity to test across a vast number of platforms and environments and access to a global network of testing professionals that makes our job a lot easier. uTest simply provides us the confidence that we are releasing a thoroughly tested product to our customers."

This is the second release cycle for which Solid State Group has deployed uTest's services. Previously, the company was performing all testing in-house until they decided that a third party review would greatly benefit them and their customers. With uTest driving the testing, Solid State Group is able to allow its staff to focus on other areas of application development.

"The business value Solid State obtains from uTest's services is further validation of the strength of our service, and the fact that they are a repeat customer is highly rewarding," said Doron Reuveni, chief executive officer, uTest. "It's wonderful that they are able to utilize our skilled testing community to obtain the superior quality and results they strive to deliver to their customers."

The uTest model represents an evolution in traditional software application testing. By offering virtual on-demand software testing services and a Pay-for-Performance business model, uTest provides its clientele a tremendous resource, with flexibility regarding platform, environment, geographic location and budget. The uTester Community of 9,000+ highly skilled software professionals across nearly 140 countries, delivers unparalleled real-world testing throughout product development lifecycles.

uTest offers two pricing models, On-Demand and Annual Subscription. Both options reflect the unique Pay-for-Performance and Pay-per-Bug model. For more information please visit www.utest.com or contact the sales office at (800) 445-3914. Additionally, to view a demo of uTest's services, please visit http://www.utest.com/solutions_watch_a_demo.htm.

3:44 AM

HCL Launches First of Its Kind On-Demand Software Testing Lab

HCL Technologies Ltd. ("HCL"), India's leading global IT Services Company, today announced the launch of an innovative on-demand software testing lab at Software 2008 that allows Independent Software Vendors (ISVs) to reduce their software testing cycle times and lower their capital expenditure on testing hardware and software.

"Our experience working with more than 100 customers over the last several years has shown us that developing and maintaining in-house testing expertise leads to high capital costs to setup the infrastructure, and high resource costs due to the cyclical demand," says G.H. Rao, Corporate Vice President, R&D Services, and HCL. "To enable our customers to address these challenges, we are making an investment of USD 6 million to setup a first-of-its-kind on-demand software testing lab."

The on-demand software testing lab includes a state-of-the art testing laboratory in India with 300 high-end servers of all popular makes (such as HP, Sun, Dell, IBM) with single/multiple CPUs provisioned with leading software testing tools. Customers can setup, provision, perform and manage testing on the lab -- all from a remote location through a secured communication channel.

The testing lab is complemented by a pool of specialized testing professionals that provides niche testing services such as performance benchmarking and capacity planning, high availability testing, firewall testing and security protocol testing etc. The testing service also includes an in-house developed IP on test automation -- Automation+ -- a testing framework that automates the entire test cycle including test environment setup and configuration, installs related applications and identifies the required automated test suites to be run on the installed software.

The on-demand software testing lab will provide a competitive edge to ISVs by enabling them to improve their products and take them to the market faster. This also helps them address a wider market through certification across various platforms, such as operating systems and browsers, further enhancing the revenue potential.

About HCL Technologies

HCL Technologies is one of India's leading global IT Services companies, providing software-led IT solutions, remote infrastructure management services and BPO. Having made a foray into the global IT landscape in 1999 after its IPO, HCL Technologies focuses on Transformational Outsourcing, working with clients in areas that impact and re-define the core of their business. The company leverages an extensive global offshore infrastructure and its global network of offices in 18 countries to deliver solutions across select verticals including Financial Services, Retail & Consumer, Life Sciences & Healthcare, Hi-Tech & Manufacturing, Telecom and Media & Entertainment (M&E). For the quarter ended 31st March 2008, HCL Technologies, along with its subsidiaries had last twelve months (LTM) revenue of US $ 1.8 billion (Rs. 7083 crores) and employed 49,802 professionals.

About HCL Enterprise

HCL Enterprise is a $4.8 billion (Rs. 19,640 crores) leading Global Technology and IT Enterprise that comprises two companies listed in India - HCL Technologies Ltd. and HCL Info systems Ltd. The 3-decade-old enterprise, founded in 1976, is one of India's original IT garage start-ups. Its range of offerings spans Product Engineering, Custom & Package Applications, BPO, IT Infrastructure Services, IT Hardware, Systems Integration, and distribution of ICT products. The HCL team comprises approximately 55,703 professionals of diverse nationalities, who operate from 18 countries including 360 points of presence in India. HCL has global partnerships with several leading Fortune 1000 firms, including leading IT and Technology firms. For more information, please visit www.hcl.in.

Other product or service names mentioned herein are the trademarks of their respective owners.

For details, contact:
Citigate Cunningham for HCL Technologies
Susan Vander May
415-618-8721
Email Contact

7:31 AM

Basics of Silk test

1) What is SilkTest?

Ans. SilkTest is a software testing automation tool developed by Segue Software, Inc.

2) What is the Segue Testing Methodology?

Ans. Segue testing methodology is a six-phase testing process:

1. Plan - Determine the testing strategy and define specific test requirements.

2. Capture - Classify the GUI objects in your application and build a framework for running your tests.

3. Create - Create automated, reusable tests. Use recording and/ or programming to build test scripts written in Segue's 4Test language.

4. Run - Select specific tests and execute them against the AUT.

5. Report - Analyze test results and generate defect reports.

6. Track - Track defects in the AUT and perform regression testing.


3) What is SilkTest Host?

Ans. SilkTest Host is a SilkTest component that manages and executes test scripts. SilkTest Host usually runs on a separate machine different than the machine where AUT (Application Under Test) is running.

4) What is SilkTest Agent?

Ans. SilkTest Agent is a SilkTest component that receives testing commands from the SilkTest Host and interacts with AUT (Application Under Test) directly. SilkTest Agent usually runs on the same machine where AUT is running.

5) What is 4Test?

Ans. 4Test is a test scripting language used by SilkTest to compose test scripts to perform automated tests. 4Test is an object-oriented fourth-generation language. It consists of 3 sets of functionalities:

1. A robust library of object-oriented classes and methods that specify how a testcase can interact with an application’s GUI objects.

2. A set of statements, operators and data types that you use to introduce structure and logic to a recorded testcase.

3. A library of built-in functions for performing common support tasks.

6) What is the DOM browser extension?

Ans. Document Object Model (DOM) browser extension is a SilkTest add-on component for testing Web applications. DOM browser extension communicates directly with the Web browser to recognize, categorize and manipulate objects on a Web page. It does this by working with the actual HTML code, rather than relying on the visual pattern recognition techniques currently employed by the Virtual Object (VO) extension.

7) What is the VO browser extension?

Ans. Virtual Object (VO) browser extension is a SilkTest add-on component for testing Web applications. VO browser extersion uses sophisticated pattern recognition techniques to identify browser-rendered objects. The VO extension sees Web pages as they appear visually; it does not read or recognize HTML tags in the Web application code. Instead, the VO extension sees the objects in a Web page; for example, links, tables, images and compound controls the way that you do, regardless of the technology behind them.

8) Is there any problem in using scripts created on v6.0 to 6.5 or higher versions?

Ans. Moving from lower to higher version should not be a problem.... This is a general statement and cannot be true at all instances. I faced problems with scripts working in 6.5 not running in 7.0 because some of the recognition patterns used each changed. And in some situations, finally landed two paths of the script to perform same action based on version. PS: Did not encounter any problems from 6.0 to 6.5.

9) What is SilkTest project?

Ans. A SilkTest project is a collection of files that contains required information about a test project.

10) How to create a new SilkTest project?

Ans. 1. Run SilkTest.

2. Select Basic Workflow bar.

3. Click Open Project on the Workflow bar.

4. Select New Project.

5. Double click Create Project icon in the New Project dialog box

6. One the Create Project dialog box, enter your project name, and your project description.

7. Click OK.

8. SilkTest will create a new subdirectory under SilkTest project directory, and save all files related to the new project under that subdirectory.

7:16 AM

Jonckers Successfully Deploys Borland(R) SilkTest(R) and Automates Software Testing

Jonckers Translation &Engineering, Microsoft Service Vendor of the Year 2007, today announced the successful adoption of Borland(R) SilkTest(R) for automated software testing. Borland SilkTest was evaluated over several months in a pilot functional testing project after being chosen from a competitive selection process.

Jorge Estevez, Group Engineering Manager for Jonckers stated, "The adoption of Borland SilkTest offers our clients the ability to quickly and cost-effectively release dependable products to their global markets. Investing in this technology is critical to the long-term success of
Jonckers and the localization industry at large, Borland's product proved to be far ahead of its competition both during the pilot evaluation and in ongoing customer projects."

Wolfgang Karas, Regional MD at Borland Software, said, "Jonckers is a valued partner and a recognized innovator in the localization industry, and Effective software testing is a vital component of their products
Dependability. The challenge for software delivery teams is to balance cost and time to market with dependable quality. It is our understanding that Jonckers selected SilkTest because it helps them effectively address this Challenge -- which in turn leads to greater customer loyalty."

New automated testing processes have now been rolled out across Jonckers' global testing centers in the Czech Republic, Vietnam and China. Estevez noted, "Being able to scale software testing in this way means we can pass on financial benefits to our customers, strengthening our
commitment to provide the most cost effective solutions possible in our market place."

A white paper on the benefits of Automated Testing is available for download at http://www.jonckers.com.

About Jonckers

Jonckers, MS Service Vendor of the Year 2007, is focused on delivering software, eLearning and multimedia localization services to the world's best companies. Jonckers achieves localization excellence through an ERP controlled global network of wholly owned offices spanning Asia, Europe and the US allowing Jonckers to deliver low cost global resources without
sacrificing quality. Please visit http://www.jonckers.com for more information.

Borland, SilkTest and all other Borland brand and product names are service marks, trademarks or registered trademarks of Borland Software Corporation or its subsidiaries in the United States and other countries.

7:32 AM

LoadRunner Vuser Technology

The actions that a Vuser performs during the scenario are described in a Vuser script. When you run a scenario, each Vuser executes a Vuser script. The Vuser scripts include functions that measure and record the performance of the server during the scenario.

To measure the performance of the server, you define transactions. Transactions measure the time that it takes for the server to respond to tasks submitted by Vusers. For instance, you can define a transaction that measures the time it takes for the server to process a request to view the balance of an account and for the information to be displayed at the ATM.

You insert rendezvous points into Vuser scripts to emulate heavy user load on the server. Rendezvous points instruct multiple Vusers to perform tasks at exactly the same time. For example, to emulate peak load on the bank server, you insert a rendezvous point to instruct 100 Vusers to simultaneously deposit cash into their accounts.

You use the LoadRunner Controller to manage and maintain your scenarios. Using the Controller, you control all the Vusers in a scenario from a single workstation.

When you execute a scenario, the LoadRunner Controller distributes each Vuser in the scenario to a host . The host is the machine that executes the Vuser script, enabling the Vuser to emulate the actions of a human user

Vuser Types

LoadRunner has various types of Vusers. Each type is designed to handle different aspects of today's client/server architectures. You can use the Vuser types in any combination in a scenario in order to create a comprehensive client/server test. The following Vuser types are available:

GUI (Windows and UNIX)

  1. RTE (Windows and UNIX)
  2. Database (CtLib, DbLib, Informix, Oracle, and ODBC)
  3. Web
  4. TUXEDO
  5. APPC
  6. Windows Sockets
  7. Baan
  8. Java
  9. DCOM
  10. PeopleSoft

7:18 AM

What is Client/Server Load Testing?

Modern client/server architectures are complex. While they provide an unprecedented degree of power and flexibility, these systems are difficult to test. Whereas single-user testing focuses primarily on functionality and the user interface of a single application, client/server testing focuses on performance and reliability of an entire client/server system.

For example, a typical client/server testing scenario might depict 200 users that login simultaneously to a system on Monday morning: What is the response time of the system? Does the system crash? To be able to answer these questions-- and more--a complete client/server performance testing solution must
  1. Test a system that combines a variety of software applications and hardware platforms
  2. Determine the suitability of a server for any given application
  3. Test the server before the necessary client software has been developed
  4. Emulate an environment where multiple clients interact with a single server application
  5. Test a client/server system under the load of tens, hundreds, or even thousands of potential users

7:16 AM

Introduction to Load Runner

LoadRunner is an industry-leading performance and load testing product by Hewlett-Packard (since it acquired Mercury Interactive in November 2006) for examining system behavior and performance, while generating actual load. LoadRunner can emulate hundreds or thousands of concurrent users to put the application
Through the rigors of real-life user loads, while collecting information from key infrastructure components (Web servers, database servers etc). The results can then be analyzed in detail, to explore the reasons for particular behavior.

LoadRunner is divided up into 3 smaller applications:

The Virtual User Generator allows us to determine what actions we would like our Vusers, or virtual users, to perform within the application. We create scripts that generate a series of actions, such as logging on, navigating through the application, and exiting the program.

The Controller takes the scripts that we have made and runs them through a schedule that we set up. We tell the Controller how many Vusers to activate, when to activate them, and how to group the Vusers and keep track of them.

The Results and Analysis program gives us all the results of the load test in various forms. It allows us to see summaries of data, as well as the details of the load test for pinpointing problems or bottlenecks.

Consider the client-side application for an automated teller machine (ATM). Although each client is connected to a server, in total there may be hundreds of ATMs open to the public. There may be some peak times — such as 10 a.m. Monday, the start of the work week — during which the load is much higher than normal. In order to test such situations, it is not practical to have a testbed of hundreds of ATMs. So, given an ATM simulator and a computer system with LoadRunner, one can simulate a large number of users accessing the server simultaneously. Once activities have been defined, they are repeatable. After debugging a problem in the application, managers can check whether the problem persists by reproducing the same situation, with the same type of user interaction.