Before You Begin
Before you begin the Testing Competitions, you need to download software, set up your environment, and register as a member with TopCoder.
TopCoder uses its own UML Tool to capture diagrams during the design process. You can download the TopCoder UML Tool for free from the TopCoder website. Some testing competitions will provide UML artifacts to assist you in your test efforts.
Your submissions may be required in different formats such Rich Text Format (RTF), MS Word Docs, or MS Excel spreadsheets (.xls).
You can use Microsoft Word or Microsoft WordPad to produce RTF files. Microsoft WordPad is included with the default distribution of Microsoft Windows.
You will need to download the competition distributions (.jar or .zip files) at the beginning of the competition. These files contain all the documentation you will need to compete, so make sure you have an appropriate archive tool.
For creating automated scripts you may be required to use scripting tools such as JUnit or SwingUnit (for Java).
Please review the Test Scenarios Review Scorecard and Test Suites Review Scorecard for information on how you will be evaluated by the Review Board.
Review the Documentation
There are 2 types of testing competitions: Scenario competitions and Script competitions. Each use various supporting documents and artifacts to help you complete the competition. Scenario competitions may use Requirements Specifications (includes Use Cases), Activity diagrams, and a QA Test Plan as input to create 1 or more written test scenarios for each QA Test Plan Item (see below). Each scenario is written text describing: setup needed, pre-conditions, detailed scenario steps to complete the test, the expected result and post-conditions. Each scenario is also marked whether it can be automated or not.
Script competitions may use the Requirements Specification, Activity diagrams, QA Test Plan, and the Test Scenarios to create automated scripts that can be executed during use in later competitions.
- Requirement Specification - The requirement specification is formatted based on Use Cases. Each use case is defined under the logical requirements section and is numbered as 2.X for each use case. 2.X.Y corresponds to each activity diagram.
- Use Case - Each use case should have 1 or more associated QA Test Plan items.
- Activity Diagrams - The activity diagrams correspond directly to the use cases. Each Activity Diagram is named after the Use Case in the following format: Use Case Name Activity Diagram Name. The purpose of this document is to record the logical use case flow.
- Design Specification - This document will detail and define all auditing, logging, and threading information for the system. Additionally, you will find supplementary information to implement your prototype conversion.
- GUI Prototype - This set of files is for GUI based tests that allow many scenarios and automated scripts to be created before the application is complete.
Details on Assembly competition testing.
Preparing Your Submission - Code
Once you have received all of your documentation and have an understanding of the competition, it is time to write your scenarios or scripts. Remember that any code written must follow TopCoder coding conventions. Our coding conventions are the published standards for Java and C#. You can view them at the following links:
Java Coding Conventions
.NET Coding Conventions
Each scenario must include:
- Reference to the Use Case number in the Requirements Specification and its name.
- Reference to the Integration/Functional/GUI/Performance Test Case number in the QA Test Plan.
- Required Setup and Tear Down for scenarios.
- Pre and Post Conditions for scenarios.
- Automatable or not
- Steps for completing each scenario.
- Expected outcome for each scenario.
A template is provided for you to fill in the scenarios.
Test Case Development
Test case development is really where you can make your submission stand out. Each language is a bit different for testing purposes.
It can be very tempting to simply aggregate all of your unit tests into one function. However, this greatly reduces the utility of your tests. In our example above, we could easily combine all three test functions into a single function, testSaveAs(). Instead of having three tests, we now only have one. If any of the three behaviors are broken, all tests will fail. In larger scale testing, this composite testing methodology can lead to very confusing failure conditions, and it can become difficult to debug your testing code. The smaller and more atomic your tests become, the more obvious the failure point and probable causes generally are.
- ALWAYS implement a tested version of the demonstration from the component specification.
- ALWAYS provide a meaningful message in assert and fail calls.
- ALWAYS document your test code as thoroughly as you document your component code.
- Break up your tests into discrete TestCase classes. If one TestCase becomes unmanageable, don't hesitate to break it into two or more classes.
- Break up your tests within those classes into the smallest functions possible; that way, it is clear which areas of the component are failing. You will then be able to use the number of tests passing and failing as a completion metric.
- Reduce code duplication and increase robustness with setUp() and tearDown().
- Test every public function for as much valid and invalid input as time allows.
- Test expected component processes: for example, loading, processing data, and saving data.
- Don't forget to clean up your environment! Unit tests should leave the system in the same state they found it in; there should be no persistent changes. This is checked during review.
- Test classes are normal classes in every respect, except that they have special significance to the testing framework. These classes can inherit from a class intermediate between themselves and the final test. They may contain methods other than test methods. They may have state.
- Because they are in the same package or namespace as the component classes they test, the unit tests can access package-private and protected classes and their members.
- Interfaces cannot be tested directly, but methods that accept interface arguments can be presented with alternative implementations. This technique has great potential for verifying that the component does the expected things with and to its interface-typed fields and method arguments, and that it reacts correctly to exceptions thrown by methods invoked on such arguments.
Make sure you write a comprehensive unit testing suite. Inadequate tests can cost you dearly because there are several scorecard items pertaining to the unit test suite. Tests should cover all non-private methods thoroughly, and they should be well documented. Doing a good job on the tests can be a big win because the unit test suite is one of the areas that is more frequently neglected.
This is another spot where you could lose a LOT of points. If the engineer cannot deploy your application, you will be notified and potentially fail review. Make sure your documentation clearly defines how to build and deploy your solution. The more detail you provide here, the better. Be meticulous in your documentation and steps. This is one area of the competition where the review scorecard is very unforgiving. Please view the deployment template, found in the distribution.
You will package your submission and upload the submission to Project Submit and Review.
The submission is now out of your hands, and the Review Board will judge it based on a scorecard. Sit back, relax and wait for your chance to appeal the review.
After review is complete, you will view and can appeal the score your application was given. You'll see the comments from the Review Board on each point in the Scorecard, and you can dispute any point. Please keep in mind that you must have a very good reason to appeal or it will be denied. If the reviewer makes a statement that is in conflict with the design, or if the reviewer has made an oversight, feel free to appeal. On the other hand, matters of opinion may not be appealed. If you are the winning assembler, you will have the opportunity to discuss reviewers' judgments in Final Fixes.
Congratulations! You won the competition.
However, there is probably still work to be done. Your review scorecard will be available via Project Submit & Review, and there you will find any and all problems the Review Board found with your submission. It will contain required and recommended fixes to your submission.
All required fixes must be addressed. Your application WILL NOT be completed, and you will not receive any payment, until all required fixes have been addressed. Every recommended fix should be attempted. Obviously, recommended fixes are of lower priority than required fixes, but if there is time, they should also be completed.
More than any at stage of the competition process, communication during final fixes is key.
If, for any reason, you have a question or comment on the review, post it as soon as possible in the contest forum and email the Project Manager. The PM will look into the issue and ensure that the primary reviewer is aware of the situation. If, for whatever reason, you feel a required fix is impossible within the given timeframe, you need to address it on the forums. Do NOT just resubmit without completing a required fix or a comment in a readme. This is never acceptable. Communicate with the reviewers to avoid these conflicts! The more detail you can give, and the sooner you give it, the more smoothly this stage will go. The Review Board is always open to communication from the your team; they may have overlooked something in the design or in your submission. Don't be afraid to question the Review Board. Questions and comments show that you're paying attention and actively involved; without communication, the Review Board doesn't know that you're working on the fixes, and can't clarify their requirements.
Remember that ALL test cases must successfully pass in order to complete final review!
Final Submission and Review
Once you have successfully completed all of the required fixes, you should submit in the same manner as you originally did. Follow the same guidelines above for ensuring all your files are uploaded properly. Try to be as complete as possible before resubmitting. This saves everyone time, including you.
If you've met all the requirements, and all the tests still pass, the assembly is complete, and goes into preparation for deployment. Your work is complete!
If anything is incomplete, you'll earn a return trip to Final Fixes.
Deployment and Support
The application will now be deployed for testing and certification. The winner will support any issues or bugs found during deployment for 30 days. After 30 days any newly identified bugs will not be the winner's responsibility. If an enhancement is identified, TopCoder will pay the winner an additional payment to enhance the tests.