Test procedures for the exploitation system#

Testing approach#

Throughout the development process of AVL, various testing methodologies are applied as appropriate to the relevant development stage.

Requirements and use cases approach#

The user stories drive the initial development of the AVL, as they define the capabilities to be implemented. They are primarily defined by the respective scientific experts but must be translated into specific work items by the product owners. The product owner ensures a successful and efficient development process.

The product owner fully understands the use case to be implemented, remains in close contact with users, and creates and maintains the product backlog, which is a pile of work items.

Software development approach#

At the beginning of each iteration cycle in the development process, the product owner prioritizes the items in the backlog of the use case and decides with the external users and the team on the items to be implemented in the next sprint. The product owner hence interacts closely and frequently with the development team as well as the users. It is thus essential that this person understands in detail both sides and is also able to communicate with researchers from agricultural sciences and with software developers and cloud engineers. Each iteration of the AVL system will add features to the system that further implement at least one of the use cases. User Stories are workflows that can be fully reproduced, first by the product owner who is the first person external to the development team to validate any new capability of the system, but then also by the other stakeholders including the external users.

Agile test approach#

The Agricultural Virtual Laboratory is implemented following a Scrum software development approach. This approach advocates frequent releases in recurring iteration cycles, so-called sprints. The Scrum process defines clear roles like the product owner, who compiles requirements in the so-called backlog and prioritizes them, the scrum master, who manages and facilitates the software development process in fixed-term sprints, and the team, which develops the software or service. The fixed length of the sprints leads to frequent releases, which are tested by the product owners and other users – ‘stakeholders’ in Scrum terminology. In this way, it is ensured that the software under development is continuously evaluated, and that feedback or even new requirements can be addressed in the next iteration. Hence, the software co-evolves with the requirements of its users in an agile manner.

The AVL software development team engages in stand-up meetings, pair negotiation, unit testing, and test-driven development. These techniques and activities have been shown to improve software quality and responsiveness to changing customer requirements.

Test-driven development (TDD) is a technique that alerts the programmer to programming mistakes moments after they have been made – in fact a tool so powerful that it largely eliminates the need for debugging. TDD is a rapid cycle of testing, coding, and refactoring. When adding a feature, a pair of programmers may perform dozens of these cycles, implementing and refining the software in tiny steps until there is nothing left to add and nothing left to take away. Indeed, research has shown that TDD substantially reduces the incidence of defects. It also helps to improve design, document interfaces, and guard against future mistakes.

Acceptance testing allows the product owner or other users to validate the software. The final acceptance tests are performed under supervision of the product owner. Nevertheless, continuous acceptance testing carried out by quality assurance engineers is integrated into the software development life cycle, revealing problems and erroneous developments at an early stage.

Software verification#

Unit-level testing ensures the correct functionality and validity of individual, usually relatively small, software units: a procedure or function in a procedural language, or a class in an object-oriented language. Explicit dependencies on external interface implementations are avoided by means of mock-up implementations, resulting in full control of the test environment. Unit-level tests are defined and carried out by software developers. The primary input for the definition of unit-level tests is common sense and domain-specific knowledge, as well as the requirements provided by the product owner or AVL users.

As part of the test-driven, incremental software development, unit-level tests are defined and coded before a new feature is implemented. A developer realises a new feature by making its associated unit-level tests run successfully.

Integration testing ensures that the developed software behaves as expected within the target environment. Quality testers define integration tests for which the primary input is the Requirements Baseline. Implicit and informal integration testing is also carried out by developers while running or debugging the software within the integrated development environment.

In addition, integration tests run automatically on our build servers. Rebuilding takes place whenever source code in the source code repository has been changed. If there is a problem with the build, an email notification is sent to responsible developers. Integration tests include unit-level tests for testing individual classes separately (as described above) as well as higher-level tests for verifying concepts.

Stress tests will be conducted to assess performance in extreme cases, such as reading or processing large data takes. Stress tests will be benchmarked to provide information to optimization tasks, if necessary.

System tests will be defined and carried out by using the AVL system as installed by automatically built installer packages.

The unit tests are defined in the AVL GitHub repository and run automatically after every change in the code; the test results are published on AppVeyor.

Software validation#

Software verification is the process ensuring that adequate specifications and inputs exist for all software development activities, and that the outputs of the activities are correct and consistent with the specifications and the input. The final outputs of this procedure are verified software modules and updated Software Version Procedure.

Software validation is the process of ensuring that the requirements baseline functions and performances are correctly and completely implemented in the final software. The Software Product Assurance ensures software validity by testing on three major levels: unit-level testing, integration testing and acceptance testing. All problems related to software components are documented and followed up in an issue tracker system.

Dataset validation#

Datasets are validated by the ver and verreport functions of the avl command-line tool, which checks automatically that they conform to the AVL dataset conventions.

Results of the last validation run (carried out on 2022-01-04) are given in the table below.

Dataset Result
avl/l2a-s1/2019/bel/S1_L2_BCK_VH.zarr PASS
avl/l2a-s1/2019/bel/S1_L2_BCK_VV.zarr PASS
avl/l2a-s1/2019/bel/S1_L2_COH_VH.zarr PASS
avl/l2a-s1/2019/bel/S1_L2_COH_VV.zarr PASS
avl/l2a-s1/2019/fra/S1_L2_BCK_VH.zarr PASS
avl/l2a-s1/2019/fra/S1_L2_BCK_VV.zarr PASS
avl/l2a-s1/2019/fra/S1_L2_COH_VH.zarr PASS
avl/l2a-s1/2019/fra/S1_L2_COH_VV.zarr PASS
avl/l2a-s1/2020/bel/S1_L2_BCK_VH.zarr PASS
avl/l2a-s1/2020/bel/S1_L2_BCK_VV.zarr PASS
avl/l2a-s1/2020/bel/S1_L2_COH_VH.zarr PASS
avl/l2a-s1/2020/bel/S1_L2_COH_VV.zarr PASS
avl/l3b/2019/bel/S2_L3B.zarr PASS
avl/l3b/2019/bel/S2_L3B_FAPAR.zarr PASS
avl/l3b/2019/bel/S2_L3B_FCOVER.zarr PASS
avl/l3b/2019/bel/S2_L3B_LAI.zarr PASS
avl/l3b/2019/bel/S2_L3B_NDVI.zarr PASS
avl/l3b/2019/fra/S2_L3B.zarr PASS
avl/l3b/2019/fra/S2_L3B_FAPAR.zarr PASS
avl/l3b/2019/fra/S2_L3B_FCOVER.zarr PASS
avl/l3b/2019/fra/S2_L3B_LAI.zarr PASS
avl/l3b/2019/fra/S2_L3B_NDVI.zarr PASS
avl/l3b/2020/bel/S2_L3B.zarr PASS
avl/l3b/2020/bel/S2_L3B_FAPAR.zarr PASS
avl/l3b/2020/bel/S2_L3B_FCOVER.zarr PASS
avl/l3b/2020/bel/S2_L3B_LAI.zarr PASS
avl/l3b/2020/bel/S2_L3B_NDVI.zarr PASS
s2/belgium/31ufr_31ufs.zarr PASS
s2/france/30twt_30txt.zarr PASS