Validation

CAM repository provides infrastructure for running unit tests and integration tests.

Unit Tests

C based CAM components

CAM repository uses CTest for running unit tests on C based CAM components cam-service, cam-uuid and libcam. Each CAM component has a test directory that hosts the unit test related files, for example cam-service/test hosts files needed to run unit tests cam-service. Refer to CTest for more information on developing and running CTest.

Before running the CTest, make sure to follow the steps listed in the Build section.

Run CTest:

cd ~/cam/build
ctest -V

The output on the terminal should look like below, directory name might vary:

UpdateCTestConfiguration  from :/home/cam_user/cam/critical-app-monitoring/build/DartConfiguration.tcl
Parse Config file:/home/cam_user/cam/critical-app-monitoring/build/DartConfiguration.tcl
UpdateCTestConfiguration  from :/home/cam_user/cam/critical-app-monitoring/build/DartConfiguration.tcl
Parse Config file:/home/cam_user/cam/critical-app-monitoring/build/DartConfiguration.tcl
Test project /home/cam_user/cam/critical-app-monitoring/build
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 1
  Start 1: cam_uuid_test

1: Test command: /home/cam_user/cam/critical-app-monitoring/build/cam-uuid/test/cam_uuid_test
1: Test timeout computed to be: 1500

...

7: Suite: libcam
7:   Test: version ...passed
7:   Test: init ...passed
7:   Test: stream init ...passed
7:   Test: stream start ...passed
7:   Test: stream event ...passed
7:   Test: stream stop ...passed
7:   Test: end ...passed
7:   Test: calibration invalid case ...passed
7:   Test: calibration ...passed
7:
7: Run Summary:    Type  Total    Ran Passed Failed Inactive
7:               suites      1      1    n/a      0        0
7:                tests      9      9      9      0        0
7:              asserts    369    369    369      0      n/a
7:
7: Elapsed time =    0.001 seconds
7:
7:
7/7 Test #7: libcam-test ......................   Passed    0.05 sec

100% tests passed, 0 tests failed out of 7

Total Test time (real) =   6.85 sec

Python based CAM component

CAM project uses Pytest for the python-based cam-tool component. Refer to cam-tool/test for more information on the unit test for cam-tool. Refer to pytest for more information on developing and running Pytest.

Before running the Pytest, make sure to follow the steps listed in the Build section. It is also required to follow the steps listed in Setup terminal environment in the terminal where pytest will be run.

Run Pytest:

cd ~/cam/critical-app-monitoring
pytest cam-tool/test --junitxml=./cam-tool-report.xml

The output on the terminal should look like below, directory name might vary:

============================================= test session starts ==============================================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.4.0
rootdir: /home/cam_user/cam/critical-app-monitoring
collected 19 items

cam-tool/test/test_cam_csc.py ........    [ 42%]
cam-tool/test/test_cam_csd.py ......      [ 73%]
cam-tool/test/test_cam_csel.py ...        [ 89%]
cam-tool/test/test_cam_message.py ..      [100%]

---------- generated xml file: /home/cam_user/cam/critical-app-monitoring/cam-tool-report.xml -----------
============================================== 19 passed in 0.08s ==============================================

Integration Tests

CAM project uses Pytest for running integration tests on CAM components. The integration tests consist of a series of tests that can be found in test. Integration tests are broadly classified into:

  • Calibration Tests: Various tests where cam-app-example runs in calibration mode to generate CSEL files and cam-tool performs analysis on the generated CSEL files.

  • Configuration Deploy Tests: Various tests to test analysis and deployment of event streams by cam-tool and the effect on cam-service.

  • Static Configuration Tests: Various tests where cam-app-example starts with different configurations and checks the behavior of cam-service.

Before running the Pytest, make sure to follow the steps listed in the Build section. It is also required to follow the steps listed in Setup terminal environment in the terminal where pytest will be run.

Run integration tests:

cd ~/cam/critical-app-monitoring
pytest --basetemp=./logs --install-dir=~/cam/cam-packages/ --junitxml=./cam-tests-report.xml ./test/

The output on the terminal should look like below, directory name might vary:

============================================= test session starts =============================================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.4.0
rootdir: /home/cam_user/cam
collected 20 items

test/test_calibration.py .......          [ 35%]
test/test_config_deploy.py ...            [ 50%]
test/test_static_config.py ..........     [100%]

--------- generated xml file: /home/cam_user/cam/critical-app-monitoring/cam-tests-report.xml ----------
======================================== 20 passed in 121.62s (0:02:01) =======================================

The integration tests support tcpdump, a well known software that performs network packet recording and visualization. Calling the Pytest framework with the optional parameter --with-tcpdump <absolute path to tcpdump binary> will start tcpdump for the relevant tests included in test_static_config.py, record the network traffic exchanged by cam-app-example and cam-service, afterwards it will generate a dump.pcap file with the raw network data and a readable file tcpdump.log containing the recording of the network exchange.

Run integration tests with tcpdump:

cd ~/cam/critical-app-monitoring
pytest --basetemp=./logs --install-dir=~/cam/cam-packages/ \
  --junitxml=./cam-tests-report.xml ./test/ --with-tcpdump <tcpdump binary path>

The output on the terminal should look like the example above, the time might slightly increase due to the tcpdump execution additional time.