Skip to content

Scheduled Checks

Scheduled Checks #9

GitHub Actions / Test Results failed Jun 3, 2024 in 0s

1 fail, 1 pass in 15m 16s

2 tests   1 ✅  15m 16s ⏱️
1 suites  0 💤
1 files    1 ❌

Results for commit c30983f.

Annotations

Check warning on line 0 in test_bit_reproducibility.TestBitReproducibility

See this annotation in the file changed.

@github-actions github-actions / Test Results

test_restart_repro (test_bit_reproducibility.TestBitReproducibility) failed

/opt/testing/checksum/test_report.xml [took 10m 16s]
Raw output
assert False
self = <test_bit_reproducibility.TestBitReproducibility object at 0x7f9cfbd57370>
output_path = PosixPath('/scratch/tm70/repro-ci/experiments/access-om2/release-1deg_jra55_iaf_bgc-2.0')
control_path = PosixPath('/scratch/tm70/repro-ci/experiments/access-om2/release-1deg_jra55_iaf_bgc-2.0/base-experiment')

    @pytest.mark.checksum
    def test_restart_repro(self, output_path: Path, control_path: Path):
        """
        Test that a run reproduces across restarts.
        """
        # First do two short (1 day) runs.
        exp_2x1day = setup_exp(control_path, output_path,
                               'test_restart_repro_2x1day')
    
        # Reconfigure to a 1 day run.
        exp_2x1day.model.set_model_runtime(seconds=86400)
    
        # Now run twice.
        exp_2x1day.setup_and_run()
        exp_2x1day.force_qsub_run()
    
        # Now do a single 2 day run
        exp_2day = setup_exp(control_path, output_path,
                             'test_restart_repro_2day')
        # Reconfigure
        exp_2day.model.set_model_runtime(seconds=172800)
    
        # Run once.
        exp_2day.setup_and_run()
    
        # Now compare the output between our two short and one long run.
        checksums_1d_0 = exp_2x1day.extract_checksums()
        checksums_1d_1 = exp_2x1day.extract_checksums(exp_2x1day.output001)
    
        checksums_2d = exp_2day.extract_checksums()
    
        # Use model specific comparision method for checksums
        model = exp_2day.model
        matching_checksums = model.check_checksums_over_restarts(
            long_run_checksum=checksums_2d,
            short_run_checksum_0=checksums_1d_0,
            short_run_checksum_1=checksums_1d_1
        )
    
        if not matching_checksums:
            # Write checksums out to file
            with open(output_path / 'restart-1d-0-checksum.json', 'w') as file:
                json.dump(checksums_1d_0, file, indent=2)
            with open(output_path / 'restart-1d-1-checksum.json', 'w') as file:
                json.dump(checksums_1d_1, file, indent=2)
            with open(output_path / 'restart-2d-0-checksum.json', 'w') as file:
                json.dump(checksums_2d, file, indent=2)
    
>       assert matching_checksums
E       assert False

/g/data/tm70/repro-ci/test/model-config-tests/test_bit_reproducibility.py:128: AssertionError

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Test Results

2 tests found

There are 2 tests, see "Raw output" for the full list of tests.
Raw output
test_bit_reproducibility.TestBitReproducibility ‑ test_bit_repro_historical
test_bit_reproducibility.TestBitReproducibility ‑ test_restart_repro