Node ID







Columns











Log Level





Log Marker








Class


















































node0 0.000ns 2025-09-26 05:20:32.032 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 86.000ms 2025-09-26 05:20:32.118 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 100.000ms 2025-09-26 05:20:32.132 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 209.000ms 2025-09-26 05:20:32.241 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 215.000ms 2025-09-26 05:20:32.247 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 226.000ms 2025-09-26 05:20:32.258 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 286.000ms 2025-09-26 05:20:32.318 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 372.000ms 2025-09-26 05:20:32.404 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 387.000ms 2025-09-26 05:20:32.419 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 497.000ms 2025-09-26 05:20:32.529 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 503.000ms 2025-09-26 05:20:32.535 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 515.000ms 2025-09-26 05:20:32.547 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 633.000ms 2025-09-26 05:20:32.665 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 634.000ms 2025-09-26 05:20:32.666 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 718.000ms 2025-09-26 05:20:32.750 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 810.000ms 2025-09-26 05:20:32.842 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 826.000ms 2025-09-26 05:20:32.858 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 924.000ms 2025-09-26 05:20:32.956 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 924.000ms 2025-09-26 05:20:32.956 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 941.000ms 2025-09-26 05:20:32.973 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 947.000ms 2025-09-26 05:20:32.979 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 960.000ms 2025-09-26 05:20:32.992 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 1.389s 2025-09-26 05:20:33.421 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 1.390s 2025-09-26 05:20:33.422 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 1.506s 2025-09-26 05:20:33.538 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 871ms
node0 1.514s 2025-09-26 05:20:33.546 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 1.516s 2025-09-26 05:20:33.548 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 1.562s 2025-09-26 05:20:33.594 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 1.619s 2025-09-26 05:20:33.651 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 1.620s 2025-09-26 05:20:33.652 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 1.855s 2025-09-26 05:20:33.887 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 929ms
node2 1.863s 2025-09-26 05:20:33.895 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.866s 2025-09-26 05:20:33.898 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.906s 2025-09-26 05:20:33.938 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.966s 2025-09-26 05:20:33.998 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.967s 2025-09-26 05:20:33.999 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 2.439s 2025-09-26 05:20:34.471 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1048ms
node4 2.448s 2025-09-26 05:20:34.480 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 2.450s 2025-09-26 05:20:34.482 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 2.490s 2025-09-26 05:20:34.522 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 2.550s 2025-09-26 05:20:34.582 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 2.551s 2025-09-26 05:20:34.583 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 2.687s 2025-09-26 05:20:34.719 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 2.788s 2025-09-26 05:20:34.820 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 2.806s 2025-09-26 05:20:34.838 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.939s 2025-09-26 05:20:34.971 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 2.946s 2025-09-26 05:20:34.978 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 2.960s 2025-09-26 05:20:34.992 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.420s 2025-09-26 05:20:35.452 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 3.421s 2025-09-26 05:20:35.453 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 3.626s 2025-09-26 05:20:35.658 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 3.715s 2025-09-26 05:20:35.747 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 3.717s 2025-09-26 05:20:35.749 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 3.718s 2025-09-26 05:20:35.750 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.934s 2025-09-26 05:20:35.966 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 4.018s 2025-09-26 05:20:36.050 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.021s 2025-09-26 05:20:36.053 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 4.022s 2025-09-26 05:20:36.054 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.449s 2025-09-26 05:20:36.481 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.452s 2025-09-26 05:20:36.484 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 4.457s 2025-09-26 05:20:36.489 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 4.467s 2025-09-26 05:20:36.499 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.468s 2025-09-26 05:20:36.500 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.479s 2025-09-26 05:20:36.511 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1057ms
node3 4.487s 2025-09-26 05:20:36.519 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 4.491s 2025-09-26 05:20:36.523 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 4.534s 2025-09-26 05:20:36.566 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 4.564s 2025-09-26 05:20:36.596 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 4.598s 2025-09-26 05:20:36.630 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 4.599s 2025-09-26 05:20:36.631 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 4.671s 2025-09-26 05:20:36.703 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.675s 2025-09-26 05:20:36.707 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 4.676s 2025-09-26 05:20:36.708 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 4.772s 2025-09-26 05:20:36.804 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.775s 2025-09-26 05:20:36.807 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.781s 2025-09-26 05:20:36.813 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.791s 2025-09-26 05:20:36.823 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.793s 2025-09-26 05:20:36.825 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.854s 2025-09-26 05:20:36.886 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 4.963s 2025-09-26 05:20:36.995 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 4.982s 2025-09-26 05:20:37.014 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 5.125s 2025-09-26 05:20:37.157 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 5.134s 2025-09-26 05:20:37.166 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 5.149s 2025-09-26 05:20:37.181 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.526s 2025-09-26 05:20:37.558 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.529s 2025-09-26 05:20:37.561 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.535s 2025-09-26 05:20:37.567 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.545s 2025-09-26 05:20:37.577 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.547s 2025-09-26 05:20:37.579 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.569s 2025-09-26 05:20:37.601 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26294707] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=138620, randomLong=5424172282803875398, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11240, randomLong=3827597745712170633, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1080950, data=35, exception=null] OS Health Check Report - Complete (took 1020 ms)
node0 5.601s 2025-09-26 05:20:37.633 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 5.608s 2025-09-26 05:20:37.640 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 5.613s 2025-09-26 05:20:37.645 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 5.625s 2025-09-26 05:20:37.657 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 5.627s 2025-09-26 05:20:37.659 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 5.693s 2025-09-26 05:20:37.725 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node0 5.713s 2025-09-26 05:20:37.745 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 5.714s 2025-09-26 05:20:37.746 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 5.728s 2025-09-26 05:20:37.760 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 1c187dd57b8ebf1c936e62d0e6d7161bfa6c6c3aee12a4f3e158f04df7aa12aa00eacb45009c99dcf3610b97c05014d0 (root) ConsistencyTestingToolState / loan-rival-impact-broken 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite
node2 5.900s 2025-09-26 05:20:37.932 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26221955] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=128270, randomLong=-3028523582152879772, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=7310, randomLong=5976280078513544428, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1048980, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node2 5.932s 2025-09-26 05:20:37.964 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.939s 2025-09-26 05:20:37.971 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.944s 2025-09-26 05:20:37.976 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 5.958s 2025-09-26 05:20:37.990 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 5.962s 2025-09-26 05:20:37.994 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 5.967s 2025-09-26 05:20:37.999 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 5.967s 2025-09-26 05:20:37.999 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 5.968s 2025-09-26 05:20:38.000 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 5.972s 2025-09-26 05:20:38.004 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 5.973s 2025-09-26 05:20:38.005 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 5.973s 2025-09-26 05:20:38.005 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 5.975s 2025-09-26 05:20:38.007 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 5.975s 2025-09-26 05:20:38.007 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 5.976s 2025-09-26 05:20:38.008 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 5.978s 2025-09-26 05:20:38.010 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 5.979s 2025-09-26 05:20:38.011 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 192.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 5.984s 2025-09-26 05:20:38.016 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.018s 2025-09-26 05:20:38.050 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node2 6.038s 2025-09-26 05:20:38.070 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 6.038s 2025-09-26 05:20:38.070 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 6.053s 2025-09-26 05:20:38.085 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 1c187dd57b8ebf1c936e62d0e6d7161bfa6c6c3aee12a4f3e158f04df7aa12aa00eacb45009c99dcf3610b97c05014d0 (root) ConsistencyTestingToolState / loan-rival-impact-broken 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite
node2 6.287s 2025-09-26 05:20:38.319 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.292s 2025-09-26 05:20:38.324 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.297s 2025-09-26 05:20:38.329 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.298s 2025-09-26 05:20:38.330 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.299s 2025-09-26 05:20:38.331 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.302s 2025-09-26 05:20:38.334 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.303s 2025-09-26 05:20:38.335 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.304s 2025-09-26 05:20:38.336 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.306s 2025-09-26 05:20:38.338 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.307s 2025-09-26 05:20:38.339 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.308s 2025-09-26 05:20:38.340 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.309s 2025-09-26 05:20:38.341 56 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 192.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.309s 2025-09-26 05:20:38.341 57 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.314s 2025-09-26 05:20:38.346 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6.657s 2025-09-26 05:20:38.689 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26226027] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=141040, randomLong=-284235024437825432, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=12560, randomLong=2445668672582282017, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1123078, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node4 6.690s 2025-09-26 05:20:38.722 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 6.694s 2025-09-26 05:20:38.726 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 6.698s 2025-09-26 05:20:38.730 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.704s 2025-09-26 05:20:38.736 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 6.781s 2025-09-26 05:20:38.813 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.784s 2025-09-26 05:20:38.816 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 6.785s 2025-09-26 05:20:38.817 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 6.785s 2025-09-26 05:20:38.817 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node4 6.807s 2025-09-26 05:20:38.839 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.807s 2025-09-26 05:20:38.839 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 6.822s 2025-09-26 05:20:38.854 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 1c187dd57b8ebf1c936e62d0e6d7161bfa6c6c3aee12a4f3e158f04df7aa12aa00eacb45009c99dcf3610b97c05014d0 (root) ConsistencyTestingToolState / loan-rival-impact-broken 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite
node1 6.953s 2025-09-26 05:20:38.985 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1325ms
node1 6.962s 2025-09-26 05:20:38.994 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 6.966s 2025-09-26 05:20:38.998 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 7.017s 2025-09-26 05:20:39.049 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 7.019s 2025-09-26 05:20:39.051 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 7.024s 2025-09-26 05:20:39.056 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 7.028s 2025-09-26 05:20:39.060 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 7.029s 2025-09-26 05:20:39.061 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 7.030s 2025-09-26 05:20:39.062 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 7.033s 2025-09-26 05:20:39.065 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 7.034s 2025-09-26 05:20:39.066 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 7.035s 2025-09-26 05:20:39.067 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 7.036s 2025-09-26 05:20:39.068 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 7.037s 2025-09-26 05:20:39.069 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 7.038s 2025-09-26 05:20:39.070 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 7.039s 2025-09-26 05:20:39.071 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 7.041s 2025-09-26 05:20:39.073 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 162.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 7.046s 2025-09-26 05:20:39.078 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 7.096s 2025-09-26 05:20:39.128 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 7.098s 2025-09-26 05:20:39.130 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 7.698s 2025-09-26 05:20:39.730 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.702s 2025-09-26 05:20:39.734 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 7.709s 2025-09-26 05:20:39.741 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 7.722s 2025-09-26 05:20:39.754 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.724s 2025-09-26 05:20:39.756 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 8.834s 2025-09-26 05:20:40.866 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26191033] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=132510, randomLong=-358946957532658061, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11160, randomLong=-4695673305978466651, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1251186, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node3 8.872s 2025-09-26 05:20:40.904 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 8.882s 2025-09-26 05:20:40.914 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 8.888s 2025-09-26 05:20:40.920 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 8.977s 2025-09-26 05:20:41.009 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node3 8.977s 2025-09-26 05:20:41.009 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node0 8.979s 2025-09-26 05:20:41.011 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 9.002s 2025-09-26 05:20:41.034 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 9.003s 2025-09-26 05:20:41.035 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 9.020s 2025-09-26 05:20:41.052 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 1c187dd57b8ebf1c936e62d0e6d7161bfa6c6c3aee12a4f3e158f04df7aa12aa00eacb45009c99dcf3610b97c05014d0 (root) ConsistencyTestingToolState / loan-rival-impact-broken 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite
node3 9.251s 2025-09-26 05:20:41.283 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 9.257s 2025-09-26 05:20:41.289 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 9.263s 2025-09-26 05:20:41.295 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 9.264s 2025-09-26 05:20:41.296 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 9.266s 2025-09-26 05:20:41.298 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 9.270s 2025-09-26 05:20:41.302 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 9.272s 2025-09-26 05:20:41.304 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 9.273s 2025-09-26 05:20:41.305 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 9.275s 2025-09-26 05:20:41.307 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node1 9.276s 2025-09-26 05:20:41.308 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 9.276s 2025-09-26 05:20:41.308 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 9.277s 2025-09-26 05:20:41.309 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 9.278s 2025-09-26 05:20:41.310 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 9.279s 2025-09-26 05:20:41.311 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 188.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 9.284s 2025-09-26 05:20:41.316 58 INFO PLATFORM_STATUS <platformForkJoinThread-3> DefaultStatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 9.311s 2025-09-26 05:20:41.343 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.313s 2025-09-26 05:20:41.345 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 9.387s 2025-09-26 05:20:41.419 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 9.391s 2025-09-26 05:20:41.423 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 9.392s 2025-09-26 05:20:41.424 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 10.041s 2025-09-26 05:20:42.073 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 10.044s 2025-09-26 05:20:42.076 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 10.345s 2025-09-26 05:20:42.377 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 10.349s 2025-09-26 05:20:42.381 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 10.358s 2025-09-26 05:20:42.390 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 10.373s 2025-09-26 05:20:42.405 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 10.376s 2025-09-26 05:20:42.408 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 11.502s 2025-09-26 05:20:43.534 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26134427] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=258310, randomLong=-4989756123602575295, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=20660, randomLong=784473990697623095, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=2049750, data=35, exception=null] OS Health Check Report - Complete (took 1028 ms)
node1 11.538s 2025-09-26 05:20:43.570 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 11.548s 2025-09-26 05:20:43.580 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 11.555s 2025-09-26 05:20:43.587 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 11.657s 2025-09-26 05:20:43.689 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node1 11.685s 2025-09-26 05:20:43.717 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 11.686s 2025-09-26 05:20:43.718 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 11.706s 2025-09-26 05:20:43.738 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 1c187dd57b8ebf1c936e62d0e6d7161bfa6c6c3aee12a4f3e158f04df7aa12aa00eacb45009c99dcf3610b97c05014d0 (root) ConsistencyTestingToolState / loan-rival-impact-broken 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite
node1 11.961s 2025-09-26 05:20:43.993 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 11.967s 2025-09-26 05:20:43.999 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 11.975s 2025-09-26 05:20:44.007 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 11.976s 2025-09-26 05:20:44.008 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 11.977s 2025-09-26 05:20:44.009 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 11.982s 2025-09-26 05:20:44.014 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 11.984s 2025-09-26 05:20:44.016 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 11.984s 2025-09-26 05:20:44.016 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 11.987s 2025-09-26 05:20:44.019 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 11.987s 2025-09-26 05:20:44.019 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 11.989s 2025-09-26 05:20:44.021 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 11.991s 2025-09-26 05:20:44.023 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 11.992s 2025-09-26 05:20:44.024 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 221.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 11.998s 2025-09-26 05:20:44.030 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> DefaultStatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 12.279s 2025-09-26 05:20:44.311 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 12.282s 2025-09-26 05:20:44.314 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 14.994s 2025-09-26 05:20:47.026 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 14.998s 2025-09-26 05:20:47.030 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 16.076s 2025-09-26 05:20:48.108 61 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.405s 2025-09-26 05:20:48.437 61 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.136s 2025-09-26 05:20:49.168 61 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 19.376s 2025-09-26 05:20:51.408 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 20.960s 2025-09-26 05:20:52.992 62 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 4.9 s in CHECKING. Now in ACTIVE
node0 20.962s 2025-09-26 05:20:52.994 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 20.962s 2025-09-26 05:20:52.994 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 21.026s 2025-09-26 05:20:53.058 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 21.157s 2025-09-26 05:20:53.189 62 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 21.166s 2025-09-26 05:20:53.198 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 21.168s 2025-09-26 05:20:53.200 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 21.171s 2025-09-26 05:20:53.203 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 21.236s 2025-09-26 05:20:53.268 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 21.238s 2025-09-26 05:20:53.270 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 21.372s 2025-09-26 05:20:53.404 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node2 21.374s 2025-09-26 05:20:53.406 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 21.432s 2025-09-26 05:20:53.464 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 21.434s 2025-09-26 05:20:53.466 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 21.452s 2025-09-26 05:20:53.484 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node4 21.454s 2025-09-26 05:20:53.486 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 21.463s 2025-09-26 05:20:53.495 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 21.468s 2025-09-26 05:20:53.500 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T05:20:49.177749819Z Next consensus number: 1 Legacy running event hash: 56ca89ca8d3eed3ef1a7a8ef7c3a369fdef069b298e1189b19e3adaa80f29c87af61266cfa45a584c475cf8fb4849433 Legacy running event mnemonic: sun-short-flame-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: fd4d40f629ee903ba9543532339e1f91060cd33ef5a167fc514cfe93215eab7d04b21049a2af287227a244e9aeff3ec0 (root) ConsistencyTestingToolState / filter-burst-differ-weather 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 high-enroll-inform-mountain 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 21.518s 2025-09-26 05:20:53.550 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 21.519s 2025-09-26 05:20:53.551 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 21.520s 2025-09-26 05:20:53.552 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 21.521s 2025-09-26 05:20:53.553 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 21.521s 2025-09-26 05:20:53.553 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 21.524s 2025-09-26 05:20:53.556 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T05:20:49.177749819Z Next consensus number: 1 Legacy running event hash: 56ca89ca8d3eed3ef1a7a8ef7c3a369fdef069b298e1189b19e3adaa80f29c87af61266cfa45a584c475cf8fb4849433 Legacy running event mnemonic: sun-short-flame-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: fd4d40f629ee903ba9543532339e1f91060cd33ef5a167fc514cfe93215eab7d04b21049a2af287227a244e9aeff3ec0 (root) ConsistencyTestingToolState / filter-burst-differ-weather 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 high-enroll-inform-mountain 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 21.529s 2025-09-26 05:20:53.561 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 21.566s 2025-09-26 05:20:53.598 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 21.567s 2025-09-26 05:20:53.599 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 21.567s 2025-09-26 05:20:53.599 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 21.569s 2025-09-26 05:20:53.601 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 21.576s 2025-09-26 05:20:53.608 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 21.604s 2025-09-26 05:20:53.636 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 21.607s 2025-09-26 05:20:53.639 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T05:20:49.177749819Z Next consensus number: 1 Legacy running event hash: 56ca89ca8d3eed3ef1a7a8ef7c3a369fdef069b298e1189b19e3adaa80f29c87af61266cfa45a584c475cf8fb4849433 Legacy running event mnemonic: sun-short-flame-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: fd4d40f629ee903ba9543532339e1f91060cd33ef5a167fc514cfe93215eab7d04b21049a2af287227a244e9aeff3ec0 (root) ConsistencyTestingToolState / filter-burst-differ-weather 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 high-enroll-inform-mountain 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 21.638s 2025-09-26 05:20:53.670 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 21.638s 2025-09-26 05:20:53.670 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 21.638s 2025-09-26 05:20:53.670 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 21.639s 2025-09-26 05:20:53.671 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 21.645s 2025-09-26 05:20:53.677 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 21.683s 2025-09-26 05:20:53.715 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 21.686s 2025-09-26 05:20:53.718 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T05:20:49.177749819Z Next consensus number: 1 Legacy running event hash: 56ca89ca8d3eed3ef1a7a8ef7c3a369fdef069b298e1189b19e3adaa80f29c87af61266cfa45a584c475cf8fb4849433 Legacy running event mnemonic: sun-short-flame-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: fd4d40f629ee903ba9543532339e1f91060cd33ef5a167fc514cfe93215eab7d04b21049a2af287227a244e9aeff3ec0 (root) ConsistencyTestingToolState / filter-burst-differ-weather 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 high-enroll-inform-mountain 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 21.708s 2025-09-26 05:20:53.740 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 21.711s 2025-09-26 05:20:53.743 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-26T05:20:49.177749819Z Next consensus number: 1 Legacy running event hash: 56ca89ca8d3eed3ef1a7a8ef7c3a369fdef069b298e1189b19e3adaa80f29c87af61266cfa45a584c475cf8fb4849433 Legacy running event mnemonic: sun-short-flame-flame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: fd4d40f629ee903ba9543532339e1f91060cd33ef5a167fc514cfe93215eab7d04b21049a2af287227a244e9aeff3ec0 (root) ConsistencyTestingToolState / filter-burst-differ-weather 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 high-enroll-inform-mountain 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 21.718s 2025-09-26 05:20:53.750 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 21.718s 2025-09-26 05:20:53.750 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 21.718s 2025-09-26 05:20:53.750 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 21.719s 2025-09-26 05:20:53.751 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 21.725s 2025-09-26 05:20:53.757 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 21.744s 2025-09-26 05:20:53.776 117 INFO PLATFORM_STATUS <platformForkJoinThread-6> DefaultStatusStateMachine: Platform spent 5.3 s in CHECKING. Now in ACTIVE
node4 21.748s 2025-09-26 05:20:53.780 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 21.748s 2025-09-26 05:20:53.780 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 21.748s 2025-09-26 05:20:53.780 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 21.749s 2025-09-26 05:20:53.781 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 21.755s 2025-09-26 05:20:53.787 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 21.877s 2025-09-26 05:20:53.909 117 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 2.5 s in CHECKING. Now in ACTIVE
node4 22.018s 2025-09-26 05:20:54.050 117 INFO PLATFORM_STATUS <platformForkJoinThread-2> DefaultStatusStateMachine: Platform spent 4.9 s in CHECKING. Now in ACTIVE
node1 22.086s 2025-09-26 05:20:54.118 117 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 23.997s 2025-09-26 05:20:56.029 147 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 1.9 s in CHECKING. Now in ACTIVE
node0 29.502s 2025-09-26 05:21:01.534 228 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 13 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 29.583s 2025-09-26 05:21:01.615 222 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 13 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 29.585s 2025-09-26 05:21:01.617 230 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 13 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 29.603s 2025-09-26 05:21:01.635 232 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 13 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 29.604s 2025-09-26 05:21:01.636 222 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 13 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 29.941s 2025-09-26 05:21:01.973 232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 13 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/13
node1 29.942s 2025-09-26 05:21:01.974 233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node0 29.987s 2025-09-26 05:21:02.019 240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 13 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/13
node0 29.988s 2025-09-26 05:21:02.020 241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node2 30.012s 2025-09-26 05:21:02.044 234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 13 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/13
node2 30.012s 2025-09-26 05:21:02.044 235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node3 30.015s 2025-09-26 05:21:02.047 234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 13 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/13
node3 30.016s 2025-09-26 05:21:02.048 235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node1 30.048s 2025-09-26 05:21:02.080 268 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node1 30.051s 2025-09-26 05:21:02.083 269 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 13 Timestamp: 2025-09-26T05:21:00.047764596Z Next consensus number: 303 Legacy running event hash: c8ed37881f41a56d77b8e483cca31ba63e6c6542b50762576faa6320d2a7eb51d279d93273bc247e52588c08ce9fee91 Legacy running event mnemonic: tiny-valve-trial-fame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2092609364 Root hash: fd8ef52dbe5242712e5e7581704277dd989177a9086953786dcbd1452ace1b4a471c6a6721ecaec1fa8ef09297170274 (root) ConsistencyTestingToolState / exact-foil-cupboard-evolve 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 clarify-parent-wait-utility 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -108365867131662583 /3 domain-worry-glove-flag 4 StringLeaf 13 /4 hand-track-discover-save
node1 30.063s 2025-09-26 05:21:02.095 270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 30.063s 2025-09-26 05:21:02.095 271 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 30.064s 2025-09-26 05:21:02.096 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 30.065s 2025-09-26 05:21:02.097 273 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 30.065s 2025-09-26 05:21:02.097 274 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 13 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/13 {"round":13,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/13/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 30.074s 2025-09-26 05:21:02.106 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node0 30.077s 2025-09-26 05:21:02.109 273 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 13 Timestamp: 2025-09-26T05:21:00.047764596Z Next consensus number: 303 Legacy running event hash: c8ed37881f41a56d77b8e483cca31ba63e6c6542b50762576faa6320d2a7eb51d279d93273bc247e52588c08ce9fee91 Legacy running event mnemonic: tiny-valve-trial-fame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2092609364 Root hash: fd8ef52dbe5242712e5e7581704277dd989177a9086953786dcbd1452ace1b4a471c6a6721ecaec1fa8ef09297170274 (root) ConsistencyTestingToolState / exact-foil-cupboard-evolve 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 clarify-parent-wait-utility 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -108365867131662583 /3 domain-worry-glove-flag 4 StringLeaf 13 /4 hand-track-discover-save
node0 30.086s 2025-09-26 05:21:02.118 274 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 30.086s 2025-09-26 05:21:02.118 275 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 30.086s 2025-09-26 05:21:02.118 276 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 30.087s 2025-09-26 05:21:02.119 277 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 30.087s 2025-09-26 05:21:02.119 278 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 13 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/13 {"round":13,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/13/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 30.105s 2025-09-26 05:21:02.137 274 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node2 30.107s 2025-09-26 05:21:02.139 275 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 13 Timestamp: 2025-09-26T05:21:00.047764596Z Next consensus number: 303 Legacy running event hash: c8ed37881f41a56d77b8e483cca31ba63e6c6542b50762576faa6320d2a7eb51d279d93273bc247e52588c08ce9fee91 Legacy running event mnemonic: tiny-valve-trial-fame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2092609364 Root hash: fd8ef52dbe5242712e5e7581704277dd989177a9086953786dcbd1452ace1b4a471c6a6721ecaec1fa8ef09297170274 (root) ConsistencyTestingToolState / exact-foil-cupboard-evolve 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 clarify-parent-wait-utility 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -108365867131662583 /3 domain-worry-glove-flag 4 StringLeaf 13 /4 hand-track-discover-save
node3 30.112s 2025-09-26 05:21:02.144 270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node2 30.115s 2025-09-26 05:21:02.147 276 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node3 30.115s 2025-09-26 05:21:02.147 271 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 13 Timestamp: 2025-09-26T05:21:00.047764596Z Next consensus number: 303 Legacy running event hash: c8ed37881f41a56d77b8e483cca31ba63e6c6542b50762576faa6320d2a7eb51d279d93273bc247e52588c08ce9fee91 Legacy running event mnemonic: tiny-valve-trial-fame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2092609364 Root hash: fd8ef52dbe5242712e5e7581704277dd989177a9086953786dcbd1452ace1b4a471c6a6721ecaec1fa8ef09297170274 (root) ConsistencyTestingToolState / exact-foil-cupboard-evolve 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 clarify-parent-wait-utility 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -108365867131662583 /3 domain-worry-glove-flag 4 StringLeaf 13 /4 hand-track-discover-save
node2 30.116s 2025-09-26 05:21:02.148 277 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 30.116s 2025-09-26 05:21:02.148 278 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 30.117s 2025-09-26 05:21:02.149 279 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 30.117s 2025-09-26 05:21:02.149 280 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 13 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/13 {"round":13,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/13/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 30.125s 2025-09-26 05:21:02.157 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 30.125s 2025-09-26 05:21:02.157 273 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 30.125s 2025-09-26 05:21:02.157 274 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 30.126s 2025-09-26 05:21:02.158 275 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 30.127s 2025-09-26 05:21:02.159 276 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 13 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/13 {"round":13,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/13/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 30.131s 2025-09-26 05:21:02.163 234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 13 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/13
node4 30.132s 2025-09-26 05:21:02.164 235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node4 30.217s 2025-09-26 05:21:02.249 266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 13
node4 30.219s 2025-09-26 05:21:02.251 267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 13 Timestamp: 2025-09-26T05:21:00.047764596Z Next consensus number: 303 Legacy running event hash: c8ed37881f41a56d77b8e483cca31ba63e6c6542b50762576faa6320d2a7eb51d279d93273bc247e52588c08ce9fee91 Legacy running event mnemonic: tiny-valve-trial-fame Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -2092609364 Root hash: fd8ef52dbe5242712e5e7581704277dd989177a9086953786dcbd1452ace1b4a471c6a6721ecaec1fa8ef09297170274 (root) ConsistencyTestingToolState / exact-foil-cupboard-evolve 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 clarify-parent-wait-utility 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -108365867131662583 /3 domain-worry-glove-flag 4 StringLeaf 13 /4 hand-track-discover-save
node4 30.226s 2025-09-26 05:21:02.258 268 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 30.226s 2025-09-26 05:21:02.258 269 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 30.227s 2025-09-26 05:21:02.259 270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 30.227s 2025-09-26 05:21:02.259 271 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 30.228s 2025-09-26 05:21:02.260 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 13 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/13 {"round":13,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/13/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 29.274s 2025-09-26 05:22:01.306 1353 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 110 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 29.346s 2025-09-26 05:22:01.378 1363 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 110 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 29.421s 2025-09-26 05:22:01.453 1353 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 110 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 29.493s 2025-09-26 05:22:01.525 1345 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 110 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 29.699s 2025-09-26 05:22:01.731 1366 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 110 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/110
node1 1m 29.700s 2025-09-26 05:22:01.732 1367 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node4 1m 29.705s 2025-09-26 05:22:01.737 1355 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 110 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 29.769s 2025-09-26 05:22:01.801 1376 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 110 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/110
node0 1m 29.770s 2025-09-26 05:22:01.802 1377 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node1 1m 29.800s 2025-09-26 05:22:01.832 1402 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node1 1m 29.803s 2025-09-26 05:22:01.835 1403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 110 Timestamp: 2025-09-26T05:22:00.187048Z Next consensus number: 2862 Legacy running event hash: 50488bd3fb73f83cb01980e44ca20049e155b60cfad60c5780441ce127d2d5539cf668f184498a5a1cc137ea0dd2f300 Legacy running event mnemonic: animal-mad-museum-weasel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 116713313 Root hash: a4b86a8b585800bd1d4289b97ef7382ebc1e23de8c66862c7bd2b06f43f28200b38419c8bb94d4eddb67278b1dfb9333 (root) ConsistencyTestingToolState / harbor-fabric-arch-bless 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 awkward-laundry-mouse-rail 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 5872004138865538101 /3 clock-example-casual-false 4 StringLeaf 110 /4 cargo-hood-pipe-dress
node1 1m 29.812s 2025-09-26 05:22:01.844 1404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 29.813s 2025-09-26 05:22:01.845 1405 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 83 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 29.813s 2025-09-26 05:22:01.845 1406 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 29.815s 2025-09-26 05:22:01.847 1407 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 29.816s 2025-09-26 05:22:01.848 1408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 110 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/110 {"round":110,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/110/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 29.823s 2025-09-26 05:22:01.855 1358 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 110 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/110
node4 1m 29.823s 2025-09-26 05:22:01.855 1359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node2 1m 29.838s 2025-09-26 05:22:01.870 1356 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 110 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/110
node2 1m 29.839s 2025-09-26 05:22:01.871 1357 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node0 1m 29.847s 2025-09-26 05:22:01.879 1412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node0 1m 29.849s 2025-09-26 05:22:01.881 1413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 110 Timestamp: 2025-09-26T05:22:00.187048Z Next consensus number: 2862 Legacy running event hash: 50488bd3fb73f83cb01980e44ca20049e155b60cfad60c5780441ce127d2d5539cf668f184498a5a1cc137ea0dd2f300 Legacy running event mnemonic: animal-mad-museum-weasel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 116713313 Root hash: a4b86a8b585800bd1d4289b97ef7382ebc1e23de8c66862c7bd2b06f43f28200b38419c8bb94d4eddb67278b1dfb9333 (root) ConsistencyTestingToolState / harbor-fabric-arch-bless 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 awkward-laundry-mouse-rail 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 5872004138865538101 /3 clock-example-casual-false 4 StringLeaf 110 /4 cargo-hood-pipe-dress
node0 1m 29.857s 2025-09-26 05:22:01.889 1414 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 29.857s 2025-09-26 05:22:01.889 1415 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 83 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 29.857s 2025-09-26 05:22:01.889 1416 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 29.859s 2025-09-26 05:22:01.891 1417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 29.860s 2025-09-26 05:22:01.892 1418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 110 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/110 {"round":110,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/110/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 29.902s 2025-09-26 05:22:01.934 1348 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 110 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/110
node3 1m 29.903s 2025-09-26 05:22:01.935 1349 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node4 1m 29.915s 2025-09-26 05:22:01.947 1394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node4 1m 29.917s 2025-09-26 05:22:01.949 1395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 110 Timestamp: 2025-09-26T05:22:00.187048Z Next consensus number: 2862 Legacy running event hash: 50488bd3fb73f83cb01980e44ca20049e155b60cfad60c5780441ce127d2d5539cf668f184498a5a1cc137ea0dd2f300 Legacy running event mnemonic: animal-mad-museum-weasel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 116713313 Root hash: a4b86a8b585800bd1d4289b97ef7382ebc1e23de8c66862c7bd2b06f43f28200b38419c8bb94d4eddb67278b1dfb9333 (root) ConsistencyTestingToolState / harbor-fabric-arch-bless 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 awkward-laundry-mouse-rail 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 5872004138865538101 /3 clock-example-casual-false 4 StringLeaf 110 /4 cargo-hood-pipe-dress
node2 1m 29.925s 2025-09-26 05:22:01.957 1392 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node4 1m 29.926s 2025-09-26 05:22:01.958 1396 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 29.927s 2025-09-26 05:22:01.959 1393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 110 Timestamp: 2025-09-26T05:22:00.187048Z Next consensus number: 2862 Legacy running event hash: 50488bd3fb73f83cb01980e44ca20049e155b60cfad60c5780441ce127d2d5539cf668f184498a5a1cc137ea0dd2f300 Legacy running event mnemonic: animal-mad-museum-weasel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 116713313 Root hash: a4b86a8b585800bd1d4289b97ef7382ebc1e23de8c66862c7bd2b06f43f28200b38419c8bb94d4eddb67278b1dfb9333 (root) ConsistencyTestingToolState / harbor-fabric-arch-bless 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 awkward-laundry-mouse-rail 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 5872004138865538101 /3 clock-example-casual-false 4 StringLeaf 110 /4 cargo-hood-pipe-dress
node4 1m 29.927s 2025-09-26 05:22:01.959 1397 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 83 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 29.927s 2025-09-26 05:22:01.959 1398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 29.929s 2025-09-26 05:22:01.961 1399 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 29.930s 2025-09-26 05:22:01.962 1400 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 110 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/110 {"round":110,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/110/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 29.936s 2025-09-26 05:22:01.968 1394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 29.936s 2025-09-26 05:22:01.968 1395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 83 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 29.936s 2025-09-26 05:22:01.968 1396 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 29.939s 2025-09-26 05:22:01.971 1397 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 29.939s 2025-09-26 05:22:01.971 1398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 110 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/110 {"round":110,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/110/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 29.987s 2025-09-26 05:22:02.019 1388 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 110
node3 1m 29.989s 2025-09-26 05:22:02.021 1389 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 110 Timestamp: 2025-09-26T05:22:00.187048Z Next consensus number: 2862 Legacy running event hash: 50488bd3fb73f83cb01980e44ca20049e155b60cfad60c5780441ce127d2d5539cf668f184498a5a1cc137ea0dd2f300 Legacy running event mnemonic: animal-mad-museum-weasel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 116713313 Root hash: a4b86a8b585800bd1d4289b97ef7382ebc1e23de8c66862c7bd2b06f43f28200b38419c8bb94d4eddb67278b1dfb9333 (root) ConsistencyTestingToolState / harbor-fabric-arch-bless 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 awkward-laundry-mouse-rail 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 5872004138865538101 /3 clock-example-casual-false 4 StringLeaf 110 /4 cargo-hood-pipe-dress
node3 1m 29.998s 2025-09-26 05:22:02.030 1390 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 29.999s 2025-09-26 05:22:02.031 1391 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 83 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 29.999s 2025-09-26 05:22:02.031 1392 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 30.001s 2025-09-26 05:22:02.033 1393 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 30.002s 2025-09-26 05:22:02.034 1394 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 110 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/110 {"round":110,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/110/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 29.432s 2025-09-26 05:23:01.464 2449 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 206 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 29.645s 2025-09-26 05:23:01.677 2461 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 206 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 29.801s 2025-09-26 05:23:01.833 2457 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 206 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 29.833s 2025-09-26 05:23:01.865 2457 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 206 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 29.879s 2025-09-26 05:23:01.911 2477 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 206 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 30.010s 2025-09-26 05:23:02.042 2463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 206 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/206
node3 2m 30.010s 2025-09-26 05:23:02.042 2464 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node2 2m 30.080s 2025-09-26 05:23:02.112 2463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 206 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/206
node2 2m 30.081s 2025-09-26 05:23:02.113 2464 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node3 2m 30.103s 2025-09-26 05:23:02.135 2504 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node3 2m 30.105s 2025-09-26 05:23:02.137 2505 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node3 2m 30.113s 2025-09-26 05:23:02.145 2506 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 30.114s 2025-09-26 05:23:02.146 2507 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 179 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 30.114s 2025-09-26 05:23:02.146 2508 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 30.118s 2025-09-26 05:23:02.150 2509 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 30.119s 2025-09-26 05:23:02.151 2510 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 206 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/206 {"round":206,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/206/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 30.159s 2025-09-26 05:23:02.191 2504 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node2 2m 30.160s 2025-09-26 05:23:02.192 2505 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node2 2m 30.167s 2025-09-26 05:23:02.199 2506 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 30.167s 2025-09-26 05:23:02.199 2507 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 179 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 30.167s 2025-09-26 05:23:02.199 2508 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 30.172s 2025-09-26 05:23:02.204 2509 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 30.172s 2025-09-26 05:23:02.204 2510 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 206 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/206 {"round":206,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/206/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 30.221s 2025-09-26 05:23:02.253 2483 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 206 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/206
node0 2m 30.222s 2025-09-26 05:23:02.254 2484 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node1 2m 30.294s 2025-09-26 05:23:02.326 2467 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 206 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/206
node1 2m 30.295s 2025-09-26 05:23:02.327 2468 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node0 2m 30.298s 2025-09-26 05:23:02.330 2520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node0 2m 30.300s 2025-09-26 05:23:02.332 2521 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node0 2m 30.308s 2025-09-26 05:23:02.340 2522 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 30.308s 2025-09-26 05:23:02.340 2523 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 179 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 30.309s 2025-09-26 05:23:02.341 2524 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 30.313s 2025-09-26 05:23:02.345 2525 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 30.313s 2025-09-26 05:23:02.345 2526 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 206 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/206 {"round":206,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/206/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 30.327s 2025-09-26 05:23:02.359 2455 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 206 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/206
node4 2m 30.328s 2025-09-26 05:23:02.360 2456 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node1 2m 30.385s 2025-09-26 05:23:02.417 2504 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node1 2m 30.387s 2025-09-26 05:23:02.419 2513 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node1 2m 30.396s 2025-09-26 05:23:02.428 2514 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 30.396s 2025-09-26 05:23:02.428 2515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 179 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 30.396s 2025-09-26 05:23:02.428 2516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 30.401s 2025-09-26 05:23:02.433 2517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 30.401s 2025-09-26 05:23:02.433 2518 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 206 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/206 {"round":206,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/206/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 30.411s 2025-09-26 05:23:02.443 2492 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 206
node4 2m 30.413s 2025-09-26 05:23:02.445 2493 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node4 2m 30.420s 2025-09-26 05:23:02.452 2494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 30.421s 2025-09-26 05:23:02.453 2495 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 179 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 30.421s 2025-09-26 05:23:02.453 2496 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 30.425s 2025-09-26 05:23:02.457 2497 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 30.426s 2025-09-26 05:23:02.458 2498 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 206 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/206 {"round":206,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/206/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 13.838s 2025-09-26 05:23:45.870 3324 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node2 3m 13.838s 2025-09-26 05:23:45.870 3324 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node1 3m 13.839s 2025-09-26 05:23:45.871 3308 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 3m 13.839s 2025-09-26 05:23:45.871 3310 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node2 3m 29.261s 2025-09-26 05:24:01.293 3582 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 300 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 29.321s 2025-09-26 05:24:01.353 3580 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 300 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 29.528s 2025-09-26 05:24:01.560 3580 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 300 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 29.552s 2025-09-26 05:24:01.584 3568 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 300 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 29.904s 2025-09-26 05:24:01.936 3583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 300 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/300
node0 3m 29.905s 2025-09-26 05:24:01.937 3584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node3 3m 29.928s 2025-09-26 05:24:01.960 3583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 300 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/300
node3 3m 29.928s 2025-09-26 05:24:01.960 3584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node1 3m 29.974s 2025-09-26 05:24:02.006 3571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 300 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/300
node1 3m 29.975s 2025-09-26 05:24:02.007 3572 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node0 3m 29.989s 2025-09-26 05:24:02.021 3619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node0 3m 29.991s 2025-09-26 05:24:02.023 3620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 300 Timestamp: 2025-09-26T05:24:00.177738001Z Next consensus number: 7668 Legacy running event hash: 3ccd3025b74a872a3ba582c64ed48d56487f301303a3881a6583aaddb8c5fb7dddc504ad3d509946f6da0353239d5619 Legacy running event mnemonic: often-again-bitter-bracket Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 373112337 Root hash: 7f660cb35fa82ead22cea8d3d0cb9aa904f61e9ef4bb8437ac3e5f61cf4daec77abf04eccdf53df1594d7d32085992a6 (root) ConsistencyTestingToolState / under-nose-vital-stick 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 property-end-build-whip 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 3118790652871586168 /3 brother-easy-cart-remember 4 StringLeaf 300 /4 pet-void-figure-shaft
node0 3m 29.998s 2025-09-26 05:24:02.030 3621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 29.998s 2025-09-26 05:24:02.030 3622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 273 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 29.998s 2025-09-26 05:24:02.030 3623 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 30.004s 2025-09-26 05:24:02.036 3624 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 30.004s 2025-09-26 05:24:02.036 3625 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 300 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/300 {"round":300,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/300/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 30.021s 2025-09-26 05:24:02.053 3615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node3 3m 30.024s 2025-09-26 05:24:02.056 3616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 300 Timestamp: 2025-09-26T05:24:00.177738001Z Next consensus number: 7668 Legacy running event hash: 3ccd3025b74a872a3ba582c64ed48d56487f301303a3881a6583aaddb8c5fb7dddc504ad3d509946f6da0353239d5619 Legacy running event mnemonic: often-again-bitter-bracket Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 373112337 Root hash: 7f660cb35fa82ead22cea8d3d0cb9aa904f61e9ef4bb8437ac3e5f61cf4daec77abf04eccdf53df1594d7d32085992a6 (root) ConsistencyTestingToolState / under-nose-vital-stick 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 property-end-build-whip 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 3118790652871586168 /3 brother-easy-cart-remember 4 StringLeaf 300 /4 pet-void-figure-shaft
node3 3m 30.031s 2025-09-26 05:24:02.063 3617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 30.032s 2025-09-26 05:24:02.064 3618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 273 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 30.032s 2025-09-26 05:24:02.064 3619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 30.038s 2025-09-26 05:24:02.070 3620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 30.039s 2025-09-26 05:24:02.071 3621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 300 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/300 {"round":300,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/300/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 30.070s 2025-09-26 05:24:02.102 3607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node1 3m 30.072s 2025-09-26 05:24:02.104 3608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 300 Timestamp: 2025-09-26T05:24:00.177738001Z Next consensus number: 7668 Legacy running event hash: 3ccd3025b74a872a3ba582c64ed48d56487f301303a3881a6583aaddb8c5fb7dddc504ad3d509946f6da0353239d5619 Legacy running event mnemonic: often-again-bitter-bracket Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 373112337 Root hash: 7f660cb35fa82ead22cea8d3d0cb9aa904f61e9ef4bb8437ac3e5f61cf4daec77abf04eccdf53df1594d7d32085992a6 (root) ConsistencyTestingToolState / under-nose-vital-stick 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 property-end-build-whip 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 3118790652871586168 /3 brother-easy-cart-remember 4 StringLeaf 300 /4 pet-void-figure-shaft
node1 3m 30.079s 2025-09-26 05:24:02.111 3609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 30.079s 2025-09-26 05:24:02.111 3610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 273 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 30.079s 2025-09-26 05:24:02.111 3611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 30.085s 2025-09-26 05:24:02.117 3612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 30.086s 2025-09-26 05:24:02.118 3613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 300 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/300 {"round":300,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/300/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 30.093s 2025-09-26 05:24:02.125 3595 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 300 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/300
node2 3m 30.094s 2025-09-26 05:24:02.126 3596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node2 3m 30.178s 2025-09-26 05:24:02.210 3639 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 300
node2 3m 30.179s 2025-09-26 05:24:02.211 3640 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 300 Timestamp: 2025-09-26T05:24:00.177738001Z Next consensus number: 7668 Legacy running event hash: 3ccd3025b74a872a3ba582c64ed48d56487f301303a3881a6583aaddb8c5fb7dddc504ad3d509946f6da0353239d5619 Legacy running event mnemonic: often-again-bitter-bracket Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 373112337 Root hash: 7f660cb35fa82ead22cea8d3d0cb9aa904f61e9ef4bb8437ac3e5f61cf4daec77abf04eccdf53df1594d7d32085992a6 (root) ConsistencyTestingToolState / under-nose-vital-stick 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 property-end-build-whip 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 3118790652871586168 /3 brother-easy-cart-remember 4 StringLeaf 300 /4 pet-void-figure-shaft
node2 3m 30.185s 2025-09-26 05:24:02.217 3641 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 30.185s 2025-09-26 05:24:02.217 3642 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 273 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 30.185s 2025-09-26 05:24:02.217 3643 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 30.191s 2025-09-26 05:24:02.223 3644 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 30.191s 2025-09-26 05:24:02.223 3645 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 300 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/300 {"round":300,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/300/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 29.231s 2025-09-26 05:25:01.263 4632 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 390 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 29.322s 2025-09-26 05:25:01.354 4618 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 390 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 29.392s 2025-09-26 05:25:01.424 4652 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 390 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 29.498s 2025-09-26 05:25:01.530 4666 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 390 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 29.611s 2025-09-26 05:25:01.643 4655 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 390 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/390
node3 4m 29.612s 2025-09-26 05:25:01.644 4656 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node3 4m 29.727s 2025-09-26 05:25:01.759 4691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node3 4m 29.730s 2025-09-26 05:25:01.762 4692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 390 Timestamp: 2025-09-26T05:25:00.006298Z Next consensus number: 9191 Legacy running event hash: 260ac1d101bbb68de9d4e49d4e1b60c88753d14fae6e40a5eafa15549d70b47228dcfb57e422db89b004146e039abffe Legacy running event mnemonic: scout-shine-danger-common Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -317044113 Root hash: c9eb9eea09f4402aeb3a1d20eece61b95be3d6304f4b4af7b00566ba733bc647d05d01b1b85aafdef6718da6ce670485 (root) ConsistencyTestingToolState / section-carpet-direct-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 scatter-stock-rib-direct 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -3163772392335456346 /3 hammer-attack-crowd-health 4 StringLeaf 390 /4 tilt-order-index-basket
node3 4m 29.736s 2025-09-26 05:25:01.768 4693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 29.736s 2025-09-26 05:25:01.768 4694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 361 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 29.736s 2025-09-26 05:25:01.768 4695 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 29.743s 2025-09-26 05:25:01.775 4696 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 29.744s 2025-09-26 05:25:01.776 4697 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 390 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/390 {"round":390,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/390/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 29.745s 2025-09-26 05:25:01.777 4698 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node1 4m 29.878s 2025-09-26 05:25:01.910 4621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 390 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/390
node1 4m 29.879s 2025-09-26 05:25:01.911 4622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node2 4m 29.920s 2025-09-26 05:25:01.952 4669 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 390 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/390
node2 4m 29.921s 2025-09-26 05:25:01.953 4670 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node1 4m 29.971s 2025-09-26 05:25:02.003 4661 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node1 4m 29.973s 2025-09-26 05:25:02.005 4662 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 390 Timestamp: 2025-09-26T05:25:00.006298Z Next consensus number: 9191 Legacy running event hash: 260ac1d101bbb68de9d4e49d4e1b60c88753d14fae6e40a5eafa15549d70b47228dcfb57e422db89b004146e039abffe Legacy running event mnemonic: scout-shine-danger-common Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -317044113 Root hash: c9eb9eea09f4402aeb3a1d20eece61b95be3d6304f4b4af7b00566ba733bc647d05d01b1b85aafdef6718da6ce670485 (root) ConsistencyTestingToolState / section-carpet-direct-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 scatter-stock-rib-direct 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -3163772392335456346 /3 hammer-attack-crowd-health 4 StringLeaf 390 /4 tilt-order-index-basket
node1 4m 29.979s 2025-09-26 05:25:02.011 4663 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 29.979s 2025-09-26 05:25:02.011 4664 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 361 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 29.979s 2025-09-26 05:25:02.011 4665 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 29.986s 2025-09-26 05:25:02.018 4674 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 29.987s 2025-09-26 05:25:02.019 4675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 390 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/390 {"round":390,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/390/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 29.989s 2025-09-26 05:25:02.021 4676 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node2 4m 30.001s 2025-09-26 05:25:02.033 4709 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node0 4m 30.002s 2025-09-26 05:25:02.034 4635 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 390 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/390
node0 4m 30.003s 2025-09-26 05:25:02.035 4636 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node2 4m 30.003s 2025-09-26 05:25:02.035 4710 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 390 Timestamp: 2025-09-26T05:25:00.006298Z Next consensus number: 9191 Legacy running event hash: 260ac1d101bbb68de9d4e49d4e1b60c88753d14fae6e40a5eafa15549d70b47228dcfb57e422db89b004146e039abffe Legacy running event mnemonic: scout-shine-danger-common Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -317044113 Root hash: c9eb9eea09f4402aeb3a1d20eece61b95be3d6304f4b4af7b00566ba733bc647d05d01b1b85aafdef6718da6ce670485 (root) ConsistencyTestingToolState / section-carpet-direct-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 scatter-stock-rib-direct 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -3163772392335456346 /3 hammer-attack-crowd-health 4 StringLeaf 390 /4 tilt-order-index-basket
node2 4m 30.011s 2025-09-26 05:25:02.043 4711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 30.011s 2025-09-26 05:25:02.043 4712 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 361 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 30.011s 2025-09-26 05:25:02.043 4713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 30.018s 2025-09-26 05:25:02.050 4714 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 30.018s 2025-09-26 05:25:02.050 4715 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 390 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/390 {"round":390,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/390/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 30.020s 2025-09-26 05:25:02.052 4716 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node0 4m 30.090s 2025-09-26 05:25:02.122 4675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 390
node0 4m 30.092s 2025-09-26 05:25:02.124 4676 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 390 Timestamp: 2025-09-26T05:25:00.006298Z Next consensus number: 9191 Legacy running event hash: 260ac1d101bbb68de9d4e49d4e1b60c88753d14fae6e40a5eafa15549d70b47228dcfb57e422db89b004146e039abffe Legacy running event mnemonic: scout-shine-danger-common Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -317044113 Root hash: c9eb9eea09f4402aeb3a1d20eece61b95be3d6304f4b4af7b00566ba733bc647d05d01b1b85aafdef6718da6ce670485 (root) ConsistencyTestingToolState / section-carpet-direct-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 scatter-stock-rib-direct 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -3163772392335456346 /3 hammer-attack-crowd-health 4 StringLeaf 390 /4 tilt-order-index-basket
node0 4m 30.100s 2025-09-26 05:25:02.132 4677 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 30.100s 2025-09-26 05:25:02.132 4678 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 361 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 30.101s 2025-09-26 05:25:02.133 4679 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 30.107s 2025-09-26 05:25:02.139 4680 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 30.108s 2025-09-26 05:25:02.140 4681 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 390 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/390 {"round":390,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/390/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 30.110s 2025-09-26 05:25:02.142 4682 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node3 5m 29.342s 2025-09-26 05:26:01.374 5644 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 475 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 29.411s 2025-09-26 05:26:01.443 5676 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 475 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 29.481s 2025-09-26 05:26:01.513 5662 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 475 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 29.744s 2025-09-26 05:26:01.776 5614 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 475 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 29.832s 2025-09-26 05:26:01.864 5647 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 475 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/475
node3 5m 29.832s 2025-09-26 05:26:01.864 5648 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node0 5m 29.901s 2025-09-26 05:26:01.933 5665 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 475 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/475
node0 5m 29.902s 2025-09-26 05:26:01.934 5666 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node2 5m 29.902s 2025-09-26 05:26:01.934 5681 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 475 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/475
node2 5m 29.902s 2025-09-26 05:26:01.934 5682 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node1 5m 29.905s 2025-09-26 05:26:01.937 5619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 475 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/475
node1 5m 29.906s 2025-09-26 05:26:01.938 5620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node3 5m 29.926s 2025-09-26 05:26:01.958 5689 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node3 5m 29.928s 2025-09-26 05:26:01.960 5690 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 475 Timestamp: 2025-09-26T05:26:00.026510818Z Next consensus number: 10708 Legacy running event hash: 6e8f2d79ab370a57bd187aeb2f3e71457adb371f09340a93946fa88db856f51a8a022c895b6e1eb5c075930781d89fbc Legacy running event mnemonic: spin-quality-odor-enough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1330144627 Root hash: 41af6f8a3d0a46bc3b513743bfff1ff82cf9e464aeb74503865ce32a7555bb50788b69579e4209942486f39340c80142 (root) ConsistencyTestingToolState / curious-vanish-pattern-stuff 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 arrange-post-birth-whisper 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -5344729486344857792 /3 van-primary-rifle-change 4 StringLeaf 475 /4 wealth-holiday-orange-grit
node3 5m 29.935s 2025-09-26 05:26:01.967 5691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 29.935s 2025-09-26 05:26:01.967 5692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 448 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 29.935s 2025-09-26 05:26:01.967 5693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 29.943s 2025-09-26 05:26:01.975 5694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 29.944s 2025-09-26 05:26:01.976 5695 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 475 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/475 {"round":475,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/475/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 29.945s 2025-09-26 05:26:01.977 5696 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/13
node0 5m 29.985s 2025-09-26 05:26:02.017 5707 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node0 5m 29.987s 2025-09-26 05:26:02.019 5708 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 475 Timestamp: 2025-09-26T05:26:00.026510818Z Next consensus number: 10708 Legacy running event hash: 6e8f2d79ab370a57bd187aeb2f3e71457adb371f09340a93946fa88db856f51a8a022c895b6e1eb5c075930781d89fbc Legacy running event mnemonic: spin-quality-odor-enough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1330144627 Root hash: 41af6f8a3d0a46bc3b513743bfff1ff82cf9e464aeb74503865ce32a7555bb50788b69579e4209942486f39340c80142 (root) ConsistencyTestingToolState / curious-vanish-pattern-stuff 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 arrange-post-birth-whisper 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -5344729486344857792 /3 van-primary-rifle-change 4 StringLeaf 475 /4 wealth-holiday-orange-grit
node2 5m 29.987s 2025-09-26 05:26:02.019 5723 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node2 5m 29.989s 2025-09-26 05:26:02.021 5724 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 475 Timestamp: 2025-09-26T05:26:00.026510818Z Next consensus number: 10708 Legacy running event hash: 6e8f2d79ab370a57bd187aeb2f3e71457adb371f09340a93946fa88db856f51a8a022c895b6e1eb5c075930781d89fbc Legacy running event mnemonic: spin-quality-odor-enough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1330144627 Root hash: 41af6f8a3d0a46bc3b513743bfff1ff82cf9e464aeb74503865ce32a7555bb50788b69579e4209942486f39340c80142 (root) ConsistencyTestingToolState / curious-vanish-pattern-stuff 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 arrange-post-birth-whisper 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -5344729486344857792 /3 van-primary-rifle-change 4 StringLeaf 475 /4 wealth-holiday-orange-grit
node1 5m 29.990s 2025-09-26 05:26:02.022 5661 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 475
node1 5m 29.992s 2025-09-26 05:26:02.024 5662 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 475 Timestamp: 2025-09-26T05:26:00.026510818Z Next consensus number: 10708 Legacy running event hash: 6e8f2d79ab370a57bd187aeb2f3e71457adb371f09340a93946fa88db856f51a8a022c895b6e1eb5c075930781d89fbc Legacy running event mnemonic: spin-quality-odor-enough Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1330144627 Root hash: 41af6f8a3d0a46bc3b513743bfff1ff82cf9e464aeb74503865ce32a7555bb50788b69579e4209942486f39340c80142 (root) ConsistencyTestingToolState / curious-vanish-pattern-stuff 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 arrange-post-birth-whisper 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -5344729486344857792 /3 van-primary-rifle-change 4 StringLeaf 475 /4 wealth-holiday-orange-grit
node0 5m 29.993s 2025-09-26 05:26:02.025 5709 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 29.994s 2025-09-26 05:26:02.026 5710 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 448 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 29.994s 2025-09-26 05:26:02.026 5711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 29.996s 2025-09-26 05:26:02.028 5725 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 29.996s 2025-09-26 05:26:02.028 5726 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 448 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 29.996s 2025-09-26 05:26:02.028 5727 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 29.998s 2025-09-26 05:26:02.030 5663 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 29.999s 2025-09-26 05:26:02.031 5664 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 448 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 29.999s 2025-09-26 05:26:02.031 5665 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 30.002s 2025-09-26 05:26:02.034 5712 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 30.002s 2025-09-26 05:26:02.034 5713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 475 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/475 {"round":475,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/475/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 30.004s 2025-09-26 05:26:02.036 5714 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/13
node2 5m 30.004s 2025-09-26 05:26:02.036 5728 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 30.005s 2025-09-26 05:26:02.037 5729 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 475 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/475 {"round":475,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/475/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 30.006s 2025-09-26 05:26:02.038 5730 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/13
node1 5m 30.007s 2025-09-26 05:26:02.039 5666 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 30.007s 2025-09-26 05:26:02.039 5667 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 475 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/475 {"round":475,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/475/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 30.009s 2025-09-26 05:26:02.041 5668 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/13
node4 5m 54.192s 2025-09-26 05:26:26.224 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 54.285s 2025-09-26 05:26:26.317 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 54.302s 2025-09-26 05:26:26.334 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 54.422s 2025-09-26 05:26:26.454 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 54.429s 2025-09-26 05:26:26.461 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 54.443s 2025-09-26 05:26:26.475 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 54.891s 2025-09-26 05:26:26.923 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 54.892s 2025-09-26 05:26:26.924 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 56.043s 2025-09-26 05:26:28.075 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1150ms
node4 5m 56.053s 2025-09-26 05:26:28.085 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 56.056s 2025-09-26 05:26:28.088 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 56.096s 2025-09-26 05:26:28.128 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 56.158s 2025-09-26 05:26:28.190 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 56.159s 2025-09-26 05:26:28.191 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 58.216s 2025-09-26 05:26:30.248 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 58.305s 2025-09-26 05:26:30.337 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 58.312s 2025-09-26 05:26:30.344 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/206/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/110/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/13/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh
node4 5m 58.313s 2025-09-26 05:26:30.345 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 58.313s 2025-09-26 05:26:30.345 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/206/SignedState.swh
node4 5m 58.317s 2025-09-26 05:26:30.349 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 58.322s 2025-09-26 05:26:30.354 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 58.456s 2025-09-26 05:26:30.488 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 58.460s 2025-09-26 05:26:30.492 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":206,"consensusTimestamp":"2025-09-26T05:23:00.006106303Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 58.462s 2025-09-26 05:26:30.494 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 58.464s 2025-09-26 05:26:30.496 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 58.466s 2025-09-26 05:26:30.498 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 58.473s 2025-09-26 05:26:30.505 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 58.474s 2025-09-26 05:26:30.506 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.506s 2025-09-26 05:26:31.538 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26260941] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=150450, randomLong=-7977769515591875702, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=7060, randomLong=8373442978291244586, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1037860, data=35, exception=null] OS Health Check Report - Complete (took 1016 ms)
node4 5m 59.531s 2025-09-26 05:26:31.563 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 59.620s 2025-09-26 05:26:31.652 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 278
node4 5m 59.623s 2025-09-26 05:26:31.655 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 59.628s 2025-09-26 05:26:31.660 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 59.703s 2025-09-26 05:26:31.735 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "Ij9eew==", "port": 30124 }, { "ipAddressV4": "CoAAEg==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "aMbUwA==", "port": 30125 }, { "ipAddressV4": "CoAAEQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I8FKkQ==", "port": 30126 }, { "ipAddressV4": "CoAAEA==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ih4HYg==", "port": 30127 }, { "ipAddressV4": "CoAAFQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IkNwDw==", "port": 30128 }, { "ipAddressV4": "CoAAFg==", "port": 30128 }] }] }
node4 5m 59.724s 2025-09-26 05:26:31.756 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 8483062483540997562.
node4 5m 59.725s 2025-09-26 05:26:31.757 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 206 rounds handled.
node4 5m 59.725s 2025-09-26 05:26:31.757 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 59.726s 2025-09-26 05:26:31.758 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6.009m 2025-09-26 05:26:32.549 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 206 Timestamp: 2025-09-26T05:23:00.006106303Z Next consensus number: 5367 Legacy running event hash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771 Legacy running event mnemonic: snake-defy-series-soup Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1992807165 Root hash: 55bc7db546d0ae6cfbff12cdaf2e7b0bccbbb6ee15a8257171536206df913cbfc4d091adf462eee8496bf5e254f5d5b8 (root) ConsistencyTestingToolState / resource-repeat-sure-crater 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 service-jealous-butter-type 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 8483062483540997562 /3 broom-chapter-table-welcome 4 StringLeaf 206 /4 trust-eyebrow-seminar-sound
node4 6.013m 2025-09-26 05:26:32.816 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: e997ee333e206e7f3fbb5a2190ae8529e6b26a086d66b08952c261095e9ed8e4190f027c9e63767dddf88b705b011771
node4 6.013m 2025-09-26 05:26:32.830 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 179
node4 6.013m 2025-09-26 05:26:32.837 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6.013m 2025-09-26 05:26:32.838 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6.013m 2025-09-26 05:26:32.840 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6.014m 2025-09-26 05:26:32.844 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6.014m 2025-09-26 05:26:32.845 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6.014m 2025-09-26 05:26:32.846 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6.014m 2025-09-26 05:26:32.848 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 179
node4 6.014m 2025-09-26 05:26:32.853 69 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 205.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6m 1.020s 2025-09-26 05:26:33.052 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:09173685b58e BR:204), num remaining: 4
node4 6m 1.022s 2025-09-26 05:26:33.054 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:02099d2f8c56 BR:205), num remaining: 3
node4 6m 1.022s 2025-09-26 05:26:33.054 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:a791fc188397 BR:204), num remaining: 2
node4 6m 1.023s 2025-09-26 05:26:33.055 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:e38c98fb70b7 BR:205), num remaining: 1
node4 6m 1.024s 2025-09-26 05:26:33.056 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:26d7147e4c2a BR:204), num remaining: 0
node4 6m 1.407s 2025-09-26 05:26:33.439 474 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 2,621 preconsensus events with max birth round 278. These events contained 6,718 transactions. 71 rounds reached consensus spanning 43.9 seconds of consensus time. The latest round to reach consensus is round 277. Replay took 589.0 milliseconds.
node4 6m 1.409s 2025-09-26 05:26:33.441 477 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6m 1.411s 2025-09-26 05:26:33.443 478 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 587.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 2.324s 2025-09-26 05:26:34.356 709 INFO PLATFORM_STATUS <platformForkJoinThread-8> DefaultStatusStateMachine: Platform spent 911.0 ms in OBSERVING. Now in BEHIND
node4 6m 2.324s 2025-09-26 05:26:34.356 710 INFO RECONNECT <platformForkJoinThread-4> ReconnectController: Starting ReconnectController
node4 6m 2.326s 2025-09-26 05:26:34.358 711 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 6m 2.326s 2025-09-26 05:26:34.358 712 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 6m 2.327s 2025-09-26 05:26:34.359 713 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 6m 2.329s 2025-09-26 05:26:34.361 714 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 6m 2.329s 2025-09-26 05:26:34.361 715 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node2 6m 2.563s 2025-09-26 05:26:34.595 6288 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":2,"otherNodeId":4,"round":526} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node2 6m 2.564s 2025-09-26 05:26:34.596 6289 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 526 Timestamp: 2025-09-26T05:26:32.506415Z Next consensus number: 11530 Legacy running event hash: c8693fcfad1a8da0165645a2cb53c086895b556aff47921f812cf0b1af440185f660d66eccff9cb178c808f9ed004716 Legacy running event mnemonic: very-wheat-fresh-equal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1807756379 Root hash: 146ce67b1f38c78a026085413133f538b59aa1de2670a2b267302ccd49359e2c34142b21d4127189d73e7a2462cb4f29 (root) ConsistencyTestingToolState / mirror-cost-train-attract 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 peasant-chest-quantum-jump 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -2879152150151860743 /3 exile-someone-decrease-balcony 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node2 6m 2.564s 2025-09-26 05:26:34.596 6290 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Sending signatures from nodes 1, 2, 3 (signing weight = 37500000000/50000000000) for state hash 146ce67b1f38c78a026085413133f538b59aa1de2670a2b267302ccd49359e2c34142b21d4127189d73e7a2462cb4f29
node2 6m 2.565s 2025-09-26 05:26:34.597 6291 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node2 6m 2.569s 2025-09-26 05:26:34.601 6292 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node2 6m 2.576s 2025-09-26 05:26:34.608 6293 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@6aa77ba6 start run()
node4 6m 2.628s 2025-09-26 05:26:34.660 716 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":276} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 2.630s 2025-09-26 05:26:34.662 717 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 6m 2.635s 2025-09-26 05:26:34.667 718 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 1, 2, 3
node4 6m 2.638s 2025-09-26 05:26:34.670 719 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 6m 2.638s 2025-09-26 05:26:34.670 720 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 6m 2.639s 2025-09-26 05:26:34.671 721 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 2.644s 2025-09-26 05:26:34.676 722 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@25378594 start run()
node4 6m 2.648s 2025-09-26 05:26:34.680 723 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node2 6m 2.730s 2025-09-26 05:26:34.762 6320 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@6aa77ba6 finish run()
node2 6m 2.731s 2025-09-26 05:26:34.763 6321 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 6m 2.731s 2025-09-26 05:26:34.763 6322 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node2 6m 2.732s 2025-09-26 05:26:34.764 6323 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@23c534e9 start run()
node4 6m 2.842s 2025-09-26 05:26:34.874 747 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 2.842s 2025-09-26 05:26:34.874 748 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 2.843s 2025-09-26 05:26:34.875 749 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@25378594 finish run()
node4 6m 2.844s 2025-09-26 05:26:34.876 750 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 2.844s 2025-09-26 05:26:34.876 751 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 2.847s 2025-09-26 05:26:34.879 752 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3a6f1468 start run()
node4 6m 2.909s 2025-09-26 05:26:34.941 753 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6m 2.909s 2025-09-26 05:26:34.941 754 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 2.912s 2025-09-26 05:26:34.944 755 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 2.912s 2025-09-26 05:26:34.944 756 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 2.913s 2025-09-26 05:26:34.945 757 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 2.913s 2025-09-26 05:26:34.945 758 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 2.913s 2025-09-26 05:26:34.945 759 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 2.914s 2025-09-26 05:26:34.946 760 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 2.914s 2025-09-26 05:26:34.946 761 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node2 6m 2.982s 2025-09-26 05:26:35.014 6324 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@23c534e9 finish run()
node2 6m 2.982s 2025-09-26 05:26:35.014 6325 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 6m 2.985s 2025-09-26 05:26:35.017 6328 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 6m 3.069s 2025-09-26 05:26:35.101 771 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 3.070s 2025-09-26 05:26:35.102 773 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 3.070s 2025-09-26 05:26:35.102 774 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 3.070s 2025-09-26 05:26:35.102 775 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 3.071s 2025-09-26 05:26:35.103 776 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3a6f1468 finish run()
node4 6m 3.072s 2025-09-26 05:26:35.104 777 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 3.072s 2025-09-26 05:26:35.104 778 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6m 3.072s 2025-09-26 05:26:35.104 779 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6m 3.072s 2025-09-26 05:26:35.104 780 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6m 3.073s 2025-09-26 05:26:35.105 781 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6m 3.073s 2025-09-26 05:26:35.105 782 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6m 3.073s 2025-09-26 05:26:35.105 783 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6m 3.074s 2025-09-26 05:26:35.106 784 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6m 3.074s 2025-09-26 05:26:35.106 785 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6m 3.077s 2025-09-26 05:26:35.109 786 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.434,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 3.078s 2025-09-26 05:26:35.110 787 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6m 3.078s 2025-09-26 05:26:35.110 788 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6m 3.081s 2025-09-26 05:26:35.113 789 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.0060558319091796875} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6m 3.084s 2025-09-26 05:26:35.116 790 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":526,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 3.085s 2025-09-26 05:26:35.117 791 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 526 Timestamp: 2025-09-26T05:26:32.506415Z Next consensus number: 11530 Legacy running event hash: c8693fcfad1a8da0165645a2cb53c086895b556aff47921f812cf0b1af440185f660d66eccff9cb178c808f9ed004716 Legacy running event mnemonic: very-wheat-fresh-equal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1807756379 Root hash: 146ce67b1f38c78a026085413133f538b59aa1de2670a2b267302ccd49359e2c34142b21d4127189d73e7a2462cb4f29 (root) ConsistencyTestingToolState / mirror-cost-train-attract 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 peasant-chest-quantum-jump 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -2879152150151860743 /3 exile-someone-decrease-balcony 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node4 6m 3.086s 2025-09-26 05:26:35.118 793 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6m 3.086s 2025-09-26 05:26:35.118 794 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -2879152150151860743.
node4 6m 3.086s 2025-09-26 05:26:35.118 795 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 526 rounds handled.
node4 6m 3.087s 2025-09-26 05:26:35.119 796 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 3.087s 2025-09-26 05:26:35.119 797 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 3.112s 2025-09-26 05:26:35.144 804 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 526 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 3.112s 2025-09-26 05:26:35.144 805 INFO PLATFORM_STATUS <platformForkJoinThread-4> DefaultStatusStateMachine: Platform spent 787.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 3.113s 2025-09-26 05:26:35.145 806 INFO STARTUP <platformForkJoinThread-3> Shadowgraph: Shadowgraph starting from expiration threshold 499
node4 6m 3.114s 2025-09-26 05:26:35.146 809 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 526 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526
node4 6m 3.116s 2025-09-26 05:26:35.148 810 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 526
node4 6m 3.133s 2025-09-26 05:26:35.165 820 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: c8693fcfad1a8da0165645a2cb53c086895b556aff47921f812cf0b1af440185f660d66eccff9cb178c808f9ed004716
node4 6m 3.134s 2025-09-26 05:26:35.166 821 INFO STARTUP <platformForkJoinThread-7> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr278_orgn0.pces. All future files will have an origin round of 526.
node2 6m 3.154s 2025-09-26 05:26:35.186 6329 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":2,"otherNodeId":4,"round":526,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 3.269s 2025-09-26 05:26:35.301 855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 526
node4 6m 3.273s 2025-09-26 05:26:35.305 856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 526 Timestamp: 2025-09-26T05:26:32.506415Z Next consensus number: 11530 Legacy running event hash: c8693fcfad1a8da0165645a2cb53c086895b556aff47921f812cf0b1af440185f660d66eccff9cb178c808f9ed004716 Legacy running event mnemonic: very-wheat-fresh-equal Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1807756379 Root hash: 146ce67b1f38c78a026085413133f538b59aa1de2670a2b267302ccd49359e2c34142b21d4127189d73e7a2462cb4f29 (root) ConsistencyTestingToolState / mirror-cost-train-attract 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 peasant-chest-quantum-jump 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -2879152150151860743 /3 exile-someone-decrease-balcony 4 StringLeaf 526 /4 plastic-rookie-simple-corn
node4 6m 3.322s 2025-09-26 05:26:35.354 857 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr278_orgn0.pces
node4 6m 3.323s 2025-09-26 05:26:35.355 858 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 499
node4 6m 3.331s 2025-09-26 05:26:35.363 859 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 526 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526 {"round":526,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/526/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 3.335s 2025-09-26 05:26:35.367 860 INFO PLATFORM_STATUS <platformForkJoinThread-7> DefaultStatusStateMachine: Platform spent 221.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 3.819s 2025-09-26 05:26:35.851 861 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 3.821s 2025-09-26 05:26:35.853 862 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 4.216s 2025-09-26 05:26:36.248 863 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:a7b3a85b309c BR:524), num remaining: 3
node4 6m 4.220s 2025-09-26 05:26:36.252 864 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:6aeb6efc8b9f BR:524), num remaining: 2
node4 6m 4.221s 2025-09-26 05:26:36.253 865 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:babcff7dee2e BR:524), num remaining: 1
node4 6m 4.221s 2025-09-26 05:26:36.253 866 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:42b2d4eb4bc3 BR:524), num remaining: 0
node4 6m 7.562s 2025-09-26 05:26:39.594 946 INFO PLATFORM_STATUS <platformForkJoinThread-5> DefaultStatusStateMachine: Platform spent 4.2 s in CHECKING. Now in ACTIVE
node2 6m 29.760s 2025-09-26 05:27:01.792 6808 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 569 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 29.831s 2025-09-26 05:27:01.863 1324 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 569 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 29.903s 2025-09-26 05:27:01.935 6689 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 569 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 6m 29.907s 2025-09-26 05:27:01.939 6715 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 569 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 29.910s 2025-09-26 05:27:01.942 6729 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 569 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 30.225s 2025-09-26 05:27:02.257 6692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 569 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/569
node1 6m 30.226s 2025-09-26 05:27:02.258 6693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node2 6m 30.261s 2025-09-26 05:27:02.293 6811 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 569 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/569
node2 6m 30.262s 2025-09-26 05:27:02.294 6812 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 569
node4 6m 30.318s 2025-09-26 05:27:02.350 1327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 569 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/569
node1 6m 30.319s 2025-09-26 05:27:02.351 6724 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node1 6m 30.321s 2025-09-26 05:27:02.353 6725 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 569 Timestamp: 2025-09-26T05:27:00.516136Z Next consensus number: 12622 Legacy running event hash: b7443b14df2394a791bfe614de464262c8dc7986bb83e79441da034bd6f3dc0355d71f779f8aa3dfd95bb5dbc1dc8f96 Legacy running event mnemonic: bleak-glad-brick-hen Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -935597965 Root hash: cf0fb2377a2a99c3f7c628e1c6c8950caf75ae776a3b2ef77a7fbaf8b7cd4ed5d6591ef19167d3e9e562695dd0a94c19 (root) ConsistencyTestingToolState / bullet-rice-earn-sibling 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 crash-bone-brief-guess 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -8539900176799430457 /3 victory-novel-about-observe 4 StringLeaf 569 /4 night-build-miracle-pledge
node4 6m 30.322s 2025-09-26 05:27:02.354 1328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 569
node1 6m 30.329s 2025-09-26 05:27:02.361 6726 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+26+19.042073740Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 30.330s 2025-09-26 05:27:02.362 6718 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 569 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/569
node3 6m 30.331s 2025-09-26 05:27:02.363 6719 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node0 6m 30.332s 2025-09-26 05:27:02.364 6732 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 569 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/569
node0 6m 30.332s 2025-09-26 05:27:02.364 6733 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node1 6m 30.332s 2025-09-26 05:27:02.364 6727 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 542 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+26+19.042073740Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 30.332s 2025-09-26 05:27:02.364 6728 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 30.334s 2025-09-26 05:27:02.366 6729 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 30.334s 2025-09-26 05:27:02.366 6730 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 569 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/569 {"round":569,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/569/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 30.336s 2025-09-26 05:27:02.368 6731 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/110
node2 6m 30.347s 2025-09-26 05:27:02.379 6855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 569
node2 6m 30.348s 2025-09-26 05:27:02.380 6856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 569 Timestamp: 2025-09-26T05:27:00.516136Z Next consensus number: 12622 Legacy running event hash: b7443b14df2394a791bfe614de464262c8dc7986bb83e79441da034bd6f3dc0355d71f779f8aa3dfd95bb5dbc1dc8f96 Legacy running event mnemonic: bleak-glad-brick-hen Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -935597965 Root hash: cf0fb2377a2a99c3f7c628e1c6c8950caf75ae776a3b2ef77a7fbaf8b7cd4ed5d6591ef19167d3e9e562695dd0a94c19 (root) ConsistencyTestingToolState / bullet-rice-earn-sibling 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 crash-bone-brief-guess 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -8539900176799430457 /3 victory-novel-about-observe 4 StringLeaf 569 /4 night-build-miracle-pledge
node2 6m 30.353s 2025-09-26 05:27:02.385 6857 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+26+19.071216276Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 30.354s 2025-09-26 05:27:02.386 6858 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 542 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+26+19.071216276Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 30.354s 2025-09-26 05:27:02.386 6859 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 30.355s 2025-09-26 05:27:02.387 6860 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 30.355s 2025-09-26 05:27:02.387 6861 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 569 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/569 {"round":569,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/569/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 30.357s 2025-09-26 05:27:02.389 6862 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/110
node0 6m 30.416s 2025-09-26 05:27:02.448 6768 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node0 6m 30.418s 2025-09-26 05:27:02.450 6769 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 569 Timestamp: 2025-09-26T05:27:00.516136Z Next consensus number: 12622 Legacy running event hash: b7443b14df2394a791bfe614de464262c8dc7986bb83e79441da034bd6f3dc0355d71f779f8aa3dfd95bb5dbc1dc8f96 Legacy running event mnemonic: bleak-glad-brick-hen Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -935597965 Root hash: cf0fb2377a2a99c3f7c628e1c6c8950caf75ae776a3b2ef77a7fbaf8b7cd4ed5d6591ef19167d3e9e562695dd0a94c19 (root) ConsistencyTestingToolState / bullet-rice-earn-sibling 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 crash-bone-brief-guess 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -8539900176799430457 /3 victory-novel-about-observe 4 StringLeaf 569 /4 night-build-miracle-pledge
node3 6m 30.423s 2025-09-26 05:27:02.455 6750 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 569
node0 6m 30.425s 2025-09-26 05:27:02.457 6770 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+26+19.060584658Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 30.425s 2025-09-26 05:27:02.457 6751 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 569 Timestamp: 2025-09-26T05:27:00.516136Z Next consensus number: 12622 Legacy running event hash: b7443b14df2394a791bfe614de464262c8dc7986bb83e79441da034bd6f3dc0355d71f779f8aa3dfd95bb5dbc1dc8f96 Legacy running event mnemonic: bleak-glad-brick-hen Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -935597965 Root hash: cf0fb2377a2a99c3f7c628e1c6c8950caf75ae776a3b2ef77a7fbaf8b7cd4ed5d6591ef19167d3e9e562695dd0a94c19 (root) ConsistencyTestingToolState / bullet-rice-earn-sibling 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 crash-bone-brief-guess 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -8539900176799430457 /3 victory-novel-about-observe 4 StringLeaf 569 /4 night-build-miracle-pledge
node0 6m 30.428s 2025-09-26 05:27:02.460 6771 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 542 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+26+19.060584658Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 30.428s 2025-09-26 05:27:02.460 6772 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 30.430s 2025-09-26 05:27:02.462 6773 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 30.430s 2025-09-26 05:27:02.462 6774 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 569 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/569 {"round":569,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/569/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 30.431s 2025-09-26 05:27:02.463 6775 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/110
node3 6m 30.431s 2025-09-26 05:27:02.463 6752 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+26+19.071015685Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 30.434s 2025-09-26 05:27:02.466 6753 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 542 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+26+19.071015685Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 30.434s 2025-09-26 05:27:02.466 6754 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 30.435s 2025-09-26 05:27:02.467 6755 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 30.436s 2025-09-26 05:27:02.468 6756 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 569 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/569 {"round":569,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/569/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 30.437s 2025-09-26 05:27:02.469 6757 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/110
node4 6m 30.443s 2025-09-26 05:27:02.475 1366 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 569
node4 6m 30.446s 2025-09-26 05:27:02.478 1367 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 569 Timestamp: 2025-09-26T05:27:00.516136Z Next consensus number: 12622 Legacy running event hash: b7443b14df2394a791bfe614de464262c8dc7986bb83e79441da034bd6f3dc0355d71f779f8aa3dfd95bb5dbc1dc8f96 Legacy running event mnemonic: bleak-glad-brick-hen Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -935597965 Root hash: cf0fb2377a2a99c3f7c628e1c6c8950caf75ae776a3b2ef77a7fbaf8b7cd4ed5d6591ef19167d3e9e562695dd0a94c19 (root) ConsistencyTestingToolState / bullet-rice-earn-sibling 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 crash-bone-brief-guess 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf -8539900176799430457 /3 victory-novel-about-observe 4 StringLeaf 569 /4 night-build-miracle-pledge
node4 6m 30.460s 2025-09-26 05:27:02.492 1368 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+26+35.687670857Z_seq1_minr499_maxr999_orgn526.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr278_orgn0.pces
node4 6m 30.460s 2025-09-26 05:27:02.492 1369 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 542 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+26+35.687670857Z_seq1_minr499_maxr999_orgn526.pces
node4 6m 30.461s 2025-09-26 05:27:02.493 1370 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 30.463s 2025-09-26 05:27:02.495 1371 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 30.464s 2025-09-26 05:27:02.496 1372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 569 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/569 {"round":569,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/569/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 30.466s 2025-09-26 05:27:02.498 1373 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node2 7m 29.150s 2025-09-26 05:28:01.182 7918 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 662 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 29.182s 2025-09-26 05:28:01.214 7799 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 662 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 29.271s 2025-09-26 05:28:01.303 7819 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 662 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 29.366s 2025-09-26 05:28:01.398 2409 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 662 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 29.383s 2025-09-26 05:28:01.415 7791 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 662 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 29.555s 2025-09-26 05:28:01.587 7931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 662 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/662
node2 7m 29.556s 2025-09-26 05:28:01.588 7932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 662
node3 7m 29.571s 2025-09-26 05:28:01.603 7812 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 662 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/662
node3 7m 29.572s 2025-09-26 05:28:01.604 7813 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node1 7m 29.622s 2025-09-26 05:28:01.654 7794 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 662 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/662
node1 7m 29.623s 2025-09-26 05:28:01.655 7795 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node0 7m 29.625s 2025-09-26 05:28:01.657 7832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 662 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/662
node0 7m 29.626s 2025-09-26 05:28:01.658 7833 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node2 7m 29.640s 2025-09-26 05:28:01.672 7963 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 662
node2 7m 29.642s 2025-09-26 05:28:01.674 7964 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 662 Timestamp: 2025-09-26T05:28:00.186989Z Next consensus number: 15095 Legacy running event hash: 50798feaf1e3ad243c9b4a40e152d73a2802e81cea785a94d38af84bab2328c8b0da46e15bcf0be2892865f1592100db Legacy running event mnemonic: august-actress-senior-fortune Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1181861837 Root hash: ee552c65d252d1d5734b48041deb27ee647fa0c5677cbefa422a11d5120943d180050d23013cc5f95248af7a1feeaf45 (root) ConsistencyTestingToolState / cube-million-remind-shell 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keen-access-actual-extra 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 4649544575044945589 /3 agree-teach-guess-bind 4 StringLeaf 662 /4 permit-donate-history-wide
node2 7m 29.648s 2025-09-26 05:28:01.680 7965 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+20+48.332556240Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+26+19.071216276Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 29.648s 2025-09-26 05:28:01.680 7966 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 634 File: data/saved/preconsensus-events/2/2025/09/26/2025-09-26T05+26+19.071216276Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 29.648s 2025-09-26 05:28:01.680 7967 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 29.651s 2025-09-26 05:28:01.683 7968 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 29.652s 2025-09-26 05:28:01.684 7969 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 662 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/662 {"round":662,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/662/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 29.653s 2025-09-26 05:28:01.685 7970 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/206
node3 7m 29.666s 2025-09-26 05:28:01.698 7844 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node3 7m 29.668s 2025-09-26 05:28:01.700 7845 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 662 Timestamp: 2025-09-26T05:28:00.186989Z Next consensus number: 15095 Legacy running event hash: 50798feaf1e3ad243c9b4a40e152d73a2802e81cea785a94d38af84bab2328c8b0da46e15bcf0be2892865f1592100db Legacy running event mnemonic: august-actress-senior-fortune Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1181861837 Root hash: ee552c65d252d1d5734b48041deb27ee647fa0c5677cbefa422a11d5120943d180050d23013cc5f95248af7a1feeaf45 (root) ConsistencyTestingToolState / cube-million-remind-shell 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keen-access-actual-extra 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 4649544575044945589 /3 agree-teach-guess-bind 4 StringLeaf 662 /4 permit-donate-history-wide
node3 7m 29.675s 2025-09-26 05:28:01.707 7854 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+26+19.071015685Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+20+48.750781255Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 29.676s 2025-09-26 05:28:01.708 7855 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 634 File: data/saved/preconsensus-events/3/2025/09/26/2025-09-26T05+26+19.071015685Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 29.676s 2025-09-26 05:28:01.708 7856 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 29.679s 2025-09-26 05:28:01.711 7857 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 29.680s 2025-09-26 05:28:01.712 7858 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 662 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/662 {"round":662,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/662/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 29.681s 2025-09-26 05:28:01.713 7859 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/206
node0 7m 29.708s 2025-09-26 05:28:01.740 7864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node0 7m 29.710s 2025-09-26 05:28:01.742 7865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 662 Timestamp: 2025-09-26T05:28:00.186989Z Next consensus number: 15095 Legacy running event hash: 50798feaf1e3ad243c9b4a40e152d73a2802e81cea785a94d38af84bab2328c8b0da46e15bcf0be2892865f1592100db Legacy running event mnemonic: august-actress-senior-fortune Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1181861837 Root hash: ee552c65d252d1d5734b48041deb27ee647fa0c5677cbefa422a11d5120943d180050d23013cc5f95248af7a1feeaf45 (root) ConsistencyTestingToolState / cube-million-remind-shell 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keen-access-actual-extra 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 4649544575044945589 /3 agree-teach-guess-bind 4 StringLeaf 662 /4 permit-donate-history-wide
node0 7m 29.718s 2025-09-26 05:28:01.750 7866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+26+19.060584658Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+20+48.144926937Z_seq0_minr1_maxr501_orgn0.pces
node0 7m 29.718s 2025-09-26 05:28:01.750 7867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 634 File: data/saved/preconsensus-events/0/2025/09/26/2025-09-26T05+26+19.060584658Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 29.718s 2025-09-26 05:28:01.750 7868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 29.721s 2025-09-26 05:28:01.753 7869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 29.722s 2025-09-26 05:28:01.754 7870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 662 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/662 {"round":662,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/662/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 29.723s 2025-09-26 05:28:01.755 7871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/206
node1 7m 29.724s 2025-09-26 05:28:01.756 7826 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 662
node1 7m 29.726s 2025-09-26 05:28:01.758 7827 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 662 Timestamp: 2025-09-26T05:28:00.186989Z Next consensus number: 15095 Legacy running event hash: 50798feaf1e3ad243c9b4a40e152d73a2802e81cea785a94d38af84bab2328c8b0da46e15bcf0be2892865f1592100db Legacy running event mnemonic: august-actress-senior-fortune Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1181861837 Root hash: ee552c65d252d1d5734b48041deb27ee647fa0c5677cbefa422a11d5120943d180050d23013cc5f95248af7a1feeaf45 (root) ConsistencyTestingToolState / cube-million-remind-shell 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keen-access-actual-extra 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 4649544575044945589 /3 agree-teach-guess-bind 4 StringLeaf 662 /4 permit-donate-history-wide
node1 7m 29.733s 2025-09-26 05:28:01.765 7828 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+20+48.392292879Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+26+19.042073740Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 29.733s 2025-09-26 05:28:01.765 7829 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 634 File: data/saved/preconsensus-events/1/2025/09/26/2025-09-26T05+26+19.042073740Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 29.733s 2025-09-26 05:28:01.765 7830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 29.737s 2025-09-26 05:28:01.769 7831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 29.737s 2025-09-26 05:28:01.769 7832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 662 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/662 {"round":662,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/662/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 29.738s 2025-09-26 05:28:01.770 7833 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/206
node4 7m 29.760s 2025-09-26 05:28:01.792 2412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 662 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/662
node4 7m 29.762s 2025-09-26 05:28:01.794 2413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 662
node4 7m 29.872s 2025-09-26 05:28:01.904 2459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 662
node4 7m 29.874s 2025-09-26 05:28:01.906 2460 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 662 Timestamp: 2025-09-26T05:28:00.186989Z Next consensus number: 15095 Legacy running event hash: 50798feaf1e3ad243c9b4a40e152d73a2802e81cea785a94d38af84bab2328c8b0da46e15bcf0be2892865f1592100db Legacy running event mnemonic: august-actress-senior-fortune Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1181861837 Root hash: ee552c65d252d1d5734b48041deb27ee647fa0c5677cbefa422a11d5120943d180050d23013cc5f95248af7a1feeaf45 (root) ConsistencyTestingToolState / cube-million-remind-shell 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 keen-access-actual-extra 1 SingletonNode RosterService.ROSTER_STATE /1 verify-warrior-speak-dress 2 VirtualMap RosterService.ROSTERS /2 fancy-struggle-scatter-invite 3 StringLeaf 4649544575044945589 /3 agree-teach-guess-bind 4 StringLeaf 662 /4 permit-donate-history-wide
node4 7m 29.884s 2025-09-26 05:28:01.916 2461 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+26+35.687670857Z_seq1_minr499_maxr999_orgn526.pces Last file: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+20+48.485855987Z_seq0_minr1_maxr278_orgn0.pces
node4 7m 29.884s 2025-09-26 05:28:01.916 2462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 634 File: data/saved/preconsensus-events/4/2025/09/26/2025-09-26T05+26+35.687670857Z_seq1_minr499_maxr999_orgn526.pces
node4 7m 29.884s 2025-09-26 05:28:01.916 2463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 29.887s 2025-09-26 05:28:01.919 2464 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 29.888s 2025-09-26 05:28:01.920 2465 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 662 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/662 {"round":662,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/662/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 29.890s 2025-09-26 05:28:01.922 2466 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/13
node3 7m 56.438s 2025-09-26 05:28:28.470 8323 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 3 to 0>> NetworkUtils: Connection broken: 3 <- 0
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T05:28:28.469677313Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T05:28:28.469677313Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node3 7m 56.575s 2025-09-26 05:28:28.607 8324 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 3 to 2>> NetworkUtils: Connection broken: 3 <- 2
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 7m 56.677s 2025-09-26 05:28:28.709 8325 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 7m 56.709s 2025-09-26 05:28:28.741 8326 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T05:28:28.741055079Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-09-26T05:28:28.741055079Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more