Node ID







Columns











Log Level





Log Marker








Class


















































node3 0.000ns 2025-10-30 18:18:14.789 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 85.000ms 2025-10-30 18:18:14.874 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 86.000ms 2025-10-30 18:18:14.875 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 100.000ms 2025-10-30 18:18:14.889 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 174.000ms 2025-10-30 18:18:14.963 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 190.000ms 2025-10-30 18:18:14.979 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 207.000ms 2025-10-30 18:18:14.996 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 235.000ms 2025-10-30 18:18:15.024 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 298.000ms 2025-10-30 18:18:15.087 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 327.000ms 2025-10-30 18:18:15.116 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 972.000ms 2025-10-30 18:18:15.761 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 998.000ms 2025-10-30 18:18:15.787 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 1.059s 2025-10-30 18:18:15.848 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.075s 2025-10-30 18:18:15.864 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.089s 2025-10-30 18:18:15.878 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 1.106s 2025-10-30 18:18:15.895 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.184s 2025-10-30 18:18:15.973 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.213s 2025-10-30 18:18:16.002 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.221s 2025-10-30 18:18:16.010 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 1.253s 2025-10-30 18:18:16.042 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 1.513s 2025-10-30 18:18:16.302 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1276ms
node3 1.524s 2025-10-30 18:18:16.313 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 1.527s 2025-10-30 18:18:16.316 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.532s 2025-10-30 18:18:16.321 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1204ms
node2 1.540s 2025-10-30 18:18:16.329 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.542s 2025-10-30 18:18:16.331 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.572s 2025-10-30 18:18:16.361 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.576s 2025-10-30 18:18:16.365 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 1.584s 2025-10-30 18:18:16.373 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 1.633s 2025-10-30 18:18:16.422 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 1.633s 2025-10-30 18:18:16.422 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 1.641s 2025-10-30 18:18:16.430 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.642s 2025-10-30 18:18:16.431 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 1.677s 2025-10-30 18:18:16.466 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 1.694s 2025-10-30 18:18:16.483 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 1.809s 2025-10-30 18:18:16.598 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 1.840s 2025-10-30 18:18:16.629 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 2.467s 2025-10-30 18:18:17.256 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1252ms
node4 2.478s 2025-10-30 18:18:17.267 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 2.481s 2025-10-30 18:18:17.270 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 2.523s 2025-10-30 18:18:17.312 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 2.585s 2025-10-30 18:18:17.374 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 2.586s 2025-10-30 18:18:17.375 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 2.670s 2025-10-30 18:18:17.459 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1415ms
node1 2.679s 2025-10-30 18:18:17.468 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 2.683s 2025-10-30 18:18:17.472 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.724s 2025-10-30 18:18:17.513 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 2.801s 2025-10-30 18:18:17.590 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 2.803s 2025-10-30 18:18:17.592 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 3.100s 2025-10-30 18:18:17.889 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1258ms
node0 3.109s 2025-10-30 18:18:17.898 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 3.112s 2025-10-30 18:18:17.901 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 3.163s 2025-10-30 18:18:17.952 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 3.226s 2025-10-30 18:18:18.015 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 3.227s 2025-10-30 18:18:18.016 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 3.599s 2025-10-30 18:18:18.388 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 3.673s 2025-10-30 18:18:18.462 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 3.697s 2025-10-30 18:18:18.486 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 3.700s 2025-10-30 18:18:18.489 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 3.735s 2025-10-30 18:18:18.524 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 3.759s 2025-10-30 18:18:18.548 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.761s 2025-10-30 18:18:18.550 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 3.794s 2025-10-30 18:18:18.583 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 4.478s 2025-10-30 18:18:19.267 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.480s 2025-10-30 18:18:19.269 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 4.486s 2025-10-30 18:18:19.275 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 4.496s 2025-10-30 18:18:19.285 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.497s 2025-10-30 18:18:19.286 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.561s 2025-10-30 18:18:19.350 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.563s 2025-10-30 18:18:19.352 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.568s 2025-10-30 18:18:19.357 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.577s 2025-10-30 18:18:19.366 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.579s 2025-10-30 18:18:19.368 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.611s 2025-10-30 18:18:19.400 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 4.705s 2025-10-30 18:18:19.494 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.709s 2025-10-30 18:18:19.498 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 4.745s 2025-10-30 18:18:19.534 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 4.863s 2025-10-30 18:18:19.652 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.966s 2025-10-30 18:18:19.755 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.969s 2025-10-30 18:18:19.758 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 5.005s 2025-10-30 18:18:19.794 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 5.294s 2025-10-30 18:18:20.083 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 5.396s 2025-10-30 18:18:20.185 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.398s 2025-10-30 18:18:20.187 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 5.434s 2025-10-30 18:18:20.223 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5.528s 2025-10-30 18:18:20.317 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.530s 2025-10-30 18:18:20.319 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5.538s 2025-10-30 18:18:20.327 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 5.552s 2025-10-30 18:18:20.341 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.555s 2025-10-30 18:18:20.344 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.617s 2025-10-30 18:18:20.406 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26294413] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=201170, randomLong=2227812170395008988, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9240, randomLong=7019830592653063681, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1059470, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node3 5.645s 2025-10-30 18:18:20.434 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 5.653s 2025-10-30 18:18:20.442 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 5.655s 2025-10-30 18:18:20.444 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 5.691s 2025-10-30 18:18:20.480 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26337733] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=242540, randomLong=-6133820656988547914, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8740, randomLong=-658375362765559644, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1538150, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node2 5.722s 2025-10-30 18:18:20.511 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.730s 2025-10-30 18:18:20.519 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.731s 2025-10-30 18:18:20.520 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 5.732s 2025-10-30 18:18:20.521 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node3 5.754s 2025-10-30 18:18:20.543 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 5.755s 2025-10-30 18:18:20.544 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 5.767s 2025-10-30 18:18:20.556 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ad8577967f5791194c1a2b78596fbb20a8a0a8cd140330f4cf5430a26ab1928e326a1b20b3ad5469eeacd625fb924bca (root) VirtualMap state / exhaust-version-turtle-table
node3 5.770s 2025-10-30 18:18:20.559 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node1 5.807s 2025-10-30 18:18:20.596 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.810s 2025-10-30 18:18:20.599 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 5.814s 2025-10-30 18:18:20.603 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node1 5.818s 2025-10-30 18:18:20.607 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 5.829s 2025-10-30 18:18:20.618 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 5.831s 2025-10-30 18:18:20.620 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.836s 2025-10-30 18:18:20.625 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.837s 2025-10-30 18:18:20.626 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.849s 2025-10-30 18:18:20.638 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ad8577967f5791194c1a2b78596fbb20a8a0a8cd140330f4cf5430a26ab1928e326a1b20b3ad5469eeacd625fb924bca (root) VirtualMap state / exhaust-version-turtle-table
node2 5.851s 2025-10-30 18:18:20.640 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node3 5.963s 2025-10-30 18:18:20.752 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 5.967s 2025-10-30 18:18:20.756 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 5.971s 2025-10-30 18:18:20.760 43 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 5.972s 2025-10-30 18:18:20.761 44 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 5.973s 2025-10-30 18:18:20.762 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 5.976s 2025-10-30 18:18:20.765 46 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 5.977s 2025-10-30 18:18:20.766 47 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 5.978s 2025-10-30 18:18:20.767 48 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 5.979s 2025-10-30 18:18:20.768 49 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 5.979s 2025-10-30 18:18:20.768 50 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 5.981s 2025-10-30 18:18:20.770 51 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 5.982s 2025-10-30 18:18:20.771 52 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 5.984s 2025-10-30 18:18:20.773 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 162.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 5.989s 2025-10-30 18:18:20.778 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.056s 2025-10-30 18:18:20.845 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.061s 2025-10-30 18:18:20.850 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.065s 2025-10-30 18:18:20.854 43 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.066s 2025-10-30 18:18:20.855 44 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.067s 2025-10-30 18:18:20.856 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.070s 2025-10-30 18:18:20.859 46 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.071s 2025-10-30 18:18:20.860 47 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.071s 2025-10-30 18:18:20.860 48 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.073s 2025-10-30 18:18:20.862 49 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.073s 2025-10-30 18:18:20.862 50 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.076s 2025-10-30 18:18:20.865 51 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.077s 2025-10-30 18:18:20.866 52 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.078s 2025-10-30 18:18:20.867 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 173.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.082s 2025-10-30 18:18:20.871 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.223s 2025-10-30 18:18:21.012 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.225s 2025-10-30 18:18:21.014 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 6.230s 2025-10-30 18:18:21.019 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 6.240s 2025-10-30 18:18:21.029 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.243s 2025-10-30 18:18:21.032 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.695s 2025-10-30 18:18:21.484 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26150412] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=239640, randomLong=5125986067874750299, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11020, randomLong=-6550366521027859996, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1282380, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node4 6.725s 2025-10-30 18:18:21.514 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6.733s 2025-10-30 18:18:21.522 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.734s 2025-10-30 18:18:21.523 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6.822s 2025-10-30 18:18:21.611 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node4 6.845s 2025-10-30 18:18:21.634 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6.846s 2025-10-30 18:18:21.635 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 6.858s 2025-10-30 18:18:21.647 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ad8577967f5791194c1a2b78596fbb20a8a0a8cd140330f4cf5430a26ab1928e326a1b20b3ad5469eeacd625fb924bca (root) VirtualMap state / exhaust-version-turtle-table
node4 6.861s 2025-10-30 18:18:21.650 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node1 6.975s 2025-10-30 18:18:21.764 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26225425] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=233770, randomLong=-548744209332325951, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10149, randomLong=-3956229954303705472, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1543980, data=35, exception=null] OS Health Check Report - Complete (took 1026 ms)
node1 7.006s 2025-10-30 18:18:21.795 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 7.015s 2025-10-30 18:18:21.804 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 7.017s 2025-10-30 18:18:21.806 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 7.067s 2025-10-30 18:18:21.856 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 7.072s 2025-10-30 18:18:21.861 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 7.077s 2025-10-30 18:18:21.866 43 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 7.078s 2025-10-30 18:18:21.867 44 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 7.079s 2025-10-30 18:18:21.868 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 7.082s 2025-10-30 18:18:21.871 46 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 7.083s 2025-10-30 18:18:21.872 47 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 7.084s 2025-10-30 18:18:21.873 48 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 7.086s 2025-10-30 18:18:21.875 49 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 7.086s 2025-10-30 18:18:21.875 50 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 7.088s 2025-10-30 18:18:21.877 51 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 7.089s 2025-10-30 18:18:21.878 52 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 7.091s 2025-10-30 18:18:21.880 53 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 175.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 7.097s 2025-10-30 18:18:21.886 54 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 7.105s 2025-10-30 18:18:21.894 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node1 7.131s 2025-10-30 18:18:21.920 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 7.132s 2025-10-30 18:18:21.921 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 7.146s 2025-10-30 18:18:21.935 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ad8577967f5791194c1a2b78596fbb20a8a0a8cd140330f4cf5430a26ab1928e326a1b20b3ad5469eeacd625fb924bca (root) VirtualMap state / exhaust-version-turtle-table
node1 7.150s 2025-10-30 18:18:21.939 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node0 7.365s 2025-10-30 18:18:22.154 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26309037] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=268920, randomLong=928222256313786810, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11920, randomLong=787595844836773452, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1351160, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node1 7.378s 2025-10-30 18:18:22.167 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 7.383s 2025-10-30 18:18:22.172 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 7.388s 2025-10-30 18:18:22.177 43 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 7.389s 2025-10-30 18:18:22.178 44 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 7.390s 2025-10-30 18:18:22.179 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 7.393s 2025-10-30 18:18:22.182 46 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 7.394s 2025-10-30 18:18:22.183 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 7.395s 2025-10-30 18:18:22.184 47 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 7.395s 2025-10-30 18:18:22.184 48 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 7.397s 2025-10-30 18:18:22.186 49 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 7.397s 2025-10-30 18:18:22.186 50 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 7.399s 2025-10-30 18:18:22.188 51 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 7.400s 2025-10-30 18:18:22.189 52 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node0 7.402s 2025-10-30 18:18:22.191 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 7.402s 2025-10-30 18:18:22.191 53 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 195.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 7.403s 2025-10-30 18:18:22.192 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 7.407s 2025-10-30 18:18:22.196 54 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.484s 2025-10-30 18:18:22.273 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node0 7.507s 2025-10-30 18:18:22.296 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 7.508s 2025-10-30 18:18:22.297 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 7.520s 2025-10-30 18:18:22.309 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ad8577967f5791194c1a2b78596fbb20a8a0a8cd140330f4cf5430a26ab1928e326a1b20b3ad5469eeacd625fb924bca (root) VirtualMap state / exhaust-version-turtle-table
node0 7.523s 2025-10-30 18:18:22.312 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node0 7.738s 2025-10-30 18:18:22.527 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 7.743s 2025-10-30 18:18:22.532 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 7.747s 2025-10-30 18:18:22.536 43 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 7.748s 2025-10-30 18:18:22.537 44 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 7.749s 2025-10-30 18:18:22.538 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 7.752s 2025-10-30 18:18:22.541 46 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 7.753s 2025-10-30 18:18:22.542 47 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 7.754s 2025-10-30 18:18:22.543 48 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 7.755s 2025-10-30 18:18:22.544 49 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 7.755s 2025-10-30 18:18:22.544 50 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 7.757s 2025-10-30 18:18:22.546 51 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 7.758s 2025-10-30 18:18:22.547 52 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 7.761s 2025-10-30 18:18:22.550 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 7.766s 2025-10-30 18:18:22.555 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 8.983s 2025-10-30 18:18:23.772 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 8.986s 2025-10-30 18:18:23.775 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 9.079s 2025-10-30 18:18:23.868 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.084s 2025-10-30 18:18:23.873 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 10.089s 2025-10-30 18:18:24.878 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 10.090s 2025-10-30 18:18:24.879 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 10.400s 2025-10-30 18:18:25.189 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 10.402s 2025-10-30 18:18:25.191 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 10.762s 2025-10-30 18:18:25.551 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 10.765s 2025-10-30 18:18:25.554 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 16.078s 2025-10-30 18:18:30.867 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.173s 2025-10-30 18:18:30.962 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.185s 2025-10-30 18:18:31.974 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 17.497s 2025-10-30 18:18:32.286 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 17.855s 2025-10-30 18:18:32.644 57 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 18.310s 2025-10-30 18:18:33.099 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 18.449s 2025-10-30 18:18:33.238 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 18.454s 2025-10-30 18:18:33.243 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 18.455s 2025-10-30 18:18:33.244 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 18.456s 2025-10-30 18:18:33.245 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 18.739s 2025-10-30 18:18:33.528 60 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 18.883s 2025-10-30 18:18:33.672 59 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 1.4 s in CHECKING. Now in ACTIVE
node1 18.886s 2025-10-30 18:18:33.675 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 18.887s 2025-10-30 18:18:33.676 59 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 2.7 s in CHECKING. Now in ACTIVE
node3 18.887s 2025-10-30 18:18:33.676 59 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 2.8 s in CHECKING. Now in ACTIVE
node2 18.889s 2025-10-30 18:18:33.678 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.890s 2025-10-30 18:18:33.679 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.891s 2025-10-30 18:18:33.680 59 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node4 18.893s 2025-10-30 18:18:33.682 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 19.081s 2025-10-30 18:18:33.870 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 19.083s 2025-10-30 18:18:33.872 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.088s 2025-10-30 18:18:33.877 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node4 19.090s 2025-10-30 18:18:33.879 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.100s 2025-10-30 18:18:33.889 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 19.102s 2025-10-30 18:18:33.891 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.121s 2025-10-30 18:18:33.910 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 19.123s 2025-10-30 18:18:33.912 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 19.158s 2025-10-30 18:18:33.947 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node1 19.160s 2025-10-30 18:18:33.949 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.319s 2025-10-30 18:18:34.108 107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.321s 2025-10-30 18:18:34.110 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-10-30T18:18:32.634048Z Next consensus number: 13 Legacy running event hash: 3a3609a7c6608c2fb01983096eef1b1a623c99a5f4b8702609a1e62cd08630ca92b948441eeb477ef5566034ab5adfba Legacy running event mnemonic: egg-crunch-cross-kick Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: fc6987c45916a4bb5473e5e8485c9f15a4382c7a52a4e61af75471b53f51c2aaca047727b9574987b09dc2e2106c0610 (root) VirtualMap state / indicate-always-carpet-hamster
node0 19.344s 2025-10-30 18:18:34.133 107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 19.347s 2025-10-30 18:18:34.136 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-10-30T18:18:32.634048Z Next consensus number: 13 Legacy running event hash: 3a3609a7c6608c2fb01983096eef1b1a623c99a5f4b8702609a1e62cd08630ca92b948441eeb477ef5566034ab5adfba Legacy running event mnemonic: egg-crunch-cross-kick Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: fc6987c45916a4bb5473e5e8485c9f15a4382c7a52a4e61af75471b53f51c2aaca047727b9574987b09dc2e2106c0610 (root) VirtualMap state / indicate-always-carpet-hamster
node2 19.353s 2025-10-30 18:18:34.142 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 19.356s 2025-10-30 18:18:34.145 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-10-30T18:18:32.634048Z Next consensus number: 13 Legacy running event hash: 3a3609a7c6608c2fb01983096eef1b1a623c99a5f4b8702609a1e62cd08630ca92b948441eeb477ef5566034ab5adfba Legacy running event mnemonic: egg-crunch-cross-kick Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: fc6987c45916a4bb5473e5e8485c9f15a4382c7a52a4e61af75471b53f51c2aaca047727b9574987b09dc2e2106c0610 (root) VirtualMap state / indicate-always-carpet-hamster
node4 19.357s 2025-10-30 18:18:34.146 107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 19.358s 2025-10-30 18:18:34.147 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 19.359s 2025-10-30 18:18:34.148 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 19.359s 2025-10-30 18:18:34.148 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.359s 2025-10-30 18:18:34.148 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-10-30T18:18:32.634048Z Next consensus number: 13 Legacy running event hash: 3a3609a7c6608c2fb01983096eef1b1a623c99a5f4b8702609a1e62cd08630ca92b948441eeb477ef5566034ab5adfba Legacy running event mnemonic: egg-crunch-cross-kick Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: fc6987c45916a4bb5473e5e8485c9f15a4382c7a52a4e61af75471b53f51c2aaca047727b9574987b09dc2e2106c0610 (root) VirtualMap state / indicate-always-carpet-hamster
node3 19.360s 2025-10-30 18:18:34.149 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 19.367s 2025-10-30 18:18:34.156 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.383s 2025-10-30 18:18:34.172 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 19.384s 2025-10-30 18:18:34.173 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 19.384s 2025-10-30 18:18:34.173 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 19.385s 2025-10-30 18:18:34.174 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 19.391s 2025-10-30 18:18:34.180 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 19.393s 2025-10-30 18:18:34.182 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 19.393s 2025-10-30 18:18:34.182 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 19.394s 2025-10-30 18:18:34.183 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 19.395s 2025-10-30 18:18:34.184 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.397s 2025-10-30 18:18:34.186 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 19.397s 2025-10-30 18:18:34.186 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 19.398s 2025-10-30 18:18:34.187 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.399s 2025-10-30 18:18:34.188 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 19.401s 2025-10-30 18:18:34.190 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.404s 2025-10-30 18:18:34.193 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 19.421s 2025-10-30 18:18:34.210 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 19.424s 2025-10-30 18:18:34.213 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-10-30T18:18:32.634048Z Next consensus number: 13 Legacy running event hash: 3a3609a7c6608c2fb01983096eef1b1a623c99a5f4b8702609a1e62cd08630ca92b948441eeb477ef5566034ab5adfba Legacy running event mnemonic: egg-crunch-cross-kick Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: fc6987c45916a4bb5473e5e8485c9f15a4382c7a52a4e61af75471b53f51c2aaca047727b9574987b09dc2e2106c0610 (root) VirtualMap state / indicate-always-carpet-hamster
node1 19.465s 2025-10-30 18:18:34.254 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 19.466s 2025-10-30 18:18:34.255 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 19.467s 2025-10-30 18:18:34.256 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 19.468s 2025-10-30 18:18:34.257 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 19.475s 2025-10-30 18:18:34.264 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 19.533s 2025-10-30 18:18:34.322 114 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 1.7 s in CHECKING. Now in ACTIVE
node0 46.472s 2025-10-30 18:19:01.261 723 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 46.480s 2025-10-30 18:19:01.269 731 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 46.554s 2025-10-30 18:19:01.343 732 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 46.594s 2025-10-30 18:19:01.383 724 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 46.621s 2025-10-30 18:19:01.410 728 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 61 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 46.677s 2025-10-30 18:19:01.466 727 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61
node1 46.678s 2025-10-30 18:19:01.467 728 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node3 46.712s 2025-10-30 18:19:01.501 735 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61
node3 46.713s 2025-10-30 18:19:01.502 736 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node2 46.725s 2025-10-30 18:19:01.514 734 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61
node2 46.726s 2025-10-30 18:19:01.515 735 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node1 46.767s 2025-10-30 18:19:01.556 759 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node1 46.769s 2025-10-30 18:19:01.558 760 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-10-30T18:19:00.291468259Z Next consensus number: 2215 Legacy running event hash: 97d509513f5c5d776aea5de2205550942ca2285d2779e4434bf62aa0fc272488f2a07199270392720c93fee3b41c9ce2 Legacy running event mnemonic: air-clay-twist-puzzle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1738029037 Root hash: 1e61a3a3eb7477db77a17956bc59c82edecb0a8b2e50f95d5d0783421cc8d661851a8699f516b5e27928c2ff335d717e (root) VirtualMap state / slow-merry-muscle-window
node1 46.778s 2025-10-30 18:19:01.567 761 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 46.779s 2025-10-30 18:19:01.568 762 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 33 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 46.779s 2025-10-30 18:19:01.568 763 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 46.781s 2025-10-30 18:19:01.570 764 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 46.782s 2025-10-30 18:19:01.571 765 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 46.784s 2025-10-30 18:19:01.573 731 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61
node4 46.785s 2025-10-30 18:19:01.574 732 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node3 46.787s 2025-10-30 18:19:01.576 775 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node3 46.789s 2025-10-30 18:19:01.578 776 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-10-30T18:19:00.291468259Z Next consensus number: 2215 Legacy running event hash: 97d509513f5c5d776aea5de2205550942ca2285d2779e4434bf62aa0fc272488f2a07199270392720c93fee3b41c9ce2 Legacy running event mnemonic: air-clay-twist-puzzle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1738029037 Root hash: 1e61a3a3eb7477db77a17956bc59c82edecb0a8b2e50f95d5d0783421cc8d661851a8699f516b5e27928c2ff335d717e (root) VirtualMap state / slow-merry-muscle-window
node3 46.798s 2025-10-30 18:19:01.587 777 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 46.799s 2025-10-30 18:19:01.588 778 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 33 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 46.799s 2025-10-30 18:19:01.588 779 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 46.801s 2025-10-30 18:19:01.590 780 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 46.801s 2025-10-30 18:19:01.590 781 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 46.805s 2025-10-30 18:19:01.594 766 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node2 46.808s 2025-10-30 18:19:01.597 767 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-10-30T18:19:00.291468259Z Next consensus number: 2215 Legacy running event hash: 97d509513f5c5d776aea5de2205550942ca2285d2779e4434bf62aa0fc272488f2a07199270392720c93fee3b41c9ce2 Legacy running event mnemonic: air-clay-twist-puzzle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1738029037 Root hash: 1e61a3a3eb7477db77a17956bc59c82edecb0a8b2e50f95d5d0783421cc8d661851a8699f516b5e27928c2ff335d717e (root) VirtualMap state / slow-merry-muscle-window
node2 46.817s 2025-10-30 18:19:01.606 768 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 46.818s 2025-10-30 18:19:01.607 769 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 33 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 46.819s 2025-10-30 18:19:01.608 770 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 46.820s 2025-10-30 18:19:01.609 771 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 46.821s 2025-10-30 18:19:01.610 772 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 46.867s 2025-10-30 18:19:01.656 763 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node4 46.869s 2025-10-30 18:19:01.658 764 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-10-30T18:19:00.291468259Z Next consensus number: 2215 Legacy running event hash: 97d509513f5c5d776aea5de2205550942ca2285d2779e4434bf62aa0fc272488f2a07199270392720c93fee3b41c9ce2 Legacy running event mnemonic: air-clay-twist-puzzle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1738029037 Root hash: 1e61a3a3eb7477db77a17956bc59c82edecb0a8b2e50f95d5d0783421cc8d661851a8699f516b5e27928c2ff335d717e (root) VirtualMap state / slow-merry-muscle-window
node4 46.877s 2025-10-30 18:19:01.666 765 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 46.878s 2025-10-30 18:19:01.667 766 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 33 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 46.879s 2025-10-30 18:19:01.668 767 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 46.880s 2025-10-30 18:19:01.669 768 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 46.881s 2025-10-30 18:19:01.670 769 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 46.894s 2025-10-30 18:19:01.683 726 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 61 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61
node0 46.895s 2025-10-30 18:19:01.684 727 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node0 46.980s 2025-10-30 18:19:01.769 769 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 61
node0 46.982s 2025-10-30 18:19:01.771 770 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 61 Timestamp: 2025-10-30T18:19:00.291468259Z Next consensus number: 2215 Legacy running event hash: 97d509513f5c5d776aea5de2205550942ca2285d2779e4434bf62aa0fc272488f2a07199270392720c93fee3b41c9ce2 Legacy running event mnemonic: air-clay-twist-puzzle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1738029037 Root hash: 1e61a3a3eb7477db77a17956bc59c82edecb0a8b2e50f95d5d0783421cc8d661851a8699f516b5e27928c2ff335d717e (root) VirtualMap state / slow-merry-muscle-window
node0 46.990s 2025-10-30 18:19:01.779 771 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 46.990s 2025-10-30 18:19:01.779 772 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 33 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 46.991s 2025-10-30 18:19:01.780 773 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 46.993s 2025-10-30 18:19:01.782 774 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 46.993s 2025-10-30 18:19:01.782 775 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 61 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61 {"round":61,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 46.315s 2025-10-30 18:20:01.104 2205 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 46.329s 2025-10-30 18:20:01.118 2190 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 46.350s 2025-10-30 18:20:01.139 2184 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 46.401s 2025-10-30 18:20:01.190 2175 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 46.441s 2025-10-30 18:20:01.230 2177 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 190 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 46.500s 2025-10-30 18:20:01.289 2187 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190
node0 1m 46.501s 2025-10-30 18:20:01.290 2188 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node1 1m 46.553s 2025-10-30 18:20:01.342 2178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190
node1 1m 46.554s 2025-10-30 18:20:01.343 2179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node3 1m 46.565s 2025-10-30 18:20:01.354 2208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190
node3 1m 46.566s 2025-10-30 18:20:01.355 2209 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node2 1m 46.571s 2025-10-30 18:20:01.360 2193 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190
node2 1m 46.571s 2025-10-30 18:20:01.360 2194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node0 1m 46.584s 2025-10-30 18:20:01.373 2227 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node0 1m 46.586s 2025-10-30 18:20:01.375 2228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-10-30T18:20:00.168450Z Next consensus number: 6995 Legacy running event hash: 72e40e88e3f9053fbd8d334e6b521ef5d2e39270972ff22377a4e9d5cfbeb31c59d318e5e661a460dc2ac3b528da3995 Legacy running event mnemonic: frame-speed-liberty-pony Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804764669 Root hash: 96ba70e9dec06f277f288fef734ccf7cc4d7cc856d6c2092df9e5bff8b4969997b5f03fc903cb5db88b31a0b916e2d71 (root) VirtualMap state / churn-brush-chimney-top
node0 1m 46.596s 2025-10-30 18:20:01.385 2229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 46.596s 2025-10-30 18:20:01.385 2230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 46.596s 2025-10-30 18:20:01.385 2231 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 46.601s 2025-10-30 18:20:01.390 2232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 46.602s 2025-10-30 18:20:01.391 2233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 46.611s 2025-10-30 18:20:01.400 2180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 190 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190
node4 1m 46.612s 2025-10-30 18:20:01.401 2181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node1 1m 46.638s 2025-10-30 18:20:01.427 2218 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node1 1m 46.640s 2025-10-30 18:20:01.429 2219 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-10-30T18:20:00.168450Z Next consensus number: 6995 Legacy running event hash: 72e40e88e3f9053fbd8d334e6b521ef5d2e39270972ff22377a4e9d5cfbeb31c59d318e5e661a460dc2ac3b528da3995 Legacy running event mnemonic: frame-speed-liberty-pony Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804764669 Root hash: 96ba70e9dec06f277f288fef734ccf7cc4d7cc856d6c2092df9e5bff8b4969997b5f03fc903cb5db88b31a0b916e2d71 (root) VirtualMap state / churn-brush-chimney-top
node1 1m 46.647s 2025-10-30 18:20:01.436 2220 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 46.647s 2025-10-30 18:20:01.436 2221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 46.648s 2025-10-30 18:20:01.437 2222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 46.652s 2025-10-30 18:20:01.441 2223 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 46.653s 2025-10-30 18:20:01.442 2224 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 46.657s 2025-10-30 18:20:01.446 2248 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node3 1m 46.659s 2025-10-30 18:20:01.448 2249 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-10-30T18:20:00.168450Z Next consensus number: 6995 Legacy running event hash: 72e40e88e3f9053fbd8d334e6b521ef5d2e39270972ff22377a4e9d5cfbeb31c59d318e5e661a460dc2ac3b528da3995 Legacy running event mnemonic: frame-speed-liberty-pony Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804764669 Root hash: 96ba70e9dec06f277f288fef734ccf7cc4d7cc856d6c2092df9e5bff8b4969997b5f03fc903cb5db88b31a0b916e2d71 (root) VirtualMap state / churn-brush-chimney-top
node2 1m 46.662s 2025-10-30 18:20:01.451 2225 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node2 1m 46.665s 2025-10-30 18:20:01.454 2226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-10-30T18:20:00.168450Z Next consensus number: 6995 Legacy running event hash: 72e40e88e3f9053fbd8d334e6b521ef5d2e39270972ff22377a4e9d5cfbeb31c59d318e5e661a460dc2ac3b528da3995 Legacy running event mnemonic: frame-speed-liberty-pony Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804764669 Root hash: 96ba70e9dec06f277f288fef734ccf7cc4d7cc856d6c2092df9e5bff8b4969997b5f03fc903cb5db88b31a0b916e2d71 (root) VirtualMap state / churn-brush-chimney-top
node3 1m 46.667s 2025-10-30 18:20:01.456 2250 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 46.667s 2025-10-30 18:20:01.456 2251 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 46.667s 2025-10-30 18:20:01.456 2252 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 46.672s 2025-10-30 18:20:01.461 2253 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 46.672s 2025-10-30 18:20:01.461 2254 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 46.674s 2025-10-30 18:20:01.463 2235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 46.674s 2025-10-30 18:20:01.463 2236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 46.674s 2025-10-30 18:20:01.463 2237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 46.679s 2025-10-30 18:20:01.468 2238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 46.680s 2025-10-30 18:20:01.469 2239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 46.693s 2025-10-30 18:20:01.482 2220 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 190
node4 1m 46.695s 2025-10-30 18:20:01.484 2221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 190 Timestamp: 2025-10-30T18:20:00.168450Z Next consensus number: 6995 Legacy running event hash: 72e40e88e3f9053fbd8d334e6b521ef5d2e39270972ff22377a4e9d5cfbeb31c59d318e5e661a460dc2ac3b528da3995 Legacy running event mnemonic: frame-speed-liberty-pony Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1804764669 Root hash: 96ba70e9dec06f277f288fef734ccf7cc4d7cc856d6c2092df9e5bff8b4969997b5f03fc903cb5db88b31a0b916e2d71 (root) VirtualMap state / churn-brush-chimney-top
node4 1m 46.702s 2025-10-30 18:20:01.491 2222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 46.702s 2025-10-30 18:20:01.491 2223 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 163 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 46.703s 2025-10-30 18:20:01.492 2224 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 46.708s 2025-10-30 18:20:01.497 2225 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 46.708s 2025-10-30 18:20:01.497 2226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 190 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190 {"round":190,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 46.884s 2025-10-30 18:21:01.673 3577 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 318 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 46.999s 2025-10-30 18:21:01.788 3638 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 318 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 47.035s 2025-10-30 18:21:01.824 3631 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 318 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 47.053s 2025-10-30 18:21:01.842 3599 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 318 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 47.085s 2025-10-30 18:21:01.874 3612 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 318 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 47.170s 2025-10-30 18:21:01.959 3618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 318 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/318
node0 2m 47.171s 2025-10-30 18:21:01.960 3619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node2 2m 47.191s 2025-10-30 18:21:01.980 3644 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 318 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/318
node2 2m 47.192s 2025-10-30 18:21:01.981 3645 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node3 2m 47.240s 2025-10-30 18:21:02.029 3637 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 318 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/318
node3 2m 47.241s 2025-10-30 18:21:02.030 3638 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node0 2m 47.260s 2025-10-30 18:21:02.049 3652 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node0 2m 47.261s 2025-10-30 18:21:02.050 3653 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node0 2m 47.270s 2025-10-30 18:21:02.059 3654 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 47.270s 2025-10-30 18:21:02.059 3655 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 291 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 47.270s 2025-10-30 18:21:02.059 3656 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 47.276s 2025-10-30 18:21:02.065 3688 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node2 2m 47.278s 2025-10-30 18:21:02.067 3689 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node0 2m 47.279s 2025-10-30 18:21:02.068 3657 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 47.279s 2025-10-30 18:21:02.068 3658 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 318 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/318 {"round":318,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/318/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 47.286s 2025-10-30 18:21:02.075 3690 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 47.286s 2025-10-30 18:21:02.075 3691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 291 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 47.286s 2025-10-30 18:21:02.075 3692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 47.287s 2025-10-30 18:21:02.076 3605 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 318 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/318
node4 2m 47.288s 2025-10-30 18:21:02.077 3606 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node2 2m 47.295s 2025-10-30 18:21:02.084 3693 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 47.295s 2025-10-30 18:21:02.084 3694 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 318 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/318 {"round":318,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/318/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 47.301s 2025-10-30 18:21:02.090 3583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 318 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/318
node1 2m 47.302s 2025-10-30 18:21:02.091 3584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node3 2m 47.322s 2025-10-30 18:21:02.111 3671 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node3 2m 47.324s 2025-10-30 18:21:02.113 3672 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node3 2m 47.333s 2025-10-30 18:21:02.122 3673 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 47.333s 2025-10-30 18:21:02.122 3674 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 291 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 47.334s 2025-10-30 18:21:02.123 3675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 47.343s 2025-10-30 18:21:02.132 3676 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 47.343s 2025-10-30 18:21:02.132 3677 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 318 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/318 {"round":318,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/318/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 47.369s 2025-10-30 18:21:02.158 3639 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node4 2m 47.371s 2025-10-30 18:21:02.160 3640 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node4 2m 47.378s 2025-10-30 18:21:02.167 3641 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 47.378s 2025-10-30 18:21:02.167 3642 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 291 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 47.378s 2025-10-30 18:21:02.167 3643 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 47.386s 2025-10-30 18:21:02.175 3644 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 47.387s 2025-10-30 18:21:02.176 3645 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 318 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/318 {"round":318,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/318/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 47.388s 2025-10-30 18:21:02.177 3617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 318
node1 2m 47.390s 2025-10-30 18:21:02.179 3618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node1 2m 47.397s 2025-10-30 18:21:02.186 3629 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 47.397s 2025-10-30 18:21:02.186 3630 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 291 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 47.397s 2025-10-30 18:21:02.186 3631 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 47.406s 2025-10-30 18:21:02.195 3632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 47.406s 2025-10-30 18:21:02.195 3633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 318 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/318 {"round":318,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/318/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 13.965s 2025-10-30 18:21:28.754 4234 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:21:28.751325766Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node0 3m 13.966s 2025-10-30 18:21:28.755 4283 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:21:28.751100690Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 3m 13.966s 2025-10-30 18:21:28.755 4282 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:21:28.751953517Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node2 3m 13.967s 2025-10-30 18:21:28.756 4305 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:21:28.751672191Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node0 3m 46.140s 2025-10-30 18:22:00.929 5172 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 449 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 46.164s 2025-10-30 18:22:00.953 5135 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 449 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 46.178s 2025-10-30 18:22:00.967 5136 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 449 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 46.203s 2025-10-30 18:22:00.992 5111 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 449 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 46.420s 2025-10-30 18:22:01.209 5139 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 449 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/449
node2 3m 46.421s 2025-10-30 18:22:01.210 5140 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node3 3m 46.494s 2025-10-30 18:22:01.283 5148 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 449 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/449
node3 3m 46.494s 2025-10-30 18:22:01.283 5149 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node2 3m 46.501s 2025-10-30 18:22:01.290 5179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node2 3m 46.502s 2025-10-30 18:22:01.291 5180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 449 Timestamp: 2025-10-30T18:22:00.053683Z Next consensus number: 15819 Legacy running event hash: 490475142fdd6843ac330adb755c223ffdbd99daede18d09cf262bd7608f6b507b7c0ee1b6f269b313bc3a31a8d4d7e5 Legacy running event mnemonic: calm-penalty-wing-anchor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 640126699 Root hash: 2db85a2c874be81c21d882e781432aecc543ecfc6e819d6c851c27bca3020a7dc451d2914f88acf682846ba7ca136c90 (root) VirtualMap state / require-easily-birth-wish
node0 3m 46.508s 2025-10-30 18:22:01.297 5185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 449 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/449
node0 3m 46.509s 2025-10-30 18:22:01.298 5186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node2 3m 46.510s 2025-10-30 18:22:01.299 5181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 46.510s 2025-10-30 18:22:01.299 5182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 422 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 46.510s 2025-10-30 18:22:01.299 5183 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 46.521s 2025-10-30 18:22:01.310 5184 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 46.521s 2025-10-30 18:22:01.310 5185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 449 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/449 {"round":449,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/449/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 46.564s 2025-10-30 18:22:01.353 5114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 449 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/449
node1 3m 46.565s 2025-10-30 18:22:01.354 5115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node3 3m 46.571s 2025-10-30 18:22:01.360 5180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node3 3m 46.572s 2025-10-30 18:22:01.361 5181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 449 Timestamp: 2025-10-30T18:22:00.053683Z Next consensus number: 15819 Legacy running event hash: 490475142fdd6843ac330adb755c223ffdbd99daede18d09cf262bd7608f6b507b7c0ee1b6f269b313bc3a31a8d4d7e5 Legacy running event mnemonic: calm-penalty-wing-anchor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 640126699 Root hash: 2db85a2c874be81c21d882e781432aecc543ecfc6e819d6c851c27bca3020a7dc451d2914f88acf682846ba7ca136c90 (root) VirtualMap state / require-easily-birth-wish
node3 3m 46.579s 2025-10-30 18:22:01.368 5190 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 46.579s 2025-10-30 18:22:01.368 5191 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 422 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 46.579s 2025-10-30 18:22:01.368 5192 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 46.590s 2025-10-30 18:22:01.379 5228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node3 3m 46.590s 2025-10-30 18:22:01.379 5193 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 46.591s 2025-10-30 18:22:01.380 5194 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 449 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/449 {"round":449,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/449/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 46.592s 2025-10-30 18:22:01.381 5229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 449 Timestamp: 2025-10-30T18:22:00.053683Z Next consensus number: 15819 Legacy running event hash: 490475142fdd6843ac330adb755c223ffdbd99daede18d09cf262bd7608f6b507b7c0ee1b6f269b313bc3a31a8d4d7e5 Legacy running event mnemonic: calm-penalty-wing-anchor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 640126699 Root hash: 2db85a2c874be81c21d882e781432aecc543ecfc6e819d6c851c27bca3020a7dc451d2914f88acf682846ba7ca136c90 (root) VirtualMap state / require-easily-birth-wish
node0 3m 46.598s 2025-10-30 18:22:01.387 5230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 46.598s 2025-10-30 18:22:01.387 5231 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 422 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 46.598s 2025-10-30 18:22:01.387 5232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 46.609s 2025-10-30 18:22:01.398 5233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 46.609s 2025-10-30 18:22:01.398 5234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 449 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/449 {"round":449,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/449/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 46.656s 2025-10-30 18:22:01.445 5157 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 449
node1 3m 46.658s 2025-10-30 18:22:01.447 5158 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 449 Timestamp: 2025-10-30T18:22:00.053683Z Next consensus number: 15819 Legacy running event hash: 490475142fdd6843ac330adb755c223ffdbd99daede18d09cf262bd7608f6b507b7c0ee1b6f269b313bc3a31a8d4d7e5 Legacy running event mnemonic: calm-penalty-wing-anchor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 640126699 Root hash: 2db85a2c874be81c21d882e781432aecc543ecfc6e819d6c851c27bca3020a7dc451d2914f88acf682846ba7ca136c90 (root) VirtualMap state / require-easily-birth-wish
node1 3m 46.667s 2025-10-30 18:22:01.456 5159 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 46.667s 2025-10-30 18:22:01.456 5160 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 422 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 46.667s 2025-10-30 18:22:01.456 5161 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 46.678s 2025-10-30 18:22:01.467 5162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 46.679s 2025-10-30 18:22:01.468 5163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 449 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/449 {"round":449,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/449/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 46.318s 2025-10-30 18:23:01.107 6753 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 588 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 46.331s 2025-10-30 18:23:01.120 6827 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 588 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 46.394s 2025-10-30 18:23:01.183 6744 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 588 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 46.394s 2025-10-30 18:23:01.183 6746 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 588 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 46.539s 2025-10-30 18:23:01.328 6747 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 588 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/588
node1 4m 46.539s 2025-10-30 18:23:01.328 6748 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node0 4m 46.543s 2025-10-30 18:23:01.332 6830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 588 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/588
node0 4m 46.544s 2025-10-30 18:23:01.333 6831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node3 4m 46.553s 2025-10-30 18:23:01.342 6749 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 588 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/588
node3 4m 46.554s 2025-10-30 18:23:01.343 6750 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node2 4m 46.610s 2025-10-30 18:23:01.399 6756 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 588 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/588
node2 4m 46.611s 2025-10-30 18:23:01.400 6757 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node1 4m 46.622s 2025-10-30 18:23:01.411 6787 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node0 4m 46.624s 2025-10-30 18:23:01.413 6870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node1 4m 46.624s 2025-10-30 18:23:01.413 6788 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 588 Timestamp: 2025-10-30T18:23:00.218449Z Next consensus number: 19130 Legacy running event hash: 1b88de64fdfbf3c837f1fc495aabc25bbb1503a7debe31eeaf1c094577ade249da0540875aad7882c250a99d7713db40 Legacy running event mnemonic: illegal-crime-doll-mandate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1843081586 Root hash: 82617e6df9810dd6a7ddff8f22f725d123f5b22ebe18d96c33512fac2bcadc0eb1dd6336412846ce30976529d0962919 (root) VirtualMap state / recycle-soap-soup-river
node0 4m 46.626s 2025-10-30 18:23:01.415 6871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 588 Timestamp: 2025-10-30T18:23:00.218449Z Next consensus number: 19130 Legacy running event hash: 1b88de64fdfbf3c837f1fc495aabc25bbb1503a7debe31eeaf1c094577ade249da0540875aad7882c250a99d7713db40 Legacy running event mnemonic: illegal-crime-doll-mandate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1843081586 Root hash: 82617e6df9810dd6a7ddff8f22f725d123f5b22ebe18d96c33512fac2bcadc0eb1dd6336412846ce30976529d0962919 (root) VirtualMap state / recycle-soap-soup-river
node3 4m 46.627s 2025-10-30 18:23:01.416 6789 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node3 4m 46.629s 2025-10-30 18:23:01.418 6790 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 588 Timestamp: 2025-10-30T18:23:00.218449Z Next consensus number: 19130 Legacy running event hash: 1b88de64fdfbf3c837f1fc495aabc25bbb1503a7debe31eeaf1c094577ade249da0540875aad7882c250a99d7713db40 Legacy running event mnemonic: illegal-crime-doll-mandate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1843081586 Root hash: 82617e6df9810dd6a7ddff8f22f725d123f5b22ebe18d96c33512fac2bcadc0eb1dd6336412846ce30976529d0962919 (root) VirtualMap state / recycle-soap-soup-river
node1 4m 46.631s 2025-10-30 18:23:01.420 6789 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 46.631s 2025-10-30 18:23:01.420 6790 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 561 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 46.632s 2025-10-30 18:23:01.421 6791 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 46.633s 2025-10-30 18:23:01.422 6872 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 46.633s 2025-10-30 18:23:01.422 6792 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 46.634s 2025-10-30 18:23:01.423 6873 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 561 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 4m 46.634s 2025-10-30 18:23:01.423 6874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 46.634s 2025-10-30 18:23:01.423 6793 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 588 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/588 {"round":588,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/588/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 46.635s 2025-10-30 18:23:01.424 6875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 46.635s 2025-10-30 18:23:01.424 6794 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node0 4m 46.636s 2025-10-30 18:23:01.425 6876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 588 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/588 {"round":588,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/588/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 46.636s 2025-10-30 18:23:01.425 6791 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 46.636s 2025-10-30 18:23:01.425 6792 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 561 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces
node3 4m 46.636s 2025-10-30 18:23:01.425 6793 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 46.637s 2025-10-30 18:23:01.426 6877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node3 4m 46.638s 2025-10-30 18:23:01.427 6794 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 46.638s 2025-10-30 18:23:01.427 6795 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 588 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/588 {"round":588,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/588/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 46.640s 2025-10-30 18:23:01.429 6796 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node2 4m 46.692s 2025-10-30 18:23:01.481 6796 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 588
node2 4m 46.694s 2025-10-30 18:23:01.483 6797 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 588 Timestamp: 2025-10-30T18:23:00.218449Z Next consensus number: 19130 Legacy running event hash: 1b88de64fdfbf3c837f1fc495aabc25bbb1503a7debe31eeaf1c094577ade249da0540875aad7882c250a99d7713db40 Legacy running event mnemonic: illegal-crime-doll-mandate Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1843081586 Root hash: 82617e6df9810dd6a7ddff8f22f725d123f5b22ebe18d96c33512fac2bcadc0eb1dd6336412846ce30976529d0962919 (root) VirtualMap state / recycle-soap-soup-river
node2 4m 46.701s 2025-10-30 18:23:01.490 6798 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 46.702s 2025-10-30 18:23:01.491 6799 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 561 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 46.702s 2025-10-30 18:23:01.491 6800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 46.703s 2025-10-30 18:23:01.492 6801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 46.704s 2025-10-30 18:23:01.493 6802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 588 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/588 {"round":588,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/588/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 46.705s 2025-10-30 18:23:01.494 6803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 5m 46.209s 2025-10-30 18:24:00.998 8368 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 726 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 46.210s 2025-10-30 18:24:00.999 8432 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 726 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 46.224s 2025-10-30 18:24:01.013 8345 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 726 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 46.293s 2025-10-30 18:24:01.082 8309 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 726 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 46.367s 2025-10-30 18:24:01.156 8312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 726 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/726
node1 5m 46.368s 2025-10-30 18:24:01.157 8313 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node2 5m 46.409s 2025-10-30 18:24:01.198 8371 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 726 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/726
node2 5m 46.410s 2025-10-30 18:24:01.199 8372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node1 5m 46.448s 2025-10-30 18:24:01.237 8352 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node1 5m 46.450s 2025-10-30 18:24:01.239 8353 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 726 Timestamp: 2025-10-30T18:24:00.143027330Z Next consensus number: 22433 Legacy running event hash: 8b6fa27bce0a135c4aacda71147a335344cf5687a426f3171c1f2b5e0b2460b7d687392f48e4661173af42359055f201 Legacy running event mnemonic: mutual-radar-disagree-high Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1952579943 Root hash: 0b9b0e535bfcbf2ac40d7e0f400623cc402d15f2f73ef1242a2fee173a49f8c9652aaf2419262753f283680cda5ab79a (root) VirtualMap state / inflict-skill-legend-bracket
node1 5m 46.458s 2025-10-30 18:24:01.247 8354 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 46.459s 2025-10-30 18:24:01.248 8355 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 699 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 46.459s 2025-10-30 18:24:01.248 8356 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 46.464s 2025-10-30 18:24:01.253 8357 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 46.465s 2025-10-30 18:24:01.254 8358 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 726 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/726 {"round":726,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/726/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 46.467s 2025-10-30 18:24:01.256 8359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/61
node2 5m 46.490s 2025-10-30 18:24:01.279 8411 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node2 5m 46.493s 2025-10-30 18:24:01.282 8412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 726 Timestamp: 2025-10-30T18:24:00.143027330Z Next consensus number: 22433 Legacy running event hash: 8b6fa27bce0a135c4aacda71147a335344cf5687a426f3171c1f2b5e0b2460b7d687392f48e4661173af42359055f201 Legacy running event mnemonic: mutual-radar-disagree-high Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1952579943 Root hash: 0b9b0e535bfcbf2ac40d7e0f400623cc402d15f2f73ef1242a2fee173a49f8c9652aaf2419262753f283680cda5ab79a (root) VirtualMap state / inflict-skill-legend-bracket
node3 5m 46.493s 2025-10-30 18:24:01.282 8348 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 726 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/726
node3 5m 46.494s 2025-10-30 18:24:01.283 8349 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node2 5m 46.500s 2025-10-30 18:24:01.289 8413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 46.501s 2025-10-30 18:24:01.290 8414 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 699 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 46.501s 2025-10-30 18:24:01.290 8415 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 46.505s 2025-10-30 18:24:01.294 8416 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 46.505s 2025-10-30 18:24:01.294 8417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 726 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/726 {"round":726,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/726/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 46.507s 2025-10-30 18:24:01.296 8418 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/61
node3 5m 46.565s 2025-10-30 18:24:01.354 8380 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node3 5m 46.567s 2025-10-30 18:24:01.356 8381 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 726 Timestamp: 2025-10-30T18:24:00.143027330Z Next consensus number: 22433 Legacy running event hash: 8b6fa27bce0a135c4aacda71147a335344cf5687a426f3171c1f2b5e0b2460b7d687392f48e4661173af42359055f201 Legacy running event mnemonic: mutual-radar-disagree-high Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1952579943 Root hash: 0b9b0e535bfcbf2ac40d7e0f400623cc402d15f2f73ef1242a2fee173a49f8c9652aaf2419262753f283680cda5ab79a (root) VirtualMap state / inflict-skill-legend-bracket
node0 5m 46.571s 2025-10-30 18:24:01.360 8435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 726 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/726
node0 5m 46.572s 2025-10-30 18:24:01.361 8436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node3 5m 46.573s 2025-10-30 18:24:01.362 8382 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 46.573s 2025-10-30 18:24:01.362 8383 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 699 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 46.573s 2025-10-30 18:24:01.362 8384 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 46.577s 2025-10-30 18:24:01.366 8385 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 46.578s 2025-10-30 18:24:01.367 8386 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 726 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/726 {"round":726,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/726/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 46.579s 2025-10-30 18:24:01.368 8395 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/61
node0 5m 46.646s 2025-10-30 18:24:01.435 8470 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 726
node0 5m 46.647s 2025-10-30 18:24:01.436 8471 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 726 Timestamp: 2025-10-30T18:24:00.143027330Z Next consensus number: 22433 Legacy running event hash: 8b6fa27bce0a135c4aacda71147a335344cf5687a426f3171c1f2b5e0b2460b7d687392f48e4661173af42359055f201 Legacy running event mnemonic: mutual-radar-disagree-high Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1952579943 Root hash: 0b9b0e535bfcbf2ac40d7e0f400623cc402d15f2f73ef1242a2fee173a49f8c9652aaf2419262753f283680cda5ab79a (root) VirtualMap state / inflict-skill-legend-bracket
node0 5m 46.654s 2025-10-30 18:24:01.443 8472 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 46.654s 2025-10-30 18:24:01.443 8473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 699 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 46.654s 2025-10-30 18:24:01.443 8474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 46.658s 2025-10-30 18:24:01.447 8475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 46.659s 2025-10-30 18:24:01.448 8476 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 726 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/726 {"round":726,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/726/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 46.660s 2025-10-30 18:24:01.449 8477 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/61
node4 5m 55.213s 2025-10-30 18:24:10.002 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 55.301s 2025-10-30 18:24:10.090 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 55.317s 2025-10-30 18:24:10.106 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 55.425s 2025-10-30 18:24:10.214 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 55.454s 2025-10-30 18:24:10.243 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 56.792s 2025-10-30 18:24:11.581 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1337ms
node4 5m 56.801s 2025-10-30 18:24:11.590 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 56.804s 2025-10-30 18:24:11.593 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 56.853s 2025-10-30 18:24:11.642 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 56.925s 2025-10-30 18:24:11.714 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 56.926s 2025-10-30 18:24:11.715 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 59.004s 2025-10-30 18:24:13.793 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 59.097s 2025-10-30 18:24:13.886 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.104s 2025-10-30 18:24:13.893 16 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/318/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/190/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 5m 59.104s 2025-10-30 18:24:13.893 17 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 59.104s 2025-10-30 18:24:13.893 18 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/318/SignedState.swh
node4 5m 59.113s 2025-10-30 18:24:13.902 19 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 59.231s 2025-10-30 18:24:14.020 29 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 59.991s 2025-10-30 18:24:14.780 31 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 59.996s 2025-10-30 18:24:14.785 32 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":318,"consensusTimestamp":"2025-10-30T18:21:00.354475Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 59.999s 2025-10-30 18:24:14.788 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.000m 2025-10-30 18:24:14.789 37 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 6.000m 2025-10-30 18:24:14.793 39 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 6.000m 2025-10-30 18:24:14.800 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.000m 2025-10-30 18:24:14.803 41 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6m 1.099s 2025-10-30 18:24:15.888 42 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26159493] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=288280, randomLong=-2160913475866417600, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=23910, randomLong=9078952552007680953, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1232549, data=35, exception=null] OS Health Check Report - Complete (took 1026 ms)
node4 6m 1.132s 2025-10-30 18:24:15.921 43 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6m 1.261s 2025-10-30 18:24:16.050 44 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 374
node4 6m 1.264s 2025-10-30 18:24:16.053 45 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6m 1.266s 2025-10-30 18:24:16.055 46 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6m 1.354s 2025-10-30 18:24:16.143 47 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IgqYQw==", "port": 30124 }, { "ipAddressV4": "CoAASQ==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iqq2gg==", "port": 30125 }, { "ipAddressV4": "CoAASg==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "IjnI6w==", "port": 30126 }, { "ipAddressV4": "CoAANw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "Ioh/yQ==", "port": 30127 }, { "ipAddressV4": "CoAAQQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "Ij1K6g==", "port": 30128 }, { "ipAddressV4": "CoAAQA==", "port": 30128 }] }] }
node4 6m 1.379s 2025-10-30 18:24:16.168 48 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long -7732178496350251092.
node4 6m 1.380s 2025-10-30 18:24:16.169 49 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 317 rounds handled.
node4 6m 1.381s 2025-10-30 18:24:16.170 50 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 1.381s 2025-10-30 18:24:16.170 51 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 1.423s 2025-10-30 18:24:16.212 52 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 318 Timestamp: 2025-10-30T18:21:00.354475Z Next consensus number: 11831 Legacy running event hash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345 Legacy running event mnemonic: live-wood-sword-timber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1623232148 Root hash: c86862d70bf57fcb718a519474a6ac363199aa3cfcabb4c2aaa726e952f833079931af3140733ac2397195b47738f5b7 (root) VirtualMap state / vessel-foot-leaf-west
node4 6m 1.429s 2025-10-30 18:24:16.218 54 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 6m 1.639s 2025-10-30 18:24:16.428 55 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 0c1590171ee2a0ef3ecbea957ca511818b899bb45cb1ef215601b1c5dc2bc178c1ac8b577f38b0e5b3b3dc48b5d33345
node4 6m 1.648s 2025-10-30 18:24:16.437 56 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 291
node4 6m 1.654s 2025-10-30 18:24:16.443 58 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6m 1.655s 2025-10-30 18:24:16.444 59 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6m 1.656s 2025-10-30 18:24:16.445 60 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6m 1.659s 2025-10-30 18:24:16.448 61 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6m 1.660s 2025-10-30 18:24:16.449 62 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6m 1.661s 2025-10-30 18:24:16.450 63 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6m 1.663s 2025-10-30 18:24:16.452 64 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 291
node4 6m 1.669s 2025-10-30 18:24:16.458 65 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 180.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6m 1.950s 2025-10-30 18:24:16.739 66 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:5fb24bb6a562 BR:316), num remaining: 3
node4 6m 1.952s 2025-10-30 18:24:16.741 67 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:219e34b6e1ca BR:316), num remaining: 2
node4 6m 1.952s 2025-10-30 18:24:16.741 68 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:dc13d2987c3e BR:316), num remaining: 1
node4 6m 1.954s 2025-10-30 18:24:16.743 69 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:1ddd20ad3bfe BR:316), num remaining: 0
node4 6m 2.404s 2025-10-30 18:24:17.193 459 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 3,243 preconsensus events with max birth round 374. These events contained 4,421 transactions. 55 rounds reached consensus spanning 26.5 seconds of consensus time. The latest round to reach consensus is round 373. Replay took 740.0 milliseconds.
node4 6m 2.409s 2025-10-30 18:24:17.198 460 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6m 2.411s 2025-10-30 18:24:17.200 461 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 740.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 3.252s 2025-10-30 18:24:18.041 574 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, stopping gossip
node4 6m 3.252s 2025-10-30 18:24:18.041 575 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291] remote ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664]
node4 6m 3.252s 2025-10-30 18:24:18.041 576 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291] remote ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664]
node4 6m 3.252s 2025-10-30 18:24:18.041 577 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291] remote ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664]
node4 6m 3.253s 2025-10-30 18:24:18.042 578 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 841.0 ms in OBSERVING. Now in BEHIND
node4 6m 3.253s 2025-10-30 18:24:18.042 579 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, start clearing queues
node0 6m 3.323s 2025-10-30 18:24:18.112 8948 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664] remote ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291]
node1 6m 3.323s 2025-10-30 18:24:18.112 8779 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664] remote ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291]
node2 6m 3.323s 2025-10-30 18:24:18.112 8844 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=765,ancientThreshold=738,expiredThreshold=664] remote ev=EventWindow[latestConsensusRound=373,ancientThreshold=346,expiredThreshold=291]
node4 6m 3.405s 2025-10-30 18:24:18.194 580 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Queues have been cleared
node4 6m 3.406s 2025-10-30 18:24:18.195 581 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Waiting for a state to be obtained from a peer
node0 6m 3.499s 2025-10-30 18:24:18.288 8949 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":0,"otherNodeId":4,"round":765} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node0 6m 3.500s 2025-10-30 18:24:18.289 8950 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: The following state will be sent to the learner:
Round: 765 Timestamp: 2025-10-30T18:24:16.959650323Z Next consensus number: 23364 Legacy running event hash: f40e304f3fe59b67239eca091c34be63383b0397c34607f39dd7efe722ad309e871b98bc0a5d553dbb14243a24282124 Legacy running event mnemonic: layer-antique-witness-nasty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 555498558 Root hash: e239455d93e2b0e5e8138845af4f9693a11697a1e40903a27ef9a21c4c99bd341c19455201143b1e6b0e8b0071ed5e8c (root) VirtualMap state / rib-produce-long-toddler
node0 6m 3.500s 2025-10-30 18:24:18.289 8951 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: Sending signatures from nodes 0, 2, 3 (signing weight = 37500000000/50000000000) for state hash e239455d93e2b0e5e8138845af4f9693a11697a1e40903a27ef9a21c4c99bd341c19455201143b1e6b0e8b0071ed5e8c
node0 6m 3.500s 2025-10-30 18:24:18.289 8952 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: Starting synchronization in the role of the sender.
node4 6m 3.568s 2025-10-30 18:24:18.357 582 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStatePeerProtocol: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":0,"round":373} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 3.569s 2025-10-30 18:24:18.358 583 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStateLearner: Receiving signed state signatures
node4 6m 3.571s 2025-10-30 18:24:18.360 584 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStateLearner: Received signatures from nodes 0, 2, 3
node0 6m 3.620s 2025-10-30 18:24:18.409 8981 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node0 6m 3.628s 2025-10-30 18:24:18.417 8982 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3de34a38 start run()
node4 6m 3.780s 2025-10-30 18:24:18.569 611 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: learner calls receiveTree()
node4 6m 3.781s 2025-10-30 18:24:18.570 612 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: synchronizing tree
node4 6m 3.781s 2025-10-30 18:24:18.570 613 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 3.788s 2025-10-30 18:24:18.577 614 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4339498a start run()
node4 6m 3.845s 2025-10-30 18:24:18.634 615 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 4 -> 4, lastLeafPath: 8 -> 8
node4 6m 3.846s 2025-10-30 18:24:18.635 616 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 4.000s 2025-10-30 18:24:18.789 617 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 4.000s 2025-10-30 18:24:18.789 618 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 4.001s 2025-10-30 18:24:18.790 619 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 4.001s 2025-10-30 18:24:18.790 620 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 4.001s 2025-10-30 18:24:18.790 621 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 4.002s 2025-10-30 18:24:18.791 622 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 4.002s 2025-10-30 18:24:18.791 623 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node4 6m 4.023s 2025-10-30 18:24:18.812 633 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 4.024s 2025-10-30 18:24:18.813 635 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 4.024s 2025-10-30 18:24:18.813 636 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 4.024s 2025-10-30 18:24:18.813 637 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 4.025s 2025-10-30 18:24:18.814 638 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@4339498a finish run()
node4 6m 4.025s 2025-10-30 18:24:18.814 639 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 4.026s 2025-10-30 18:24:18.815 640 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: synchronization complete
node4 6m 4.026s 2025-10-30 18:24:18.815 641 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: learner calls initialize()
node4 6m 4.027s 2025-10-30 18:24:18.816 642 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: initializing tree
node4 6m 4.027s 2025-10-30 18:24:18.816 643 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: initialization complete
node4 6m 4.027s 2025-10-30 18:24:18.816 644 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: learner calls hash()
node4 6m 4.027s 2025-10-30 18:24:18.816 645 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: hashing tree
node4 6m 4.027s 2025-10-30 18:24:18.816 646 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: hashing complete
node4 6m 4.027s 2025-10-30 18:24:18.816 647 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: learner calls logStatistics()
node4 6m 4.030s 2025-10-30 18:24:18.819 648 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.245,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":9,"leafNodes":5,"redundantLeafNodes":2,"internalNodes":4,"redundantInternalNodes":0} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 4.030s 2025-10-30 18:24:18.819 649 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=9; transfersFromLearner=8; internalHashes=3; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=5; leafCleanHashes=2; leafData=5; leafCleanData=2
node4 6m 4.030s 2025-10-30 18:24:18.819 650 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> LearningSynchronizer: learner is done synchronizing
node4 6m 4.031s 2025-10-30 18:24:18.820 651 INFO STARTUP <<platform-core: SyncProtocolWith0 4 to 0>> ConsistencyTestingToolState: New State Constructed.
node4 6m 4.036s 2025-10-30 18:24:18.825 652 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStateLearner: Reconnect data usage report {"dataMegabytes":0.005864143371582031} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node0 6m 4.060s 2025-10-30 18:24:18.849 8986 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3de34a38 finish run()
node0 6m 4.061s 2025-10-30 18:24:18.850 8987 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: finished sending tree
node0 6m 4.063s 2025-10-30 18:24:18.852 8990 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: Finished synchronization in the role of the sender.
node0 6m 4.108s 2025-10-30 18:24:18.897 8991 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectStateTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":0,"otherNodeId":4,"round":765} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 4.136s 2025-10-30 18:24:18.925 653 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStatePeerProtocol: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":0,"round":765} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 4.137s 2025-10-30 18:24:18.926 654 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> ReconnectStatePeerProtocol: Information for state received during reconnect:
Round: 765 Timestamp: 2025-10-30T18:24:16.959650323Z Next consensus number: 23364 Legacy running event hash: f40e304f3fe59b67239eca091c34be63383b0397c34607f39dd7efe722ad309e871b98bc0a5d553dbb14243a24282124 Legacy running event mnemonic: layer-antique-witness-nasty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 555498558 Root hash: e239455d93e2b0e5e8138845af4f9693a11697a1e40903a27ef9a21c4c99bd341c19455201143b1e6b0e8b0071ed5e8c (root) VirtualMap state / rib-produce-long-toddler
node4 6m 4.137s 2025-10-30 18:24:18.926 655 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: A state was obtained from a peer
node4 6m 4.139s 2025-10-30 18:24:18.928 656 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: The state obtained from a peer was validated
node4 6m 4.139s 2025-10-30 18:24:18.928 658 DEBUG RECONNECT <<platform-core: reconnectController>> ReconnectController: `loadState` : reloading state
node4 6m 4.140s 2025-10-30 18:24:18.929 659 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with state long 7058921390453627106.
node4 6m 4.140s 2025-10-30 18:24:18.929 660 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with 764 rounds handled.
node4 6m 4.140s 2025-10-30 18:24:18.929 661 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 4.140s 2025-10-30 18:24:18.929 662 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 4.156s 2025-10-30 18:24:18.945 667 INFO STATE_TO_DISK <<platform-core: reconnectController>> DefaultSavedStateController: Signed state from round 765 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 4.156s 2025-10-30 18:24:18.945 668 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 902.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 4.157s 2025-10-30 18:24:18.946 669 INFO STARTUP <platformForkJoinThread-3> Shadowgraph: Shadowgraph starting from expiration threshold 738
node4 6m 4.159s 2025-10-30 18:24:18.948 672 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 765 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/765
node4 6m 4.161s 2025-10-30 18:24:18.950 673 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 765
node4 6m 4.166s 2025-10-30 18:24:18.955 677 INFO EVENT_STREAM <<platform-core: reconnectController>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: f40e304f3fe59b67239eca091c34be63383b0397c34607f39dd7efe722ad309e871b98bc0a5d553dbb14243a24282124
node4 6m 4.173s 2025-10-30 18:24:18.962 682 INFO STARTUP <platformForkJoinThread-3> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr374_orgn0.pces. All future files will have an origin round of 765.
node4 6m 4.174s 2025-10-30 18:24:18.963 685 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Reconnect almost done resuming gossip
node4 6m 4.312s 2025-10-30 18:24:19.101 711 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 765
node4 6m 4.315s 2025-10-30 18:24:19.104 712 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 765 Timestamp: 2025-10-30T18:24:16.959650323Z Next consensus number: 23364 Legacy running event hash: f40e304f3fe59b67239eca091c34be63383b0397c34607f39dd7efe722ad309e871b98bc0a5d553dbb14243a24282124 Legacy running event mnemonic: layer-antique-witness-nasty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 555498558 Root hash: e239455d93e2b0e5e8138845af4f9693a11697a1e40903a27ef9a21c4c99bd341c19455201143b1e6b0e8b0071ed5e8c (root) VirtualMap state / rib-produce-long-toddler
node4 6m 4.350s 2025-10-30 18:24:19.139 713 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr374_orgn0.pces
node4 6m 4.350s 2025-10-30 18:24:19.139 714 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 738
node4 6m 4.356s 2025-10-30 18:24:19.145 715 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 765 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/765 {"round":765,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/765/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 4.359s 2025-10-30 18:24:19.148 716 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 202.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 4.668s 2025-10-30 18:24:19.457 717 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 4.671s 2025-10-30 18:24:19.460 718 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 5.314s 2025-10-30 18:24:20.103 719 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:0d17374c58e9 BR:763), num remaining: 3
node4 6m 5.315s 2025-10-30 18:24:20.104 720 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:f5a7144cb617 BR:763), num remaining: 2
node4 6m 5.315s 2025-10-30 18:24:20.104 721 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:4e1bb002d3f0 BR:763), num remaining: 1
node4 6m 5.315s 2025-10-30 18:24:20.104 722 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:44f0ce1c773c BR:763), num remaining: 0
node4 6m 8.916s 2025-10-30 18:24:23.705 852 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 4.6 s in CHECKING. Now in ACTIVE
node3 6m 46.315s 2025-10-30 18:25:01.104 9860 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 860 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 46.338s 2025-10-30 18:25:01.127 9886 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 860 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 46.427s 2025-10-30 18:25:01.216 1750 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 860 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 46.478s 2025-10-30 18:25:01.267 10051 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 860 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 46.505s 2025-10-30 18:25:01.294 9833 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 860 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 46.540s 2025-10-30 18:25:01.329 10054 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 860 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/860
node0 6m 46.541s 2025-10-30 18:25:01.330 10055 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 860
node1 6m 46.573s 2025-10-30 18:25:01.362 9836 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 860 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/860
node1 6m 46.573s 2025-10-30 18:25:01.362 9837 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node0 6m 46.617s 2025-10-30 18:25:01.406 10094 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 860
node0 6m 46.619s 2025-10-30 18:25:01.408 10095 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 860 Timestamp: 2025-10-30T18:25:00.151879Z Next consensus number: 26678 Legacy running event hash: 54ab14c83838b6594aea3c449174389e770d00783538e0b814647cf9fe93ce177ed7e82a1bf0ae526bf6d994e276c76d Legacy running event mnemonic: nice-october-alpha-crunch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1490175464 Root hash: 7c40c73d53d4d5cbc5e06d58986d30d733a5ed4191a6d920743ad6e4dbdb79e4a76dbcfcf629a3392e53f430a9b5885e (root) VirtualMap state / marine-blur-kind-effort
node0 6m 46.626s 2025-10-30 18:25:01.415 10096 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 46.626s 2025-10-30 18:25:01.415 10097 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 832 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 46.626s 2025-10-30 18:25:01.415 10098 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 46.633s 2025-10-30 18:25:01.422 10099 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 46.634s 2025-10-30 18:25:01.423 10100 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 860 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/860 {"round":860,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/860/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 46.635s 2025-10-30 18:25:01.424 10101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/190
node4 6m 46.645s 2025-10-30 18:25:01.434 1753 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 860 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/860
node4 6m 46.646s 2025-10-30 18:25:01.435 1754 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 860
node1 6m 46.655s 2025-10-30 18:25:01.444 9876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node1 6m 46.657s 2025-10-30 18:25:01.446 9877 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 860 Timestamp: 2025-10-30T18:25:00.151879Z Next consensus number: 26678 Legacy running event hash: 54ab14c83838b6594aea3c449174389e770d00783538e0b814647cf9fe93ce177ed7e82a1bf0ae526bf6d994e276c76d Legacy running event mnemonic: nice-october-alpha-crunch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1490175464 Root hash: 7c40c73d53d4d5cbc5e06d58986d30d733a5ed4191a6d920743ad6e4dbdb79e4a76dbcfcf629a3392e53f430a9b5885e (root) VirtualMap state / marine-blur-kind-effort
node1 6m 46.664s 2025-10-30 18:25:01.453 9878 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 46.667s 2025-10-30 18:25:01.456 9879 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 832 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 46.667s 2025-10-30 18:25:01.456 9880 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 46.674s 2025-10-30 18:25:01.463 9881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 46.674s 2025-10-30 18:25:01.463 9882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 860 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/860 {"round":860,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/860/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 46.676s 2025-10-30 18:25:01.465 9883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/190
node3 6m 46.695s 2025-10-30 18:25:01.484 9873 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 860 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/860
node3 6m 46.695s 2025-10-30 18:25:01.484 9874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node2 6m 46.709s 2025-10-30 18:25:01.498 9899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 860 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/860
node2 6m 46.709s 2025-10-30 18:25:01.498 9900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node4 6m 46.754s 2025-10-30 18:25:01.543 1799 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 860
node4 6m 46.757s 2025-10-30 18:25:01.546 1800 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 860 Timestamp: 2025-10-30T18:25:00.151879Z Next consensus number: 26678 Legacy running event hash: 54ab14c83838b6594aea3c449174389e770d00783538e0b814647cf9fe93ce177ed7e82a1bf0ae526bf6d994e276c76d Legacy running event mnemonic: nice-october-alpha-crunch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1490175464 Root hash: 7c40c73d53d4d5cbc5e06d58986d30d733a5ed4191a6d920743ad6e4dbdb79e4a76dbcfcf629a3392e53f430a9b5885e (root) VirtualMap state / marine-blur-kind-effort
node4 6m 46.766s 2025-10-30 18:25:01.555 1801 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr374_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+24+19.541949330Z_seq1_minr738_maxr1238_orgn765.pces
node4 6m 46.766s 2025-10-30 18:25:01.555 1802 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 832 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+24+19.541949330Z_seq1_minr738_maxr1238_orgn765.pces
node4 6m 46.766s 2025-10-30 18:25:01.555 1803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 46.770s 2025-10-30 18:25:01.559 1804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 46.771s 2025-10-30 18:25:01.560 1805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 860 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/860 {"round":860,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/860/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 46.772s 2025-10-30 18:25:01.561 1806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node3 6m 46.783s 2025-10-30 18:25:01.572 9913 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node3 6m 46.785s 2025-10-30 18:25:01.574 9914 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 860 Timestamp: 2025-10-30T18:25:00.151879Z Next consensus number: 26678 Legacy running event hash: 54ab14c83838b6594aea3c449174389e770d00783538e0b814647cf9fe93ce177ed7e82a1bf0ae526bf6d994e276c76d Legacy running event mnemonic: nice-october-alpha-crunch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1490175464 Root hash: 7c40c73d53d4d5cbc5e06d58986d30d733a5ed4191a6d920743ad6e4dbdb79e4a76dbcfcf629a3392e53f430a9b5885e (root) VirtualMap state / marine-blur-kind-effort
node2 6m 46.788s 2025-10-30 18:25:01.577 9931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 860
node2 6m 46.790s 2025-10-30 18:25:01.579 9932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 860 Timestamp: 2025-10-30T18:25:00.151879Z Next consensus number: 26678 Legacy running event hash: 54ab14c83838b6594aea3c449174389e770d00783538e0b814647cf9fe93ce177ed7e82a1bf0ae526bf6d994e276c76d Legacy running event mnemonic: nice-october-alpha-crunch Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1490175464 Root hash: 7c40c73d53d4d5cbc5e06d58986d30d733a5ed4191a6d920743ad6e4dbdb79e4a76dbcfcf629a3392e53f430a9b5885e (root) VirtualMap state / marine-blur-kind-effort
node3 6m 46.792s 2025-10-30 18:25:01.581 9915 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 46.793s 2025-10-30 18:25:01.582 9916 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 832 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 46.795s 2025-10-30 18:25:01.584 9917 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 46.797s 2025-10-30 18:25:01.586 9933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 46.800s 2025-10-30 18:25:01.589 9934 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 832 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 46.800s 2025-10-30 18:25:01.589 9935 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 46.802s 2025-10-30 18:25:01.591 9918 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 46.803s 2025-10-30 18:25:01.592 9919 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 860 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/860 {"round":860,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/860/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 46.804s 2025-10-30 18:25:01.593 9920 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/190
node2 6m 46.807s 2025-10-30 18:25:01.596 9936 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 46.808s 2025-10-30 18:25:01.597 9937 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 860 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/860 {"round":860,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/860/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 46.809s 2025-10-30 18:25:01.598 9938 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/190
node2 7m 46.329s 2025-10-30 18:26:01.118 11356 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 989 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 46.333s 2025-10-30 18:26:01.122 11287 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 989 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 46.334s 2025-10-30 18:26:01.123 11334 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 989 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 46.357s 2025-10-30 18:26:01.146 3234 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 989 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 46.410s 2025-10-30 18:26:01.199 11513 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 989 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 46.523s 2025-10-30 18:26:01.312 11290 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 989 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/989
node1 7m 46.524s 2025-10-30 18:26:01.313 11291 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node2 7m 46.594s 2025-10-30 18:26:01.383 11359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 989 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/989
node2 7m 46.594s 2025-10-30 18:26:01.383 11360 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node1 7m 46.606s 2025-10-30 18:26:01.395 11330 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node1 7m 46.608s 2025-10-30 18:26:01.397 11331 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 989 Timestamp: 2025-10-30T18:26:00.083146416Z Next consensus number: 31505 Legacy running event hash: 271a04b8d1d62b94c629785d2cf556a77173b40afdcc6eaeec3ee8837eaaf324be088e1eaf2c4ae037fbb9e102c22f69 Legacy running event mnemonic: today-novel-shed-income Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2069302838 Root hash: 48b125781f3838262b04a87ae37d80a8f7f3f36edc027e0d0986d73f801fb7d51d909192fd2b0c21b38a19f13627927f (root) VirtualMap state / bike-pyramid-train-age
node4 7m 46.611s 2025-10-30 18:26:01.400 3237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 989 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/989
node0 7m 46.612s 2025-10-30 18:26:01.401 11516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 989 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/989
node0 7m 46.612s 2025-10-30 18:26:01.401 11517 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 989
node4 7m 46.612s 2025-10-30 18:26:01.401 3238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 989
node1 7m 46.615s 2025-10-30 18:26:01.404 11332 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+18+31.105410861Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 46.615s 2025-10-30 18:26:01.404 11333 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 962 File: data/saved/preconsensus-events/1/2025/10/30/2025-10-30T18+22+23.609759974Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 46.615s 2025-10-30 18:26:01.404 11334 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 46.625s 2025-10-30 18:26:01.414 11335 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 46.626s 2025-10-30 18:26:01.415 11336 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 989 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/989 {"round":989,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/989/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 46.627s 2025-10-30 18:26:01.416 11337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/318
node3 7m 46.643s 2025-10-30 18:26:01.432 11337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 989 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/989
node3 7m 46.644s 2025-10-30 18:26:01.433 11338 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node2 7m 46.674s 2025-10-30 18:26:01.463 11399 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node2 7m 46.676s 2025-10-30 18:26:01.465 11400 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 989 Timestamp: 2025-10-30T18:26:00.083146416Z Next consensus number: 31505 Legacy running event hash: 271a04b8d1d62b94c629785d2cf556a77173b40afdcc6eaeec3ee8837eaaf324be088e1eaf2c4ae037fbb9e102c22f69 Legacy running event mnemonic: today-novel-shed-income Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2069302838 Root hash: 48b125781f3838262b04a87ae37d80a8f7f3f36edc027e0d0986d73f801fb7d51d909192fd2b0c21b38a19f13627927f (root) VirtualMap state / bike-pyramid-train-age
node2 7m 46.683s 2025-10-30 18:26:01.472 11401 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+18+30.997820882Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 46.683s 2025-10-30 18:26:01.472 11402 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 962 File: data/saved/preconsensus-events/2/2025/10/30/2025-10-30T18+22+23.632420342Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 46.683s 2025-10-30 18:26:01.472 11403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 46.692s 2025-10-30 18:26:01.481 11548 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 989
node2 7m 46.693s 2025-10-30 18:26:01.482 11404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 46.694s 2025-10-30 18:26:01.483 11549 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 989 Timestamp: 2025-10-30T18:26:00.083146416Z Next consensus number: 31505 Legacy running event hash: 271a04b8d1d62b94c629785d2cf556a77173b40afdcc6eaeec3ee8837eaaf324be088e1eaf2c4ae037fbb9e102c22f69 Legacy running event mnemonic: today-novel-shed-income Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2069302838 Root hash: 48b125781f3838262b04a87ae37d80a8f7f3f36edc027e0d0986d73f801fb7d51d909192fd2b0c21b38a19f13627927f (root) VirtualMap state / bike-pyramid-train-age
node2 7m 46.694s 2025-10-30 18:26:01.483 11405 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 989 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/989 {"round":989,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/989/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 46.696s 2025-10-30 18:26:01.485 11406 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/318
node0 7m 46.701s 2025-10-30 18:26:01.490 11550 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+18+31.050357805Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 46.701s 2025-10-30 18:26:01.490 11551 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 962 File: data/saved/preconsensus-events/0/2025/10/30/2025-10-30T18+22+23.517231454Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 46.701s 2025-10-30 18:26:01.490 11552 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 46.711s 2025-10-30 18:26:01.500 11553 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 46.712s 2025-10-30 18:26:01.501 11554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 989 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/989 {"round":989,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/989/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 46.713s 2025-10-30 18:26:01.502 11555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/318
node3 7m 46.720s 2025-10-30 18:26:01.509 11369 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 989
node3 7m 46.722s 2025-10-30 18:26:01.511 11370 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 989 Timestamp: 2025-10-30T18:26:00.083146416Z Next consensus number: 31505 Legacy running event hash: 271a04b8d1d62b94c629785d2cf556a77173b40afdcc6eaeec3ee8837eaaf324be088e1eaf2c4ae037fbb9e102c22f69 Legacy running event mnemonic: today-novel-shed-income Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2069302838 Root hash: 48b125781f3838262b04a87ae37d80a8f7f3f36edc027e0d0986d73f801fb7d51d909192fd2b0c21b38a19f13627927f (root) VirtualMap state / bike-pyramid-train-age
node3 7m 46.729s 2025-10-30 18:26:01.518 11371 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+18+30.900078163Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 46.729s 2025-10-30 18:26:01.518 11372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 962 File: data/saved/preconsensus-events/3/2025/10/30/2025-10-30T18+22+23.605359218Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 46.729s 2025-10-30 18:26:01.518 11373 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 46.732s 2025-10-30 18:26:01.521 3283 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 989
node4 7m 46.734s 2025-10-30 18:26:01.523 3284 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 989 Timestamp: 2025-10-30T18:26:00.083146416Z Next consensus number: 31505 Legacy running event hash: 271a04b8d1d62b94c629785d2cf556a77173b40afdcc6eaeec3ee8837eaaf324be088e1eaf2c4ae037fbb9e102c22f69 Legacy running event mnemonic: today-novel-shed-income Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2069302838 Root hash: 48b125781f3838262b04a87ae37d80a8f7f3f36edc027e0d0986d73f801fb7d51d909192fd2b0c21b38a19f13627927f (root) VirtualMap state / bike-pyramid-train-age
node3 7m 46.739s 2025-10-30 18:26:01.528 11374 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 46.740s 2025-10-30 18:26:01.529 11375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 989 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/989 {"round":989,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/989/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 46.741s 2025-10-30 18:26:01.530 11376 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/318
node4 7m 46.741s 2025-10-30 18:26:01.530 3285 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+18+31.225738898Z_seq0_minr1_maxr374_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+24+19.541949330Z_seq1_minr738_maxr1238_orgn765.pces
node4 7m 46.741s 2025-10-30 18:26:01.530 3286 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 962 File: data/saved/preconsensus-events/4/2025/10/30/2025-10-30T18+24+19.541949330Z_seq1_minr738_maxr1238_orgn765.pces
node4 7m 46.741s 2025-10-30 18:26:01.530 3287 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 46.747s 2025-10-30 18:26:01.536 3288 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 46.748s 2025-10-30 18:26:01.537 3289 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 989 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/989 {"round":989,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/989/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 46.749s 2025-10-30 18:26:01.538 3290 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/61
node1 7m 57.953s 2025-10-30 18:26:12.742 11623 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 1 to 3>> NetworkUtils: Connection broken: 1 -> 3
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.741676059Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 7m 57.958s 2025-10-30 18:26:12.747 3568 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 4 to 3>> NetworkUtils: Connection broken: 4 <- 3
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.743701583Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node1 7m 58.067s 2025-10-30 18:26:12.856 11624 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 1 to 2>> NetworkUtils: Connection broken: 1 -> 2
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.855687946Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 7m 58.067s 2025-10-30 18:26:12.856 3569 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 4 to 2>> NetworkUtils: Connection broken: 4 <- 2
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.855785122Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 7m 58.107s 2025-10-30 18:26:12.896 3570 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 4 to 0>> NetworkUtils: Connection broken: 4 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.895422168Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node1 7m 58.109s 2025-10-30 18:26:12.898 11625 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 1 to 0>> NetworkUtils: Connection broken: 1 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-10-30T18:26:12.895541068Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more