Node ID







Columns











Log Level





Log Marker








Class


















































node2 0.000ns 2025-11-07 05:46:59.182 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 80.000ms 2025-11-07 05:46:59.262 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 91.000ms 2025-11-07 05:46:59.273 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 108.000ms 2025-11-07 05:46:59.290 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 169.000ms 2025-11-07 05:46:59.351 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 185.000ms 2025-11-07 05:46:59.367 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 221.000ms 2025-11-07 05:46:59.403 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 251.000ms 2025-11-07 05:46:59.433 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 297.000ms 2025-11-07 05:46:59.479 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 327.000ms 2025-11-07 05:46:59.509 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 446.000ms 2025-11-07 05:46:59.628 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 538.000ms 2025-11-07 05:46:59.720 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 555.000ms 2025-11-07 05:46:59.737 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 671.000ms 2025-11-07 05:46:59.853 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 701.000ms 2025-11-07 05:46:59.883 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 1.502s 2025-11-07 05:47:00.684 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1250ms
node2 1.512s 2025-11-07 05:47:00.694 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.515s 2025-11-07 05:47:00.697 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 1.553s 2025-11-07 05:47:00.735 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.624s 2025-11-07 05:47:00.806 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.626s 2025-11-07 05:47:00.808 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 1.775s 2025-11-07 05:47:00.957 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1446ms
node1 1.785s 2025-11-07 05:47:00.967 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.788s 2025-11-07 05:47:00.970 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.800s 2025-11-07 05:47:00.982 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 1.826s 2025-11-07 05:47:01.008 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 1.888s 2025-11-07 05:47:01.070 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 1.888s 2025-11-07 05:47:01.070 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 1.913s 2025-11-07 05:47:01.095 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 1.932s 2025-11-07 05:47:01.114 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 2.090s 2025-11-07 05:47:01.272 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 2.126s 2025-11-07 05:47:01.308 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 2.149s 2025-11-07 05:47:01.331 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1446ms
node0 2.159s 2025-11-07 05:47:01.341 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 2.162s 2025-11-07 05:47:01.344 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.199s 2025-11-07 05:47:01.381 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node0 2.257s 2025-11-07 05:47:01.439 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 2.258s 2025-11-07 05:47:01.440 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 2.677s 2025-11-07 05:47:01.859 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 2.786s 2025-11-07 05:47:01.968 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 2.806s 2025-11-07 05:47:01.988 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 2.947s 2025-11-07 05:47:02.129 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 2.985s 2025-11-07 05:47:02.167 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 3.651s 2025-11-07 05:47:02.833 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 3.738s 2025-11-07 05:47:02.920 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.740s 2025-11-07 05:47:02.922 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 3.768s 2025-11-07 05:47:02.950 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1641ms
node2 3.774s 2025-11-07 05:47:02.956 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.778s 2025-11-07 05:47:02.960 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 3.780s 2025-11-07 05:47:02.962 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 3.823s 2025-11-07 05:47:03.005 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 3.891s 2025-11-07 05:47:03.073 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 3.892s 2025-11-07 05:47:03.074 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 3.920s 2025-11-07 05:47:03.102 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 4.016s 2025-11-07 05:47:03.198 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.018s 2025-11-07 05:47:03.200 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 4.053s 2025-11-07 05:47:03.235 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.331s 2025-11-07 05:47:03.513 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 4.431s 2025-11-07 05:47:03.613 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 4.433s 2025-11-07 05:47:03.615 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 4.468s 2025-11-07 05:47:03.650 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 4.549s 2025-11-07 05:47:03.731 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.550s 2025-11-07 05:47:03.732 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.556s 2025-11-07 05:47:03.738 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.565s 2025-11-07 05:47:03.747 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.567s 2025-11-07 05:47:03.749 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.567s 2025-11-07 05:47:03.749 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1581ms
node4 4.576s 2025-11-07 05:47:03.758 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 4.579s 2025-11-07 05:47:03.761 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 4.621s 2025-11-07 05:47:03.803 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 4.691s 2025-11-07 05:47:03.873 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 4.692s 2025-11-07 05:47:03.874 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 4.839s 2025-11-07 05:47:04.021 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.841s 2025-11-07 05:47:04.023 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.849s 2025-11-07 05:47:04.031 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.860s 2025-11-07 05:47:04.042 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.863s 2025-11-07 05:47:04.045 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.280s 2025-11-07 05:47:04.462 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.283s 2025-11-07 05:47:04.465 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 5.289s 2025-11-07 05:47:04.471 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 5.302s 2025-11-07 05:47:04.484 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 5.304s 2025-11-07 05:47:04.486 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.683s 2025-11-07 05:47:04.865 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26021153] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=234410, randomLong=4351010363724982655, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9820, randomLong=-6487627409460845346, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1342909, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node2 5.715s 2025-11-07 05:47:04.897 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.723s 2025-11-07 05:47:04.905 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.725s 2025-11-07 05:47:04.907 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 5.811s 2025-11-07 05:47:04.993 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node2 5.835s 2025-11-07 05:47:05.017 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.835s 2025-11-07 05:47:05.017 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.848s 2025-11-07 05:47:05.030 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ba74fe5e13965e32d9456fa7baecccf83f6c98f57928e629710fdc0923e4678aabe9af6fcecf6d0fd9dbf42d44ddfe65 (root) VirtualMap state / ranch-crowd-index-shock
node2 5.851s 2025-11-07 05:47:05.033 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node1 6.003s 2025-11-07 05:47:05.185 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26327348] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=158431, randomLong=1188766718220881970, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11900, randomLong=-3393580029651933095, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1810440, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node1 6.033s 2025-11-07 05:47:05.215 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 6.042s 2025-11-07 05:47:05.224 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 6.043s 2025-11-07 05:47:05.225 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node1 6.044s 2025-11-07 05:47:05.226 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 6.070s 2025-11-07 05:47:05.252 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.076s 2025-11-07 05:47:05.258 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.081s 2025-11-07 05:47:05.263 43 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.082s 2025-11-07 05:47:05.264 44 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.083s 2025-11-07 05:47:05.265 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.087s 2025-11-07 05:47:05.269 46 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.088s 2025-11-07 05:47:05.270 47 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.089s 2025-11-07 05:47:05.271 48 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.091s 2025-11-07 05:47:05.273 49 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.091s 2025-11-07 05:47:05.273 50 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.093s 2025-11-07 05:47:05.275 51 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.094s 2025-11-07 05:47:05.276 52 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.097s 2025-11-07 05:47:05.279 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 187.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.103s 2025-11-07 05:47:05.285 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 6.130s 2025-11-07 05:47:05.312 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node3 6.140s 2025-11-07 05:47:05.322 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.143s 2025-11-07 05:47:05.325 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 6.155s 2025-11-07 05:47:05.337 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 6.155s 2025-11-07 05:47:05.337 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 6.167s 2025-11-07 05:47:05.349 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ba74fe5e13965e32d9456fa7baecccf83f6c98f57928e629710fdc0923e4678aabe9af6fcecf6d0fd9dbf42d44ddfe65 (root) VirtualMap state / ranch-crowd-index-shock
node1 6.170s 2025-11-07 05:47:05.352 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node3 6.193s 2025-11-07 05:47:05.375 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 6.388s 2025-11-07 05:47:05.570 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.392s 2025-11-07 05:47:05.574 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.397s 2025-11-07 05:47:05.579 43 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 6.398s 2025-11-07 05:47:05.580 44 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.399s 2025-11-07 05:47:05.581 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 6.402s 2025-11-07 05:47:05.584 46 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 6.403s 2025-11-07 05:47:05.585 47 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 6.404s 2025-11-07 05:47:05.586 48 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 6.405s 2025-11-07 05:47:05.587 49 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 6.405s 2025-11-07 05:47:05.587 50 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 6.407s 2025-11-07 05:47:05.589 51 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.408s 2025-11-07 05:47:05.590 52 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.413s 2025-11-07 05:47:05.595 53 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 186.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.419s 2025-11-07 05:47:05.601 54 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.423s 2025-11-07 05:47:05.605 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26265352] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=224020, randomLong=4879285224404134415, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9100, randomLong=-5740797598201081563, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1443820, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node0 6.456s 2025-11-07 05:47:05.638 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 6.464s 2025-11-07 05:47:05.646 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 6.466s 2025-11-07 05:47:05.648 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 6.553s 2025-11-07 05:47:05.735 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node0 6.579s 2025-11-07 05:47:05.761 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 6.580s 2025-11-07 05:47:05.762 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 6.593s 2025-11-07 05:47:05.775 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ba74fe5e13965e32d9456fa7baecccf83f6c98f57928e629710fdc0923e4678aabe9af6fcecf6d0fd9dbf42d44ddfe65 (root) VirtualMap state / ranch-crowd-index-shock
node0 6.596s 2025-11-07 05:47:05.778 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 6.804s 2025-11-07 05:47:05.986 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 6.837s 2025-11-07 05:47:06.019 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 6.842s 2025-11-07 05:47:06.024 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 6.847s 2025-11-07 05:47:06.029 43 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 6.848s 2025-11-07 05:47:06.030 44 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 6.849s 2025-11-07 05:47:06.031 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 6.853s 2025-11-07 05:47:06.035 46 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 6.854s 2025-11-07 05:47:06.036 47 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 6.854s 2025-11-07 05:47:06.036 48 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 6.857s 2025-11-07 05:47:06.039 49 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 6.857s 2025-11-07 05:47:06.039 50 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 6.859s 2025-11-07 05:47:06.041 51 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 6.861s 2025-11-07 05:47:06.043 52 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 6.862s 2025-11-07 05:47:06.044 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 206.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 6.869s 2025-11-07 05:47:06.051 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6.907s 2025-11-07 05:47:06.089 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.910s 2025-11-07 05:47:06.092 16 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 6.949s 2025-11-07 05:47:06.131 21 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 7.139s 2025-11-07 05:47:06.321 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.142s 2025-11-07 05:47:06.324 27 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 7.150s 2025-11-07 05:47:06.332 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 7.163s 2025-11-07 05:47:06.345 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 7.166s 2025-11-07 05:47:06.348 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 7.800s 2025-11-07 05:47:06.982 24 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 7.801s 2025-11-07 05:47:06.983 26 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 7.808s 2025-11-07 05:47:06.990 28 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 7.823s 2025-11-07 05:47:07.005 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 7.827s 2025-11-07 05:47:07.009 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 8.292s 2025-11-07 05:47:07.474 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26211055] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=243410, randomLong=21079988848113460, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=22150, randomLong=2292714516962704765, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1644159, data=35, exception=null] OS Health Check Report - Complete (took 1030 ms)
node3 8.328s 2025-11-07 05:47:07.510 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 8.340s 2025-11-07 05:47:07.522 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 8.342s 2025-11-07 05:47:07.524 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 8.440s 2025-11-07 05:47:07.622 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node3 8.467s 2025-11-07 05:47:07.649 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 8.467s 2025-11-07 05:47:07.649 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 8.482s 2025-11-07 05:47:07.664 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ba74fe5e13965e32d9456fa7baecccf83f6c98f57928e629710fdc0923e4678aabe9af6fcecf6d0fd9dbf42d44ddfe65 (root) VirtualMap state / ranch-crowd-index-shock
node3 8.486s 2025-11-07 05:47:07.668 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node3 8.712s 2025-11-07 05:47:07.894 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 8.719s 2025-11-07 05:47:07.901 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 8.726s 2025-11-07 05:47:07.908 43 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 8.727s 2025-11-07 05:47:07.909 44 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 8.728s 2025-11-07 05:47:07.910 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 8.732s 2025-11-07 05:47:07.914 46 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 8.734s 2025-11-07 05:47:07.916 47 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 8.734s 2025-11-07 05:47:07.916 48 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 8.736s 2025-11-07 05:47:07.918 49 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 8.737s 2025-11-07 05:47:07.919 50 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 8.739s 2025-11-07 05:47:07.921 51 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 8.740s 2025-11-07 05:47:07.922 52 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 8.742s 2025-11-07 05:47:07.924 53 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 192.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 8.748s 2025-11-07 05:47:07.930 54 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 8.955s 2025-11-07 05:47:08.137 31 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26336190] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=238120, randomLong=2347182902892275002, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=29490, randomLong=-4539751248819438838, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1756850, data=35, exception=null] OS Health Check Report - Complete (took 1032 ms)
node4 8.991s 2025-11-07 05:47:08.173 32 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 9.001s 2025-11-07 05:47:08.183 33 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 9.003s 2025-11-07 05:47:08.185 34 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 9.093s 2025-11-07 05:47:08.275 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.095s 2025-11-07 05:47:08.277 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 9.097s 2025-11-07 05:47:08.279 35 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node4 9.122s 2025-11-07 05:47:08.304 36 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 9.123s 2025-11-07 05:47:08.305 37 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 9.136s 2025-11-07 05:47:08.318 38 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: ba74fe5e13965e32d9456fa7baecccf83f6c98f57928e629710fdc0923e4678aabe9af6fcecf6d0fd9dbf42d44ddfe65 (root) VirtualMap state / ranch-crowd-index-shock
node4 9.140s 2025-11-07 05:47:08.322 40 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 9.351s 2025-11-07 05:47:08.533 41 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 9.358s 2025-11-07 05:47:08.540 42 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 9.364s 2025-11-07 05:47:08.546 43 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 9.365s 2025-11-07 05:47:08.547 44 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 9.367s 2025-11-07 05:47:08.549 45 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 9.372s 2025-11-07 05:47:08.554 46 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 9.374s 2025-11-07 05:47:08.556 47 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 9.375s 2025-11-07 05:47:08.557 48 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 9.378s 2025-11-07 05:47:08.560 49 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 9.378s 2025-11-07 05:47:08.560 50 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 9.380s 2025-11-07 05:47:08.562 51 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 9.382s 2025-11-07 05:47:08.564 52 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 9.383s 2025-11-07 05:47:08.565 53 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 183.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 9.389s 2025-11-07 05:47:08.571 54 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 9.412s 2025-11-07 05:47:08.594 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.415s 2025-11-07 05:47:08.597 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 9.858s 2025-11-07 05:47:09.040 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 9.861s 2025-11-07 05:47:09.043 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 11.739s 2025-11-07 05:47:10.921 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 11.741s 2025-11-07 05:47:10.923 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 12.377s 2025-11-07 05:47:11.559 55 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 12.380s 2025-11-07 05:47:11.562 56 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 16.190s 2025-11-07 05:47:15.372 57 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 16.505s 2025-11-07 05:47:15.687 57 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 16.955s 2025-11-07 05:47:16.137 57 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 17.791s 2025-11-07 05:47:16.973 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node1 17.837s 2025-11-07 05:47:17.019 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node0 17.848s 2025-11-07 05:47:17.030 58 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node3 17.895s 2025-11-07 05:47:17.077 57 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node4 17.928s 2025-11-07 05:47:17.110 57 INFO STARTUP <<scheduler TransactionHandler>> DefaultTransactionHandler: Ignoring empty consensus round 1
node2 18.233s 2025-11-07 05:47:17.415 59 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 2.0 s in CHECKING. Now in ACTIVE
node2 18.235s 2025-11-07 05:47:17.417 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 18.264s 2025-11-07 05:47:17.446 59 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 1.8 s in CHECKING. Now in ACTIVE
node1 18.266s 2025-11-07 05:47:17.448 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.275s 2025-11-07 05:47:17.457 59 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 1.3 s in CHECKING. Now in ACTIVE
node0 18.278s 2025-11-07 05:47:17.460 61 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.324s 2025-11-07 05:47:17.506 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 18.358s 2025-11-07 05:47:17.540 59 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 2 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 18.601s 2025-11-07 05:47:17.783 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node3 18.604s 2025-11-07 05:47:17.786 75 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 18.617s 2025-11-07 05:47:17.799 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node2 18.620s 2025-11-07 05:47:17.802 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 18.654s 2025-11-07 05:47:17.836 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node0 18.656s 2025-11-07 05:47:17.838 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 18.684s 2025-11-07 05:47:17.866 76 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node1 18.687s 2025-11-07 05:47:17.869 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 18.781s 2025-11-07 05:47:17.963 74 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 2 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node4 18.784s 2025-11-07 05:47:17.966 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 18.836s 2025-11-07 05:47:18.018 107 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 18.855s 2025-11-07 05:47:18.037 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node2 18.857s 2025-11-07 05:47:18.039 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-11-07T05:47:16.531408093Z Next consensus number: 7 Legacy running event hash: 86db94f55ac03122ff3b3d1c8b645c90448fb9253b0cf5a28ef4e6230ceaaa63c70fcb76dcba1051dfe8dec93f5a2518 Legacy running event mnemonic: caution-gesture-fix-cattle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: db38889426e9a8d11ae87e51c9c6d2874eeddedba65e0f34834d2d526ffef8a459e0fe019693ed9cb1ec9337a516c67f (root) VirtualMap state / october-unique-tag-tourist
node3 18.877s 2025-11-07 05:47:18.059 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node3 18.881s 2025-11-07 05:47:18.063 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-11-07T05:47:16.531408093Z Next consensus number: 7 Legacy running event hash: 86db94f55ac03122ff3b3d1c8b645c90448fb9253b0cf5a28ef4e6230ceaaa63c70fcb76dcba1051dfe8dec93f5a2518 Legacy running event mnemonic: caution-gesture-fix-cattle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: db38889426e9a8d11ae87e51c9c6d2874eeddedba65e0f34834d2d526ffef8a459e0fe019693ed9cb1ec9337a516c67f (root) VirtualMap state / october-unique-tag-tourist
node0 18.894s 2025-11-07 05:47:18.076 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node0 18.897s 2025-11-07 05:47:18.079 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-11-07T05:47:16.531408093Z Next consensus number: 7 Legacy running event hash: 86db94f55ac03122ff3b3d1c8b645c90448fb9253b0cf5a28ef4e6230ceaaa63c70fcb76dcba1051dfe8dec93f5a2518 Legacy running event mnemonic: caution-gesture-fix-cattle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: db38889426e9a8d11ae87e51c9c6d2874eeddedba65e0f34834d2d526ffef8a459e0fe019693ed9cb1ec9337a516c67f (root) VirtualMap state / october-unique-tag-tourist
node2 18.898s 2025-11-07 05:47:18.080 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 18.899s 2025-11-07 05:47:18.081 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 18.899s 2025-11-07 05:47:18.081 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 18.900s 2025-11-07 05:47:18.082 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 18.906s 2025-11-07 05:47:18.088 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 18.925s 2025-11-07 05:47:18.107 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 18.926s 2025-11-07 05:47:18.108 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 18.926s 2025-11-07 05:47:18.108 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 18.927s 2025-11-07 05:47:18.109 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 18.928s 2025-11-07 05:47:18.110 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node1 18.931s 2025-11-07 05:47:18.113 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-11-07T05:47:16.531408093Z Next consensus number: 7 Legacy running event hash: 86db94f55ac03122ff3b3d1c8b645c90448fb9253b0cf5a28ef4e6230ceaaa63c70fcb76dcba1051dfe8dec93f5a2518 Legacy running event mnemonic: caution-gesture-fix-cattle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: db38889426e9a8d11ae87e51c9c6d2874eeddedba65e0f34834d2d526ffef8a459e0fe019693ed9cb1ec9337a516c67f (root) VirtualMap state / october-unique-tag-tourist
node0 18.932s 2025-11-07 05:47:18.114 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 18.933s 2025-11-07 05:47:18.115 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 18.934s 2025-11-07 05:47:18.116 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 18.935s 2025-11-07 05:47:18.117 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 18.935s 2025-11-07 05:47:18.117 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 18.941s 2025-11-07 05:47:18.123 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 18.972s 2025-11-07 05:47:18.154 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 18.973s 2025-11-07 05:47:18.155 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 18.973s 2025-11-07 05:47:18.155 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 18.974s 2025-11-07 05:47:18.156 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 18.980s 2025-11-07 05:47:18.162 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.047s 2025-11-07 05:47:18.229 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 2
node4 19.050s 2025-11-07 05:47:18.232 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 2 Timestamp: 2025-11-07T05:47:16.531408093Z Next consensus number: 7 Legacy running event hash: 86db94f55ac03122ff3b3d1c8b645c90448fb9253b0cf5a28ef4e6230ceaaa63c70fcb76dcba1051dfe8dec93f5a2518 Legacy running event mnemonic: caution-gesture-fix-cattle Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -799502542 Root hash: db38889426e9a8d11ae87e51c9c6d2874eeddedba65e0f34834d2d526ffef8a459e0fe019693ed9cb1ec9337a516c67f (root) VirtualMap state / october-unique-tag-tourist
node4 19.091s 2025-11-07 05:47:18.273 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 19.092s 2025-11-07 05:47:18.274 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 19.092s 2025-11-07 05:47:18.274 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 19.094s 2025-11-07 05:47:18.276 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 19.100s 2025-11-07 05:47:18.282 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 2 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2 {"round":2,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 19.477s 2025-11-07 05:47:18.659 135 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 19.925s 2025-11-07 05:47:19.107 140 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 1.1 s in CHECKING. Now in ACTIVE
node4 21.385s 2025-11-07 05:47:20.567 167 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 1.9 s in CHECKING. Now in ACTIVE
node3 1m 2.662s 2025-11-07 05:48:01.844 1095 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 95 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 2.671s 2025-11-07 05:48:01.853 1110 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 95 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 2.720s 2025-11-07 05:48:01.902 1107 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 95 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 2.741s 2025-11-07 05:48:01.923 1084 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 95 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 2.776s 2025-11-07 05:48:01.958 1117 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 95 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 2.877s 2025-11-07 05:48:02.059 1090 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 95 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/95
node2 1m 2.878s 2025-11-07 05:48:02.060 1091 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node1 1m 2.912s 2025-11-07 05:48:02.094 1113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 95 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/95
node1 1m 2.913s 2025-11-07 05:48:02.095 1114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node4 1m 2.941s 2025-11-07 05:48:02.123 1123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 95 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/95
node4 1m 2.942s 2025-11-07 05:48:02.124 1124 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node0 1m 2.947s 2025-11-07 05:48:02.129 1116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 95 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/95
node0 1m 2.948s 2025-11-07 05:48:02.130 1117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node2 1m 2.971s 2025-11-07 05:48:02.153 1134 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node2 1m 2.974s 2025-11-07 05:48:02.156 1135 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 95 Timestamp: 2025-11-07T05:48:00.411184Z Next consensus number: 3432 Legacy running event hash: 2f6009d23d5b71cc0d8b92f3f8e83f1cc7ca3986a97f557694d5714f12218496bf630e8fd5070bb399c6374cbba10667 Legacy running event mnemonic: ugly-demise-problem-define Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -29204778 Root hash: c626f7b315c2f9db5b905ad8f98e9cf424ba8aba78633810146ba12d4cc9ba8696f1c957508307cac2cf8c6c4ae60a2e (root) VirtualMap state / deputy-answer-reunion-seven
node3 1m 2.982s 2025-11-07 05:48:02.164 1101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 95 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/95
node3 1m 2.983s 2025-11-07 05:48:02.165 1102 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node2 1m 2.985s 2025-11-07 05:48:02.167 1136 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 2.987s 2025-11-07 05:48:02.169 1137 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 68 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 2.987s 2025-11-07 05:48:02.169 1138 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 2.990s 2025-11-07 05:48:02.172 1139 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 1m 2.991s 2025-11-07 05:48:02.173 1140 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 95 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/95 {"round":95,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/95/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 2.999s 2025-11-07 05:48:02.181 1147 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node1 1m 3.001s 2025-11-07 05:48:02.183 1148 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 95 Timestamp: 2025-11-07T05:48:00.411184Z Next consensus number: 3432 Legacy running event hash: 2f6009d23d5b71cc0d8b92f3f8e83f1cc7ca3986a97f557694d5714f12218496bf630e8fd5070bb399c6374cbba10667 Legacy running event mnemonic: ugly-demise-problem-define Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -29204778 Root hash: c626f7b315c2f9db5b905ad8f98e9cf424ba8aba78633810146ba12d4cc9ba8696f1c957508307cac2cf8c6c4ae60a2e (root) VirtualMap state / deputy-answer-reunion-seven
node1 1m 3.009s 2025-11-07 05:48:02.191 1159 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 3.010s 2025-11-07 05:48:02.192 1160 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 68 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 3.011s 2025-11-07 05:48:02.193 1161 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 3.013s 2025-11-07 05:48:02.195 1162 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 3.014s 2025-11-07 05:48:02.196 1163 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 95 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/95 {"round":95,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/95/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 3.034s 2025-11-07 05:48:02.216 1150 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node4 1m 3.036s 2025-11-07 05:48:02.218 1167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node0 1m 3.037s 2025-11-07 05:48:02.219 1151 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 95 Timestamp: 2025-11-07T05:48:00.411184Z Next consensus number: 3432 Legacy running event hash: 2f6009d23d5b71cc0d8b92f3f8e83f1cc7ca3986a97f557694d5714f12218496bf630e8fd5070bb399c6374cbba10667 Legacy running event mnemonic: ugly-demise-problem-define Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -29204778 Root hash: c626f7b315c2f9db5b905ad8f98e9cf424ba8aba78633810146ba12d4cc9ba8696f1c957508307cac2cf8c6c4ae60a2e (root) VirtualMap state / deputy-answer-reunion-seven
node4 1m 3.039s 2025-11-07 05:48:02.221 1168 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 95 Timestamp: 2025-11-07T05:48:00.411184Z Next consensus number: 3432 Legacy running event hash: 2f6009d23d5b71cc0d8b92f3f8e83f1cc7ca3986a97f557694d5714f12218496bf630e8fd5070bb399c6374cbba10667 Legacy running event mnemonic: ugly-demise-problem-define Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -29204778 Root hash: c626f7b315c2f9db5b905ad8f98e9cf424ba8aba78633810146ba12d4cc9ba8696f1c957508307cac2cf8c6c4ae60a2e (root) VirtualMap state / deputy-answer-reunion-seven
node0 1m 3.046s 2025-11-07 05:48:02.228 1152 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 3.047s 2025-11-07 05:48:02.229 1153 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 68 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 3.047s 2025-11-07 05:48:02.229 1154 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 3.047s 2025-11-07 05:48:02.229 1169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 3.048s 2025-11-07 05:48:02.230 1170 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 68 File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 3.048s 2025-11-07 05:48:02.230 1171 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 3.050s 2025-11-07 05:48:02.232 1155 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 3.050s 2025-11-07 05:48:02.232 1156 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 95 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/95 {"round":95,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/95/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 3.051s 2025-11-07 05:48:02.233 1172 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 3.052s 2025-11-07 05:48:02.234 1173 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 95 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/95 {"round":95,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/95/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 3.078s 2025-11-07 05:48:02.260 1135 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 95
node3 1m 3.080s 2025-11-07 05:48:02.262 1136 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 95 Timestamp: 2025-11-07T05:48:00.411184Z Next consensus number: 3432 Legacy running event hash: 2f6009d23d5b71cc0d8b92f3f8e83f1cc7ca3986a97f557694d5714f12218496bf630e8fd5070bb399c6374cbba10667 Legacy running event mnemonic: ugly-demise-problem-define Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -29204778 Root hash: c626f7b315c2f9db5b905ad8f98e9cf424ba8aba78633810146ba12d4cc9ba8696f1c957508307cac2cf8c6c4ae60a2e (root) VirtualMap state / deputy-answer-reunion-seven
node3 1m 3.090s 2025-11-07 05:48:02.272 1137 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 3.091s 2025-11-07 05:48:02.273 1138 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 68 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 3.091s 2025-11-07 05:48:02.273 1139 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 3.094s 2025-11-07 05:48:02.276 1140 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 3.095s 2025-11-07 05:48:02.277 1141 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 95 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/95 {"round":95,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/95/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 1.914s 2025-11-07 05:49:01.096 2548 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 223 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 2.004s 2025-11-07 05:49:01.186 2538 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 223 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 2.006s 2025-11-07 05:49:01.188 2551 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 223 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 2.022s 2025-11-07 05:49:01.204 2567 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 223 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 2.100s 2025-11-07 05:49:01.282 2535 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 223 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 2.165s 2025-11-07 05:49:01.347 2541 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 223 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/223
node0 2m 2.166s 2025-11-07 05:49:01.348 2542 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node3 2m 2.170s 2025-11-07 05:49:01.352 2538 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 223 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/223
node3 2m 2.171s 2025-11-07 05:49:01.353 2539 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node2 2m 2.223s 2025-11-07 05:49:01.405 2551 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 223 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/223
node2 2m 2.223s 2025-11-07 05:49:01.405 2552 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node1 2m 2.235s 2025-11-07 05:49:01.417 2554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 223 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/223
node1 2m 2.236s 2025-11-07 05:49:01.418 2555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node0 2m 2.250s 2025-11-07 05:49:01.432 2573 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node0 2m 2.252s 2025-11-07 05:49:01.434 2574 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 223 Timestamp: 2025-11-07T05:49:00.266526Z Next consensus number: 8191 Legacy running event hash: f873a2a4d158400d00a1c2fe272e67ba54c3d5cd7243c8483a5af7875b5b965cf1f6319e79655fe0073baaac819f9f7f Legacy running event mnemonic: vast-process-steak-accuse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1788626791 Root hash: e0171439d8eab21dac3d5fc1766cffa69332fca419c01dad13bf1e6d4c590504112e18e3b7c20e0ffe16e201b28f1f56 (root) VirtualMap state / wood-mango-tissue-round
node3 2m 2.258s 2025-11-07 05:49:01.440 2570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node0 2m 2.260s 2025-11-07 05:49:01.442 2583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 2.260s 2025-11-07 05:49:01.442 2584 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 196 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 2.260s 2025-11-07 05:49:01.442 2585 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 2.260s 2025-11-07 05:49:01.442 2571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 223 Timestamp: 2025-11-07T05:49:00.266526Z Next consensus number: 8191 Legacy running event hash: f873a2a4d158400d00a1c2fe272e67ba54c3d5cd7243c8483a5af7875b5b965cf1f6319e79655fe0073baaac819f9f7f Legacy running event mnemonic: vast-process-steak-accuse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1788626791 Root hash: e0171439d8eab21dac3d5fc1766cffa69332fca419c01dad13bf1e6d4c590504112e18e3b7c20e0ffe16e201b28f1f56 (root) VirtualMap state / wood-mango-tissue-round
node0 2m 2.266s 2025-11-07 05:49:01.448 2586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 2.267s 2025-11-07 05:49:01.449 2587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 223 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/223 {"round":223,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/223/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 2.268s 2025-11-07 05:49:01.450 2572 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 2.268s 2025-11-07 05:49:01.450 2573 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 196 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 2.268s 2025-11-07 05:49:01.450 2574 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 2.274s 2025-11-07 05:49:01.456 2575 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 2.275s 2025-11-07 05:49:01.457 2576 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 223 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/223 {"round":223,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/223/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 2.301s 2025-11-07 05:49:01.483 2570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 223 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/223
node4 2m 2.302s 2025-11-07 05:49:01.484 2571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node2 2m 2.317s 2025-11-07 05:49:01.499 2591 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node2 2m 2.321s 2025-11-07 05:49:01.503 2592 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 223 Timestamp: 2025-11-07T05:49:00.266526Z Next consensus number: 8191 Legacy running event hash: f873a2a4d158400d00a1c2fe272e67ba54c3d5cd7243c8483a5af7875b5b965cf1f6319e79655fe0073baaac819f9f7f Legacy running event mnemonic: vast-process-steak-accuse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1788626791 Root hash: e0171439d8eab21dac3d5fc1766cffa69332fca419c01dad13bf1e6d4c590504112e18e3b7c20e0ffe16e201b28f1f56 (root) VirtualMap state / wood-mango-tissue-round
node2 2m 2.329s 2025-11-07 05:49:01.511 2593 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 2.329s 2025-11-07 05:49:01.511 2594 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 196 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 2.329s 2025-11-07 05:49:01.511 2595 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 2.335s 2025-11-07 05:49:01.517 2596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 2.336s 2025-11-07 05:49:01.518 2597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 223 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/223 {"round":223,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/223/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 2.361s 2025-11-07 05:49:01.543 2586 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node1 2m 2.363s 2025-11-07 05:49:01.545 2587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 223 Timestamp: 2025-11-07T05:49:00.266526Z Next consensus number: 8191 Legacy running event hash: f873a2a4d158400d00a1c2fe272e67ba54c3d5cd7243c8483a5af7875b5b965cf1f6319e79655fe0073baaac819f9f7f Legacy running event mnemonic: vast-process-steak-accuse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1788626791 Root hash: e0171439d8eab21dac3d5fc1766cffa69332fca419c01dad13bf1e6d4c590504112e18e3b7c20e0ffe16e201b28f1f56 (root) VirtualMap state / wood-mango-tissue-round
node1 2m 2.372s 2025-11-07 05:49:01.554 2588 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 2.373s 2025-11-07 05:49:01.555 2589 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 196 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 2.373s 2025-11-07 05:49:01.555 2590 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 2.378s 2025-11-07 05:49:01.560 2591 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 2.379s 2025-11-07 05:49:01.561 2592 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 223 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/223 {"round":223,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/223/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 2.392s 2025-11-07 05:49:01.574 2610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 223
node4 2m 2.394s 2025-11-07 05:49:01.576 2611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 223 Timestamp: 2025-11-07T05:49:00.266526Z Next consensus number: 8191 Legacy running event hash: f873a2a4d158400d00a1c2fe272e67ba54c3d5cd7243c8483a5af7875b5b965cf1f6319e79655fe0073baaac819f9f7f Legacy running event mnemonic: vast-process-steak-accuse Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1788626791 Root hash: e0171439d8eab21dac3d5fc1766cffa69332fca419c01dad13bf1e6d4c590504112e18e3b7c20e0ffe16e201b28f1f56 (root) VirtualMap state / wood-mango-tissue-round
node4 2m 2.403s 2025-11-07 05:49:01.585 2612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 2.403s 2025-11-07 05:49:01.585 2613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 196 File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 2.403s 2025-11-07 05:49:01.585 2614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 2.409s 2025-11-07 05:49:01.591 2615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 2.410s 2025-11-07 05:49:01.592 2616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 223 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/223 {"round":223,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/223/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 1.784s 2025-11-07 05:50:00.966 4020 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 355 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 3m 1.876s 2025-11-07 05:50:01.058 4033 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 355 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 1.885s 2025-11-07 05:50:01.067 4022 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 355 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 1.929s 2025-11-07 05:50:01.111 4031 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 355 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 2.050s 2025-11-07 05:50:01.232 4007 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 355 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 2.080s 2025-11-07 05:50:01.262 4025 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 355 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/355
node0 3m 2.081s 2025-11-07 05:50:01.263 4026 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node2 3m 2.145s 2025-11-07 05:50:01.327 4023 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 355 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/355
node2 3m 2.146s 2025-11-07 05:50:01.328 4024 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node1 3m 2.150s 2025-11-07 05:50:01.332 4034 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 355 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/355
node1 3m 2.151s 2025-11-07 05:50:01.333 4035 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node0 3m 2.165s 2025-11-07 05:50:01.347 4060 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node0 3m 2.167s 2025-11-07 05:50:01.349 4061 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node4 3m 2.171s 2025-11-07 05:50:01.353 4036 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 355 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/355
node4 3m 2.173s 2025-11-07 05:50:01.355 4037 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node0 3m 2.174s 2025-11-07 05:50:01.356 4062 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 2.174s 2025-11-07 05:50:01.356 4063 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 328 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 2.174s 2025-11-07 05:50:01.356 4064 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 2.178s 2025-11-07 05:50:01.360 4010 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 355 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/355
node3 3m 2.179s 2025-11-07 05:50:01.361 4011 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node0 3m 2.184s 2025-11-07 05:50:01.366 4065 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 2.184s 2025-11-07 05:50:01.366 4066 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 355 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/355 {"round":355,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/355/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 2.225s 2025-11-07 05:50:01.407 4056 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node2 3m 2.228s 2025-11-07 05:50:01.410 4059 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node2 3m 2.237s 2025-11-07 05:50:01.419 4060 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 2.237s 2025-11-07 05:50:01.419 4061 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 328 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 2.237s 2025-11-07 05:50:01.419 4062 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 2.243s 2025-11-07 05:50:01.425 4077 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node1 3m 2.245s 2025-11-07 05:50:01.427 4078 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node2 3m 2.246s 2025-11-07 05:50:01.428 4063 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 2.247s 2025-11-07 05:50:01.429 4064 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 355 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/355 {"round":355,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/355/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 2.253s 2025-11-07 05:50:01.435 4079 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 2.253s 2025-11-07 05:50:01.435 4080 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 328 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 2.253s 2025-11-07 05:50:01.435 4081 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 2.262s 2025-11-07 05:50:01.444 4082 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 2.263s 2025-11-07 05:50:01.445 4083 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 355 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/355 {"round":355,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/355/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 3m 2.265s 2025-11-07 05:50:01.447 4075 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node3 3m 2.268s 2025-11-07 05:50:01.450 4045 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 355
node4 3m 2.268s 2025-11-07 05:50:01.450 4076 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node3 3m 2.271s 2025-11-07 05:50:01.453 4046 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node4 3m 2.277s 2025-11-07 05:50:01.459 4077 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 2.277s 2025-11-07 05:50:01.459 4078 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 328 File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces
node4 3m 2.277s 2025-11-07 05:50:01.459 4079 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 2.278s 2025-11-07 05:50:01.460 4047 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 2.279s 2025-11-07 05:50:01.461 4048 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 328 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 2.279s 2025-11-07 05:50:01.461 4049 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 3m 2.287s 2025-11-07 05:50:01.469 4080 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 3m 2.287s 2025-11-07 05:50:01.469 4081 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 355 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/355 {"round":355,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/355/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 2.288s 2025-11-07 05:50:01.470 4050 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 2.289s 2025-11-07 05:50:01.471 4051 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 355 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/355 {"round":355,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/355/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 15.254s 2025-11-07 05:50:14.436 4392 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:50:14.434802732Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node0 3m 15.255s 2025-11-07 05:50:14.437 4379 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:50:14.435236721Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node2 3m 15.256s 2025-11-07 05:50:14.438 4381 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:50:14.436370197Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 3m 15.258s 2025-11-07 05:50:14.440 4368 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:50:14.435284583Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node1 4m 1.689s 2025-11-07 05:51:00.871 5640 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 493 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 1.693s 2025-11-07 05:51:00.875 5569 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 493 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 1.720s 2025-11-07 05:51:00.902 5567 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 493 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 1.730s 2025-11-07 05:51:00.912 5584 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 493 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 1.906s 2025-11-07 05:51:01.088 5572 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 493 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/493
node2 4m 1.906s 2025-11-07 05:51:01.088 5573 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node3 4m 1.940s 2025-11-07 05:51:01.122 5587 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 493 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/493
node3 4m 1.940s 2025-11-07 05:51:01.122 5588 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node0 4m 1.943s 2025-11-07 05:51:01.125 5570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 493 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/493
node0 4m 1.944s 2025-11-07 05:51:01.126 5571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node1 4m 1.963s 2025-11-07 05:51:01.145 5643 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 493 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/493
node1 4m 1.964s 2025-11-07 05:51:01.146 5644 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node2 4m 1.988s 2025-11-07 05:51:01.170 5604 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node2 4m 1.990s 2025-11-07 05:51:01.172 5605 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 493 Timestamp: 2025-11-07T05:51:00.002803862Z Next consensus number: 16624 Legacy running event hash: 69444f4b2a4539e91058cbd7c21cdf71248a1bb1abd0eafda1d04123bbbfa9f61ab91b81c077e654fa8a7e95c66cdab5 Legacy running event mnemonic: chuckle-truly-emerge-diamond Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -987425334 Root hash: 2ec6def6b8f0a588f99bda7ab9930071e3c23c74097066a699bf4e4e67c031786d0e8890b570a15ab6ea0b6e1dc77aac (root) VirtualMap state / sugar-candy-tilt-excuse
node2 4m 1.998s 2025-11-07 05:51:01.180 5606 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 1.998s 2025-11-07 05:51:01.180 5607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 466 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 1.998s 2025-11-07 05:51:01.180 5608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 2.010s 2025-11-07 05:51:01.192 5609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 2.010s 2025-11-07 05:51:01.192 5610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 493 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/493 {"round":493,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/493/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 2.023s 2025-11-07 05:51:01.205 5619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node0 4m 2.024s 2025-11-07 05:51:01.206 5602 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node3 4m 2.026s 2025-11-07 05:51:01.208 5620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 493 Timestamp: 2025-11-07T05:51:00.002803862Z Next consensus number: 16624 Legacy running event hash: 69444f4b2a4539e91058cbd7c21cdf71248a1bb1abd0eafda1d04123bbbfa9f61ab91b81c077e654fa8a7e95c66cdab5 Legacy running event mnemonic: chuckle-truly-emerge-diamond Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -987425334 Root hash: 2ec6def6b8f0a588f99bda7ab9930071e3c23c74097066a699bf4e4e67c031786d0e8890b570a15ab6ea0b6e1dc77aac (root) VirtualMap state / sugar-candy-tilt-excuse
node0 4m 2.027s 2025-11-07 05:51:01.209 5603 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 493 Timestamp: 2025-11-07T05:51:00.002803862Z Next consensus number: 16624 Legacy running event hash: 69444f4b2a4539e91058cbd7c21cdf71248a1bb1abd0eafda1d04123bbbfa9f61ab91b81c077e654fa8a7e95c66cdab5 Legacy running event mnemonic: chuckle-truly-emerge-diamond Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -987425334 Root hash: 2ec6def6b8f0a588f99bda7ab9930071e3c23c74097066a699bf4e4e67c031786d0e8890b570a15ab6ea0b6e1dc77aac (root) VirtualMap state / sugar-candy-tilt-excuse
node0 4m 2.034s 2025-11-07 05:51:01.216 5604 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 2.034s 2025-11-07 05:51:01.216 5605 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 466 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 2.034s 2025-11-07 05:51:01.216 5606 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 2.034s 2025-11-07 05:51:01.216 5621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 2.034s 2025-11-07 05:51:01.216 5622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 466 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 2.034s 2025-11-07 05:51:01.216 5623 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 2.046s 2025-11-07 05:51:01.228 5607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 2.046s 2025-11-07 05:51:01.228 5608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 493 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/493 {"round":493,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/493/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 2.046s 2025-11-07 05:51:01.228 5624 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 2.047s 2025-11-07 05:51:01.229 5625 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 493 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/493 {"round":493,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/493/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 2.049s 2025-11-07 05:51:01.231 5683 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 493
node1 4m 2.051s 2025-11-07 05:51:01.233 5684 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 493 Timestamp: 2025-11-07T05:51:00.002803862Z Next consensus number: 16624 Legacy running event hash: 69444f4b2a4539e91058cbd7c21cdf71248a1bb1abd0eafda1d04123bbbfa9f61ab91b81c077e654fa8a7e95c66cdab5 Legacy running event mnemonic: chuckle-truly-emerge-diamond Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -987425334 Root hash: 2ec6def6b8f0a588f99bda7ab9930071e3c23c74097066a699bf4e4e67c031786d0e8890b570a15ab6ea0b6e1dc77aac (root) VirtualMap state / sugar-candy-tilt-excuse
node1 4m 2.059s 2025-11-07 05:51:01.241 5685 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 2.059s 2025-11-07 05:51:01.241 5686 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 466 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 2.060s 2025-11-07 05:51:01.242 5687 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 2.071s 2025-11-07 05:51:01.253 5688 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 2.071s 2025-11-07 05:51:01.253 5689 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 493 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/493 {"round":493,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/493/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 2.089s 2025-11-07 05:52:01.271 7261 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 632 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 2.100s 2025-11-07 05:52:01.282 7172 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 632 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 2.154s 2025-11-07 05:52:01.336 7166 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 632 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 2.162s 2025-11-07 05:52:01.344 7183 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 632 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 2.295s 2025-11-07 05:52:01.477 7169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 632 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/632
node2 5m 2.295s 2025-11-07 05:52:01.477 7170 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node3 5m 2.309s 2025-11-07 05:52:01.491 7186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 632 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/632
node3 5m 2.309s 2025-11-07 05:52:01.491 7187 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node0 5m 2.365s 2025-11-07 05:52:01.547 7175 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 632 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/632
node0 5m 2.366s 2025-11-07 05:52:01.548 7176 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node2 5m 2.372s 2025-11-07 05:52:01.554 7201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node2 5m 2.374s 2025-11-07 05:52:01.556 7202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 632 Timestamp: 2025-11-07T05:52:00.416695Z Next consensus number: 19944 Legacy running event hash: 3a1057c72f662d1475d0ca7bb219f04883d24930b49aea6cc7dbc60df64f314eff1c98e9bf6150c0f347df133032509c Legacy running event mnemonic: dragon-wedding-under-penalty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400548510 Root hash: 62b2ae74c5786e907d3237c56390a47bc1136f85b2398cddc3ba3485654fc189b2e8b875945090539902ab096009a2c0 (root) VirtualMap state / film-cover-extra-spirit
node1 5m 2.379s 2025-11-07 05:52:01.561 7264 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 632 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/632
node1 5m 2.380s 2025-11-07 05:52:01.562 7265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node2 5m 2.380s 2025-11-07 05:52:01.562 7203 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 2.380s 2025-11-07 05:52:01.562 7204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 605 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 2.380s 2025-11-07 05:52:01.562 7205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 2.383s 2025-11-07 05:52:01.565 7206 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 2.383s 2025-11-07 05:52:01.565 7207 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 632 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/632 {"round":632,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/632/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 2.385s 2025-11-07 05:52:01.567 7208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/2
node3 5m 2.399s 2025-11-07 05:52:01.581 7226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node3 5m 2.402s 2025-11-07 05:52:01.584 7227 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 632 Timestamp: 2025-11-07T05:52:00.416695Z Next consensus number: 19944 Legacy running event hash: 3a1057c72f662d1475d0ca7bb219f04883d24930b49aea6cc7dbc60df64f314eff1c98e9bf6150c0f347df133032509c Legacy running event mnemonic: dragon-wedding-under-penalty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400548510 Root hash: 62b2ae74c5786e907d3237c56390a47bc1136f85b2398cddc3ba3485654fc189b2e8b875945090539902ab096009a2c0 (root) VirtualMap state / film-cover-extra-spirit
node3 5m 2.411s 2025-11-07 05:52:01.593 7228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 2.411s 2025-11-07 05:52:01.593 7229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 605 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 5m 2.411s 2025-11-07 05:52:01.593 7230 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 5m 2.414s 2025-11-07 05:52:01.596 7231 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 2.414s 2025-11-07 05:52:01.596 7232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 632 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/632 {"round":632,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/632/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 2.416s 2025-11-07 05:52:01.598 7233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/2
node0 5m 2.446s 2025-11-07 05:52:01.628 7207 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node0 5m 2.448s 2025-11-07 05:52:01.630 7208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 632 Timestamp: 2025-11-07T05:52:00.416695Z Next consensus number: 19944 Legacy running event hash: 3a1057c72f662d1475d0ca7bb219f04883d24930b49aea6cc7dbc60df64f314eff1c98e9bf6150c0f347df133032509c Legacy running event mnemonic: dragon-wedding-under-penalty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400548510 Root hash: 62b2ae74c5786e907d3237c56390a47bc1136f85b2398cddc3ba3485654fc189b2e8b875945090539902ab096009a2c0 (root) VirtualMap state / film-cover-extra-spirit
node0 5m 2.455s 2025-11-07 05:52:01.637 7209 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 2.455s 2025-11-07 05:52:01.637 7211 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 605 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 5m 2.456s 2025-11-07 05:52:01.638 7219 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 2.458s 2025-11-07 05:52:01.640 7220 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 2.458s 2025-11-07 05:52:01.640 7221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 632 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/632 {"round":632,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/632/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 2.460s 2025-11-07 05:52:01.642 7222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/2
node1 5m 2.467s 2025-11-07 05:52:01.649 7304 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 632
node1 5m 2.469s 2025-11-07 05:52:01.651 7305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 632 Timestamp: 2025-11-07T05:52:00.416695Z Next consensus number: 19944 Legacy running event hash: 3a1057c72f662d1475d0ca7bb219f04883d24930b49aea6cc7dbc60df64f314eff1c98e9bf6150c0f347df133032509c Legacy running event mnemonic: dragon-wedding-under-penalty Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1400548510 Root hash: 62b2ae74c5786e907d3237c56390a47bc1136f85b2398cddc3ba3485654fc189b2e8b875945090539902ab096009a2c0 (root) VirtualMap state / film-cover-extra-spirit
node1 5m 2.476s 2025-11-07 05:52:01.658 7306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 2.476s 2025-11-07 05:52:01.658 7307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 605 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 2.476s 2025-11-07 05:52:01.658 7308 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 2.479s 2025-11-07 05:52:01.661 7309 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 2.479s 2025-11-07 05:52:01.661 7310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 632 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/632 {"round":632,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/632/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 2.481s 2025-11-07 05:52:01.663 7311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/2
node4 5m 57.513s 2025-11-07 05:52:56.695 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 57.607s 2025-11-07 05:52:56.789 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 57.625s 2025-11-07 05:52:56.807 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 57.743s 2025-11-07 05:52:56.925 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 57.775s 2025-11-07 05:52:56.957 5 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 59.277s 2025-11-07 05:52:58.459 6 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1501ms
node4 5m 59.287s 2025-11-07 05:52:58.469 7 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 59.290s 2025-11-07 05:52:58.472 8 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 59.332s 2025-11-07 05:52:58.514 9 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 59.394s 2025-11-07 05:52:58.576 10 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 59.395s 2025-11-07 05:52:58.577 11 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 6m 1.491s 2025-11-07 05:53:00.673 12 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 6m 1.588s 2025-11-07 05:53:00.770 15 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6m 1.596s 2025-11-07 05:53:00.778 16 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/355/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/223/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/95/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2/SignedState.swh
node4 6m 1.596s 2025-11-07 05:53:00.778 17 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 6m 1.597s 2025-11-07 05:53:00.779 18 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/355/SignedState.swh
node4 6m 1.607s 2025-11-07 05:53:00.789 19 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node3 6m 1.711s 2025-11-07 05:53:00.893 8747 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 769 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 1.724s 2025-11-07 05:53:00.906 8736 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 769 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 1.733s 2025-11-07 05:53:00.915 29 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 6m 1.790s 2025-11-07 05:53:00.972 8726 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 769 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 1.869s 2025-11-07 05:53:01.051 8937 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 769 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 1.926s 2025-11-07 05:53:01.108 8729 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 769 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/769
node0 6m 1.926s 2025-11-07 05:53:01.108 8730 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node1 6m 1.939s 2025-11-07 05:53:01.121 8940 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 769 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/769
node1 6m 1.939s 2025-11-07 05:53:01.121 8941 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node2 6m 1.996s 2025-11-07 05:53:01.178 8739 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 769 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/769
node2 6m 1.997s 2025-11-07 05:53:01.179 8740 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node0 6m 2.007s 2025-11-07 05:53:01.189 8761 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node0 6m 2.009s 2025-11-07 05:53:01.191 8762 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 769 Timestamp: 2025-11-07T05:53:00.029184Z Next consensus number: 23179 Legacy running event hash: 36faeda6b19eaea3853d7f4ab644d062e2716e81fde752c8cc797ea252f4704085726a7a0794d1c52cf9acae168e2fed Legacy running event mnemonic: tent-ensure-slush-close Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1405486853 Root hash: 68202b8d46e5e0185d7f592db72f9c9fe445030ae1284882aa31235354f81239472673a54264586e78806a34e21de18b (root) VirtualMap state / advice-immune-syrup-wonder
node0 6m 2.017s 2025-11-07 05:53:01.199 8763 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 2.017s 2025-11-07 05:53:01.199 8764 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 742 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 2.017s 2025-11-07 05:53:01.199 8765 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 2.022s 2025-11-07 05:53:01.204 8766 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 2.022s 2025-11-07 05:53:01.204 8980 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node0 6m 2.023s 2025-11-07 05:53:01.205 8767 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 769 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/769 {"round":769,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/769/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 2.024s 2025-11-07 05:53:01.206 8768 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/95
node1 6m 2.024s 2025-11-07 05:53:01.206 8981 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 769 Timestamp: 2025-11-07T05:53:00.029184Z Next consensus number: 23179 Legacy running event hash: 36faeda6b19eaea3853d7f4ab644d062e2716e81fde752c8cc797ea252f4704085726a7a0794d1c52cf9acae168e2fed Legacy running event mnemonic: tent-ensure-slush-close Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1405486853 Root hash: 68202b8d46e5e0185d7f592db72f9c9fe445030ae1284882aa31235354f81239472673a54264586e78806a34e21de18b (root) VirtualMap state / advice-immune-syrup-wonder
node1 6m 2.033s 2025-11-07 05:53:01.215 8982 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 6m 2.033s 2025-11-07 05:53:01.215 8983 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 742 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 2.033s 2025-11-07 05:53:01.215 8984 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 2.038s 2025-11-07 05:53:01.220 8985 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 2.038s 2025-11-07 05:53:01.220 8986 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 769 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/769 {"round":769,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/769/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 2.040s 2025-11-07 05:53:01.222 8987 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/95
node3 6m 2.069s 2025-11-07 05:53:01.251 8760 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 769 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/769
node3 6m 2.070s 2025-11-07 05:53:01.252 8761 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node2 6m 2.078s 2025-11-07 05:53:01.260 8771 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node2 6m 2.080s 2025-11-07 05:53:01.262 8772 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 769 Timestamp: 2025-11-07T05:53:00.029184Z Next consensus number: 23179 Legacy running event hash: 36faeda6b19eaea3853d7f4ab644d062e2716e81fde752c8cc797ea252f4704085726a7a0794d1c52cf9acae168e2fed Legacy running event mnemonic: tent-ensure-slush-close Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1405486853 Root hash: 68202b8d46e5e0185d7f592db72f9c9fe445030ae1284882aa31235354f81239472673a54264586e78806a34e21de18b (root) VirtualMap state / advice-immune-syrup-wonder
node2 6m 2.087s 2025-11-07 05:53:01.269 8773 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 6m 2.087s 2025-11-07 05:53:01.269 8774 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 742 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 2.087s 2025-11-07 05:53:01.269 8775 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 2.091s 2025-11-07 05:53:01.273 8784 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 2.092s 2025-11-07 05:53:01.274 8785 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 769 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/769 {"round":769,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/769/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 2.094s 2025-11-07 05:53:01.276 8786 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/95
node3 6m 2.151s 2025-11-07 05:53:01.333 8803 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 769
node3 6m 2.154s 2025-11-07 05:53:01.336 8804 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 769 Timestamp: 2025-11-07T05:53:00.029184Z Next consensus number: 23179 Legacy running event hash: 36faeda6b19eaea3853d7f4ab644d062e2716e81fde752c8cc797ea252f4704085726a7a0794d1c52cf9acae168e2fed Legacy running event mnemonic: tent-ensure-slush-close Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1405486853 Root hash: 68202b8d46e5e0185d7f592db72f9c9fe445030ae1284882aa31235354f81239472673a54264586e78806a34e21de18b (root) VirtualMap state / advice-immune-syrup-wonder
node3 6m 2.161s 2025-11-07 05:53:01.343 8805 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 2.161s 2025-11-07 05:53:01.343 8806 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 742 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 2.161s 2025-11-07 05:53:01.343 8807 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 2.166s 2025-11-07 05:53:01.348 8808 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 2.167s 2025-11-07 05:53:01.349 8809 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 769 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/769 {"round":769,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/769/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 2.168s 2025-11-07 05:53:01.350 8810 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/95
node4 6m 2.575s 2025-11-07 05:53:01.757 31 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 6m 2.580s 2025-11-07 05:53:01.762 32 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":355,"consensusTimestamp":"2025-11-07T05:50:00.053854Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 6m 2.585s 2025-11-07 05:53:01.767 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6m 2.586s 2025-11-07 05:53:01.768 38 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 6m 2.592s 2025-11-07 05:53:01.774 39 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 6m 2.602s 2025-11-07 05:53:01.784 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6m 2.605s 2025-11-07 05:53:01.787 41 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6m 3.724s 2025-11-07 05:53:02.906 42 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26107919] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=250350, randomLong=996443306886336631, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=17720, randomLong=5435691987828708375, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1559309, data=35, exception=null] OS Health Check Report - Complete (took 1031 ms)
node4 6m 3.761s 2025-11-07 05:53:02.943 43 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6m 3.898s 2025-11-07 05:53:03.080 44 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 385
node4 6m 3.901s 2025-11-07 05:53:03.083 45 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6m 3.903s 2025-11-07 05:53:03.085 46 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6m 4.001s 2025-11-07 05:53:03.183 47 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "IqtR8Q==", "port": 30124 }, { "ipAddressV4": "CoAAZA==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "Iiz5zQ==", "port": 30125 }, { "ipAddressV4": "CoAAaA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHfBAg==", "port": 30126 }, { "ipAddressV4": "CoAAZg==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I95M3w==", "port": 30127 }, { "ipAddressV4": "CoAAZQ==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjcdWA==", "port": 30128 }, { "ipAddressV4": "CoAAZw==", "port": 30128 }] }] }
node4 6m 4.029s 2025-11-07 05:53:03.211 48 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long 2684573279949512893.
node4 6m 4.030s 2025-11-07 05:53:03.212 49 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 354 rounds handled.
node4 6m 4.030s 2025-11-07 05:53:03.212 50 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 4.030s 2025-11-07 05:53:03.212 51 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 4.077s 2025-11-07 05:53:03.259 52 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 355 Timestamp: 2025-11-07T05:50:00.053854Z Next consensus number: 12969 Legacy running event hash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c Legacy running event mnemonic: father-extra-excite-wine Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1876023202 Root hash: 8eda315162490851cf3bdc8e4b4613f6c89fa961f98dc98bc197f0323e4168b39eb2359174ad85034700291c95f0119c (root) VirtualMap state / cave-clay-empty-any
node4 6m 4.083s 2025-11-07 05:53:03.265 54 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Starting the ReconnectController
node4 6m 4.316s 2025-11-07 05:53:03.498 55 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 5a9dbd787a762fdc7b414e129bd23fe7b92ac7a0a543ef4d82c82e8f338eb3ff21836f6506adbb0362409613bf329f8c
node4 6m 4.326s 2025-11-07 05:53:03.508 56 INFO STARTUP <platformForkJoinThread-3> Shadowgraph: Shadowgraph starting from expiration threshold 328
node4 6m 4.332s 2025-11-07 05:53:03.514 58 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6m 4.333s 2025-11-07 05:53:03.515 59 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6m 4.335s 2025-11-07 05:53:03.517 60 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6m 4.338s 2025-11-07 05:53:03.520 61 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6m 4.339s 2025-11-07 05:53:03.521 62 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6m 4.340s 2025-11-07 05:53:03.522 63 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6m 4.349s 2025-11-07 05:53:03.531 64 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 328
node4 6m 4.356s 2025-11-07 05:53:03.538 65 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 210.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6m 4.625s 2025-11-07 05:53:03.807 66 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:312c00b46fc6 BR:353), num remaining: 4
node4 6m 4.627s 2025-11-07 05:53:03.809 67 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:50318d0e7ef5 BR:353), num remaining: 3
node4 6m 4.627s 2025-11-07 05:53:03.809 68 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:32eb67dceea6 BR:353), num remaining: 2
node4 6m 4.628s 2025-11-07 05:53:03.810 69 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:4a9b1f315b80 BR:353), num remaining: 1
node4 6m 4.629s 2025-11-07 05:53:03.811 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:8ae646775d27 BR:353), num remaining: 0
node4 6m 4.849s 2025-11-07 05:53:04.031 232 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 2,039 preconsensus events with max birth round 385. These events contained 2,837 transactions. 29 rounds reached consensus spanning 12.7 seconds of consensus time. The latest round to reach consensus is round 384. Replay took 497.0 milliseconds.
node4 6m 4.851s 2025-11-07 05:53:04.033 233 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6m 4.853s 2025-11-07 05:53:04.035 234 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 493.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 5.734s 2025-11-07 05:53:04.916 335 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, stopping gossip
node4 6m 5.734s 2025-11-07 05:53:04.916 336 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node4 6m 5.734s 2025-11-07 05:53:04.916 337 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328] remote ev=EventWindow[latestConsensusRound=777,ancientThreshold=750,expiredThreshold=676]
node4 6m 5.734s 2025-11-07 05:53:04.916 339 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328] remote ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677]
node4 6m 5.734s 2025-11-07 05:53:04.916 338 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328] remote ev=EventWindow[latestConsensusRound=777,ancientThreshold=750,expiredThreshold=676]
node4 6m 5.734s 2025-11-07 05:53:04.916 340 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Preparing for reconnect, start clearing queues
node4 6m 5.734s 2025-11-07 05:53:04.916 341 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 880.0 ms in OBSERVING. Now in BEHIND
node0 6m 5.804s 2025-11-07 05:53:04.986 8868 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=777,ancientThreshold=750,expiredThreshold=676] remote ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328]
node1 6m 5.804s 2025-11-07 05:53:04.986 9079 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=777,ancientThreshold=750,expiredThreshold=676] remote ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328]
node2 6m 5.804s 2025-11-07 05:53:04.986 8882 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328]
node3 6m 5.805s 2025-11-07 05:53:04.987 8909 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=778,ancientThreshold=751,expiredThreshold=677] remote ev=EventWindow[latestConsensusRound=384,ancientThreshold=357,expiredThreshold=328]
node4 6m 5.887s 2025-11-07 05:53:05.069 342 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Queues have been cleared
node4 6m 5.888s 2025-11-07 05:53:05.070 343 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Waiting for a state to be obtained from a peer
node2 6m 5.981s 2025-11-07 05:53:05.163 8883 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":2,"otherNodeId":4,"round":778} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node2 6m 5.982s 2025-11-07 05:53:05.164 8884 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: The following state will be sent to the learner:
Round: 778 Timestamp: 2025-11-07T05:53:03.929197Z Next consensus number: 23396 Legacy running event hash: 6bcfe06c21392ba636ea29d65514ea6c86c7cc7e0373525abf607c3b8204b3a06c5e03902e48f3f2c911487debd071d5 Legacy running event mnemonic: ketchup-verb-cheese-inquiry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 9ddd34e1ff113d76786fcea006fdbe58723b253fd3e942e52d8a6d1ae0279fd7c6f04645b8291f572baa7097d3b8944a (root) VirtualMap state / dutch-order-bulb-put
node2 6m 5.983s 2025-11-07 05:53:05.165 8885 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Sending signatures from nodes 0, 2, 3 (signing weight = 37500000000/50000000000) for state hash 9ddd34e1ff113d76786fcea006fdbe58723b253fd3e942e52d8a6d1ae0279fd7c6f04645b8291f572baa7097d3b8944a
node2 6m 5.983s 2025-11-07 05:53:05.165 8886 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Starting synchronization in the role of the sender.
node4 6m 6.051s 2025-11-07 05:53:05.233 344 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":384} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 6.052s 2025-11-07 05:53:05.234 345 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Receiving signed state signatures
node4 6m 6.053s 2025-11-07 05:53:05.235 346 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Received signatures from nodes 0, 2, 3
node2 6m 6.104s 2025-11-07 05:53:05.286 8915 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node2 6m 6.114s 2025-11-07 05:53:05.296 8916 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3750e4a2 start run()
node4 6m 6.277s 2025-11-07 05:53:05.459 373 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls receiveTree()
node4 6m 6.278s 2025-11-07 05:53:05.460 374 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: synchronizing tree
node4 6m 6.279s 2025-11-07 05:53:05.461 375 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 6.287s 2025-11-07 05:53:05.469 376 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@695845f8 start run()
node4 6m 6.344s 2025-11-07 05:53:05.526 377 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 4 -> 4, lastLeafPath: 8 -> 8
node4 6m 6.345s 2025-11-07 05:53:05.527 378 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 6.489s 2025-11-07 05:53:05.671 379 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 6.490s 2025-11-07 05:53:05.672 380 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 6.490s 2025-11-07 05:53:05.672 381 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 6.491s 2025-11-07 05:53:05.673 382 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 6.491s 2025-11-07 05:53:05.673 383 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 6.491s 2025-11-07 05:53:05.673 384 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 6.491s 2025-11-07 05:53:05.673 385 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node4 6m 6.514s 2025-11-07 05:53:05.696 395 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 6.515s 2025-11-07 05:53:05.697 397 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 6.515s 2025-11-07 05:53:05.697 398 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 6.515s 2025-11-07 05:53:05.697 399 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 6.515s 2025-11-07 05:53:05.697 400 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@695845f8 finish run()
node4 6m 6.516s 2025-11-07 05:53:05.698 401 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route []
node4 6m 6.517s 2025-11-07 05:53:05.699 402 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: synchronization complete
node4 6m 6.517s 2025-11-07 05:53:05.699 403 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls initialize()
node4 6m 6.518s 2025-11-07 05:53:05.700 404 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: initializing tree
node4 6m 6.518s 2025-11-07 05:53:05.700 405 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: initialization complete
node4 6m 6.518s 2025-11-07 05:53:05.700 406 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls hash()
node4 6m 6.518s 2025-11-07 05:53:05.700 407 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: hashing tree
node4 6m 6.518s 2025-11-07 05:53:05.700 408 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: hashing complete
node4 6m 6.518s 2025-11-07 05:53:05.700 409 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner calls logStatistics()
node4 6m 6.521s 2025-11-07 05:53:05.703 410 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.23900000000000002,"hashTimeInSeconds":0.0,"initializationTimeInSeconds":0.0,"totalNodes":9,"leafNodes":5,"redundantLeafNodes":2,"internalNodes":4,"redundantInternalNodes":0} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 6.521s 2025-11-07 05:53:05.703 411 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=9; transfersFromLearner=8; internalHashes=3; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=5; leafCleanHashes=2; leafData=5; leafCleanData=2
node4 6m 6.522s 2025-11-07 05:53:05.704 412 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> LearningSynchronizer: learner is done synchronizing
node4 6m 6.523s 2025-11-07 05:53:05.705 413 INFO STARTUP <<platform-core: SyncProtocolWith2 4 to 2>> ConsistencyTestingToolState: New State Constructed.
node4 6m 6.528s 2025-11-07 05:53:05.710 414 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStateLearner: Reconnect data usage report {"dataMegabytes":0.005864143371582031} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node2 6m 6.560s 2025-11-07 05:53:05.742 8920 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@3750e4a2 finish run()
node2 6m 6.560s 2025-11-07 05:53:05.742 8921 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> TeachingSynchronizer: finished sending tree
node2 6m 6.562s 2025-11-07 05:53:05.744 8924 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Finished synchronization in the role of the sender.
node2 6m 6.600s 2025-11-07 05:53:05.782 8925 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> ReconnectStateTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":2,"otherNodeId":4,"round":778} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 6.637s 2025-11-07 05:53:05.819 415 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":2,"round":778} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 6.638s 2025-11-07 05:53:05.820 416 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> ReconnectStatePeerProtocol: Information for state received during reconnect:
Round: 778 Timestamp: 2025-11-07T05:53:03.929197Z Next consensus number: 23396 Legacy running event hash: 6bcfe06c21392ba636ea29d65514ea6c86c7cc7e0373525abf607c3b8204b3a06c5e03902e48f3f2c911487debd071d5 Legacy running event mnemonic: ketchup-verb-cheese-inquiry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 9ddd34e1ff113d76786fcea006fdbe58723b253fd3e942e52d8a6d1ae0279fd7c6f04645b8291f572baa7097d3b8944a (root) VirtualMap state / dutch-order-bulb-put
node4 6m 6.639s 2025-11-07 05:53:05.821 417 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: A state was obtained from a peer
node4 6m 6.641s 2025-11-07 05:53:05.823 418 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: The state obtained from a peer was validated
node4 6m 6.642s 2025-11-07 05:53:05.824 420 DEBUG RECONNECT <<platform-core: reconnectController>> ReconnectController: `loadState` : reloading state
node4 6m 6.642s 2025-11-07 05:53:05.824 421 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with state long 7244582931102900703.
node4 6m 6.643s 2025-11-07 05:53:05.825 422 INFO STARTUP <<platform-core: reconnectController>> ConsistencyTestingToolState: State initialized with 777 rounds handled.
node4 6m 6.643s 2025-11-07 05:53:05.825 423 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 6.643s 2025-11-07 05:53:05.825 424 INFO STARTUP <<platform-core: reconnectController>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 6.660s 2025-11-07 05:53:05.842 429 INFO STATE_TO_DISK <<platform-core: reconnectController>> DefaultSavedStateController: Signed state from round 778 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 6.660s 2025-11-07 05:53:05.842 430 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 924.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 6.661s 2025-11-07 05:53:05.843 431 INFO STARTUP <platformForkJoinThread-7> Shadowgraph: Shadowgraph starting from expiration threshold 751
node4 6m 6.664s 2025-11-07 05:53:05.846 434 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 778 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778
node4 6m 6.665s 2025-11-07 05:53:05.847 435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 778
node4 6m 6.677s 2025-11-07 05:53:05.859 443 INFO EVENT_STREAM <<platform-core: reconnectController>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 6bcfe06c21392ba636ea29d65514ea6c86c7cc7e0373525abf607c3b8204b3a06c5e03902e48f3f2c911487debd071d5
node4 6m 6.678s 2025-11-07 05:53:05.860 448 INFO STARTUP <platformForkJoinThread-4> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr385_orgn0.pces. All future files will have an origin round of 778.
node4 6m 6.679s 2025-11-07 05:53:05.861 449 INFO RECONNECT <<platform-core: reconnectController>> ReconnectController: Reconnect almost done resuming gossip
node4 6m 6.822s 2025-11-07 05:53:06.004 473 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 778
node4 6m 6.826s 2025-11-07 05:53:06.008 474 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 778 Timestamp: 2025-11-07T05:53:03.929197Z Next consensus number: 23396 Legacy running event hash: 6bcfe06c21392ba636ea29d65514ea6c86c7cc7e0373525abf607c3b8204b3a06c5e03902e48f3f2c911487debd071d5 Legacy running event mnemonic: ketchup-verb-cheese-inquiry Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 2138287084 Root hash: 9ddd34e1ff113d76786fcea006fdbe58723b253fd3e942e52d8a6d1ae0279fd7c6f04645b8291f572baa7097d3b8944a (root) VirtualMap state / dutch-order-bulb-put
node4 6m 6.865s 2025-11-07 05:53:06.047 475 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr385_orgn0.pces
node4 6m 6.865s 2025-11-07 05:53:06.047 476 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 751
node4 6m 6.871s 2025-11-07 05:53:06.053 477 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 778 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778 {"round":778,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/778/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 6.875s 2025-11-07 05:53:06.057 478 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 213.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 7.345s 2025-11-07 05:53:06.527 479 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 7.348s 2025-11-07 05:53:06.530 480 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 7.822s 2025-11-07 05:53:07.004 481 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:fa86aed4f4f5 BR:776), num remaining: 3
node4 6m 7.824s 2025-11-07 05:53:07.006 482 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:5982c9f46043 BR:776), num remaining: 2
node4 6m 7.824s 2025-11-07 05:53:07.006 483 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:2780b56dfc0a BR:777), num remaining: 1
node4 6m 7.824s 2025-11-07 05:53:07.006 484 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:fcc78f3271f9 BR:776), num remaining: 0
node4 6m 11.419s 2025-11-07 05:53:10.601 618 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.5 s in CHECKING. Now in ACTIVE
node3 7m 1.830s 2025-11-07 05:54:01.012 10296 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 902 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 1.860s 2025-11-07 05:54:01.042 10420 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 902 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 1.871s 2025-11-07 05:54:01.053 10225 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 902 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 1.908s 2025-11-07 05:54:01.090 1827 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 902 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 1.945s 2025-11-07 05:54:01.127 10262 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 902 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 2.043s 2025-11-07 05:54:01.225 10265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 902 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/902
node2 7m 2.044s 2025-11-07 05:54:01.226 10266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 902
node1 7m 2.057s 2025-11-07 05:54:01.239 10423 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 902 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/902
node1 7m 2.058s 2025-11-07 05:54:01.240 10424 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node0 7m 2.092s 2025-11-07 05:54:01.274 10228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 902 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/902
node0 7m 2.093s 2025-11-07 05:54:01.275 10229 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node3 7m 2.114s 2025-11-07 05:54:01.296 10299 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 902 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/902
node3 7m 2.115s 2025-11-07 05:54:01.297 10300 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node2 7m 2.123s 2025-11-07 05:54:01.305 10305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 902
node2 7m 2.125s 2025-11-07 05:54:01.307 10306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 902 Timestamp: 2025-11-07T05:54:00.066725581Z Next consensus number: 27723 Legacy running event hash: 3987525265756dea4371667f7ed97673c9e01e11bbfbf89b884e76f554ef68bf8e5f0e709ba85c4e7c6b7bcd41aef5e4 Legacy running event mnemonic: couch-engage-push-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 749129649 Root hash: 15f4145240472e005e64fa029da207d66023909f5b9917280916e520d9e8afbbe0dd4d532aaad2d5681b1553800d0315 (root) VirtualMap state / salon-medal-angry-scale
node2 7m 2.132s 2025-11-07 05:54:01.314 10307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+47+15.409510130Z_seq0_minr1_maxr501_orgn0.pces
node2 7m 2.132s 2025-11-07 05:54:01.314 10308 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 875 File: data/saved/preconsensus-events/2/2025/11/07/2025-11-07T05+51+04.346086645Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 2.133s 2025-11-07 05:54:01.315 10309 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 2.140s 2025-11-07 05:54:01.322 10310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 2.141s 2025-11-07 05:54:01.323 10311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 902 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/902 {"round":902,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/902/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 2.143s 2025-11-07 05:54:01.325 10312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/223
node1 7m 2.149s 2025-11-07 05:54:01.331 10455 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node1 7m 2.152s 2025-11-07 05:54:01.334 10456 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 902 Timestamp: 2025-11-07T05:54:00.066725581Z Next consensus number: 27723 Legacy running event hash: 3987525265756dea4371667f7ed97673c9e01e11bbfbf89b884e76f554ef68bf8e5f0e709ba85c4e7c6b7bcd41aef5e4 Legacy running event mnemonic: couch-engage-push-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 749129649 Root hash: 15f4145240472e005e64fa029da207d66023909f5b9917280916e520d9e8afbbe0dd4d532aaad2d5681b1553800d0315 (root) VirtualMap state / salon-medal-angry-scale
node1 7m 2.158s 2025-11-07 05:54:01.340 10457 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+47+15.690476938Z_seq0_minr1_maxr501_orgn0.pces
node1 7m 2.161s 2025-11-07 05:54:01.343 10458 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 875 File: data/saved/preconsensus-events/1/2025/11/07/2025-11-07T05+51+04.341116730Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 2.161s 2025-11-07 05:54:01.343 10459 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 2.169s 2025-11-07 05:54:01.351 10460 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 2.169s 2025-11-07 05:54:01.351 10461 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 902 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/902 {"round":902,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/902/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 2.171s 2025-11-07 05:54:01.353 10462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/223
node0 7m 2.175s 2025-11-07 05:54:01.357 10260 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node0 7m 2.178s 2025-11-07 05:54:01.360 10261 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 902 Timestamp: 2025-11-07T05:54:00.066725581Z Next consensus number: 27723 Legacy running event hash: 3987525265756dea4371667f7ed97673c9e01e11bbfbf89b884e76f554ef68bf8e5f0e709ba85c4e7c6b7bcd41aef5e4 Legacy running event mnemonic: couch-engage-push-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 749129649 Root hash: 15f4145240472e005e64fa029da207d66023909f5b9917280916e520d9e8afbbe0dd4d532aaad2d5681b1553800d0315 (root) VirtualMap state / salon-medal-angry-scale
node0 7m 2.184s 2025-11-07 05:54:01.366 10262 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+47+15.662787367Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 2.186s 2025-11-07 05:54:01.368 10263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 875 File: data/saved/preconsensus-events/0/2025/11/07/2025-11-07T05+51+04.450512141Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 2.187s 2025-11-07 05:54:01.369 10264 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 2.194s 2025-11-07 05:54:01.376 10265 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 2.195s 2025-11-07 05:54:01.377 10266 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 902 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/902 {"round":902,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/902/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 2.196s 2025-11-07 05:54:01.378 10267 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/223
node3 7m 2.199s 2025-11-07 05:54:01.381 10339 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 902
node3 7m 2.201s 2025-11-07 05:54:01.383 10340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 902 Timestamp: 2025-11-07T05:54:00.066725581Z Next consensus number: 27723 Legacy running event hash: 3987525265756dea4371667f7ed97673c9e01e11bbfbf89b884e76f554ef68bf8e5f0e709ba85c4e7c6b7bcd41aef5e4 Legacy running event mnemonic: couch-engage-push-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 749129649 Root hash: 15f4145240472e005e64fa029da207d66023909f5b9917280916e520d9e8afbbe0dd4d532aaad2d5681b1553800d0315 (root) VirtualMap state / salon-medal-angry-scale
node4 7m 2.204s 2025-11-07 05:54:01.386 1830 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 902 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/902
node4 7m 2.206s 2025-11-07 05:54:01.388 1831 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 902
node3 7m 2.208s 2025-11-07 05:54:01.390 10341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+47+15.614127169Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 2.211s 2025-11-07 05:54:01.393 10342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 875 File: data/saved/preconsensus-events/3/2025/11/07/2025-11-07T05+51+04.452326284Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 2.211s 2025-11-07 05:54:01.393 10343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 2.219s 2025-11-07 05:54:01.401 10344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 2.219s 2025-11-07 05:54:01.401 10345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 902 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/902 {"round":902,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/902/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 2.221s 2025-11-07 05:54:01.403 10346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/223
node4 7m 2.324s 2025-11-07 05:54:01.506 1868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 902
node4 7m 2.326s 2025-11-07 05:54:01.508 1869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 902 Timestamp: 2025-11-07T05:54:00.066725581Z Next consensus number: 27723 Legacy running event hash: 3987525265756dea4371667f7ed97673c9e01e11bbfbf89b884e76f554ef68bf8e5f0e709ba85c4e7c6b7bcd41aef5e4 Legacy running event mnemonic: couch-engage-push-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 749129649 Root hash: 15f4145240472e005e64fa029da207d66023909f5b9917280916e520d9e8afbbe0dd4d532aaad2d5681b1553800d0315 (root) VirtualMap state / salon-medal-angry-scale
node4 7m 2.336s 2025-11-07 05:54:01.518 1870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+53+06.437057360Z_seq1_minr751_maxr1251_orgn778.pces Last file: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+47+15.554873337Z_seq0_minr1_maxr385_orgn0.pces
node4 7m 2.337s 2025-11-07 05:54:01.519 1871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 875 File: data/saved/preconsensus-events/4/2025/11/07/2025-11-07T05+53+06.437057360Z_seq1_minr751_maxr1251_orgn778.pces
node4 7m 2.337s 2025-11-07 05:54:01.519 1872 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 2.342s 2025-11-07 05:54:01.524 1874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 2.342s 2025-11-07 05:54:01.524 1882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 902 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/902 {"round":902,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/902/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 2.344s 2025-11-07 05:54:01.526 1883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/2
node2 8.011m 2025-11-07 05:54:59.819 11724 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 2 to 1>> NetworkUtils: Connection broken: 2 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.819018532Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 8.011m 2025-11-07 05:54:59.821 11740 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.818316648Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8.011m 2025-11-07 05:54:59.824 3281 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 4 to 1>> NetworkUtils: Connection broken: 4 <- 1
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.821109323Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node4 8.011m 2025-11-07 05:54:59.871 3282 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 4 to 0>> NetworkUtils: Connection broken: 4 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.870847408Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 8.012m 2025-11-07 05:54:59.874 11741 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 3 to 0>> NetworkUtils: Connection broken: 3 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.871078076Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node2 8.012m 2025-11-07 05:54:59.875 11725 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith0 2 to 0>> NetworkUtils: Connection broken: 2 <- 0
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:54:59.870819067Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 8m 1.232s 2025-11-07 05:55:00.414 11742 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 3 to 2>> NetworkUtils: Connection broken: 3 <- 2
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:55:00.410156722Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more
node3 8m 1.252s 2025-11-07 05:55:00.434 11743 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-11-07T05:55:00.429030092Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:160) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.runProtocol(RpcPeerProtocol.java:293) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.writeMessages(RpcPeerProtocol.java:388) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$2(RpcPeerProtocol.java:297) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.util.concurrent.ExecutionException: java.net.SocketException: Connection reset at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 8 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readShort(DataInputStream.java:319) at org.hiero.base.io.streams.AugmentedDataInputStream.readShort(AugmentedDataInputStream.java:158) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.readMessages(RpcPeerProtocol.java:432) at com.swirlds.platform.network.protocol.rpc.RpcPeerProtocol.lambda$runProtocol$1(RpcPeerProtocol.java:296) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more