Node ID







Columns











Log Level





Log Marker








Class


















































node2 0.000ns 2025-12-03 20:59:31.702 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 87.000ms 2025-12-03 20:59:31.789 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 102.000ms 2025-12-03 20:59:31.804 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 210.000ms 2025-12-03 20:59:31.912 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 216.000ms 2025-12-03 20:59:31.918 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 228.000ms 2025-12-03 20:59:31.930 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 386.000ms 2025-12-03 20:59:32.088 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 473.000ms 2025-12-03 20:59:32.175 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 488.000ms 2025-12-03 20:59:32.190 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 597.000ms 2025-12-03 20:59:32.299 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 603.000ms 2025-12-03 20:59:32.305 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 616.000ms 2025-12-03 20:59:32.318 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 639.000ms 2025-12-03 20:59:32.341 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 640.000ms 2025-12-03 20:59:32.342 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.032s 2025-12-03 20:59:32.734 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 1.032s 2025-12-03 20:59:32.734 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 1.477s 2025-12-03 20:59:33.179 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 836ms
node2 1.484s 2025-12-03 20:59:33.186 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 1.487s 2025-12-03 20:59:33.189 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.512s 2025-12-03 20:59:33.214 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 1.523s 2025-12-03 20:59:33.225 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 1.592s 2025-12-03 20:59:33.294 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 1.593s 2025-12-03 20:59:33.295 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 1.609s 2025-12-03 20:59:33.311 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 1.628s 2025-12-03 20:59:33.330 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.751s 2025-12-03 20:59:33.453 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 1.757s 2025-12-03 20:59:33.459 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 1.771s 2025-12-03 20:59:33.473 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 1.918s 2025-12-03 20:59:33.620 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 885ms
node1 1.929s 2025-12-03 20:59:33.631 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 1.933s 2025-12-03 20:59:33.635 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 1.974s 2025-12-03 20:59:33.676 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 2.027s 2025-12-03 20:59:33.729 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node1 2.035s 2025-12-03 20:59:33.737 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 2.036s 2025-12-03 20:59:33.738 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 2.116s 2025-12-03 20:59:33.818 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 2.132s 2025-12-03 20:59:33.834 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.184s 2025-12-03 20:59:33.886 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 2.206s 2025-12-03 20:59:33.908 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 2.208s 2025-12-03 20:59:33.910 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node3 2.245s 2025-12-03 20:59:33.947 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 2.251s 2025-12-03 20:59:33.953 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 2.264s 2025-12-03 20:59:33.966 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 2.279s 2025-12-03 20:59:33.981 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 2.296s 2025-12-03 20:59:33.998 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.415s 2025-12-03 20:59:34.117 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 2.422s 2025-12-03 20:59:34.124 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 2.436s 2025-12-03 20:59:34.138 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 2.710s 2025-12-03 20:59:34.412 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 2.711s 2025-12-03 20:59:34.413 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node0 2.902s 2025-12-03 20:59:34.604 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 2.903s 2025-12-03 20:59:34.605 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 3.330s 2025-12-03 20:59:35.032 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1122ms
node4 3.339s 2025-12-03 20:59:35.041 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 3.342s 2025-12-03 20:59:35.044 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 3.383s 2025-12-03 20:59:35.085 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 3.445s 2025-12-03 20:59:35.147 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 3.446s 2025-12-03 20:59:35.148 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 3.593s 2025-12-03 20:59:35.295 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 882ms
node3 3.602s 2025-12-03 20:59:35.304 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 3.605s 2025-12-03 20:59:35.307 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 3.608s 2025-12-03 20:59:35.310 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 3.644s 2025-12-03 20:59:35.346 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 3.689s 2025-12-03 20:59:35.391 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 3.691s 2025-12-03 20:59:35.393 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 3.692s 2025-12-03 20:59:35.394 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.704s 2025-12-03 20:59:35.406 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 3.704s 2025-12-03 20:59:35.406 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 4.057s 2025-12-03 20:59:35.759 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1153ms
node1 4.065s 2025-12-03 20:59:35.767 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 4.066s 2025-12-03 20:59:35.768 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 4.070s 2025-12-03 20:59:35.772 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 4.112s 2025-12-03 20:59:35.814 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 4.143s 2025-12-03 20:59:35.845 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.146s 2025-12-03 20:59:35.848 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 4.147s 2025-12-03 20:59:35.849 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.180s 2025-12-03 20:59:35.882 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 4.181s 2025-12-03 20:59:35.883 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 4.432s 2025-12-03 20:59:36.134 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.442s 2025-12-03 20:59:36.144 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 4.448s 2025-12-03 20:59:36.150 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 4.458s 2025-12-03 20:59:36.160 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.460s 2025-12-03 20:59:36.162 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.912s 2025-12-03 20:59:36.614 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.921s 2025-12-03 20:59:36.623 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.926s 2025-12-03 20:59:36.628 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.936s 2025-12-03 20:59:36.638 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.937s 2025-12-03 20:59:36.639 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.428s 2025-12-03 20:59:37.130 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5.532s 2025-12-03 20:59:37.234 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.536s 2025-12-03 20:59:37.238 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 5.536s 2025-12-03 20:59:37.238 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 5.557s 2025-12-03 20:59:37.259 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26354912] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=288190, randomLong=2513279256475049537, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=13570, randomLong=-2315340745216571474, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1453700, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node2 5.586s 2025-12-03 20:59:37.288 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 5.594s 2025-12-03 20:59:37.296 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 5.596s 2025-12-03 20:59:37.298 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node2 5.673s 2025-12-03 20:59:37.375 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node2 5.692s 2025-12-03 20:59:37.394 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 5.693s 2025-12-03 20:59:37.395 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node2 5.706s 2025-12-03 20:59:37.408 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e490e5fcd8e922ca682a92546deea302b9bf23ff60b723dcd2cdb7f94a1c41d3e71ffe5f0e41666c7bf9cbac91eda0c4 (root) ConsistencyTestingToolState / host-dog-title-flock 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue
node3 5.770s 2025-12-03 20:59:37.472 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 5.856s 2025-12-03 20:59:37.558 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 5.859s 2025-12-03 20:59:37.561 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 5.860s 2025-12-03 20:59:37.562 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 5.900s 2025-12-03 20:59:37.602 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 5.904s 2025-12-03 20:59:37.606 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 5.909s 2025-12-03 20:59:37.611 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 5.910s 2025-12-03 20:59:37.612 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 5.911s 2025-12-03 20:59:37.613 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 5.915s 2025-12-03 20:59:37.617 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 5.916s 2025-12-03 20:59:37.618 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 5.916s 2025-12-03 20:59:37.618 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 5.918s 2025-12-03 20:59:37.620 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 5.918s 2025-12-03 20:59:37.620 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 5.920s 2025-12-03 20:59:37.622 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 5.922s 2025-12-03 20:59:37.624 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 5.923s 2025-12-03 20:59:37.625 57 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 164.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 5.927s 2025-12-03 20:59:37.629 58 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node1 6.044s 2025-12-03 20:59:37.746 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26241424] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=160500, randomLong=-1816236343966283541, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11040, randomLong=-701539408947879807, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1088860, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node1 6.073s 2025-12-03 20:59:37.775 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 6.081s 2025-12-03 20:59:37.783 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 6.084s 2025-12-03 20:59:37.786 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 6.162s 2025-12-03 20:59:37.864 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node1 6.182s 2025-12-03 20:59:37.884 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 6.182s 2025-12-03 20:59:37.884 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 6.196s 2025-12-03 20:59:37.898 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e490e5fcd8e922ca682a92546deea302b9bf23ff60b723dcd2cdb7f94a1c41d3e71ffe5f0e41666c7bf9cbac91eda0c4 (root) ConsistencyTestingToolState / host-dog-title-flock 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue
node4 6.349s 2025-12-03 20:59:38.051 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.358s 2025-12-03 20:59:38.060 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 6.359s 2025-12-03 20:59:38.061 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 6.366s 2025-12-03 20:59:38.068 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 6.378s 2025-12-03 20:59:38.080 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.380s 2025-12-03 20:59:38.082 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 6.400s 2025-12-03 20:59:38.102 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.405s 2025-12-03 20:59:38.107 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.409s 2025-12-03 20:59:38.111 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node1 6.409s 2025-12-03 20:59:38.111 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.410s 2025-12-03 20:59:38.112 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 6.414s 2025-12-03 20:59:38.116 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 6.415s 2025-12-03 20:59:38.117 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 6.415s 2025-12-03 20:59:38.117 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 6.417s 2025-12-03 20:59:38.119 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 6.417s 2025-12-03 20:59:38.119 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 6.418s 2025-12-03 20:59:38.120 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.420s 2025-12-03 20:59:38.122 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.421s 2025-12-03 20:59:38.123 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 173.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.426s 2025-12-03 20:59:38.128 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 6.458s 2025-12-03 20:59:38.160 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.461s 2025-12-03 20:59:38.163 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 6.461s 2025-12-03 20:59:38.163 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 6.701s 2025-12-03 20:59:38.403 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.714s 2025-12-03 20:59:38.416 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 6.721s 2025-12-03 20:59:38.423 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 6.734s 2025-12-03 20:59:38.436 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 6.737s 2025-12-03 20:59:38.439 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.291s 2025-12-03 20:59:38.993 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.302s 2025-12-03 20:59:39.004 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 7.308s 2025-12-03 20:59:39.010 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 7.319s 2025-12-03 20:59:39.021 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.321s 2025-12-03 20:59:39.023 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 7.494s 2025-12-03 20:59:39.196 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26273654] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=132860, randomLong=576994622504930413, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=8800, randomLong=-2957778952424802113, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=970130, data=35, exception=null] OS Health Check Report - Complete (took 1022 ms)
node4 7.525s 2025-12-03 20:59:39.227 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 7.533s 2025-12-03 20:59:39.235 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 7.536s 2025-12-03 20:59:39.238 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 7.621s 2025-12-03 20:59:39.323 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node4 7.643s 2025-12-03 20:59:39.345 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 7.644s 2025-12-03 20:59:39.346 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 7.658s 2025-12-03 20:59:39.360 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e490e5fcd8e922ca682a92546deea302b9bf23ff60b723dcd2cdb7f94a1c41d3e71ffe5f0e41666c7bf9cbac91eda0c4 (root) ConsistencyTestingToolState / host-dog-title-flock 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue
node3 7.848s 2025-12-03 20:59:39.550 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26085836] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=264740, randomLong=888258314685225279, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14060, randomLong=967277062325558222, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1094580, data=35, exception=null] OS Health Check Report - Complete (took 1026 ms)
node4 7.866s 2025-12-03 20:59:39.568 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 7.871s 2025-12-03 20:59:39.573 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 7.876s 2025-12-03 20:59:39.578 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 7.877s 2025-12-03 20:59:39.579 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 7.878s 2025-12-03 20:59:39.580 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 7.882s 2025-12-03 20:59:39.584 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 7.883s 2025-12-03 20:59:39.585 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 7.884s 2025-12-03 20:59:39.586 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 7.885s 2025-12-03 20:59:39.587 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 7.886s 2025-12-03 20:59:39.588 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 7.887s 2025-12-03 20:59:39.589 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 7.888s 2025-12-03 20:59:39.590 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 7.890s 2025-12-03 20:59:39.592 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 7.891s 2025-12-03 20:59:39.593 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 176.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 7.896s 2025-12-03 20:59:39.598 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 7.896s 2025-12-03 20:59:39.598 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 7.900s 2025-12-03 20:59:39.602 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node3 7.997s 2025-12-03 20:59:39.699 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node3 8.023s 2025-12-03 20:59:39.725 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 8.024s 2025-12-03 20:59:39.726 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 8.042s 2025-12-03 20:59:39.744 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e490e5fcd8e922ca682a92546deea302b9bf23ff60b723dcd2cdb7f94a1c41d3e71ffe5f0e41666c7bf9cbac91eda0c4 (root) ConsistencyTestingToolState / host-dog-title-flock 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue
node3 8.270s 2025-12-03 20:59:39.972 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 8.275s 2025-12-03 20:59:39.977 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node3 8.280s 2025-12-03 20:59:39.982 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 8.281s 2025-12-03 20:59:39.983 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 8.282s 2025-12-03 20:59:39.984 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 8.285s 2025-12-03 20:59:39.987 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 8.286s 2025-12-03 20:59:39.988 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 8.287s 2025-12-03 20:59:39.989 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 8.288s 2025-12-03 20:59:39.990 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 8.289s 2025-12-03 20:59:39.991 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 8.290s 2025-12-03 20:59:39.992 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 8.292s 2025-12-03 20:59:39.994 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 8.294s 2025-12-03 20:59:39.996 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 185.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node3 8.300s 2025-12-03 20:59:40.002 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 8.444s 2025-12-03 20:59:40.146 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26344480] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=201369, randomLong=-5865400154806442658, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=14080, randomLong=4940305524360896363, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1322150, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node0 8.480s 2025-12-03 20:59:40.182 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 8.490s 2025-12-03 20:59:40.192 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 8.494s 2025-12-03 20:59:40.196 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 8.585s 2025-12-03 20:59:40.287 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node0 8.607s 2025-12-03 20:59:40.309 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 8.607s 2025-12-03 20:59:40.309 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 8.622s 2025-12-03 20:59:40.324 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: e490e5fcd8e922ca682a92546deea302b9bf23ff60b723dcd2cdb7f94a1c41d3e71ffe5f0e41666c7bf9cbac91eda0c4 (root) ConsistencyTestingToolState / host-dog-title-flock 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue
node0 8.862s 2025-12-03 20:59:40.564 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 8.869s 2025-12-03 20:59:40.571 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 8.876s 2025-12-03 20:59:40.578 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 8.877s 2025-12-03 20:59:40.579 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 8.879s 2025-12-03 20:59:40.581 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 8.882s 2025-12-03 20:59:40.584 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 8.884s 2025-12-03 20:59:40.586 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 8.884s 2025-12-03 20:59:40.586 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 8.886s 2025-12-03 20:59:40.588 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 8.886s 2025-12-03 20:59:40.588 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 8.888s 2025-12-03 20:59:40.590 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 8.890s 2025-12-03 20:59:40.592 56 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 8.893s 2025-12-03 20:59:40.595 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 211.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 8.900s 2025-12-03 20:59:40.602 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 8.919s 2025-12-03 20:59:40.621 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 8.921s 2025-12-03 20:59:40.623 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 9.420s 2025-12-03 20:59:41.122 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.423s 2025-12-03 20:59:41.125 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 10.888s 2025-12-03 20:59:42.590 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 10.891s 2025-12-03 20:59:42.593 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 11.294s 2025-12-03 20:59:42.996 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 11.297s 2025-12-03 20:59:42.999 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 11.888s 2025-12-03 20:59:43.590 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 11.891s 2025-12-03 20:59:43.593 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 16.018s 2025-12-03 20:59:47.720 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 16.517s 2025-12-03 20:59:48.219 61 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node4 17.986s 2025-12-03 20:59:49.688 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 18.389s 2025-12-03 20:59:50.091 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 18.986s 2025-12-03 20:59:50.688 61 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 19.784s 2025-12-03 20:59:51.486 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 19.852s 2025-12-03 20:59:51.554 62 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 3.8 s in CHECKING. Now in ACTIVE
node2 19.867s 2025-12-03 20:59:51.569 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 19.950s 2025-12-03 20:59:51.652 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 20.072s 2025-12-03 20:59:51.774 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node3 20.279s 2025-12-03 20:59:51.981 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 20.281s 2025-12-03 20:59:51.983 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 20.405s 2025-12-03 20:59:52.107 78 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 2.4 s in CHECKING. Now in ACTIVE
node0 20.415s 2025-12-03 20:59:52.117 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 20.423s 2025-12-03 20:59:52.125 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 20.425s 2025-12-03 20:59:52.127 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 20.455s 2025-12-03 20:59:52.157 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 20.457s 2025-12-03 20:59:52.159 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 20.468s 2025-12-03 20:59:52.170 91 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 3.9 s in CHECKING. Now in ACTIVE
node4 20.472s 2025-12-03 20:59:52.174 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node4 20.474s 2025-12-03 20:59:52.176 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 20.526s 2025-12-03 20:59:52.228 81 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node2 20.527s 2025-12-03 20:59:52.229 82 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.561s 2025-12-03 20:59:52.263 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 20.564s 2025-12-03 20:59:52.266 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-03T20:59:49.697052229Z Next consensus number: 1 Legacy running event hash: c10f87ceee2c160dca068b2472deb81e9d751dcb8be38c3d464926ff2e75d4f28fe82aa6f68c1e85ad8b0b971e04dfd6 Legacy running event mnemonic: taste-alien-dentist-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 94e0b37602cc8a1f0ebc405141125dc1a8d2b713c585c471fb8a4d3592f46055e53fffac650acac8a63caed49f3f0019 (root) ConsistencyTestingToolState / goat-mule-flower-rotate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 suggest-mixed-fit-toy 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 20.600s 2025-12-03 20:59:52.302 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 20.601s 2025-12-03 20:59:52.303 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 20.601s 2025-12-03 20:59:52.303 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 20.602s 2025-12-03 20:59:52.304 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 20.606s 2025-12-03 20:59:52.308 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 20.693s 2025-12-03 20:59:52.395 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 20.696s 2025-12-03 20:59:52.398 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-03T20:59:49.697052229Z Next consensus number: 1 Legacy running event hash: c10f87ceee2c160dca068b2472deb81e9d751dcb8be38c3d464926ff2e75d4f28fe82aa6f68c1e85ad8b0b971e04dfd6 Legacy running event mnemonic: taste-alien-dentist-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 94e0b37602cc8a1f0ebc405141125dc1a8d2b713c585c471fb8a4d3592f46055e53fffac650acac8a63caed49f3f0019 (root) ConsistencyTestingToolState / goat-mule-flower-rotate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 suggest-mixed-fit-toy 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 20.718s 2025-12-03 20:59:52.420 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 20.721s 2025-12-03 20:59:52.423 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-03T20:59:49.697052229Z Next consensus number: 1 Legacy running event hash: c10f87ceee2c160dca068b2472deb81e9d751dcb8be38c3d464926ff2e75d4f28fe82aa6f68c1e85ad8b0b971e04dfd6 Legacy running event mnemonic: taste-alien-dentist-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 94e0b37602cc8a1f0ebc405141125dc1a8d2b713c585c471fb8a4d3592f46055e53fffac650acac8a63caed49f3f0019 (root) ConsistencyTestingToolState / goat-mule-flower-rotate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 suggest-mixed-fit-toy 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 20.732s 2025-12-03 20:59:52.434 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 20.733s 2025-12-03 20:59:52.435 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 20.733s 2025-12-03 20:59:52.435 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 20.734s 2025-12-03 20:59:52.436 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 20.739s 2025-12-03 20:59:52.441 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 20.740s 2025-12-03 20:59:52.442 117 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 2.4 s in CHECKING. Now in ACTIVE
node4 20.741s 2025-12-03 20:59:52.443 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 20.744s 2025-12-03 20:59:52.446 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-03T20:59:49.697052229Z Next consensus number: 1 Legacy running event hash: c10f87ceee2c160dca068b2472deb81e9d751dcb8be38c3d464926ff2e75d4f28fe82aa6f68c1e85ad8b0b971e04dfd6 Legacy running event mnemonic: taste-alien-dentist-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 94e0b37602cc8a1f0ebc405141125dc1a8d2b713c585c471fb8a4d3592f46055e53fffac650acac8a63caed49f3f0019 (root) ConsistencyTestingToolState / goat-mule-flower-rotate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 suggest-mixed-fit-toy 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node1 20.762s 2025-12-03 20:59:52.464 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 20.763s 2025-12-03 20:59:52.465 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 20.763s 2025-12-03 20:59:52.465 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 20.764s 2025-12-03 20:59:52.466 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 20.769s 2025-12-03 20:59:52.471 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 20.769s 2025-12-03 20:59:52.471 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 20.772s 2025-12-03 20:59:52.474 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-12-03T20:59:49.697052229Z Next consensus number: 1 Legacy running event hash: c10f87ceee2c160dca068b2472deb81e9d751dcb8be38c3d464926ff2e75d4f28fe82aa6f68c1e85ad8b0b971e04dfd6 Legacy running event mnemonic: taste-alien-dentist-script Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 94e0b37602cc8a1f0ebc405141125dc1a8d2b713c585c471fb8a4d3592f46055e53fffac650acac8a63caed49f3f0019 (root) ConsistencyTestingToolState / goat-mule-flower-rotate 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 suggest-mixed-fit-toy 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 20.785s 2025-12-03 20:59:52.487 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 20.785s 2025-12-03 20:59:52.487 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 20.786s 2025-12-03 20:59:52.488 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 20.787s 2025-12-03 20:59:52.489 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 20.792s 2025-12-03 20:59:52.494 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 20.808s 2025-12-03 20:59:52.510 116 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 20.808s 2025-12-03 20:59:52.510 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 20.809s 2025-12-03 20:59:52.511 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 20.810s 2025-12-03 20:59:52.512 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 20.814s 2025-12-03 20:59:52.516 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 21.247s 2025-12-03 20:59:52.949 118 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 2.3 s in CHECKING. Now in ACTIVE
node2 30.307s 2025-12-03 21:00:02.009 258 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 16 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 30.336s 2025-12-03 21:00:02.038 270 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 16 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 30.345s 2025-12-03 21:00:02.047 256 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 16 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 30.377s 2025-12-03 21:00:02.079 258 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 16 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 30.503s 2025-12-03 21:00:02.205 260 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 16 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 30.735s 2025-12-03 21:00:02.437 260 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 16 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/16
node3 30.736s 2025-12-03 21:00:02.438 261 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node1 30.762s 2025-12-03 21:00:02.464 258 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 16 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/16
node1 30.763s 2025-12-03 21:00:02.465 259 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node4 30.802s 2025-12-03 21:00:02.504 262 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 16 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/16
node4 30.803s 2025-12-03 21:00:02.505 263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node2 30.833s 2025-12-03 21:00:02.535 270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 16 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/16
node2 30.834s 2025-12-03 21:00:02.536 271 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node3 30.845s 2025-12-03 21:00:02.547 292 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node3 30.849s 2025-12-03 21:00:02.551 293 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 16 Timestamp: 2025-12-03T21:00:00.577374982Z Next consensus number: 420 Legacy running event hash: 46148dbac85a57b5322103fa5fdad3b338e404fd23253b2619750e9f389b3f0a644b30a3ae310679332ff62e07280c77 Legacy running event mnemonic: secret-emotion-visit-wall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 600266007 Root hash: ee155191e636f64b4118e8bb1e2ceac19baf91280e5f0f378b8d2703d2609bfb8f0d13002420ad859358df03ff30960d (root) ConsistencyTestingToolState / dice-cram-deer-cute 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 quote-predict-open-oval 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 962082079668242219 /3 spawn-carbon-upon-climb 4 StringLeaf 16 /4 clinic-aspect-pepper-field
node3 30.859s 2025-12-03 21:00:02.561 294 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 30.859s 2025-12-03 21:00:02.561 295 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 30.860s 2025-12-03 21:00:02.562 296 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 30.860s 2025-12-03 21:00:02.562 297 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 30.861s 2025-12-03 21:00:02.563 298 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node3 30.861s 2025-12-03 21:00:02.563 298 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 16 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/16 {"round":16,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/16/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 30.864s 2025-12-03 21:00:02.566 299 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 16 Timestamp: 2025-12-03T21:00:00.577374982Z Next consensus number: 420 Legacy running event hash: 46148dbac85a57b5322103fa5fdad3b338e404fd23253b2619750e9f389b3f0a644b30a3ae310679332ff62e07280c77 Legacy running event mnemonic: secret-emotion-visit-wall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 600266007 Root hash: ee155191e636f64b4118e8bb1e2ceac19baf91280e5f0f378b8d2703d2609bfb8f0d13002420ad859358df03ff30960d (root) ConsistencyTestingToolState / dice-cram-deer-cute 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 quote-predict-open-oval 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 962082079668242219 /3 spawn-carbon-upon-climb 4 StringLeaf 16 /4 clinic-aspect-pepper-field
node1 30.876s 2025-12-03 21:00:02.578 300 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 30.877s 2025-12-03 21:00:02.579 301 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 30.877s 2025-12-03 21:00:02.579 302 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 30.878s 2025-12-03 21:00:02.580 303 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 30.879s 2025-12-03 21:00:02.581 304 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 16 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/16 {"round":16,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/16/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 30.908s 2025-12-03 21:00:02.610 302 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node4 30.910s 2025-12-03 21:00:02.612 303 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 16 Timestamp: 2025-12-03T21:00:00.577374982Z Next consensus number: 420 Legacy running event hash: 46148dbac85a57b5322103fa5fdad3b338e404fd23253b2619750e9f389b3f0a644b30a3ae310679332ff62e07280c77 Legacy running event mnemonic: secret-emotion-visit-wall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 600266007 Root hash: ee155191e636f64b4118e8bb1e2ceac19baf91280e5f0f378b8d2703d2609bfb8f0d13002420ad859358df03ff30960d (root) ConsistencyTestingToolState / dice-cram-deer-cute 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 quote-predict-open-oval 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 962082079668242219 /3 spawn-carbon-upon-climb 4 StringLeaf 16 /4 clinic-aspect-pepper-field
node2 30.921s 2025-12-03 21:00:02.623 310 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node4 30.922s 2025-12-03 21:00:02.624 304 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 30.923s 2025-12-03 21:00:02.625 305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node2 30.924s 2025-12-03 21:00:02.626 311 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 16 Timestamp: 2025-12-03T21:00:00.577374982Z Next consensus number: 420 Legacy running event hash: 46148dbac85a57b5322103fa5fdad3b338e404fd23253b2619750e9f389b3f0a644b30a3ae310679332ff62e07280c77 Legacy running event mnemonic: secret-emotion-visit-wall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 600266007 Root hash: ee155191e636f64b4118e8bb1e2ceac19baf91280e5f0f378b8d2703d2609bfb8f0d13002420ad859358df03ff30960d (root) ConsistencyTestingToolState / dice-cram-deer-cute 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 quote-predict-open-oval 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 962082079668242219 /3 spawn-carbon-upon-climb 4 StringLeaf 16 /4 clinic-aspect-pepper-field
node4 30.924s 2025-12-03 21:00:02.626 306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 30.924s 2025-12-03 21:00:02.626 307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 30.925s 2025-12-03 21:00:02.627 308 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 16 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/16 {"round":16,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/16/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 30.933s 2025-12-03 21:00:02.635 312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 30.933s 2025-12-03 21:00:02.635 313 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 30.933s 2025-12-03 21:00:02.635 314 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 30.934s 2025-12-03 21:00:02.636 315 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 30.934s 2025-12-03 21:00:02.636 316 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 16 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/16 {"round":16,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/16/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 30.995s 2025-12-03 21:00:02.697 272 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 16 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/16
node0 30.996s 2025-12-03 21:00:02.698 273 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node0 31.092s 2025-12-03 21:00:02.794 316 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 16
node0 31.095s 2025-12-03 21:00:02.797 317 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 16 Timestamp: 2025-12-03T21:00:00.577374982Z Next consensus number: 420 Legacy running event hash: 46148dbac85a57b5322103fa5fdad3b338e404fd23253b2619750e9f389b3f0a644b30a3ae310679332ff62e07280c77 Legacy running event mnemonic: secret-emotion-visit-wall Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 600266007 Root hash: ee155191e636f64b4118e8bb1e2ceac19baf91280e5f0f378b8d2703d2609bfb8f0d13002420ad859358df03ff30960d (root) ConsistencyTestingToolState / dice-cram-deer-cute 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 quote-predict-open-oval 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 962082079668242219 /3 spawn-carbon-upon-climb 4 StringLeaf 16 /4 clinic-aspect-pepper-field
node0 31.105s 2025-12-03 21:00:02.807 318 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 31.105s 2025-12-03 21:00:02.807 319 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 31.106s 2025-12-03 21:00:02.808 320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 31.106s 2025-12-03 21:00:02.808 321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 31.107s 2025-12-03 21:00:02.809 322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 16 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/16 {"round":16,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/16/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 29.548s 2025-12-03 21:01:01.250 1297 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 106 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 29.823s 2025-12-03 21:01:01.525 1275 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 106 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 29.824s 2025-12-03 21:01:01.526 1303 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 106 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 29.830s 2025-12-03 21:01:01.532 1271 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 106 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 29.865s 2025-12-03 21:01:01.567 1319 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 106 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 30.133s 2025-12-03 21:01:01.835 1278 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 106 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/106
node4 1m 30.133s 2025-12-03 21:01:01.835 1279 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node1 1m 30.151s 2025-12-03 21:01:01.853 1274 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 106 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/106
node1 1m 30.152s 2025-12-03 21:01:01.854 1275 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node4 1m 30.218s 2025-12-03 21:01:01.920 1312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node2 1m 30.220s 2025-12-03 21:01:01.922 1322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 106 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/106
node2 1m 30.220s 2025-12-03 21:01:01.922 1323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node4 1m 30.220s 2025-12-03 21:01:01.922 1313 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 106 Timestamp: 2025-12-03T21:01:00.089810692Z Next consensus number: 2901 Legacy running event hash: 35439000dfe71b785e45ead921b84ebfa705bb770cc431ce3ce4f46defd16a3847e74987abf6d78f051fb89413e4e209 Legacy running event mnemonic: position-abandon-add-humor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1699417553 Root hash: 4c27813a899b86e189b398082f3466ec1fa7beb3438f61266605912ac29843e432511618e70333a82da28e9dd47e2636 (root) ConsistencyTestingToolState / lumber-sugar-smile-catalog 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 surround-iron-dinosaur-bronze 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -52508176360409252 /3 path-female-silver-mom 4 StringLeaf 106 /4 lounge-salmon-luggage-proof
node4 1m 30.229s 2025-12-03 21:01:01.931 1314 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 30.230s 2025-12-03 21:01:01.932 1315 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 79 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 30.230s 2025-12-03 21:01:01.932 1316 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 30.232s 2025-12-03 21:01:01.934 1317 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 30.233s 2025-12-03 21:01:01.935 1318 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 106 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/106 {"round":106,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/106/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 30.239s 2025-12-03 21:01:01.941 1322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node1 1m 30.241s 2025-12-03 21:01:01.943 1323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 106 Timestamp: 2025-12-03T21:01:00.089810692Z Next consensus number: 2901 Legacy running event hash: 35439000dfe71b785e45ead921b84ebfa705bb770cc431ce3ce4f46defd16a3847e74987abf6d78f051fb89413e4e209 Legacy running event mnemonic: position-abandon-add-humor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1699417553 Root hash: 4c27813a899b86e189b398082f3466ec1fa7beb3438f61266605912ac29843e432511618e70333a82da28e9dd47e2636 (root) ConsistencyTestingToolState / lumber-sugar-smile-catalog 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 surround-iron-dinosaur-bronze 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -52508176360409252 /3 path-female-silver-mom 4 StringLeaf 106 /4 lounge-salmon-luggage-proof
node1 1m 30.252s 2025-12-03 21:01:01.954 1324 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 30.252s 2025-12-03 21:01:01.954 1325 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 79 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 30.252s 2025-12-03 21:01:01.954 1326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 30.254s 2025-12-03 21:01:01.956 1327 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 1m 30.255s 2025-12-03 21:01:01.957 1328 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 106 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/106 {"round":106,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/106/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 30.302s 2025-12-03 21:01:02.004 1356 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node2 1m 30.304s 2025-12-03 21:01:02.006 1357 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 106 Timestamp: 2025-12-03T21:01:00.089810692Z Next consensus number: 2901 Legacy running event hash: 35439000dfe71b785e45ead921b84ebfa705bb770cc431ce3ce4f46defd16a3847e74987abf6d78f051fb89413e4e209 Legacy running event mnemonic: position-abandon-add-humor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1699417553 Root hash: 4c27813a899b86e189b398082f3466ec1fa7beb3438f61266605912ac29843e432511618e70333a82da28e9dd47e2636 (root) ConsistencyTestingToolState / lumber-sugar-smile-catalog 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 surround-iron-dinosaur-bronze 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -52508176360409252 /3 path-female-silver-mom 4 StringLeaf 106 /4 lounge-salmon-luggage-proof
node2 1m 30.312s 2025-12-03 21:01:02.014 1358 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 30.312s 2025-12-03 21:01:02.014 1359 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 79 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 30.312s 2025-12-03 21:01:02.014 1360 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 1m 30.314s 2025-12-03 21:01:02.016 1361 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 30.315s 2025-12-03 21:01:02.017 1300 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 106 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/106
node2 1m 30.315s 2025-12-03 21:01:02.017 1362 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 106 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/106 {"round":106,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/106/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 30.316s 2025-12-03 21:01:02.018 1301 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node3 1m 30.320s 2025-12-03 21:01:02.022 1306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 106 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/106
node3 1m 30.321s 2025-12-03 21:01:02.023 1307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node0 1m 30.409s 2025-12-03 21:01:02.111 1334 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node3 1m 30.411s 2025-12-03 21:01:02.113 1340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/15 for round 106
node0 1m 30.412s 2025-12-03 21:01:02.114 1335 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 106 Timestamp: 2025-12-03T21:01:00.089810692Z Next consensus number: 2901 Legacy running event hash: 35439000dfe71b785e45ead921b84ebfa705bb770cc431ce3ce4f46defd16a3847e74987abf6d78f051fb89413e4e209 Legacy running event mnemonic: position-abandon-add-humor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1699417553 Root hash: 4c27813a899b86e189b398082f3466ec1fa7beb3438f61266605912ac29843e432511618e70333a82da28e9dd47e2636 (root) ConsistencyTestingToolState / lumber-sugar-smile-catalog 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 surround-iron-dinosaur-bronze 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -52508176360409252 /3 path-female-silver-mom 4 StringLeaf 106 /4 lounge-salmon-luggage-proof
node3 1m 30.413s 2025-12-03 21:01:02.115 1341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 106 Timestamp: 2025-12-03T21:01:00.089810692Z Next consensus number: 2901 Legacy running event hash: 35439000dfe71b785e45ead921b84ebfa705bb770cc431ce3ce4f46defd16a3847e74987abf6d78f051fb89413e4e209 Legacy running event mnemonic: position-abandon-add-humor Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1699417553 Root hash: 4c27813a899b86e189b398082f3466ec1fa7beb3438f61266605912ac29843e432511618e70333a82da28e9dd47e2636 (root) ConsistencyTestingToolState / lumber-sugar-smile-catalog 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 surround-iron-dinosaur-bronze 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -52508176360409252 /3 path-female-silver-mom 4 StringLeaf 106 /4 lounge-salmon-luggage-proof
node0 1m 30.421s 2025-12-03 21:01:02.123 1336 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 30.421s 2025-12-03 21:01:02.123 1337 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 79 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 30.422s 2025-12-03 21:01:02.124 1338 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 30.423s 2025-12-03 21:01:02.125 1342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 30.423s 2025-12-03 21:01:02.125 1343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 79 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 30.423s 2025-12-03 21:01:02.125 1344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 30.424s 2025-12-03 21:01:02.126 1339 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 30.425s 2025-12-03 21:01:02.127 1340 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 106 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/106 {"round":106,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/106/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 30.426s 2025-12-03 21:01:02.128 1345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 30.426s 2025-12-03 21:01:02.128 1346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 106 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/106 {"round":106,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/106/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 29.784s 2025-12-03 21:02:01.486 2292 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 193 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 29.853s 2025-12-03 21:02:01.555 2330 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 193 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 29.932s 2025-12-03 21:02:01.634 2300 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 193 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 29.993s 2025-12-03 21:02:01.695 2368 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 193 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 30.033s 2025-12-03 21:02:01.735 2340 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 193 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 2m 30.160s 2025-12-03 21:02:01.862 2343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 193 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/193
node0 2m 30.160s 2025-12-03 21:02:01.862 2344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node2 2m 30.230s 2025-12-03 21:02:01.932 2333 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 193 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/193
node2 2m 30.231s 2025-12-03 21:02:01.933 2334 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node0 2m 30.247s 2025-12-03 21:02:01.949 2375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node0 2m 30.249s 2025-12-03 21:02:01.951 2376 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node0 2m 30.256s 2025-12-03 21:02:01.958 2377 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 30.256s 2025-12-03 21:02:01.958 2378 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 166 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 30.256s 2025-12-03 21:02:01.958 2379 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 30.261s 2025-12-03 21:02:01.963 2380 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 30.261s 2025-12-03 21:02:01.963 2381 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 193 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/193 {"round":193,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/193/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 2m 30.317s 2025-12-03 21:02:02.019 2365 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node2 2m 30.319s 2025-12-03 21:02:02.021 2374 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node2 2m 30.325s 2025-12-03 21:02:02.027 2375 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 30.326s 2025-12-03 21:02:02.028 2376 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 166 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 30.326s 2025-12-03 21:02:02.028 2377 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 30.330s 2025-12-03 21:02:02.032 2378 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 30.330s 2025-12-03 21:02:02.032 2379 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 193 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/193 {"round":193,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/193/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 2m 30.342s 2025-12-03 21:02:02.044 2305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 193 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/193
node1 2m 30.343s 2025-12-03 21:02:02.045 2306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node3 2m 30.415s 2025-12-03 21:02:02.117 2371 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 193 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/193
node3 2m 30.416s 2025-12-03 21:02:02.118 2372 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node1 2m 30.428s 2025-12-03 21:02:02.130 2345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node1 2m 30.430s 2025-12-03 21:02:02.132 2346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node1 2m 30.438s 2025-12-03 21:02:02.140 2347 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 30.438s 2025-12-03 21:02:02.140 2348 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 166 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 30.438s 2025-12-03 21:02:02.140 2349 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 30.442s 2025-12-03 21:02:02.144 2350 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 30.443s 2025-12-03 21:02:02.145 2351 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 193 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/193 {"round":193,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/193/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 30.484s 2025-12-03 21:02:02.186 2303 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 193 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/193
node4 2m 30.485s 2025-12-03 21:02:02.187 2312 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node3 2m 30.504s 2025-12-03 21:02:02.206 2411 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node3 2m 30.506s 2025-12-03 21:02:02.208 2412 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node3 2m 30.514s 2025-12-03 21:02:02.216 2413 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 30.514s 2025-12-03 21:02:02.216 2414 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 166 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 30.514s 2025-12-03 21:02:02.216 2415 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 30.518s 2025-12-03 21:02:02.220 2416 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 30.519s 2025-12-03 21:02:02.221 2417 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 193 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/193 {"round":193,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/193/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 30.572s 2025-12-03 21:02:02.274 2341 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/22 for round 193
node4 2m 30.574s 2025-12-03 21:02:02.276 2342 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node4 2m 30.581s 2025-12-03 21:02:02.283 2343 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 30.582s 2025-12-03 21:02:02.284 2344 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 166 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 30.582s 2025-12-03 21:02:02.284 2345 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 30.586s 2025-12-03 21:02:02.288 2346 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 30.587s 2025-12-03 21:02:02.289 2347 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 193 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/193 {"round":193,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/193/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 14.835s 2025-12-03 21:02:46.537 3156 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 2 to 4>> NetworkUtils: Connection broken: 2 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 3m 14.836s 2025-12-03 21:02:46.538 3182 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 3 to 4>> NetworkUtils: Connection broken: 3 -> 4
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node1 3m 14.837s 2025-12-03 21:02:46.539 3138 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 1 to 4>> NetworkUtils: Connection broken: 1 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:02:46.536947323Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:02:46.536947323Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node0 3m 14.839s 2025-12-03 21:02:46.541 3138 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:02:46.537974964Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:02:46.537974964Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node2 3m 30.053s 2025-12-03 21:03:01.755 3445 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 285 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 30.134s 2025-12-03 21:03:01.836 3407 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 285 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 30.183s 2025-12-03 21:03:01.885 3419 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 285 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 30.205s 2025-12-03 21:03:01.907 3451 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 285 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 30.589s 2025-12-03 21:03:02.291 3422 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 285 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/285
node0 3m 30.590s 2025-12-03 21:03:02.292 3423 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node1 3m 30.659s 2025-12-03 21:03:02.361 3410 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 285 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/285
node1 3m 30.660s 2025-12-03 21:03:02.362 3411 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node0 3m 30.687s 2025-12-03 21:03:02.389 3462 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node2 3m 30.688s 2025-12-03 21:03:02.390 3448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 285 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/285
node2 3m 30.688s 2025-12-03 21:03:02.390 3449 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node0 3m 30.689s 2025-12-03 21:03:02.391 3463 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 285 Timestamp: 2025-12-03T21:03:00.457670Z Next consensus number: 7721 Legacy running event hash: ee7bd4f7ffe69f8717d9186ad42ac7ad85e6f84de348b4609936e5cc3dbd5f39fb0a2f3b39ecb29bc9383fcda5c5eccf Legacy running event mnemonic: cost-gentle-addict-august Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1377989674 Root hash: 00b4573d0185d65e5e69f998302fcdd40081f9b36e4c78a1aaea56ecf72b0884e9dbd993145ec2d2ef9b6781346b9f7b (root) ConsistencyTestingToolState / bird-tray-cost-crowd 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 repair-raise-spatial-absurd 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -5951042220787565223 /3 sorry-borrow-wife-step 4 StringLeaf 285 /4 possible-dirt-assume-gasp
node0 3m 30.696s 2025-12-03 21:03:02.398 3464 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 30.696s 2025-12-03 21:03:02.398 3465 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 258 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 30.696s 2025-12-03 21:03:02.398 3466 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 30.703s 2025-12-03 21:03:02.405 3467 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 30.704s 2025-12-03 21:03:02.406 3468 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 285 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/285 {"round":285,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/285/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 30.748s 2025-12-03 21:03:02.450 3445 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node1 3m 30.750s 2025-12-03 21:03:02.452 3446 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 285 Timestamp: 2025-12-03T21:03:00.457670Z Next consensus number: 7721 Legacy running event hash: ee7bd4f7ffe69f8717d9186ad42ac7ad85e6f84de348b4609936e5cc3dbd5f39fb0a2f3b39ecb29bc9383fcda5c5eccf Legacy running event mnemonic: cost-gentle-addict-august Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1377989674 Root hash: 00b4573d0185d65e5e69f998302fcdd40081f9b36e4c78a1aaea56ecf72b0884e9dbd993145ec2d2ef9b6781346b9f7b (root) ConsistencyTestingToolState / bird-tray-cost-crowd 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 repair-raise-spatial-absurd 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -5951042220787565223 /3 sorry-borrow-wife-step 4 StringLeaf 285 /4 possible-dirt-assume-gasp
node1 3m 30.758s 2025-12-03 21:03:02.460 3447 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 30.758s 2025-12-03 21:03:02.460 3448 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 258 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 30.758s 2025-12-03 21:03:02.460 3449 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 30.758s 2025-12-03 21:03:02.460 3454 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 285 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/285
node3 3m 30.759s 2025-12-03 21:03:02.461 3455 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node1 3m 30.764s 2025-12-03 21:03:02.466 3450 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 30.764s 2025-12-03 21:03:02.466 3451 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 285 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/285 {"round":285,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/285/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 3m 30.780s 2025-12-03 21:03:02.482 3488 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node2 3m 30.782s 2025-12-03 21:03:02.484 3489 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 285 Timestamp: 2025-12-03T21:03:00.457670Z Next consensus number: 7721 Legacy running event hash: ee7bd4f7ffe69f8717d9186ad42ac7ad85e6f84de348b4609936e5cc3dbd5f39fb0a2f3b39ecb29bc9383fcda5c5eccf Legacy running event mnemonic: cost-gentle-addict-august Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1377989674 Root hash: 00b4573d0185d65e5e69f998302fcdd40081f9b36e4c78a1aaea56ecf72b0884e9dbd993145ec2d2ef9b6781346b9f7b (root) ConsistencyTestingToolState / bird-tray-cost-crowd 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 repair-raise-spatial-absurd 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -5951042220787565223 /3 sorry-borrow-wife-step 4 StringLeaf 285 /4 possible-dirt-assume-gasp
node2 3m 30.788s 2025-12-03 21:03:02.490 3490 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 30.788s 2025-12-03 21:03:02.490 3491 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 258 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 30.788s 2025-12-03 21:03:02.490 3492 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 30.794s 2025-12-03 21:03:02.496 3493 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 30.794s 2025-12-03 21:03:02.496 3494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 285 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/285 {"round":285,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/285/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 30.859s 2025-12-03 21:03:02.561 3489 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/29 for round 285
node3 3m 30.861s 2025-12-03 21:03:02.563 3490 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 285 Timestamp: 2025-12-03T21:03:00.457670Z Next consensus number: 7721 Legacy running event hash: ee7bd4f7ffe69f8717d9186ad42ac7ad85e6f84de348b4609936e5cc3dbd5f39fb0a2f3b39ecb29bc9383fcda5c5eccf Legacy running event mnemonic: cost-gentle-addict-august Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1377989674 Root hash: 00b4573d0185d65e5e69f998302fcdd40081f9b36e4c78a1aaea56ecf72b0884e9dbd993145ec2d2ef9b6781346b9f7b (root) ConsistencyTestingToolState / bird-tray-cost-crowd 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 repair-raise-spatial-absurd 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -5951042220787565223 /3 sorry-borrow-wife-step 4 StringLeaf 285 /4 possible-dirt-assume-gasp
node3 3m 30.869s 2025-12-03 21:03:02.571 3491 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 30.870s 2025-12-03 21:03:02.572 3492 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 258 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 30.870s 2025-12-03 21:03:02.572 3493 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 30.876s 2025-12-03 21:03:02.578 3494 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 30.877s 2025-12-03 21:03:02.579 3495 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 285 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/285 {"round":285,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/285/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 29.683s 2025-12-03 21:04:01.385 4556 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 376 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 29.753s 2025-12-03 21:04:01.455 4512 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 376 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 29.753s 2025-12-03 21:04:01.455 4506 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 376 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 4m 29.814s 2025-12-03 21:04:01.516 4562 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 376 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 30.160s 2025-12-03 21:04:01.862 4515 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 376 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/376
node0 4m 30.161s 2025-12-03 21:04:01.863 4516 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node2 4m 30.230s 2025-12-03 21:04:01.932 4575 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 376 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/376
node2 4m 30.231s 2025-12-03 21:04:01.933 4576 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node3 4m 30.238s 2025-12-03 21:04:01.940 4569 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 376 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/376
node3 4m 30.239s 2025-12-03 21:04:01.941 4570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node0 4m 30.257s 2025-12-03 21:04:01.959 4547 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node0 4m 30.259s 2025-12-03 21:04:01.961 4548 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 376 Timestamp: 2025-12-03T21:04:00.400063Z Next consensus number: 9261 Legacy running event hash: 7a95097c1e4440c2c767f8040c097cc07df71639b1edd0ad132aba6d41191ca55f7eca91e3667f6c193dda3d9f12536a Legacy running event mnemonic: famous-convince-silk-blue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1680318172 Root hash: b625d779131a1bf28781dc0205a88f7f44057f66d6d2b30febb7707f0f3f2d73def58c93fef1cfca91fbfebcf41f58df (root) ConsistencyTestingToolState / derive-axis-globe-layer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 snow-keen-work-hen 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 871391813947533678 /3 heart-drift-story-coconut 4 StringLeaf 376 /4 age-bacon-purse-tourist
node0 4m 30.267s 2025-12-03 21:04:01.969 4552 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 30.268s 2025-12-03 21:04:01.970 4553 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 349 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 4m 30.268s 2025-12-03 21:04:01.970 4554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 30.275s 2025-12-03 21:04:01.977 4555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 30.275s 2025-12-03 21:04:01.977 4556 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 376 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/376 {"round":376,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/376/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 30.277s 2025-12-03 21:04:01.979 4557 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node2 4m 30.314s 2025-12-03 21:04:02.016 4607 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node2 4m 30.316s 2025-12-03 21:04:02.018 4608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 376 Timestamp: 2025-12-03T21:04:00.400063Z Next consensus number: 9261 Legacy running event hash: 7a95097c1e4440c2c767f8040c097cc07df71639b1edd0ad132aba6d41191ca55f7eca91e3667f6c193dda3d9f12536a Legacy running event mnemonic: famous-convince-silk-blue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1680318172 Root hash: b625d779131a1bf28781dc0205a88f7f44057f66d6d2b30febb7707f0f3f2d73def58c93fef1cfca91fbfebcf41f58df (root) ConsistencyTestingToolState / derive-axis-globe-layer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 snow-keen-work-hen 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 871391813947533678 /3 heart-drift-story-coconut 4 StringLeaf 376 /4 age-bacon-purse-tourist
node1 4m 30.320s 2025-12-03 21:04:02.022 4519 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 376 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/376
node1 4m 30.320s 2025-12-03 21:04:02.022 4520 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node2 4m 30.325s 2025-12-03 21:04:02.027 4620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 30.325s 2025-12-03 21:04:02.027 4621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 349 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 30.325s 2025-12-03 21:04:02.027 4622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 30.332s 2025-12-03 21:04:02.034 4623 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 30.332s 2025-12-03 21:04:02.034 4624 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 376 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/376 {"round":376,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/376/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 30.334s 2025-12-03 21:04:02.036 4625 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node3 4m 30.334s 2025-12-03 21:04:02.036 4609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node3 4m 30.337s 2025-12-03 21:04:02.039 4610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 376 Timestamp: 2025-12-03T21:04:00.400063Z Next consensus number: 9261 Legacy running event hash: 7a95097c1e4440c2c767f8040c097cc07df71639b1edd0ad132aba6d41191ca55f7eca91e3667f6c193dda3d9f12536a Legacy running event mnemonic: famous-convince-silk-blue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1680318172 Root hash: b625d779131a1bf28781dc0205a88f7f44057f66d6d2b30febb7707f0f3f2d73def58c93fef1cfca91fbfebcf41f58df (root) ConsistencyTestingToolState / derive-axis-globe-layer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 snow-keen-work-hen 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 871391813947533678 /3 heart-drift-story-coconut 4 StringLeaf 376 /4 age-bacon-purse-tourist
node3 4m 30.344s 2025-12-03 21:04:02.046 4611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 30.344s 2025-12-03 21:04:02.046 4612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 349 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 30.345s 2025-12-03 21:04:02.047 4613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 30.351s 2025-12-03 21:04:02.053 4614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 30.352s 2025-12-03 21:04:02.054 4615 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 376 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/376 {"round":376,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/376/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 30.354s 2025-12-03 21:04:02.056 4616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node1 4m 30.409s 2025-12-03 21:04:02.111 4554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/36 for round 376
node1 4m 30.411s 2025-12-03 21:04:02.113 4555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 376 Timestamp: 2025-12-03T21:04:00.400063Z Next consensus number: 9261 Legacy running event hash: 7a95097c1e4440c2c767f8040c097cc07df71639b1edd0ad132aba6d41191ca55f7eca91e3667f6c193dda3d9f12536a Legacy running event mnemonic: famous-convince-silk-blue Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1680318172 Root hash: b625d779131a1bf28781dc0205a88f7f44057f66d6d2b30febb7707f0f3f2d73def58c93fef1cfca91fbfebcf41f58df (root) ConsistencyTestingToolState / derive-axis-globe-layer 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 snow-keen-work-hen 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 871391813947533678 /3 heart-drift-story-coconut 4 StringLeaf 376 /4 age-bacon-purse-tourist
node1 4m 30.418s 2025-12-03 21:04:02.120 4566 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 30.419s 2025-12-03 21:04:02.121 4567 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 349 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 4m 30.419s 2025-12-03 21:04:02.121 4568 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 30.426s 2025-12-03 21:04:02.128 4569 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 30.427s 2025-12-03 21:04:02.129 4570 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 376 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/376 {"round":376,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/376/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 30.429s 2025-12-03 21:04:02.131 4571 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node3 5m 29.724s 2025-12-03 21:05:01.426 5645 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 466 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 29.929s 2025-12-03 21:05:01.631 5653 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 466 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 30.060s 2025-12-03 21:05:01.762 5609 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 466 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 30.070s 2025-12-03 21:05:01.772 5589 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 466 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 30.191s 2025-12-03 21:05:01.893 5592 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 466 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/466
node0 5m 30.192s 2025-12-03 21:05:01.894 5593 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node1 5m 30.209s 2025-12-03 21:05:01.911 5612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 466 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/466
node1 5m 30.210s 2025-12-03 21:05:01.912 5613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node1 5m 30.297s 2025-12-03 21:05:01.999 5652 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node0 5m 30.299s 2025-12-03 21:05:02.001 5632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node1 5m 30.299s 2025-12-03 21:05:02.001 5653 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 466 Timestamp: 2025-12-03T21:05:00.358576Z Next consensus number: 10778 Legacy running event hash: 72bc25ff55a41ec9618e83c166359415080746872deff2264d6a982317a338237910062fe5fe6686f6c3d3b5f0cddc28 Legacy running event mnemonic: fiscal-say-region-summer Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1679564926 Root hash: ee11fedf0b19c50da58a9bc61eac74623e00039f11a370916e25524f114547880b655c2fe9ed1f921dee6af6bcf326c5 (root) ConsistencyTestingToolState / digital-category-glimpse-fury 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 border-you-demand-admit 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3400946007740669306 /3 goat-valid-faint-faint 4 StringLeaf 466 /4 belt-robust-today-work
node0 5m 30.301s 2025-12-03 21:05:02.003 5633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 466 Timestamp: 2025-12-03T21:05:00.358576Z Next consensus number: 10778 Legacy running event hash: 72bc25ff55a41ec9618e83c166359415080746872deff2264d6a982317a338237910062fe5fe6686f6c3d3b5f0cddc28 Legacy running event mnemonic: fiscal-say-region-summer Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1679564926 Root hash: ee11fedf0b19c50da58a9bc61eac74623e00039f11a370916e25524f114547880b655c2fe9ed1f921dee6af6bcf326c5 (root) ConsistencyTestingToolState / digital-category-glimpse-fury 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 border-you-demand-admit 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3400946007740669306 /3 goat-valid-faint-faint 4 StringLeaf 466 /4 belt-robust-today-work
node1 5m 30.306s 2025-12-03 21:05:02.008 5654 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 30.306s 2025-12-03 21:05:02.008 5655 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 439 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces
node1 5m 30.306s 2025-12-03 21:05:02.008 5656 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 30.308s 2025-12-03 21:05:02.010 5634 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 30.308s 2025-12-03 21:05:02.010 5635 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 439 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 5m 30.308s 2025-12-03 21:05:02.010 5636 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 30.314s 2025-12-03 21:05:02.016 5657 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 30.314s 2025-12-03 21:05:02.016 5658 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 466 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/466 {"round":466,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/466/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 30.316s 2025-12-03 21:05:02.018 5637 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 30.316s 2025-12-03 21:05:02.018 5638 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 466 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/466 {"round":466,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/466/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 30.316s 2025-12-03 21:05:02.018 5659 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/16
node0 5m 30.318s 2025-12-03 21:05:02.020 5639 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/16
node3 5m 30.562s 2025-12-03 21:05:02.264 5648 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 466 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/466
node3 5m 30.563s 2025-12-03 21:05:02.265 5649 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node3 5m 30.657s 2025-12-03 21:05:02.359 5683 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node3 5m 30.660s 2025-12-03 21:05:02.362 5684 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 466 Timestamp: 2025-12-03T21:05:00.358576Z Next consensus number: 10778 Legacy running event hash: 72bc25ff55a41ec9618e83c166359415080746872deff2264d6a982317a338237910062fe5fe6686f6c3d3b5f0cddc28 Legacy running event mnemonic: fiscal-say-region-summer Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1679564926 Root hash: ee11fedf0b19c50da58a9bc61eac74623e00039f11a370916e25524f114547880b655c2fe9ed1f921dee6af6bcf326c5 (root) ConsistencyTestingToolState / digital-category-glimpse-fury 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 border-you-demand-admit 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3400946007740669306 /3 goat-valid-faint-faint 4 StringLeaf 466 /4 belt-robust-today-work
node3 5m 30.667s 2025-12-03 21:05:02.369 5685 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 30.668s 2025-12-03 21:05:02.370 5686 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 439 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 30.668s 2025-12-03 21:05:02.370 5687 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 30.672s 2025-12-03 21:05:02.374 5656 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 466 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/466
node2 5m 30.673s 2025-12-03 21:05:02.375 5657 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node3 5m 30.676s 2025-12-03 21:05:02.378 5688 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 30.676s 2025-12-03 21:05:02.378 5689 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 466 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/466 {"round":466,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/466/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 5m 30.678s 2025-12-03 21:05:02.380 5690 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/16
node2 5m 30.762s 2025-12-03 21:05:02.464 5701 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 466
node2 5m 30.764s 2025-12-03 21:05:02.466 5702 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 466 Timestamp: 2025-12-03T21:05:00.358576Z Next consensus number: 10778 Legacy running event hash: 72bc25ff55a41ec9618e83c166359415080746872deff2264d6a982317a338237910062fe5fe6686f6c3d3b5f0cddc28 Legacy running event mnemonic: fiscal-say-region-summer Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1679564926 Root hash: ee11fedf0b19c50da58a9bc61eac74623e00039f11a370916e25524f114547880b655c2fe9ed1f921dee6af6bcf326c5 (root) ConsistencyTestingToolState / digital-category-glimpse-fury 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 border-you-demand-admit 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3400946007740669306 /3 goat-valid-faint-faint 4 StringLeaf 466 /4 belt-robust-today-work
node2 5m 30.770s 2025-12-03 21:05:02.472 5703 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 30.770s 2025-12-03 21:05:02.472 5704 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 439 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 30.770s 2025-12-03 21:05:02.472 5705 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 30.778s 2025-12-03 21:05:02.480 5706 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 30.779s 2025-12-03 21:05:02.481 5707 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 466 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/466 {"round":466,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/466/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 30.780s 2025-12-03 21:05:02.482 5708 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/16
node4 5m 55.362s 2025-12-03 21:05:27.064 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 55.472s 2025-12-03 21:05:27.174 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 55.491s 2025-12-03 21:05:27.193 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 55.627s 2025-12-03 21:05:27.329 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 55.633s 2025-12-03 21:05:27.335 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 55.648s 2025-12-03 21:05:27.350 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 56.140s 2025-12-03 21:05:27.842 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 56.141s 2025-12-03 21:05:27.843 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 57.352s 2025-12-03 21:05:29.054 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1210ms
node4 5m 57.361s 2025-12-03 21:05:29.063 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 57.365s 2025-12-03 21:05:29.067 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 57.410s 2025-12-03 21:05:29.112 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 57.485s 2025-12-03 21:05:29.187 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 57.486s 2025-12-03 21:05:29.188 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 59.511s 2025-12-03 21:05:31.213 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 59.610s 2025-12-03 21:05:31.312 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.618s 2025-12-03 21:05:31.320 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/193/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/106/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/16/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh
node4 5m 59.619s 2025-12-03 21:05:31.321 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 59.619s 2025-12-03 21:05:31.321 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/193/SignedState.swh
node4 5m 59.624s 2025-12-03 21:05:31.326 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 59.628s 2025-12-03 21:05:31.330 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 59.783s 2025-12-03 21:05:31.485 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 59.788s 2025-12-03 21:05:31.490 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":193,"consensusTimestamp":"2025-12-03T21:02:00.317182Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 59.792s 2025-12-03 21:05:31.494 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.803s 2025-12-03 21:05:31.505 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 59.807s 2025-12-03 21:05:31.509 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 59.815s 2025-12-03 21:05:31.517 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 59.818s 2025-12-03 21:05:31.520 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 6.014m 2025-12-03 21:05:32.555 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26186290] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=199330, randomLong=966010218878518803, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=10830, randomLong=6973262774739336960, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=904669, data=35, exception=null] OS Health Check Report - Complete (took 1017 ms)
node4 6.015m 2025-12-03 21:05:32.582 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 6.016m 2025-12-03 21:05:32.670 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 262
node4 6.016m 2025-12-03 21:05:32.673 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 6.016m 2025-12-03 21:05:32.676 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 6m 1.053s 2025-12-03 21:05:32.755 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "I8GipA==", "port": 30124 }, { "ipAddressV4": "CoAALw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IkOpMg==", "port": 30125 }, { "ipAddressV4": "CoAAMA==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "I94SJg==", "port": 30126 }, { "ipAddressV4": "CoAAMQ==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "I7xTNA==", "port": 30127 }, { "ipAddressV4": "CoAAMg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "I8EAYQ==", "port": 30128 }, { "ipAddressV4": "CoAALg==", "port": 30128 }] }] }
node4 6m 1.078s 2025-12-03 21:05:32.780 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long -3360451185516512433.
node4 6m 1.079s 2025-12-03 21:05:32.781 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 193 rounds handled.
node4 6m 1.079s 2025-12-03 21:05:32.781 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 1.080s 2025-12-03 21:05:32.782 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 1.896s 2025-12-03 21:05:33.598 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 193 Timestamp: 2025-12-03T21:02:00.317182Z Next consensus number: 5433 Legacy running event hash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce Legacy running event mnemonic: light-finish-whip-congress Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -1984754730 Root hash: a5b1e6e1bcbd81e600b1e53e0cf421f41d6382ab863b10bb0f79e7f5abcb3ebb275244fe1b890bc759710ee0aad51289 (root) ConsistencyTestingToolState / enemy-catalog-half-select 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 dial-plastic-multiply-clock 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3360451185516512433 /3 reveal-grass-invite-loan 4 StringLeaf 193 /4 matter-cram-lucky-whale
node4 6m 2.175s 2025-12-03 21:05:33.877 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: dbf4dd480fd3d688bb67fe4314bf02423715784807feb9446248ee2f63f80ab76937d9bbf0132a6edbe6b6c0efa3f1ce
node4 6m 2.190s 2025-12-03 21:05:33.892 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 166
node4 6m 2.197s 2025-12-03 21:05:33.899 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 6m 2.199s 2025-12-03 21:05:33.901 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 6m 2.200s 2025-12-03 21:05:33.902 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 6m 2.203s 2025-12-03 21:05:33.905 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 6m 2.205s 2025-12-03 21:05:33.907 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 6m 2.206s 2025-12-03 21:05:33.908 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 6m 2.208s 2025-12-03 21:05:33.910 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 166
node4 6m 2.218s 2025-12-03 21:05:33.920 69 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 218.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6m 2.426s 2025-12-03 21:05:34.128 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:de9e9d3381c8 BR:191), num remaining: 4
node4 6m 2.427s 2025-12-03 21:05:34.129 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:0a41598d35de BR:192), num remaining: 3
node4 6m 2.428s 2025-12-03 21:05:34.130 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:ccc7e5b76aa1 BR:191), num remaining: 2
node4 6m 2.429s 2025-12-03 21:05:34.131 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:b83ef6834578 BR:191), num remaining: 1
node4 6m 2.430s 2025-12-03 21:05:34.132 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:3b24bf627098 BR:191), num remaining: 0
node4 6m 2.813s 2025-12-03 21:05:34.515 677 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 2,632 preconsensus events with max birth round 262. These events contained 6,781 transactions. 68 rounds reached consensus spanning 44.5 seconds of consensus time. The latest round to reach consensus is round 261. Replay took 603.0 milliseconds.
node4 6m 2.816s 2025-12-03 21:05:34.518 680 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6m 2.816s 2025-12-03 21:05:34.518 681 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 596.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 6m 3.703s 2025-12-03 21:05:35.405 692 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 884.0 ms in OBSERVING. Now in BEHIND
node4 6m 3.704s 2025-12-03 21:05:35.406 693 INFO RECONNECT <platformForkJoinThread-8> ReconnectController: Starting ReconnectController
node4 6m 3.705s 2025-12-03 21:05:35.407 694 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 6m 3.705s 2025-12-03 21:05:35.407 695 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 6m 3.706s 2025-12-03 21:05:35.408 696 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 6m 3.707s 2025-12-03 21:05:35.409 697 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 6m 3.708s 2025-12-03 21:05:35.410 698 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node0 6m 3.939s 2025-12-03 21:05:35.641 6221 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":0,"otherNodeId":4,"round":517} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node0 6m 3.940s 2025-12-03 21:05:35.642 6222 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 517 Timestamp: 2025-12-03T21:05:34.037049Z Next consensus number: 11599 Legacy running event hash: 23a7547602cc87ffc45794f456b83d719744f1e333bc1ee0d79401965bfaecd5d1ac19f361485d7332c37618809ac31b Legacy running event mnemonic: insane-mom-flower-ability Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1954272148 Root hash: 15101fd4cd190e024cf19b4e5856991cd34f5b561cb279dcae4e5d052c19cf0e0fe659fa590e209d6bec32101b5700b8 (root) ConsistencyTestingToolState / pear-whisper-friend-scare 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 lounge-treat-cabin-finish 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3489044930256852996 /3 bronze-helmet-pioneer-bulb 4 StringLeaf 517 /4 arrest-farm-inherit-finish
node0 6m 3.941s 2025-12-03 21:05:35.643 6223 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 1, 2 (signing weight = 37500000000/50000000000) for state hash 15101fd4cd190e024cf19b4e5856991cd34f5b561cb279dcae4e5d052c19cf0e0fe659fa590e209d6bec32101b5700b8
node0 6m 3.941s 2025-12-03 21:05:35.643 6224 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node0 6m 3.952s 2025-12-03 21:05:35.654 6225 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node0 6m 3.962s 2025-12-03 21:05:35.664 6226 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1af17f2b start run()
node4 6m 4.010s 2025-12-03 21:05:35.712 699 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":0,"round":260} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 6m 4.013s 2025-12-03 21:05:35.715 700 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 6m 4.014s 2025-12-03 21:05:35.716 701 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 1, 2
node4 6m 4.018s 2025-12-03 21:05:35.720 702 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 6m 4.018s 2025-12-03 21:05:35.720 703 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 6m 4.019s 2025-12-03 21:05:35.721 704 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 4.027s 2025-12-03 21:05:35.729 705 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@40bb517b start run()
node4 6m 4.034s 2025-12-03 21:05:35.736 706 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node0 6m 4.117s 2025-12-03 21:05:35.819 6240 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1af17f2b finish run()
node0 6m 4.117s 2025-12-03 21:05:35.819 6241 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: finished sending tree
node0 6m 4.118s 2025-12-03 21:05:35.820 6242 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node0 6m 4.119s 2025-12-03 21:05:35.821 6243 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1ea61859 start run()
node4 6m 4.246s 2025-12-03 21:05:35.948 730 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 4.247s 2025-12-03 21:05:35.949 731 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 4.248s 2025-12-03 21:05:35.950 732 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@40bb517b finish run()
node4 6m 4.249s 2025-12-03 21:05:35.951 733 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 6m 4.249s 2025-12-03 21:05:35.951 734 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 4.253s 2025-12-03 21:05:35.955 735 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@705f4c85 start run()
node4 6m 4.310s 2025-12-03 21:05:36.012 736 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 6m 4.310s 2025-12-03 21:05:36.012 737 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 6m 4.313s 2025-12-03 21:05:36.015 738 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 6m 4.314s 2025-12-03 21:05:36.016 739 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 6m 4.314s 2025-12-03 21:05:36.016 740 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 6m 4.315s 2025-12-03 21:05:36.017 741 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 6m 4.315s 2025-12-03 21:05:36.017 742 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 6m 4.315s 2025-12-03 21:05:36.017 743 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 6m 4.316s 2025-12-03 21:05:36.018 744 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node0 6m 4.383s 2025-12-03 21:05:36.085 6247 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@1ea61859 finish run()
node0 6m 4.384s 2025-12-03 21:05:36.086 6248 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> TeachingSynchronizer: finished sending tree
node0 6m 4.386s 2025-12-03 21:05:36.088 6251 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 6m 4.488s 2025-12-03 21:05:36.190 754 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 6m 4.489s 2025-12-03 21:05:36.191 756 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 6m 4.490s 2025-12-03 21:05:36.192 757 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 6m 4.490s 2025-12-03 21:05:36.192 758 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 6m 4.491s 2025-12-03 21:05:36.193 759 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@705f4c85 finish run()
node4 6m 4.492s 2025-12-03 21:05:36.194 760 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 6m 4.492s 2025-12-03 21:05:36.194 761 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 6m 4.492s 2025-12-03 21:05:36.194 762 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 6m 4.493s 2025-12-03 21:05:36.195 763 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 6m 4.493s 2025-12-03 21:05:36.195 764 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 6m 4.493s 2025-12-03 21:05:36.195 765 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 6m 4.493s 2025-12-03 21:05:36.195 766 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 6m 4.495s 2025-12-03 21:05:36.197 767 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 6m 4.496s 2025-12-03 21:05:36.198 768 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 6m 4.502s 2025-12-03 21:05:36.204 769 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.47400000000000003,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 6m 4.504s 2025-12-03 21:05:36.206 770 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 6m 4.504s 2025-12-03 21:05:36.206 771 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 6m 4.509s 2025-12-03 21:05:36.211 772 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.006054878234863281} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 6m 4.514s 2025-12-03 21:05:36.216 773 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":0,"round":517,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 4.515s 2025-12-03 21:05:36.217 774 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 517 Timestamp: 2025-12-03T21:05:34.037049Z Next consensus number: 11599 Legacy running event hash: 23a7547602cc87ffc45794f456b83d719744f1e333bc1ee0d79401965bfaecd5d1ac19f361485d7332c37618809ac31b Legacy running event mnemonic: insane-mom-flower-ability Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1954272148 Root hash: 15101fd4cd190e024cf19b4e5856991cd34f5b561cb279dcae4e5d052c19cf0e0fe659fa590e209d6bec32101b5700b8 (root) ConsistencyTestingToolState / pear-whisper-friend-scare 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 lounge-treat-cabin-finish 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3489044930256852996 /3 bronze-helmet-pioneer-bulb 4 StringLeaf 517 /4 arrest-farm-inherit-finish
node4 6m 4.516s 2025-12-03 21:05:36.218 776 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 6m 4.517s 2025-12-03 21:05:36.219 777 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long -3489044930256852996.
node4 6m 4.517s 2025-12-03 21:05:36.219 778 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 517 rounds handled.
node4 6m 4.517s 2025-12-03 21:05:36.219 779 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 6m 4.518s 2025-12-03 21:05:36.220 780 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 6m 4.544s 2025-12-03 21:05:36.246 787 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 517 created, will eventually be written to disk, for reason: RECONNECT
node4 6m 4.545s 2025-12-03 21:05:36.247 788 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 840.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 6m 4.546s 2025-12-03 21:05:36.248 790 INFO STARTUP <platformForkJoinThread-7> Shadowgraph: Shadowgraph starting from expiration threshold 490
node4 6m 4.548s 2025-12-03 21:05:36.250 792 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 517 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/517
node4 6m 4.549s 2025-12-03 21:05:36.251 793 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 517
node4 6m 4.551s 2025-12-03 21:05:36.253 794 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 23a7547602cc87ffc45794f456b83d719744f1e333bc1ee0d79401965bfaecd5d1ac19f361485d7332c37618809ac31b
node4 6m 4.552s 2025-12-03 21:05:36.254 795 INFO STARTUP <platformForkJoinThread-4> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr262_orgn0.pces. All future files will have an origin round of 517.
node0 6m 4.584s 2025-12-03 21:05:36.286 6252 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":0,"otherNodeId":4,"round":517,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 6m 4.753s 2025-12-03 21:05:36.455 838 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/3 for round 517
node4 6m 4.757s 2025-12-03 21:05:36.459 839 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 517 Timestamp: 2025-12-03T21:05:34.037049Z Next consensus number: 11599 Legacy running event hash: 23a7547602cc87ffc45794f456b83d719744f1e333bc1ee0d79401965bfaecd5d1ac19f361485d7332c37618809ac31b Legacy running event mnemonic: insane-mom-flower-ability Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1954272148 Root hash: 15101fd4cd190e024cf19b4e5856991cd34f5b561cb279dcae4e5d052c19cf0e0fe659fa590e209d6bec32101b5700b8 (root) ConsistencyTestingToolState / pear-whisper-friend-scare 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 lounge-treat-cabin-finish 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3489044930256852996 /3 bronze-helmet-pioneer-bulb 4 StringLeaf 517 /4 arrest-farm-inherit-finish
node4 6m 4.799s 2025-12-03 21:05:36.501 840 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr262_orgn0.pces
node4 6m 4.800s 2025-12-03 21:05:36.502 841 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 490
node4 6m 4.805s 2025-12-03 21:05:36.507 842 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 517 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/517 {"round":517,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/517/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 4.810s 2025-12-03 21:05:36.512 843 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 263.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6m 5.212s 2025-12-03 21:05:36.914 844 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 6m 5.216s 2025-12-03 21:05:36.918 845 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 6m 5.515s 2025-12-03 21:05:37.217 846 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:e8272b4447c0 BR:515), num remaining: 3
node4 6m 5.519s 2025-12-03 21:05:37.221 847 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:5f39804dde91 BR:515), num remaining: 2
node4 6m 5.519s 2025-12-03 21:05:37.221 848 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:c65f0282f548 BR:516), num remaining: 1
node4 6m 5.520s 2025-12-03 21:05:37.222 849 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:248c8b91a55b BR:516), num remaining: 0
node4 6m 10.140s 2025-12-03 21:05:41.842 943 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 5.3 s in CHECKING. Now in ACTIVE
node3 6m 29.774s 2025-12-03 21:06:01.476 6733 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 557 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 29.842s 2025-12-03 21:06:01.544 6678 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 557 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 29.843s 2025-12-03 21:06:01.545 6695 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 557 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 29.967s 2025-12-03 21:06:01.669 6747 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 557 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 30.077s 2025-12-03 21:06:01.779 1266 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 557 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 30.256s 2025-12-03 21:06:01.958 6691 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 557 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/557
node0 6m 30.257s 2025-12-03 21:06:01.959 6692 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 557
node1 6m 30.319s 2025-12-03 21:06:02.021 6698 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 557 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/557
node1 6m 30.319s 2025-12-03 21:06:02.021 6699 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node3 6m 30.327s 2025-12-03 21:06:02.029 6736 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 557 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/557
node3 6m 30.327s 2025-12-03 21:06:02.029 6737 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node0 6m 30.354s 2025-12-03 21:06:02.056 6731 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/52 for round 557
node0 6m 30.356s 2025-12-03 21:06:02.058 6732 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 557 Timestamp: 2025-12-03T21:06:00.027272674Z Next consensus number: 12389 Legacy running event hash: 57607f2f32e22de65eb9cdcbf1ca2b3fe5f12b72c474111db8e501e54b6dc201bb70a25e1ccec4ea9128a3de66da3357 Legacy running event mnemonic: ugly-tourist-fox-rubber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1352842891 Root hash: 997d5c8cdc2db0cb8410346e8802cb3ab41e145f7e9ef097b8adee310629d9040904735e26d3489355b2fb2e3350fa3b (root) ConsistencyTestingToolState / spare-muscle-kidney-trade 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 allow-tourist-unfair-shrimp 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3159150264899496267 /3 execute-core-odor-army 4 StringLeaf 557 /4 choice-wife-verify-offer
node0 6m 30.362s 2025-12-03 21:06:02.064 6733 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T21+05+25.141511185Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 6m 30.362s 2025-12-03 21:06:02.064 6734 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 530 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T21+05+25.141511185Z_seq1_minr474_maxr5474_orgn0.pces
node0 6m 30.362s 2025-12-03 21:06:02.064 6735 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 30.363s 2025-12-03 21:06:02.065 6736 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 30.364s 2025-12-03 21:06:02.066 6737 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 557 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/557 {"round":557,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/557/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 30.365s 2025-12-03 21:06:02.067 6738 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/106
node2 6m 30.389s 2025-12-03 21:06:02.091 6750 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 557 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/557
node2 6m 30.390s 2025-12-03 21:06:02.092 6751 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node1 6m 30.405s 2025-12-03 21:06:02.107 6733 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node1 6m 30.407s 2025-12-03 21:06:02.109 6734 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 557 Timestamp: 2025-12-03T21:06:00.027272674Z Next consensus number: 12389 Legacy running event hash: 57607f2f32e22de65eb9cdcbf1ca2b3fe5f12b72c474111db8e501e54b6dc201bb70a25e1ccec4ea9128a3de66da3357 Legacy running event mnemonic: ugly-tourist-fox-rubber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1352842891 Root hash: 997d5c8cdc2db0cb8410346e8802cb3ab41e145f7e9ef097b8adee310629d9040904735e26d3489355b2fb2e3350fa3b (root) ConsistencyTestingToolState / spare-muscle-kidney-trade 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 allow-tourist-unfair-shrimp 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3159150264899496267 /3 execute-core-odor-army 4 StringLeaf 557 /4 choice-wife-verify-offer
node1 6m 30.413s 2025-12-03 21:06:02.115 6735 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T21+05+25.220744331Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 30.415s 2025-12-03 21:06:02.117 6736 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 530 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T21+05+25.220744331Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 30.415s 2025-12-03 21:06:02.117 6737 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 30.416s 2025-12-03 21:06:02.118 6744 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 30.417s 2025-12-03 21:06:02.119 6749 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 557 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/557 {"round":557,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/557/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 30.418s 2025-12-03 21:06:02.120 6750 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/106
node3 6m 30.420s 2025-12-03 21:06:02.122 6771 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node3 6m 30.422s 2025-12-03 21:06:02.124 6772 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 557 Timestamp: 2025-12-03T21:06:00.027272674Z Next consensus number: 12389 Legacy running event hash: 57607f2f32e22de65eb9cdcbf1ca2b3fe5f12b72c474111db8e501e54b6dc201bb70a25e1ccec4ea9128a3de66da3357 Legacy running event mnemonic: ugly-tourist-fox-rubber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1352842891 Root hash: 997d5c8cdc2db0cb8410346e8802cb3ab41e145f7e9ef097b8adee310629d9040904735e26d3489355b2fb2e3350fa3b (root) ConsistencyTestingToolState / spare-muscle-kidney-trade 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 allow-tourist-unfair-shrimp 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3159150264899496267 /3 execute-core-odor-army 4 StringLeaf 557 /4 choice-wife-verify-offer
node3 6m 30.429s 2025-12-03 21:06:02.131 6773 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T21+05+25.244137290Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 30.432s 2025-12-03 21:06:02.134 6774 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 530 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T21+05+25.244137290Z_seq1_minr474_maxr5474_orgn0.pces
node3 6m 30.432s 2025-12-03 21:06:02.134 6775 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 6m 30.433s 2025-12-03 21:06:02.135 6776 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 30.434s 2025-12-03 21:06:02.136 6777 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 557 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/557 {"round":557,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/557/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 30.435s 2025-12-03 21:06:02.137 6778 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/106
node2 6m 30.473s 2025-12-03 21:06:02.175 6783 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 557
node2 6m 30.475s 2025-12-03 21:06:02.177 6786 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 557 Timestamp: 2025-12-03T21:06:00.027272674Z Next consensus number: 12389 Legacy running event hash: 57607f2f32e22de65eb9cdcbf1ca2b3fe5f12b72c474111db8e501e54b6dc201bb70a25e1ccec4ea9128a3de66da3357 Legacy running event mnemonic: ugly-tourist-fox-rubber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1352842891 Root hash: 997d5c8cdc2db0cb8410346e8802cb3ab41e145f7e9ef097b8adee310629d9040904735e26d3489355b2fb2e3350fa3b (root) ConsistencyTestingToolState / spare-muscle-kidney-trade 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 allow-tourist-unfair-shrimp 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3159150264899496267 /3 execute-core-odor-army 4 StringLeaf 557 /4 choice-wife-verify-offer
node2 6m 30.482s 2025-12-03 21:06:02.184 6787 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T21+05+25.203631551Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 6m 30.485s 2025-12-03 21:06:02.187 6788 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 530 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T21+05+25.203631551Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 30.485s 2025-12-03 21:06:02.187 6789 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 30.486s 2025-12-03 21:06:02.188 6790 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 30.487s 2025-12-03 21:06:02.189 6791 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 557 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/557 {"round":557,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/557/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 30.488s 2025-12-03 21:06:02.190 6792 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/106
node4 6m 30.538s 2025-12-03 21:06:02.240 1269 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 557 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/557
node4 6m 30.538s 2025-12-03 21:06:02.240 1270 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 557
node4 6m 30.662s 2025-12-03 21:06:02.364 1319 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/10 for round 557
node4 6m 30.665s 2025-12-03 21:06:02.367 1320 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 557 Timestamp: 2025-12-03T21:06:00.027272674Z Next consensus number: 12389 Legacy running event hash: 57607f2f32e22de65eb9cdcbf1ca2b3fe5f12b72c474111db8e501e54b6dc201bb70a25e1ccec4ea9128a3de66da3357 Legacy running event mnemonic: ugly-tourist-fox-rubber Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1352842891 Root hash: 997d5c8cdc2db0cb8410346e8802cb3ab41e145f7e9ef097b8adee310629d9040904735e26d3489355b2fb2e3350fa3b (root) ConsistencyTestingToolState / spare-muscle-kidney-trade 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 allow-tourist-unfair-shrimp 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf -3159150264899496267 /3 execute-core-odor-army 4 StringLeaf 557 /4 choice-wife-verify-offer
node4 6m 30.675s 2025-12-03 21:06:02.377 1321 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr262_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T21+05+36.789238347Z_seq1_minr490_maxr990_orgn517.pces
node4 6m 30.675s 2025-12-03 21:06:02.377 1322 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 530 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T21+05+36.789238347Z_seq1_minr490_maxr990_orgn517.pces
node4 6m 30.675s 2025-12-03 21:06:02.377 1323 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 30.678s 2025-12-03 21:06:02.380 1324 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 30.679s 2025-12-03 21:06:02.381 1325 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 557 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/557 {"round":557,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/557/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 30.681s 2025-12-03 21:06:02.383 1326 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node3 7m 29.634s 2025-12-03 21:07:01.336 7836 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 653 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 7m 29.653s 2025-12-03 21:07:01.355 7805 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 653 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 29.769s 2025-12-03 21:07:01.471 7886 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 653 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 29.849s 2025-12-03 21:07:01.551 2384 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 653 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 29.918s 2025-12-03 21:07:01.620 7806 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 653 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 30.069s 2025-12-03 21:07:01.771 7809 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 653 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/653
node1 7m 30.069s 2025-12-03 21:07:01.771 7810 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node3 7m 30.130s 2025-12-03 21:07:01.832 7849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 653 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/653
node3 7m 30.131s 2025-12-03 21:07:01.833 7850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node1 7m 30.155s 2025-12-03 21:07:01.857 7841 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node1 7m 30.157s 2025-12-03 21:07:01.859 7842 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 653 Timestamp: 2025-12-03T21:07:00.069771Z Next consensus number: 14841 Legacy running event hash: d83842dbcd5f7cf8b932f713c328f097c6309900c61ca9f8130444e2a22a025897178ccfa42c271e223cb666931fffbf Legacy running event mnemonic: weekend-fork-feature-pair Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 983617797 Root hash: 724220eaa4bcc4098f09272302144fc8e4587c46efde5a8bff86ab9cff2c84cc41ace57a2af6969299efbecdb619394a (root) ConsistencyTestingToolState / embark-bubble-hammer-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swear-foil-lock-cycle 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 3499517450620198174 /3 decade-earn-utility-surge 4 StringLeaf 653 /4 practice-relief-athlete-cloth
node1 7m 30.164s 2025-12-03 21:07:01.866 7843 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T20+59+48.207163779Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T21+05+25.220744331Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 30.164s 2025-12-03 21:07:01.866 7844 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 626 File: data/saved/preconsensus-events/1/2025/12/03/2025-12-03T21+05+25.220744331Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 30.164s 2025-12-03 21:07:01.866 7845 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 30.167s 2025-12-03 21:07:01.869 7846 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 30.167s 2025-12-03 21:07:01.869 7847 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 653 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/653 {"round":653,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/653/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 30.169s 2025-12-03 21:07:01.871 7848 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/193
node2 7m 30.192s 2025-12-03 21:07:01.894 7899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 653 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/653
node2 7m 30.193s 2025-12-03 21:07:01.895 7900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node0 7m 30.201s 2025-12-03 21:07:01.903 7818 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 653 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/653
node0 7m 30.202s 2025-12-03 21:07:01.904 7819 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 653
node3 7m 30.222s 2025-12-03 21:07:01.924 7881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node3 7m 30.225s 2025-12-03 21:07:01.927 7882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 653 Timestamp: 2025-12-03T21:07:00.069771Z Next consensus number: 14841 Legacy running event hash: d83842dbcd5f7cf8b932f713c328f097c6309900c61ca9f8130444e2a22a025897178ccfa42c271e223cb666931fffbf Legacy running event mnemonic: weekend-fork-feature-pair Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 983617797 Root hash: 724220eaa4bcc4098f09272302144fc8e4587c46efde5a8bff86ab9cff2c84cc41ace57a2af6969299efbecdb619394a (root) ConsistencyTestingToolState / embark-bubble-hammer-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swear-foil-lock-cycle 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 3499517450620198174 /3 decade-earn-utility-surge 4 StringLeaf 653 /4 practice-relief-athlete-cloth
node3 7m 30.233s 2025-12-03 21:07:01.935 7883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T21+05+25.244137290Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T20+59+48.093206394Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 30.234s 2025-12-03 21:07:01.936 7884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 626 File: data/saved/preconsensus-events/3/2025/12/03/2025-12-03T21+05+25.244137290Z_seq1_minr474_maxr5474_orgn0.pces
node3 7m 30.234s 2025-12-03 21:07:01.936 7885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 7m 30.237s 2025-12-03 21:07:01.939 7886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 30.237s 2025-12-03 21:07:01.939 7887 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 653 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/653 {"round":653,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/653/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 30.239s 2025-12-03 21:07:01.941 7888 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/193
node2 7m 30.278s 2025-12-03 21:07:01.980 7931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/57 for round 653
node2 7m 30.279s 2025-12-03 21:07:01.981 7932 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 653 Timestamp: 2025-12-03T21:07:00.069771Z Next consensus number: 14841 Legacy running event hash: d83842dbcd5f7cf8b932f713c328f097c6309900c61ca9f8130444e2a22a025897178ccfa42c271e223cb666931fffbf Legacy running event mnemonic: weekend-fork-feature-pair Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 983617797 Root hash: 724220eaa4bcc4098f09272302144fc8e4587c46efde5a8bff86ab9cff2c84cc41ace57a2af6969299efbecdb619394a (root) ConsistencyTestingToolState / embark-bubble-hammer-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swear-foil-lock-cycle 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 3499517450620198174 /3 decade-earn-utility-surge 4 StringLeaf 653 /4 practice-relief-athlete-cloth
node2 7m 30.287s 2025-12-03 21:07:01.989 7933 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T21+05+25.203631551Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T20+59+47.750846119Z_seq0_minr1_maxr501_orgn0.pces
node2 7m 30.287s 2025-12-03 21:07:01.989 7934 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 626 File: data/saved/preconsensus-events/2/2025/12/03/2025-12-03T21+05+25.203631551Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 30.287s 2025-12-03 21:07:01.989 7935 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 30.290s 2025-12-03 21:07:01.992 7936 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 30.290s 2025-12-03 21:07:01.992 7937 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 653 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/653 {"round":653,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/653/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 30.292s 2025-12-03 21:07:01.994 7938 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/193
node0 7m 30.302s 2025-12-03 21:07:02.004 7863 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/59 for round 653
node0 7m 30.304s 2025-12-03 21:07:02.006 7864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 653 Timestamp: 2025-12-03T21:07:00.069771Z Next consensus number: 14841 Legacy running event hash: d83842dbcd5f7cf8b932f713c328f097c6309900c61ca9f8130444e2a22a025897178ccfa42c271e223cb666931fffbf Legacy running event mnemonic: weekend-fork-feature-pair Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 983617797 Root hash: 724220eaa4bcc4098f09272302144fc8e4587c46efde5a8bff86ab9cff2c84cc41ace57a2af6969299efbecdb619394a (root) ConsistencyTestingToolState / embark-bubble-hammer-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swear-foil-lock-cycle 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 3499517450620198174 /3 decade-earn-utility-surge 4 StringLeaf 653 /4 practice-relief-athlete-cloth
node0 7m 30.311s 2025-12-03 21:07:02.013 7865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T21+05+25.141511185Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T20+59+48.129734720Z_seq0_minr1_maxr501_orgn0.pces
node0 7m 30.312s 2025-12-03 21:07:02.014 7866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 626 File: data/saved/preconsensus-events/0/2025/12/03/2025-12-03T21+05+25.141511185Z_seq1_minr474_maxr5474_orgn0.pces
node0 7m 30.312s 2025-12-03 21:07:02.014 7867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 7m 30.315s 2025-12-03 21:07:02.017 7868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 30.315s 2025-12-03 21:07:02.017 7869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 653 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/653 {"round":653,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/653/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 30.316s 2025-12-03 21:07:02.018 7870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/193
node4 7m 30.337s 2025-12-03 21:07:02.039 2387 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 653 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/653
node4 7m 30.338s 2025-12-03 21:07:02.040 2388 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 653
node4 7m 30.456s 2025-12-03 21:07:02.158 2433 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/17 for round 653
node4 7m 30.459s 2025-12-03 21:07:02.161 2434 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 653 Timestamp: 2025-12-03T21:07:00.069771Z Next consensus number: 14841 Legacy running event hash: d83842dbcd5f7cf8b932f713c328f097c6309900c61ca9f8130444e2a22a025897178ccfa42c271e223cb666931fffbf Legacy running event mnemonic: weekend-fork-feature-pair Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 983617797 Root hash: 724220eaa4bcc4098f09272302144fc8e4587c46efde5a8bff86ab9cff2c84cc41ace57a2af6969299efbecdb619394a (root) ConsistencyTestingToolState / embark-bubble-hammer-leave 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 swear-foil-lock-cycle 1 SingletonNode RosterService.ROSTER_STATE /1 deposit-fog-skull-wait 2 VirtualMap RosterService.ROSTERS /2 situate-high-grain-issue 3 StringLeaf 3499517450620198174 /3 decade-earn-utility-surge 4 StringLeaf 653 /4 practice-relief-athlete-cloth
node4 7m 30.467s 2025-12-03 21:07:02.169 2435 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T20+59+48.283102362Z_seq0_minr1_maxr262_orgn0.pces Last file: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T21+05+36.789238347Z_seq1_minr490_maxr990_orgn517.pces
node4 7m 30.468s 2025-12-03 21:07:02.170 2436 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 626 File: data/saved/preconsensus-events/4/2025/12/03/2025-12-03T21+05+36.789238347Z_seq1_minr490_maxr990_orgn517.pces
node4 7m 30.468s 2025-12-03 21:07:02.170 2437 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 30.471s 2025-12-03 21:07:02.173 2438 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 30.472s 2025-12-03 21:07:02.174 2439 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 653 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/653 {"round":653,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/653/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 30.473s 2025-12-03 21:07:02.175 2440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/16
node0 7m 59.578s 2025-12-03 21:07:31.280 8349 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 0 to 2>> NetworkUtils: Connection broken: 0 -> 2
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.279917450Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.279917450Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node3 7m 59.578s 2025-12-03 21:07:31.280 8362 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 3 to 2>> NetworkUtils: Connection broken: 3 <- 2
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.WaitForAcceptReject.transition(WaitForAcceptReject.java:48) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node4 7m 59.584s 2025-12-03 21:07:31.286 2911 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith2 4 to 2>> NetworkUtils: Connection broken: 4 <- 2
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.283600147Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.283600147Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:180) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readBoolean(DataInputStream.java:255) at org.hiero.base.io.streams.AugmentedDataInputStream.readBoolean(AugmentedDataInputStream.java:137) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readMyTipsTheyHave$7(SyncUtils.java:163) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node4 7m 59.739s 2025-12-03 21:07:31.441 2912 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 4 to 1>> NetworkUtils: Connection broken: 4 <- 1
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.FilterInputStream.read(FilterInputStream.java:71) at org.hiero.base.io.streams.AugmentedDataInputStream.read(AugmentedDataInputStream.java:57) at com.swirlds.platform.network.communication.states.SentInitiate.transition(SentInitiate.java:73) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node0 7m 59.740s 2025-12-03 21:07:31.442 8350 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 0 to 1>> NetworkUtils: Connection broken: 0 -> 1
java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.heartbeats.HeartbeatPeerProtocol.initiateHeartbeat(HeartbeatPeerProtocol.java:112) at com.swirlds.platform.heartbeats.HeartbeatPeerProtocol.runProtocol(HeartbeatPeerProtocol.java:156) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583)
node3 7m 59.740s 2025-12-03 21:07:31.442 8363 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith1 3 to 1>> NetworkUtils: Connection broken: 3 <- 1
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.441868009Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.441868009Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more
node0 8.003m 2025-12-03 21:07:31.902 8351 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith3 0 to 3>> NetworkUtils: Connection broken: 0 -> 3
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.901898173Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:31.901898173Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:347) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:420) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:399) at java.base/java.io.DataInputStream.readFully(DataInputStream.java:208) at java.base/java.io.DataInputStream.readLong(DataInputStream.java:407) at org.hiero.base.io.streams.AugmentedDataInputStream.readLong(AugmentedDataInputStream.java:186) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.deserializeEventWindow(SyncUtils.java:640) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readTheirTipsAndEventWindow$3(SyncUtils.java:104) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 11 more
node0 8.006m 2025-12-03 21:07:32.067 8352 WARN SOCKET_EXCEPTIONS <<platform-core: SyncProtocolWith4 0 to 4>> NetworkUtils: Connection broken: 0 -> 4
java.io.IOException: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:32.066963658Z at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:258) at com.swirlds.platform.network.communication.states.ProtocolNegotiated.transition(ProtocolNegotiated.java:47) at com.swirlds.platform.network.communication.Negotiator.execute(Negotiator.java:79) at com.swirlds.platform.network.communication.ProtocolNegotiatorThread.run(ProtocolNegotiatorThread.java:70) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.doWork(StoppableThreadImpl.java:599) at com.swirlds.common.threading.framework.internal.StoppableThreadImpl.run(StoppableThreadImpl.java:200) at com.swirlds.common.threading.framework.internal.AbstractThreadConfiguration.lambda$wrapRunnableWithSnapshot$3(AbstractThreadConfiguration.java:654) at java.base/java.lang.Thread.run(Thread.java:1583) Caused by: com.swirlds.common.threading.pool.ParallelExecutionException: Time thrown: 2025-12-03T21:07:32.066963658Z at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:148) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.readWriteParallel(ShadowgraphSynchronizer.java:304) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.sendAndReceiveEvents(ShadowgraphSynchronizer.java:241) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.reserveSynchronize(ShadowgraphSynchronizer.java:201) at com.swirlds.platform.gossip.shadowgraph.ShadowgraphSynchronizer.synchronize(ShadowgraphSynchronizer.java:113) at com.swirlds.platform.gossip.sync.protocol.SyncPeerProtocol.runProtocol(SyncPeerProtocol.java:254) ... 7 more Suppressed: java.util.concurrent.ExecutionException: java.net.SocketException: Connection or outbound has closed at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:154) ... 12 more Caused by: java.net.SocketException: Connection or outbound has closed at java.base/sun.security.ssl.SSLSocketImpl$AppOutputStream.write(SSLSocketImpl.java:1297) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.write(AbstractStreamExtension.java:115) at com.swirlds.common.io.extendable.ExtendableOutputStream.write(ExtendableOutputStream.java:64) at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125) at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252) at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240) at java.base/java.io.DataOutputStream.flush(DataOutputStream.java:131) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$sendEventsTheyNeed$8(SyncUtils.java:234) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:24) at org.hiero.base.concurrent.ThrowingRunnable.call(ThrowingRunnable.java:9) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ... 2 more Caused by: java.net.SocketException: Connection reset at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:318) at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:346) at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:796) at java.base/java.net.Socket$SocketInputStream.read(Socket.java:1099) at java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:489) at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:483) at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(SSLSocketInputRecord.java:70) at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(SSLSocketImpl.java:1461) at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(SSLSocketImpl.java:1066) at com.swirlds.common.io.extendable.extensions.AbstractStreamExtension.read(AbstractStreamExtension.java:73) at com.swirlds.common.io.extendable.ExtendableInputStream.read(ExtendableInputStream.java:63) at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:291) at java.base/java.io.BufferedInputStream.implRead(BufferedInputStream.java:325) at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:312) at java.base/java.io.DataInputStream.readUnsignedByte(DataInputStream.java:295) at java.base/java.io.DataInputStream.readByte(DataInputStream.java:275) at org.hiero.base.io.streams.AugmentedDataInputStream.readByte(AugmentedDataInputStream.java:144) at com.swirlds.platform.gossip.shadowgraph.SyncUtils.lambda$readEventsINeed$9(SyncUtils.java:278) at com.swirlds.common.threading.pool.CachedPoolParallelExecutor.doParallelWithHandler(CachedPoolParallelExecutor.java:146) ... 12 more