Node ID







Columns











Log Level





Log Marker







Class


















































node4 0.000ns 2025-09-28 05:43:38.774 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 85.000ms 2025-09-28 05:43:38.859 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 100.000ms 2025-09-28 05:43:38.874 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 210.000ms 2025-09-28 05:43:38.984 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 216.000ms 2025-09-28 05:43:38.990 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 228.000ms 2025-09-28 05:43:39.002 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 265.000ms 2025-09-28 05:43:39.039 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 352.000ms 2025-09-28 05:43:39.126 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node3 367.000ms 2025-09-28 05:43:39.141 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node2 414.000ms 2025-09-28 05:43:39.188 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node3 480.000ms 2025-09-28 05:43:39.254 4 INFO STARTUP <main> Browser: The following nodes [3] are set to run locally
node3 486.000ms 2025-09-28 05:43:39.260 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node3 498.000ms 2025-09-28 05:43:39.272 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 509.000ms 2025-09-28 05:43:39.283 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node2 512.000ms 2025-09-28 05:43:39.286 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node2 529.000ms 2025-09-28 05:43:39.303 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 593.000ms 2025-09-28 05:43:39.367 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node1 608.000ms 2025-09-28 05:43:39.382 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 649.000ms 2025-09-28 05:43:39.423 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 650.000ms 2025-09-28 05:43:39.424 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 652.000ms 2025-09-28 05:43:39.426 4 INFO STARTUP <main> Browser: The following nodes [2] are set to run locally
node2 658.000ms 2025-09-28 05:43:39.432 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node2 671.000ms 2025-09-28 05:43:39.445 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node1 721.000ms 2025-09-28 05:43:39.495 4 INFO STARTUP <main> Browser: The following nodes [1] are set to run locally
node1 727.000ms 2025-09-28 05:43:39.501 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node1 739.000ms 2025-09-28 05:43:39.513 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 913.000ms 2025-09-28 05:43:39.687 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node3 914.000ms 2025-09-28 05:43:39.688 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node2 1.114s 2025-09-28 05:43:39.888 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node2 1.115s 2025-09-28 05:43:39.889 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node1 1.147s 2025-09-28 05:43:39.921 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node1 1.148s 2025-09-28 05:43:39.922 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 1.544s 2025-09-28 05:43:40.318 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 894ms
node4 1.553s 2025-09-28 05:43:40.327 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 1.556s 2025-09-28 05:43:40.330 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 1.595s 2025-09-28 05:43:40.369 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 1.656s 2025-09-28 05:43:40.430 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 1.657s 2025-09-28 05:43:40.431 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node3 1.739s 2025-09-28 05:43:40.513 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 825ms
node3 1.748s 2025-09-28 05:43:40.522 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node3 1.752s 2025-09-28 05:43:40.526 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node3 1.789s 2025-09-28 05:43:40.563 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node3 1.861s 2025-09-28 05:43:40.635 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node3 1.862s 2025-09-28 05:43:40.636 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node1 1.998s 2025-09-28 05:43:40.772 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 850ms
node1 2.007s 2025-09-28 05:43:40.781 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node1 2.011s 2025-09-28 05:43:40.785 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.051s 2025-09-28 05:43:40.825 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 2.092s 2025-09-28 05:43:40.866 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 977ms
node2 2.101s 2025-09-28 05:43:40.875 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node2 2.103s 2025-09-28 05:43:40.877 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 2.109s 2025-09-28 05:43:40.883 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node1 2.110s 2025-09-28 05:43:40.884 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 2.143s 2025-09-28 05:43:40.917 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node2 2.201s 2025-09-28 05:43:40.975 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node2 2.202s 2025-09-28 05:43:40.976 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node0 2.242s 2025-09-28 05:43:41.016 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node0 2.339s 2025-09-28 05:43:41.113 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node0 2.356s 2025-09-28 05:43:41.130 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node0 2.480s 2025-09-28 05:43:41.254 4 INFO STARTUP <main> Browser: The following nodes [0] are set to run locally
node0 2.487s 2025-09-28 05:43:41.261 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node0 2.500s 2025-09-28 05:43:41.274 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 2.965s 2025-09-28 05:43:41.739 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node0 2.966s 2025-09-28 05:43:41.740 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 3.660s 2025-09-28 05:43:42.434 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 3.747s 2025-09-28 05:43:42.521 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 3.749s 2025-09-28 05:43:42.523 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node4 3.750s 2025-09-28 05:43:42.524 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node3 3.916s 2025-09-28 05:43:42.690 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 4.000s 2025-09-28 05:43:42.774 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.003s 2025-09-28 05:43:42.777 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node3 4.004s 2025-09-28 05:43:42.778 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.115s 2025-09-28 05:43:42.889 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 1148ms
node0 4.123s 2025-09-28 05:43:42.897 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node0 4.126s 2025-09-28 05:43:42.900 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node1 4.139s 2025-09-28 05:43:42.913 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node0 4.165s 2025-09-28 05:43:42.939 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node1 4.217s 2025-09-28 05:43:42.991 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.219s 2025-09-28 05:43:42.993 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node1 4.220s 2025-09-28 05:43:42.994 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node0 4.236s 2025-09-28 05:43:43.010 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node0 4.238s 2025-09-28 05:43:43.012 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node2 4.296s 2025-09-28 05:43:43.070 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node2 4.380s 2025-09-28 05:43:43.154 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 4.382s 2025-09-28 05:43:43.156 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node2 4.383s 2025-09-28 05:43:43.157 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 4.499s 2025-09-28 05:43:43.273 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.502s 2025-09-28 05:43:43.276 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 4.517s 2025-09-28 05:43:43.291 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node4 4.527s 2025-09-28 05:43:43.301 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 4.529s 2025-09-28 05:43:43.303 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.784s 2025-09-28 05:43:43.558 30 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.788s 2025-09-28 05:43:43.562 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node3 4.802s 2025-09-28 05:43:43.576 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node3 4.813s 2025-09-28 05:43:43.587 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node3 4.815s 2025-09-28 05:43:43.589 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.967s 2025-09-28 05:43:43.741 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.971s 2025-09-28 05:43:43.745 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node1 4.984s 2025-09-28 05:43:43.758 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node1 4.995s 2025-09-28 05:43:43.769 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node1 4.996s 2025-09-28 05:43:43.770 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.170s 2025-09-28 05:43:43.944 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.173s 2025-09-28 05:43:43.947 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node2 5.185s 2025-09-28 05:43:43.959 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node2 5.196s 2025-09-28 05:43:43.970 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node2 5.198s 2025-09-28 05:43:43.972 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5.636s 2025-09-28 05:43:44.410 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26329434] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=160010, randomLong=29526764570334079, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=40930, randomLong=3294675611984936925, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1103180, data=35, exception=null] OS Health Check Report - Complete (took 1021 ms)
node4 5.665s 2025-09-28 05:43:44.439 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5.673s 2025-09-28 05:43:44.447 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5.676s 2025-09-28 05:43:44.450 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.756s 2025-09-28 05:43:44.530 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node4 5.778s 2025-09-28 05:43:44.552 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5.778s 2025-09-28 05:43:44.552 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node4 5.792s 2025-09-28 05:43:44.566 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 262591cdcf23638a153b446be345ccfee0466da3439b882cedd0167aac9cbffb99bdfc1ce727847f5fd354f4cde0de53 (root) ConsistencyTestingToolState / seven-shoe-brick-image 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent
node3 5.936s 2025-09-28 05:43:44.710 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26244764] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=208219, randomLong=1524371798264941920, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11680, randomLong=-5239352002295327217, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1109647, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node3 5.970s 2025-09-28 05:43:44.744 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node3 5.978s 2025-09-28 05:43:44.752 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node3 5.981s 2025-09-28 05:43:44.755 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5.981s 2025-09-28 05:43:44.755 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node4 5.985s 2025-09-28 05:43:44.759 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node4 5.991s 2025-09-28 05:43:44.765 47 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5.991s 2025-09-28 05:43:44.765 48 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5.992s 2025-09-28 05:43:44.766 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5.995s 2025-09-28 05:43:44.769 50 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5.996s 2025-09-28 05:43:44.770 51 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5.997s 2025-09-28 05:43:44.771 52 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5.999s 2025-09-28 05:43:44.773 53 WARN STARTUP <<start-node-4>> PcesFileTracker: No preconsensus event files available
node4 5.999s 2025-09-28 05:43:44.773 54 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node4 6.001s 2025-09-28 05:43:44.775 55 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node4 6.002s 2025-09-28 05:43:44.776 56 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 6.003s 2025-09-28 05:43:44.777 57 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 153.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 6.008s 2025-09-28 05:43:44.782 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node3 6.064s 2025-09-28 05:43:44.838 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node3 6.085s 2025-09-28 05:43:44.859 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/3/ConsistencyTestLog.csv
node3 6.086s 2025-09-28 05:43:44.860 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node3 6.101s 2025-09-28 05:43:44.875 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 262591cdcf23638a153b446be345ccfee0466da3439b882cedd0167aac9cbffb99bdfc1ce727847f5fd354f4cde0de53 (root) ConsistencyTestingToolState / seven-shoe-brick-image 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent
node1 6.103s 2025-09-28 05:43:44.877 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26224806] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=195070, randomLong=-2997842775004709940, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=63591, randomLong=3522080768922351111, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1779610, data=35, exception=null] OS Health Check Report - Complete (took 1023 ms)
node1 6.135s 2025-09-28 05:43:44.909 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node1 6.142s 2025-09-28 05:43:44.916 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node1 6.145s 2025-09-28 05:43:44.919 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node1 6.224s 2025-09-28 05:43:44.998 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node1 6.244s 2025-09-28 05:43:45.018 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/1/ConsistencyTestLog.csv
node1 6.244s 2025-09-28 05:43:45.018 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 6.258s 2025-09-28 05:43:45.032 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 262591cdcf23638a153b446be345ccfee0466da3439b882cedd0167aac9cbffb99bdfc1ce727847f5fd354f4cde0de53 (root) ConsistencyTestingToolState / seven-shoe-brick-image 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent
node3 6.307s 2025-09-28 05:43:45.081 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node3 6.311s 2025-09-28 05:43:45.085 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.312s 2025-09-28 05:43:45.086 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26351815] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=181600, randomLong=-4484013265585120488, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=11700, randomLong=5798546866842063596, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1206128, data=35, exception=null] OS Health Check Report - Complete (took 1027 ms)
node3 6.316s 2025-09-28 05:43:45.090 47 INFO STARTUP <<start-node-3>> ConsistencyTestingToolMain: init called in Main for node 3.
node3 6.317s 2025-09-28 05:43:45.091 48 INFO STARTUP <<start-node-3>> SwirldsPlatform: Starting platform 3
node3 6.318s 2025-09-28 05:43:45.092 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node3 6.321s 2025-09-28 05:43:45.095 50 INFO STARTUP <<start-node-3>> CycleFinder: No cyclical back pressure detected in wiring model.
node3 6.322s 2025-09-28 05:43:45.096 51 INFO STARTUP <<start-node-3>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node3 6.323s 2025-09-28 05:43:45.097 52 INFO STARTUP <<start-node-3>> InputWireChecks: All input wires have been bound.
node3 6.325s 2025-09-28 05:43:45.099 53 WARN STARTUP <<start-node-3>> PcesFileTracker: No preconsensus event files available
node3 6.325s 2025-09-28 05:43:45.099 54 INFO STARTUP <<start-node-3>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node3 6.327s 2025-09-28 05:43:45.101 55 INFO STARTUP <<start-node-3>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node3 6.328s 2025-09-28 05:43:45.102 56 INFO STARTUP <<app: appMain 3>> ConsistencyTestingToolMain: run called in Main.
node3 6.330s 2025-09-28 05:43:45.104 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 174.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 6.331s 2025-09-28 05:43:45.105 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node3 6.334s 2025-09-28 05:43:45.108 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 3.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.350s 2025-09-28 05:43:45.124 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node2 6.359s 2025-09-28 05:43:45.133 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node2 6.363s 2025-09-28 05:43:45.137 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 6.417s 2025-09-28 05:43:45.191 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 6.420s 2025-09-28 05:43:45.194 21 INFO STARTUP <main> StartupStateUtils: No saved states were found on disk.
node0 6.421s 2025-09-28 05:43:45.195 22 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node2 6.456s 2025-09-28 05:43:45.230 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node1 6.473s 2025-09-28 05:43:45.247 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node1 6.477s 2025-09-28 05:43:45.251 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node1 6.482s 2025-09-28 05:43:45.256 47 INFO STARTUP <<start-node-1>> ConsistencyTestingToolMain: init called in Main for node 1.
node2 6.482s 2025-09-28 05:43:45.256 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/2/ConsistencyTestLog.csv
node2 6.482s 2025-09-28 05:43:45.256 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node1 6.483s 2025-09-28 05:43:45.257 48 INFO STARTUP <<start-node-1>> SwirldsPlatform: Starting platform 1
node1 6.484s 2025-09-28 05:43:45.258 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node1 6.487s 2025-09-28 05:43:45.261 50 INFO STARTUP <<start-node-1>> CycleFinder: No cyclical back pressure detected in wiring model.
node1 6.488s 2025-09-28 05:43:45.262 51 INFO STARTUP <<start-node-1>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node1 6.489s 2025-09-28 05:43:45.263 52 INFO STARTUP <<start-node-1>> InputWireChecks: All input wires have been bound.
node1 6.490s 2025-09-28 05:43:45.264 53 WARN STARTUP <<start-node-1>> PcesFileTracker: No preconsensus event files available
node1 6.490s 2025-09-28 05:43:45.264 54 INFO STARTUP <<start-node-1>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node1 6.492s 2025-09-28 05:43:45.266 55 INFO STARTUP <<start-node-1>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node1 6.493s 2025-09-28 05:43:45.267 56 INFO STARTUP <<app: appMain 1>> ConsistencyTestingToolMain: run called in Main.
node1 6.495s 2025-09-28 05:43:45.269 57 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 184.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node1 6.501s 2025-09-28 05:43:45.275 58 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node2 6.501s 2025-09-28 05:43:45.275 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 262591cdcf23638a153b446be345ccfee0466da3439b882cedd0167aac9cbffb99bdfc1ce727847f5fd354f4cde0de53 (root) ConsistencyTestingToolState / seven-shoe-brick-image 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent
node2 6.748s 2025-09-28 05:43:45.522 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node2 6.753s 2025-09-28 05:43:45.527 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node2 6.759s 2025-09-28 05:43:45.533 47 INFO STARTUP <<start-node-2>> ConsistencyTestingToolMain: init called in Main for node 2.
node2 6.760s 2025-09-28 05:43:45.534 48 INFO STARTUP <<start-node-2>> SwirldsPlatform: Starting platform 2
node2 6.761s 2025-09-28 05:43:45.535 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node2 6.764s 2025-09-28 05:43:45.538 50 INFO STARTUP <<start-node-2>> CycleFinder: No cyclical back pressure detected in wiring model.
node2 6.765s 2025-09-28 05:43:45.539 51 INFO STARTUP <<start-node-2>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node2 6.766s 2025-09-28 05:43:45.540 52 INFO STARTUP <<start-node-2>> InputWireChecks: All input wires have been bound.
node2 6.768s 2025-09-28 05:43:45.542 53 WARN STARTUP <<start-node-2>> PcesFileTracker: No preconsensus event files available
node2 6.768s 2025-09-28 05:43:45.542 54 INFO STARTUP <<start-node-2>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node2 6.769s 2025-09-28 05:43:45.543 55 INFO STARTUP <<start-node-2>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node2 6.771s 2025-09-28 05:43:45.545 56 INFO STARTUP <<app: appMain 2>> ConsistencyTestingToolMain: run called in Main.
node2 6.772s 2025-09-28 05:43:45.546 57 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 205.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node2 6.778s 2025-09-28 05:43:45.552 58 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node0 7.246s 2025-09-28 05:43:46.020 29 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.250s 2025-09-28 05:43:46.024 32 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node0 7.264s 2025-09-28 05:43:46.038 33 INFO STARTUP <main> AddressBookInitializer: Starting from genesis: using the config address book.
node0 7.276s 2025-09-28 05:43:46.050 34 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 7.278s 2025-09-28 05:43:46.052 35 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node0 8.413s 2025-09-28 05:43:47.187 36 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26221232] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=241629, randomLong=3934578102081275822, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=9580, randomLong=1481004566002360916, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1043178, data=35, exception=null] OS Health Check Report - Complete (took 1024 ms)
node0 8.447s 2025-09-28 05:43:47.221 37 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node0 8.455s 2025-09-28 05:43:47.229 38 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node0 8.458s 2025-09-28 05:43:47.232 39 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node0 8.540s 2025-09-28 05:43:47.314 40 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node0 8.563s 2025-09-28 05:43:47.337 41 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/0/ConsistencyTestLog.csv
node0 8.563s 2025-09-28 05:43:47.337 42 INFO STARTUP <main> TransactionHandlingHistory: No log file found. Starting without any previous history
node0 8.580s 2025-09-28 05:43:47.354 43 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 0 Timestamp: 1970-01-01T00:00:00Z Next consensus number: 0 Legacy running event hash: null Legacy running event mnemonic: null Rounds non-ancient: 0 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1 Root hash: 262591cdcf23638a153b446be345ccfee0466da3439b882cedd0167aac9cbffb99bdfc1ce727847f5fd354f4cde0de53 (root) ConsistencyTestingToolState / seven-shoe-brick-image 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 method-topple-elite-gate 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent
node0 8.829s 2025-09-28 05:43:47.603 45 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b
node0 8.834s 2025-09-28 05:43:47.608 46 INFO STARTUP <platformForkJoinThread-2> Shadowgraph: Shadowgraph starting from expiration threshold 1
node0 8.838s 2025-09-28 05:43:47.612 47 INFO STARTUP <<start-node-0>> ConsistencyTestingToolMain: init called in Main for node 0.
node0 8.839s 2025-09-28 05:43:47.613 48 INFO STARTUP <<start-node-0>> SwirldsPlatform: Starting platform 0
node0 8.840s 2025-09-28 05:43:47.614 49 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node0 8.843s 2025-09-28 05:43:47.617 50 INFO STARTUP <<start-node-0>> CycleFinder: No cyclical back pressure detected in wiring model.
node0 8.844s 2025-09-28 05:43:47.618 51 INFO STARTUP <<start-node-0>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node0 8.845s 2025-09-28 05:43:47.619 52 INFO STARTUP <<start-node-0>> InputWireChecks: All input wires have been bound.
node0 8.847s 2025-09-28 05:43:47.621 53 WARN STARTUP <<start-node-0>> PcesFileTracker: No preconsensus event files available
node0 8.847s 2025-09-28 05:43:47.621 54 INFO STARTUP <<start-node-0>> SwirldsPlatform: replaying preconsensus event stream starting at 0
node0 8.850s 2025-09-28 05:43:47.624 55 INFO STARTUP <<start-node-0>> PcesReplayer: Replayed 0 preconsensus events with max birth round -1. These events contained 0 transactions. 0 rounds reached consensus spanning 0.0 nanoseconds of consensus time. The latest round to reach consensus is round 0. Replay took 0.0 nanoseconds.
node0 8.852s 2025-09-28 05:43:47.626 56 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 214.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node0 8.853s 2025-09-28 05:43:47.627 57 INFO STARTUP <<app: appMain 0>> ConsistencyTestingToolMain: run called in Main.
node0 8.857s 2025-09-28 05:43:47.631 58 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 4.0 ms in REPLAYING_EVENTS. Now in OBSERVING
node4 9.003s 2025-09-28 05:43:47.777 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 9.006s 2025-09-28 05:43:47.780 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node3 9.328s 2025-09-28 05:43:48.102 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting3.csv' ]
node3 9.330s 2025-09-28 05:43:48.104 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node1 9.495s 2025-09-28 05:43:48.269 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting1.csv' ]
node1 9.498s 2025-09-28 05:43:48.272 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node2 9.772s 2025-09-28 05:43:48.546 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting2.csv' ]
node2 9.775s 2025-09-28 05:43:48.549 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node0 11.852s 2025-09-28 05:43:50.626 59 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting0.csv' ]
node0 11.856s 2025-09-28 05:43:50.630 60 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 16.098s 2025-09-28 05:43:54.872 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 16.424s 2025-09-28 05:43:55.198 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node1 16.590s 2025-09-28 05:43:55.364 61 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node2 16.867s 2025-09-28 05:43:55.641 61 INFO PLATFORM_STATUS <platformForkJoinThread-1> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node3 17.891s 2025-09-28 05:43:56.665 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node2 17.915s 2025-09-28 05:43:56.689 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node4 17.945s 2025-09-28 05:43:56.719 62 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 1.8 s in CHECKING. Now in ACTIVE
node4 17.946s 2025-09-28 05:43:56.720 64 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node1 17.967s 2025-09-28 05:43:56.741 63 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.076s 2025-09-28 05:43:56.850 62 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 1 created, will eventually be written to disk, for reason: FIRST_ROUND_AFTER_GENESIS
node0 18.213s 2025-09-28 05:43:56.987 77 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node0 18.216s 2025-09-28 05:43:56.990 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 18.219s 2025-09-28 05:43:56.993 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node1 18.221s 2025-09-28 05:43:56.995 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 18.248s 2025-09-28 05:43:57.022 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node3 18.249s 2025-09-28 05:43:57.023 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 18.276s 2025-09-28 05:43:57.050 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node4 18.278s 2025-09-28 05:43:57.052 80 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 18.282s 2025-09-28 05:43:57.056 78 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 1 state to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node2 18.284s 2025-09-28 05:43:57.058 79 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 18.472s 2025-09-28 05:43:57.246 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node1 18.475s 2025-09-28 05:43:57.249 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-28T05:43:55.373708641Z Next consensus number: 1 Legacy running event hash: 0df34cbc76d3af5300bd8a0e5e03e7be52054b86ad623f22b07669b8d944614511858418ea2139973ed1f67219dda379 Legacy running event mnemonic: salmon-nurse-surprise-between Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 9e93b6ece0420ebd962bf7d168db625c716121a3471f70c27751bb3806ce88f2b8e0b51e44d27fb5a631167c33f8ee9a (root) ConsistencyTestingToolState / color-card-water-stumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 antique-adult-giant-matrix 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 18.483s 2025-09-28 05:43:57.257 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node3 18.487s 2025-09-28 05:43:57.261 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-28T05:43:55.373708641Z Next consensus number: 1 Legacy running event hash: 0df34cbc76d3af5300bd8a0e5e03e7be52054b86ad623f22b07669b8d944614511858418ea2139973ed1f67219dda379 Legacy running event mnemonic: salmon-nurse-surprise-between Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 9e93b6ece0420ebd962bf7d168db625c716121a3471f70c27751bb3806ce88f2b8e0b51e44d27fb5a631167c33f8ee9a (root) ConsistencyTestingToolState / color-card-water-stumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 antique-adult-giant-matrix 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node0 18.488s 2025-09-28 05:43:57.262 108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node0 18.492s 2025-09-28 05:43:57.266 109 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-28T05:43:55.373708641Z Next consensus number: 1 Legacy running event hash: 0df34cbc76d3af5300bd8a0e5e03e7be52054b86ad623f22b07669b8d944614511858418ea2139973ed1f67219dda379 Legacy running event mnemonic: salmon-nurse-surprise-between Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 9e93b6ece0420ebd962bf7d168db625c716121a3471f70c27751bb3806ce88f2b8e0b51e44d27fb5a631167c33f8ee9a (root) ConsistencyTestingToolState / color-card-water-stumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 antique-adult-giant-matrix 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node2 18.507s 2025-09-28 05:43:57.281 107 INFO PLATFORM_STATUS <platformForkJoinThread-2> StatusStateMachine: Platform spent 1.6 s in CHECKING. Now in ACTIVE
node1 18.509s 2025-09-28 05:43:57.283 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 18.509s 2025-09-28 05:43:57.283 113 INFO PLATFORM_STATUS <platformForkJoinThread-8> StatusStateMachine: Platform spent 1.9 s in CHECKING. Now in ACTIVE
node1 18.509s 2025-09-28 05:43:57.283 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 18.510s 2025-09-28 05:43:57.284 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 18.511s 2025-09-28 05:43:57.285 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 18.517s 2025-09-28 05:43:57.291 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 18.518s 2025-09-28 05:43:57.292 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 18.518s 2025-09-28 05:43:57.292 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 18.519s 2025-09-28 05:43:57.293 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 18.520s 2025-09-28 05:43:57.294 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 18.521s 2025-09-28 05:43:57.295 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node4 18.524s 2025-09-28 05:43:57.298 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-28T05:43:55.373708641Z Next consensus number: 1 Legacy running event hash: 0df34cbc76d3af5300bd8a0e5e03e7be52054b86ad623f22b07669b8d944614511858418ea2139973ed1f67219dda379 Legacy running event mnemonic: salmon-nurse-surprise-between Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 9e93b6ece0420ebd962bf7d168db625c716121a3471f70c27751bb3806ce88f2b8e0b51e44d27fb5a631167c33f8ee9a (root) ConsistencyTestingToolState / color-card-water-stumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 antique-adult-giant-matrix 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node3 18.525s 2025-09-28 05:43:57.299 115 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 18.533s 2025-09-28 05:43:57.307 110 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 18.534s 2025-09-28 05:43:57.308 111 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 18.534s 2025-09-28 05:43:57.308 112 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 18.535s 2025-09-28 05:43:57.309 113 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 18.541s 2025-09-28 05:43:57.315 114 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 18.545s 2025-09-28 05:43:57.319 117 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/1 for round 1
node2 18.548s 2025-09-28 05:43:57.322 118 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 1 Timestamp: 2025-09-28T05:43:55.373708641Z Next consensus number: 1 Legacy running event hash: 0df34cbc76d3af5300bd8a0e5e03e7be52054b86ad623f22b07669b8d944614511858418ea2139973ed1f67219dda379 Legacy running event mnemonic: salmon-nurse-surprise-between Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1450302654 Root hash: 9e93b6ece0420ebd962bf7d168db625c716121a3471f70c27751bb3806ce88f2b8e0b51e44d27fb5a631167c33f8ee9a (root) ConsistencyTestingToolState / color-card-water-stumble 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 antique-adult-giant-matrix 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 1931016930446315563 /3 almost-atom-novel-view 4 StringLeaf 1 /4 wreck-whale-old-bottom
node4 18.559s 2025-09-28 05:43:57.333 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 18.559s 2025-09-28 05:43:57.333 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 18.559s 2025-09-28 05:43:57.333 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 18.560s 2025-09-28 05:43:57.334 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 18.566s 2025-09-28 05:43:57.340 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 18.583s 2025-09-28 05:43:57.357 117 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 2.2 s in CHECKING. Now in ACTIVE
node2 18.585s 2025-09-28 05:43:57.359 119 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 18.585s 2025-09-28 05:43:57.359 120 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 18.586s 2025-09-28 05:43:57.360 121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 18.587s 2025-09-28 05:43:57.361 122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 18.593s 2025-09-28 05:43:57.367 123 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 1 to disk. Reason: FIRST_ROUND_AFTER_GENESIS, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1 {"round":1,"freezeState":false,"reason":"FIRST_ROUND_AFTER_GENESIS","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 18.946s 2025-09-28 05:43:57.720 117 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 10.1 s in OBSERVING. Now in CHECKING
node0 21.103s 2025-09-28 05:43:59.877 156 INFO PLATFORM_STATUS <platformForkJoinThread-5> StatusStateMachine: Platform spent 2.2 s in CHECKING. Now in ACTIVE
node0 22.527s 2025-09-28 05:44:01.301 200 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 10 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 22.617s 2025-09-28 05:44:01.391 198 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 10 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 22.618s 2025-09-28 05:44:01.392 198 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 10 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 22.618s 2025-09-28 05:44:01.392 198 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 10 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 22.643s 2025-09-28 05:44:01.417 198 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 10 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 22.760s 2025-09-28 05:44:01.534 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 10 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/10
node3 22.761s 2025-09-28 05:44:01.535 201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node2 22.764s 2025-09-28 05:44:01.538 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 10 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/10
node2 22.765s 2025-09-28 05:44:01.539 201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node4 22.787s 2025-09-28 05:44:01.561 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 10 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/10
node4 22.787s 2025-09-28 05:44:01.561 201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node0 22.831s 2025-09-28 05:44:01.605 202 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 10 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/10
node0 22.832s 2025-09-28 05:44:01.606 203 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node1 22.834s 2025-09-28 05:44:01.608 200 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 10 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/10
node1 22.835s 2025-09-28 05:44:01.609 201 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node3 22.846s 2025-09-28 05:44:01.620 232 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node2 22.847s 2025-09-28 05:44:01.621 238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node3 22.848s 2025-09-28 05:44:01.622 233 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 10 Timestamp: 2025-09-28T05:44:00.307612Z Next consensus number: 291 Legacy running event hash: d3304aaf1f7cb5aa7b6f84e2a3954cac392e23bbc599bf787a29e7d522897a2222c38009748afd558db00ad3d0cd2ea8 Legacy running event mnemonic: old-field-usage-enlist Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1638304851 Root hash: ae447f240b933f8bfcf68087c07b47c5dc6ca6db9b0910d939365e25b347c08a257801b737792087c2fa298ca750e8ad (root) ConsistencyTestingToolState / daring-tonight-include-vapor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ceiling-tree-author-eight 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -341525971029347665 /3 health-then-robot-average 4 StringLeaf 10 /4 announce-help-hire-tip
node2 22.850s 2025-09-28 05:44:01.624 239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 10 Timestamp: 2025-09-28T05:44:00.307612Z Next consensus number: 291 Legacy running event hash: d3304aaf1f7cb5aa7b6f84e2a3954cac392e23bbc599bf787a29e7d522897a2222c38009748afd558db00ad3d0cd2ea8 Legacy running event mnemonic: old-field-usage-enlist Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1638304851 Root hash: ae447f240b933f8bfcf68087c07b47c5dc6ca6db9b0910d939365e25b347c08a257801b737792087c2fa298ca750e8ad (root) ConsistencyTestingToolState / daring-tonight-include-vapor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ceiling-tree-author-eight 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -341525971029347665 /3 health-then-robot-average 4 StringLeaf 10 /4 announce-help-hire-tip
node2 22.857s 2025-09-28 05:44:01.631 240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node3 22.857s 2025-09-28 05:44:01.631 234 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node2 22.858s 2025-09-28 05:44:01.632 241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 22.858s 2025-09-28 05:44:01.632 242 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 22.858s 2025-09-28 05:44:01.632 235 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 22.858s 2025-09-28 05:44:01.632 236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 22.859s 2025-09-28 05:44:01.633 243 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 22.859s 2025-09-28 05:44:01.633 244 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 10 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/10 {"round":10,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/10/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 22.859s 2025-09-28 05:44:01.633 237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 22.859s 2025-09-28 05:44:01.633 238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 10 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/10 {"round":10,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/10/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 22.867s 2025-09-28 05:44:01.641 236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node4 22.869s 2025-09-28 05:44:01.643 237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 10 Timestamp: 2025-09-28T05:44:00.307612Z Next consensus number: 291 Legacy running event hash: d3304aaf1f7cb5aa7b6f84e2a3954cac392e23bbc599bf787a29e7d522897a2222c38009748afd558db00ad3d0cd2ea8 Legacy running event mnemonic: old-field-usage-enlist Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1638304851 Root hash: ae447f240b933f8bfcf68087c07b47c5dc6ca6db9b0910d939365e25b347c08a257801b737792087c2fa298ca750e8ad (root) ConsistencyTestingToolState / daring-tonight-include-vapor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ceiling-tree-author-eight 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -341525971029347665 /3 health-then-robot-average 4 StringLeaf 10 /4 announce-help-hire-tip
node4 22.876s 2025-09-28 05:44:01.650 238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 22.876s 2025-09-28 05:44:01.650 239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 22.877s 2025-09-28 05:44:01.651 240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 22.877s 2025-09-28 05:44:01.651 241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 22.878s 2025-09-28 05:44:01.652 242 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 10 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/10 {"round":10,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/10/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 22.917s 2025-09-28 05:44:01.691 236 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node1 22.919s 2025-09-28 05:44:01.693 237 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 10 Timestamp: 2025-09-28T05:44:00.307612Z Next consensus number: 291 Legacy running event hash: d3304aaf1f7cb5aa7b6f84e2a3954cac392e23bbc599bf787a29e7d522897a2222c38009748afd558db00ad3d0cd2ea8 Legacy running event mnemonic: old-field-usage-enlist Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1638304851 Root hash: ae447f240b933f8bfcf68087c07b47c5dc6ca6db9b0910d939365e25b347c08a257801b737792087c2fa298ca750e8ad (root) ConsistencyTestingToolState / daring-tonight-include-vapor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ceiling-tree-author-eight 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -341525971029347665 /3 health-then-robot-average 4 StringLeaf 10 /4 announce-help-hire-tip
node0 22.925s 2025-09-28 05:44:01.699 238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/7 for round 10
node1 22.927s 2025-09-28 05:44:01.701 238 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 22.927s 2025-09-28 05:44:01.701 239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node0 22.928s 2025-09-28 05:44:01.702 239 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 10 Timestamp: 2025-09-28T05:44:00.307612Z Next consensus number: 291 Legacy running event hash: d3304aaf1f7cb5aa7b6f84e2a3954cac392e23bbc599bf787a29e7d522897a2222c38009748afd558db00ad3d0cd2ea8 Legacy running event mnemonic: old-field-usage-enlist Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1638304851 Root hash: ae447f240b933f8bfcf68087c07b47c5dc6ca6db9b0910d939365e25b347c08a257801b737792087c2fa298ca750e8ad (root) ConsistencyTestingToolState / daring-tonight-include-vapor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 ceiling-tree-author-eight 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -341525971029347665 /3 health-then-robot-average 4 StringLeaf 10 /4 announce-help-hire-tip
node1 22.928s 2025-09-28 05:44:01.702 240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 22.928s 2025-09-28 05:44:01.702 241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 22.929s 2025-09-28 05:44:01.703 242 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 10 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/10 {"round":10,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/10/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 22.938s 2025-09-28 05:44:01.712 240 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 22.939s 2025-09-28 05:44:01.713 241 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 1 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 22.939s 2025-09-28 05:44:01.713 242 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 22.940s 2025-09-28 05:44:01.714 243 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 22.940s 2025-09-28 05:44:01.714 244 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 10 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/10 {"round":10,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/10/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 1m 22.247s 2025-09-28 05:45:01.021 1551 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 134 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 1m 22.256s 2025-09-28 05:45:01.030 1562 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 134 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 1m 22.266s 2025-09-28 05:45:01.040 1587 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 134 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 1m 22.282s 2025-09-28 05:45:01.056 1597 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 134 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 1m 22.341s 2025-09-28 05:45:01.115 1579 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 134 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 1m 22.466s 2025-09-28 05:45:01.240 1554 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 134 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/134
node1 1m 22.467s 2025-09-28 05:45:01.241 1555 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node0 1m 22.470s 2025-09-28 05:45:01.244 1582 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 134 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/134
node0 1m 22.471s 2025-09-28 05:45:01.245 1583 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node2 1m 22.536s 2025-09-28 05:45:01.310 1590 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 134 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/134
node2 1m 22.538s 2025-09-28 05:45:01.312 1591 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node3 1m 22.540s 2025-09-28 05:45:01.314 1600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 134 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/134
node3 1m 22.541s 2025-09-28 05:45:01.315 1601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node1 1m 22.553s 2025-09-28 05:45:01.327 1596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node1 1m 22.555s 2025-09-28 05:45:01.329 1597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 134 Timestamp: 2025-09-28T05:45:00.090902Z Next consensus number: 5078 Legacy running event hash: 0d45a9b9ca3e5b26c6905a6e6dc8c9b56b2c16d5e3412acc6451b0bf59e9a00aa51f2f74ee7af998b444410120c726af Legacy running event mnemonic: post-similar-reunion-gospel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -957874872 Root hash: 66caffe506216f9187136fd10cdf76c22debd45b43850a40a419da3d27ed65cc53a474d82865b08a5ae7c7925957f91e (root) ConsistencyTestingToolState / sun-address-service-vendor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 legend-surprise-segment-grant 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 2979579708113036722 /3 coast-cabbage-town-tragic 4 StringLeaf 134 /4 sheriff-coach-certain-wrestle
node4 1m 22.556s 2025-09-28 05:45:01.330 1566 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 134 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/134
node4 1m 22.557s 2025-09-28 05:45:01.331 1567 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node0 1m 22.561s 2025-09-28 05:45:01.335 1616 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node0 1m 22.563s 2025-09-28 05:45:01.337 1617 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 134 Timestamp: 2025-09-28T05:45:00.090902Z Next consensus number: 5078 Legacy running event hash: 0d45a9b9ca3e5b26c6905a6e6dc8c9b56b2c16d5e3412acc6451b0bf59e9a00aa51f2f74ee7af998b444410120c726af Legacy running event mnemonic: post-similar-reunion-gospel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -957874872 Root hash: 66caffe506216f9187136fd10cdf76c22debd45b43850a40a419da3d27ed65cc53a474d82865b08a5ae7c7925957f91e (root) ConsistencyTestingToolState / sun-address-service-vendor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 legend-surprise-segment-grant 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 2979579708113036722 /3 coast-cabbage-town-tragic 4 StringLeaf 134 /4 sheriff-coach-certain-wrestle
node1 1m 22.567s 2025-09-28 05:45:01.341 1598 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 22.567s 2025-09-28 05:45:01.341 1599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 107 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 22.567s 2025-09-28 05:45:01.341 1600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 1m 22.571s 2025-09-28 05:45:01.345 1601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 22.572s 2025-09-28 05:45:01.346 1618 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node1 1m 22.572s 2025-09-28 05:45:01.346 1602 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 134 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/134 {"round":134,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/134/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 1m 22.573s 2025-09-28 05:45:01.347 1619 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 107 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 1m 22.573s 2025-09-28 05:45:01.347 1620 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 1m 22.577s 2025-09-28 05:45:01.351 1621 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 1m 22.577s 2025-09-28 05:45:01.351 1622 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 134 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/134 {"round":134,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/134/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 1m 22.620s 2025-09-28 05:45:01.394 1632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node2 1m 22.623s 2025-09-28 05:45:01.397 1633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 134 Timestamp: 2025-09-28T05:45:00.090902Z Next consensus number: 5078 Legacy running event hash: 0d45a9b9ca3e5b26c6905a6e6dc8c9b56b2c16d5e3412acc6451b0bf59e9a00aa51f2f74ee7af998b444410120c726af Legacy running event mnemonic: post-similar-reunion-gospel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -957874872 Root hash: 66caffe506216f9187136fd10cdf76c22debd45b43850a40a419da3d27ed65cc53a474d82865b08a5ae7c7925957f91e (root) ConsistencyTestingToolState / sun-address-service-vendor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 legend-surprise-segment-grant 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 2979579708113036722 /3 coast-cabbage-town-tragic 4 StringLeaf 134 /4 sheriff-coach-certain-wrestle
node3 1m 22.623s 2025-09-28 05:45:01.397 1634 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node3 1m 22.625s 2025-09-28 05:45:01.399 1635 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 134 Timestamp: 2025-09-28T05:45:00.090902Z Next consensus number: 5078 Legacy running event hash: 0d45a9b9ca3e5b26c6905a6e6dc8c9b56b2c16d5e3412acc6451b0bf59e9a00aa51f2f74ee7af998b444410120c726af Legacy running event mnemonic: post-similar-reunion-gospel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -957874872 Root hash: 66caffe506216f9187136fd10cdf76c22debd45b43850a40a419da3d27ed65cc53a474d82865b08a5ae7c7925957f91e (root) ConsistencyTestingToolState / sun-address-service-vendor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 legend-surprise-segment-grant 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 2979579708113036722 /3 coast-cabbage-town-tragic 4 StringLeaf 134 /4 sheriff-coach-certain-wrestle
node2 1m 22.633s 2025-09-28 05:45:01.407 1634 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 22.633s 2025-09-28 05:45:01.407 1635 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 107 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 1m 22.633s 2025-09-28 05:45:01.407 1636 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 1m 22.635s 2025-09-28 05:45:01.409 1636 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 22.635s 2025-09-28 05:45:01.409 1637 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 107 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 1m 22.635s 2025-09-28 05:45:01.409 1638 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 22.635s 2025-09-28 05:45:01.409 1608 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/13 for round 134
node2 1m 22.637s 2025-09-28 05:45:01.411 1637 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 22.637s 2025-09-28 05:45:01.411 1609 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 134 Timestamp: 2025-09-28T05:45:00.090902Z Next consensus number: 5078 Legacy running event hash: 0d45a9b9ca3e5b26c6905a6e6dc8c9b56b2c16d5e3412acc6451b0bf59e9a00aa51f2f74ee7af998b444410120c726af Legacy running event mnemonic: post-similar-reunion-gospel Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -957874872 Root hash: 66caffe506216f9187136fd10cdf76c22debd45b43850a40a419da3d27ed65cc53a474d82865b08a5ae7c7925957f91e (root) ConsistencyTestingToolState / sun-address-service-vendor 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 legend-surprise-segment-grant 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 2979579708113036722 /3 coast-cabbage-town-tragic 4 StringLeaf 134 /4 sheriff-coach-certain-wrestle
node2 1m 22.638s 2025-09-28 05:45:01.412 1638 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 134 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/134 {"round":134,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/134/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 1m 22.639s 2025-09-28 05:45:01.413 1639 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 1m 22.639s 2025-09-28 05:45:01.413 1640 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 134 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/134 {"round":134,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/134/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 1m 22.645s 2025-09-28 05:45:01.419 1610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 22.645s 2025-09-28 05:45:01.419 1611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 107 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 1m 22.645s 2025-09-28 05:45:01.419 1612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 1m 22.649s 2025-09-28 05:45:01.423 1613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 1m 22.649s 2025-09-28 05:45:01.423 1614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 134 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/134 {"round":134,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/134/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 22.453s 2025-09-28 05:46:01.227 3104 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 269 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 2m 22.520s 2025-09-28 05:46:01.294 3108 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 269 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 2m 22.574s 2025-09-28 05:46:01.348 3108 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 269 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 22.592s 2025-09-28 05:46:01.366 3138 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 269 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 2m 22.684s 2025-09-28 05:46:01.458 3141 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 269 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/269
node2 2m 22.685s 2025-09-28 05:46:01.459 3142 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node1 2m 22.693s 2025-09-28 05:46:01.467 3064 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 269 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 2m 22.745s 2025-09-28 05:46:01.519 3067 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 269 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/269
node1 2m 22.746s 2025-09-28 05:46:01.520 3068 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node2 2m 22.770s 2025-09-28 05:46:01.544 3177 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node2 2m 22.772s 2025-09-28 05:46:01.546 3178 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node2 2m 22.779s 2025-09-28 05:46:01.553 3179 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 22.779s 2025-09-28 05:46:01.553 3180 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 242 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 2m 22.780s 2025-09-28 05:46:01.554 3181 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 2m 22.787s 2025-09-28 05:46:01.561 3182 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 2m 22.787s 2025-09-28 05:46:01.561 3183 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 269 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/269 {"round":269,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/269/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 2m 22.808s 2025-09-28 05:46:01.582 3107 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 269 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/269
node0 2m 22.810s 2025-09-28 05:46:01.584 3108 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node1 2m 22.828s 2025-09-28 05:46:01.602 3099 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node1 2m 22.830s 2025-09-28 05:46:01.604 3100 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node1 2m 22.838s 2025-09-28 05:46:01.612 3101 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 22.838s 2025-09-28 05:46:01.612 3102 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 242 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 2m 22.838s 2025-09-28 05:46:01.612 3103 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 2m 22.845s 2025-09-28 05:46:01.619 3104 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 2m 22.845s 2025-09-28 05:46:01.619 3105 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 269 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/269 {"round":269,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/269/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 22.896s 2025-09-28 05:46:01.670 3121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 269 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/269
node4 2m 22.897s 2025-09-28 05:46:01.671 3122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node3 2m 22.901s 2025-09-28 05:46:01.675 3121 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 269 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/269
node3 2m 22.902s 2025-09-28 05:46:01.676 3122 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node0 2m 22.905s 2025-09-28 05:46:01.679 3143 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node0 2m 22.908s 2025-09-28 05:46:01.682 3144 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node0 2m 22.917s 2025-09-28 05:46:01.691 3145 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 22.918s 2025-09-28 05:46:01.692 3146 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 242 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 2m 22.918s 2025-09-28 05:46:01.692 3147 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 2m 22.926s 2025-09-28 05:46:01.700 3148 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 2m 22.926s 2025-09-28 05:46:01.700 3149 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 269 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/269 {"round":269,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/269/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 2m 22.985s 2025-09-28 05:46:01.759 3165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node4 2m 22.986s 2025-09-28 05:46:01.760 3153 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/19 for round 269
node3 2m 22.987s 2025-09-28 05:46:01.761 3166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node4 2m 22.988s 2025-09-28 05:46:01.762 3154 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node3 2m 22.995s 2025-09-28 05:46:01.769 3167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 22.996s 2025-09-28 05:46:01.770 3168 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 242 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 2m 22.996s 2025-09-28 05:46:01.770 3169 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 2m 22.997s 2025-09-28 05:46:01.771 3155 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 22.997s 2025-09-28 05:46:01.771 3157 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 242 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces
node4 2m 22.998s 2025-09-28 05:46:01.772 3165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 2m 23.003s 2025-09-28 05:46:01.777 3170 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 2m 23.003s 2025-09-28 05:46:01.777 3171 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 269 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/269 {"round":269,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/269/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 2m 23.004s 2025-09-28 05:46:01.778 3166 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 2m 23.005s 2025-09-28 05:46:01.779 3167 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 269 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/269 {"round":269,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/269/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 3m 22.070s 2025-09-28 05:47:00.844 4629 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 402 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 3m 22.105s 2025-09-28 05:47:00.879 4575 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 402 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 3m 22.123s 2025-09-28 05:47:00.897 4631 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 402 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 3m 22.207s 2025-09-28 05:47:00.981 4623 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 402 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 3m 22.327s 2025-09-28 05:47:01.101 4632 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 402 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/402
node3 3m 22.328s 2025-09-28 05:47:01.102 4633 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node1 3m 22.381s 2025-09-28 05:47:01.155 4578 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 402 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/402
node1 3m 22.382s 2025-09-28 05:47:01.156 4579 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node2 3m 22.385s 2025-09-28 05:47:01.159 4626 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 402 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/402
node2 3m 22.386s 2025-09-28 05:47:01.160 4627 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node3 3m 22.414s 2025-09-28 05:47:01.188 4664 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node3 3m 22.416s 2025-09-28 05:47:01.190 4665 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 402 Timestamp: 2025-09-28T05:47:00.019835Z Next consensus number: 14433 Legacy running event hash: e5992a759578807dc3caf15c219970e6d83b526c6aa8a89950150000c258e2d4ae88c570d6652346da7474b62b904246 Legacy running event mnemonic: erupt-exchange-extra-vacant Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 201106013 Root hash: b83d968f188251f8c9af266ba84ac2a0268b8ded816e73e4633b0c173aa1a19bb3b3eb0ebff3321fca01f3a18a8bcfbd (root) ConsistencyTestingToolState / wear-comfort-aware-disease 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 item-keen-enter-convince 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 8940386125639942911 /3 lunch-convince-board-uniform 4 StringLeaf 402 /4 hole-exercise-mandate-chapter
node3 3m 22.423s 2025-09-28 05:47:01.197 4666 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 22.424s 2025-09-28 05:47:01.198 4667 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 375 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 3m 22.424s 2025-09-28 05:47:01.198 4668 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 3m 22.433s 2025-09-28 05:47:01.207 4669 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 3m 22.434s 2025-09-28 05:47:01.208 4670 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 402 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/402 {"round":402,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/402/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 22.452s 2025-09-28 05:47:01.226 4634 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 402 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/402
node0 3m 22.453s 2025-09-28 05:47:01.227 4635 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node2 3m 22.472s 2025-09-28 05:47:01.246 4658 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node1 3m 22.474s 2025-09-28 05:47:01.248 4610 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node2 3m 22.474s 2025-09-28 05:47:01.248 4659 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 402 Timestamp: 2025-09-28T05:47:00.019835Z Next consensus number: 14433 Legacy running event hash: e5992a759578807dc3caf15c219970e6d83b526c6aa8a89950150000c258e2d4ae88c570d6652346da7474b62b904246 Legacy running event mnemonic: erupt-exchange-extra-vacant Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 201106013 Root hash: b83d968f188251f8c9af266ba84ac2a0268b8ded816e73e4633b0c173aa1a19bb3b3eb0ebff3321fca01f3a18a8bcfbd (root) ConsistencyTestingToolState / wear-comfort-aware-disease 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 item-keen-enter-convince 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 8940386125639942911 /3 lunch-convince-board-uniform 4 StringLeaf 402 /4 hole-exercise-mandate-chapter
node1 3m 22.476s 2025-09-28 05:47:01.250 4611 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 402 Timestamp: 2025-09-28T05:47:00.019835Z Next consensus number: 14433 Legacy running event hash: e5992a759578807dc3caf15c219970e6d83b526c6aa8a89950150000c258e2d4ae88c570d6652346da7474b62b904246 Legacy running event mnemonic: erupt-exchange-extra-vacant Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 201106013 Root hash: b83d968f188251f8c9af266ba84ac2a0268b8ded816e73e4633b0c173aa1a19bb3b3eb0ebff3321fca01f3a18a8bcfbd (root) ConsistencyTestingToolState / wear-comfort-aware-disease 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 item-keen-enter-convince 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 8940386125639942911 /3 lunch-convince-board-uniform 4 StringLeaf 402 /4 hole-exercise-mandate-chapter
node2 3m 22.482s 2025-09-28 05:47:01.256 4660 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 22.482s 2025-09-28 05:47:01.256 4661 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 375 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 3m 22.482s 2025-09-28 05:47:01.256 4662 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 3m 22.483s 2025-09-28 05:47:01.257 4612 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 22.484s 2025-09-28 05:47:01.258 4613 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 375 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces
node1 3m 22.484s 2025-09-28 05:47:01.258 4614 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 3m 22.492s 2025-09-28 05:47:01.266 4663 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 3m 22.493s 2025-09-28 05:47:01.267 4664 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 402 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/402 {"round":402,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/402/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 3m 22.494s 2025-09-28 05:47:01.268 4623 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 3m 22.494s 2025-09-28 05:47:01.268 4624 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 402 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/402 {"round":402,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/402/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 3m 22.548s 2025-09-28 05:47:01.322 4666 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/25 for round 402
node0 3m 22.550s 2025-09-28 05:47:01.324 4667 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 402 Timestamp: 2025-09-28T05:47:00.019835Z Next consensus number: 14433 Legacy running event hash: e5992a759578807dc3caf15c219970e6d83b526c6aa8a89950150000c258e2d4ae88c570d6652346da7474b62b904246 Legacy running event mnemonic: erupt-exchange-extra-vacant Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 201106013 Root hash: b83d968f188251f8c9af266ba84ac2a0268b8ded816e73e4633b0c173aa1a19bb3b3eb0ebff3321fca01f3a18a8bcfbd (root) ConsistencyTestingToolState / wear-comfort-aware-disease 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 item-keen-enter-convince 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 8940386125639942911 /3 lunch-convince-board-uniform 4 StringLeaf 402 /4 hole-exercise-mandate-chapter
node0 3m 22.558s 2025-09-28 05:47:01.332 4671 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 22.558s 2025-09-28 05:47:01.332 4672 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 375 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces
node0 3m 22.558s 2025-09-28 05:47:01.332 4673 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 3m 22.568s 2025-09-28 05:47:01.342 4674 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 3m 22.568s 2025-09-28 05:47:01.342 4675 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 402 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/402 {"round":402,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/402/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 22.262s 2025-09-28 05:48:01.036 6210 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 541 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 4m 22.289s 2025-09-28 05:48:01.063 6250 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 541 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 4m 22.342s 2025-09-28 05:48:01.116 6210 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 541 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 22.346s 2025-09-28 05:48:01.120 6182 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 541 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 4m 22.438s 2025-09-28 05:48:01.212 6185 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 541 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/541
node1 4m 22.439s 2025-09-28 05:48:01.213 6186 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node0 4m 22.482s 2025-09-28 05:48:01.256 6213 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 541 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/541
node0 4m 22.483s 2025-09-28 05:48:01.257 6214 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node3 4m 22.508s 2025-09-28 05:48:01.282 6253 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 541 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/541
node3 4m 22.509s 2025-09-28 05:48:01.283 6254 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node1 4m 22.524s 2025-09-28 05:48:01.298 6221 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node2 4m 22.525s 2025-09-28 05:48:01.299 6213 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 541 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/541
node1 4m 22.526s 2025-09-28 05:48:01.300 6222 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 541 Timestamp: 2025-09-28T05:48:00.227507Z Next consensus number: 17663 Legacy running event hash: 52b777c9dafec1c761bd513905c838be0877a4620c7cfcdbbf0fc45be712f9482874c240ec241d8f6ac47925cc54da18 Legacy running event mnemonic: finger-weird-park-shrimp Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -834372272 Root hash: a715f3facb6e25241922374bdd5dff5a54c851b671b86296240bfb0bb23e340c949f8179cb1b652dafc20ea39ce4bf2c (root) ConsistencyTestingToolState / buddy-liar-myself-pig 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reward-apology-pull-zone 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -534898808152608437 /3 ahead-copper-under-parent 4 StringLeaf 541 /4 penalty-wink-goose-polar
node2 4m 22.526s 2025-09-28 05:48:01.300 6214 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node1 4m 22.534s 2025-09-28 05:48:01.308 6223 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 22.534s 2025-09-28 05:48:01.308 6224 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 514 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 4m 22.534s 2025-09-28 05:48:01.308 6225 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 4m 22.535s 2025-09-28 05:48:01.309 6226 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 4m 22.536s 2025-09-28 05:48:01.310 6227 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 541 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/541 {"round":541,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/541/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 4m 22.538s 2025-09-28 05:48:01.312 6228 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/1
node0 4m 22.567s 2025-09-28 05:48:01.341 6245 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node0 4m 22.569s 2025-09-28 05:48:01.343 6246 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 541 Timestamp: 2025-09-28T05:48:00.227507Z Next consensus number: 17663 Legacy running event hash: 52b777c9dafec1c761bd513905c838be0877a4620c7cfcdbbf0fc45be712f9482874c240ec241d8f6ac47925cc54da18 Legacy running event mnemonic: finger-weird-park-shrimp Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -834372272 Root hash: a715f3facb6e25241922374bdd5dff5a54c851b671b86296240bfb0bb23e340c949f8179cb1b652dafc20ea39ce4bf2c (root) ConsistencyTestingToolState / buddy-liar-myself-pig 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reward-apology-pull-zone 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -534898808152608437 /3 ahead-copper-under-parent 4 StringLeaf 541 /4 penalty-wink-goose-polar
node0 4m 22.576s 2025-09-28 05:48:01.350 6247 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 4m 22.576s 2025-09-28 05:48:01.350 6248 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 514 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 4m 22.576s 2025-09-28 05:48:01.350 6249 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 4m 22.577s 2025-09-28 05:48:01.351 6250 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 4m 22.578s 2025-09-28 05:48:01.352 6251 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 541 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/541 {"round":541,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/541/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 4m 22.579s 2025-09-28 05:48:01.353 6252 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/1
node3 4m 22.591s 2025-09-28 05:48:01.365 6289 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node3 4m 22.593s 2025-09-28 05:48:01.367 6290 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 541 Timestamp: 2025-09-28T05:48:00.227507Z Next consensus number: 17663 Legacy running event hash: 52b777c9dafec1c761bd513905c838be0877a4620c7cfcdbbf0fc45be712f9482874c240ec241d8f6ac47925cc54da18 Legacy running event mnemonic: finger-weird-park-shrimp Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -834372272 Root hash: a715f3facb6e25241922374bdd5dff5a54c851b671b86296240bfb0bb23e340c949f8179cb1b652dafc20ea39ce4bf2c (root) ConsistencyTestingToolState / buddy-liar-myself-pig 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reward-apology-pull-zone 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -534898808152608437 /3 ahead-copper-under-parent 4 StringLeaf 541 /4 penalty-wink-goose-polar
node3 4m 22.600s 2025-09-28 05:48:01.374 6291 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 4m 22.601s 2025-09-28 05:48:01.375 6292 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 514 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces
node3 4m 22.601s 2025-09-28 05:48:01.375 6293 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node3 4m 22.602s 2025-09-28 05:48:01.376 6294 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 4m 22.602s 2025-09-28 05:48:01.376 6295 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 541 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/541 {"round":541,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/541/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 4m 22.603s 2025-09-28 05:48:01.377 6296 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/1
node2 4m 22.641s 2025-09-28 05:48:01.415 6249 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/31 for round 541
node2 4m 22.643s 2025-09-28 05:48:01.417 6250 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 541 Timestamp: 2025-09-28T05:48:00.227507Z Next consensus number: 17663 Legacy running event hash: 52b777c9dafec1c761bd513905c838be0877a4620c7cfcdbbf0fc45be712f9482874c240ec241d8f6ac47925cc54da18 Legacy running event mnemonic: finger-weird-park-shrimp Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -834372272 Root hash: a715f3facb6e25241922374bdd5dff5a54c851b671b86296240bfb0bb23e340c949f8179cb1b652dafc20ea39ce4bf2c (root) ConsistencyTestingToolState / buddy-liar-myself-pig 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 reward-apology-pull-zone 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -534898808152608437 /3 ahead-copper-under-parent 4 StringLeaf 541 /4 penalty-wink-goose-polar
node2 4m 22.650s 2025-09-28 05:48:01.424 6251 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 4m 22.650s 2025-09-28 05:48:01.424 6252 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 514 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces
node2 4m 22.650s 2025-09-28 05:48:01.424 6253 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 4m 22.651s 2025-09-28 05:48:01.425 6254 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 4m 22.652s 2025-09-28 05:48:01.426 6255 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 541 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/541 {"round":541,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/541/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 4m 22.653s 2025-09-28 05:48:01.427 6256 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/1
node3 5m 22.509s 2025-09-28 05:49:01.283 7834 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 680 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 5m 22.546s 2025-09-28 05:49:01.320 7882 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 680 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 5m 22.562s 2025-09-28 05:49:01.336 7854 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 680 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 5m 22.582s 2025-09-28 05:49:01.356 7816 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 680 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 5m 22.750s 2025-09-28 05:49:01.524 7837 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 680 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/680
node3 5m 22.750s 2025-09-28 05:49:01.524 7838 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node0 5m 22.755s 2025-09-28 05:49:01.529 7819 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 680 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/680
node0 5m 22.756s 2025-09-28 05:49:01.530 7820 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node2 5m 22.820s 2025-09-28 05:49:01.594 7867 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 680 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/680
node2 5m 22.821s 2025-09-28 05:49:01.595 7868 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node3 5m 22.833s 2025-09-28 05:49:01.607 7869 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node3 5m 22.835s 2025-09-28 05:49:01.609 7870 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 680 Timestamp: 2025-09-28T05:49:00.413821830Z Next consensus number: 20985 Legacy running event hash: 996b17eafa6a0622d848995f536430e8533d268dcec1d086bf7f8511de19fe65d089db2d2b51f4f347768c8fcbc03766 Legacy running event mnemonic: sport-wool-ready-session Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1721163818 Root hash: 4c6121841353dfd6ec92ff6a1c7161735d8e6bd2c62d43b456c230e42dd412d70c5fe1e135745c275bcbd5e547ffae26 (root) ConsistencyTestingToolState / method-correct-heavy-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 capable-season-tube-embody 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7077768802875491408 /3 more-strike-glimpse-uphold 4 StringLeaf 680 /4 beach-current-wine-job
node3 5m 22.842s 2025-09-28 05:49:01.616 7871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 5m 22.842s 2025-09-28 05:49:01.616 7872 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 653 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces
node3 5m 22.842s 2025-09-28 05:49:01.616 7873 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 22.845s 2025-09-28 05:49:01.619 7851 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node3 5m 22.845s 2025-09-28 05:49:01.619 7874 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 5m 22.846s 2025-09-28 05:49:01.620 7875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 680 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/680 {"round":680,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/680/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 22.847s 2025-09-28 05:49:01.621 7860 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 680 Timestamp: 2025-09-28T05:49:00.413821830Z Next consensus number: 20985 Legacy running event hash: 996b17eafa6a0622d848995f536430e8533d268dcec1d086bf7f8511de19fe65d089db2d2b51f4f347768c8fcbc03766 Legacy running event mnemonic: sport-wool-ready-session Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1721163818 Root hash: 4c6121841353dfd6ec92ff6a1c7161735d8e6bd2c62d43b456c230e42dd412d70c5fe1e135745c275bcbd5e547ffae26 (root) ConsistencyTestingToolState / method-correct-heavy-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 capable-season-tube-embody 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7077768802875491408 /3 more-strike-glimpse-uphold 4 StringLeaf 680 /4 beach-current-wine-job
node3 5m 22.847s 2025-09-28 05:49:01.621 7876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/10
node0 5m 22.854s 2025-09-28 05:49:01.628 7861 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 5m 22.854s 2025-09-28 05:49:01.628 7862 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 653 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 5m 22.854s 2025-09-28 05:49:01.628 7863 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 5m 22.857s 2025-09-28 05:49:01.631 7864 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 5m 22.858s 2025-09-28 05:49:01.632 7865 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 680 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/680 {"round":680,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/680/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 5m 22.859s 2025-09-28 05:49:01.633 7866 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/10
node1 5m 22.884s 2025-09-28 05:49:01.658 7885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 680 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/680
node1 5m 22.884s 2025-09-28 05:49:01.658 7886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node2 5m 22.915s 2025-09-28 05:49:01.689 7899 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node2 5m 22.917s 2025-09-28 05:49:01.691 7900 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 680 Timestamp: 2025-09-28T05:49:00.413821830Z Next consensus number: 20985 Legacy running event hash: 996b17eafa6a0622d848995f536430e8533d268dcec1d086bf7f8511de19fe65d089db2d2b51f4f347768c8fcbc03766 Legacy running event mnemonic: sport-wool-ready-session Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1721163818 Root hash: 4c6121841353dfd6ec92ff6a1c7161735d8e6bd2c62d43b456c230e42dd412d70c5fe1e135745c275bcbd5e547ffae26 (root) ConsistencyTestingToolState / method-correct-heavy-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 capable-season-tube-embody 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7077768802875491408 /3 more-strike-glimpse-uphold 4 StringLeaf 680 /4 beach-current-wine-job
node2 5m 22.924s 2025-09-28 05:49:01.698 7901 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 5m 22.925s 2025-09-28 05:49:01.699 7902 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 653 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces
node2 5m 22.925s 2025-09-28 05:49:01.699 7903 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 5m 22.928s 2025-09-28 05:49:01.702 7904 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 5m 22.929s 2025-09-28 05:49:01.703 7905 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 680 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/680 {"round":680,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/680/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 5m 22.930s 2025-09-28 05:49:01.704 7906 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/10
node1 5m 22.969s 2025-09-28 05:49:01.743 7921 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/37 for round 680
node1 5m 22.971s 2025-09-28 05:49:01.745 7922 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 680 Timestamp: 2025-09-28T05:49:00.413821830Z Next consensus number: 20985 Legacy running event hash: 996b17eafa6a0622d848995f536430e8533d268dcec1d086bf7f8511de19fe65d089db2d2b51f4f347768c8fcbc03766 Legacy running event mnemonic: sport-wool-ready-session Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1721163818 Root hash: 4c6121841353dfd6ec92ff6a1c7161735d8e6bd2c62d43b456c230e42dd412d70c5fe1e135745c275bcbd5e547ffae26 (root) ConsistencyTestingToolState / method-correct-heavy-aim 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 capable-season-tube-embody 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7077768802875491408 /3 more-strike-glimpse-uphold 4 StringLeaf 680 /4 beach-current-wine-job
node1 5m 22.979s 2025-09-28 05:49:01.753 7926 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 22.979s 2025-09-28 05:49:01.753 7927 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 653 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 5m 22.979s 2025-09-28 05:49:01.753 7928 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 5m 22.982s 2025-09-28 05:49:01.756 7929 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 5m 22.983s 2025-09-28 05:49:01.757 7930 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 680 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/680 {"round":680,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/680/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 5m 22.984s 2025-09-28 05:49:01.758 7931 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/10
node4 5m 50.068s 2025-09-28 05:49:28.842 1 INFO STARTUP <main> StaticPlatformBuilder:
////////////////////// // Node is Starting // //////////////////////
node4 5m 50.158s 2025-09-28 05:49:28.932 2 DEBUG STARTUP <main> StaticPlatformBuilder: main() started {} [com.swirlds.logging.legacy.payload.NodeStartPayload]
node4 5m 50.174s 2025-09-28 05:49:28.948 3 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 50.290s 2025-09-28 05:49:29.064 4 INFO STARTUP <main> Browser: The following nodes [4] are set to run locally
node4 5m 50.298s 2025-09-28 05:49:29.072 5 INFO STARTUP <main> ConsistencyTestingToolMain: Registering ConsistencyTestingToolState with ConstructableRegistry
node4 5m 50.310s 2025-09-28 05:49:29.084 6 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 50.726s 2025-09-28 05:49:29.500 9 INFO STARTUP <main> ConsistencyTestingToolMain: ConsistencyTestingToolState is registered with ConstructableRegistry
node4 5m 50.727s 2025-09-28 05:49:29.501 10 DEBUG STARTUP <main> BootstrapUtils: Scanning the classpath for RuntimeConstructable classes
node4 5m 51.690s 2025-09-28 05:49:30.464 11 DEBUG STARTUP <main> BootstrapUtils: Done with registerConstructables, time taken 963ms
node4 5m 51.698s 2025-09-28 05:49:30.472 12 INFO STARTUP <main> ConsistencyTestingToolMain: constructor called in Main.
node4 5m 51.701s 2025-09-28 05:49:30.475 13 WARN STARTUP <main> PlatformConfigUtils: Configuration property 'reconnect.asyncOutputStreamFlushMilliseconds' was renamed to 'reconnect.asyncOutputStreamFlush'. This build is currently backwards compatible with the old name, but this may not be true in a future release, so it is important to switch to the new name.
node4 5m 51.747s 2025-09-28 05:49:30.521 14 INFO STARTUP <main> PrometheusEndpoint: PrometheusEndpoint: Starting server listing on port: 9999
node4 5m 51.807s 2025-09-28 05:49:30.581 15 WARN STARTUP <main> CryptoStatic: There are no keys on disk, Adhoc keys will be generated, but this is incompatible with DAB.
node4 5m 51.808s 2025-09-28 05:49:30.582 16 DEBUG STARTUP <main> CryptoStatic: Started generating keys
node4 5m 53.812s 2025-09-28 05:49:32.586 17 DEBUG STARTUP <main> CryptoStatic: Done generating keys
node4 5m 53.901s 2025-09-28 05:49:32.675 20 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 53.908s 2025-09-28 05:49:32.682 21 INFO STARTUP <main> StartupStateUtils: The following saved states were found on disk:
- /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/269/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/134/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/10/SignedState.swh - /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1/SignedState.swh
node4 5m 53.908s 2025-09-28 05:49:32.682 22 INFO STARTUP <main> StartupStateUtils: Loading latest state from disk.
node4 5m 53.908s 2025-09-28 05:49:32.682 23 INFO STARTUP <main> StartupStateUtils: Loading signed state from disk: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/269/SignedState.swh
node4 5m 53.912s 2025-09-28 05:49:32.686 24 INFO STARTUP <main> ConsistencyTestingToolState: New State Constructed.
node4 5m 53.917s 2025-09-28 05:49:32.691 25 INFO STATE_TO_DISK <main> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp
node4 5m 54.043s 2025-09-28 05:49:32.817 36 INFO STARTUP <main> StartupStateUtils: Loaded state's hash is the same as when it was saved.
node4 5m 54.046s 2025-09-28 05:49:32.820 37 INFO STARTUP <main> StartupStateUtils: Platform has loaded a saved state {"round":269,"consensusTimestamp":"2025-09-28T05:46:00.409185161Z"} [com.swirlds.logging.legacy.payload.SavedStateLoadedPayload]
node4 5m 54.048s 2025-09-28 05:49:32.822 40 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.049s 2025-09-28 05:49:32.823 43 INFO STARTUP <main> BootstrapUtils: Not upgrading software, current software is version SemanticVersion[major=1, minor=0, patch=0, pre=, build=].
node4 5m 54.058s 2025-09-28 05:49:32.832 44 INFO STARTUP <main> AddressBookInitializer: Using the loaded state's address book and weight values.
node4 5m 54.065s 2025-09-28 05:49:32.839 45 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 54.066s 2025-09-28 05:49:32.840 46 INFO STARTUP <main> ConsistencyTestingToolMain: returning software version SemanticVersion[major=1, minor=0, patch=0, pre=, build=]
node4 5m 55.094s 2025-09-28 05:49:33.868 47 INFO STARTUP <main> OSHealthChecker:
PASSED - Clock Source Speed Check Report[callsPerSec=26186747] PASSED - Entropy Check Report[success=true, entropySource=Strong Instance, elapsedNanos=227110, randomLong=6437812918758976189, exception=null] PASSED - Entropy Check Report[success=true, entropySource=NativePRNGBlocking:SUN, elapsedNanos=7180, randomLong=7754031097164110503, exception=null] PASSED - File System Check Report[code=SUCCESS, readNanos=1056980, data=35, exception=null] OS Health Check Report - Complete (took 1016 ms)
node4 5m 55.119s 2025-09-28 05:49:33.893 48 DEBUG STARTUP <main> BootstrapUtils: jvmPauseDetectorThread started
node4 5m 55.238s 2025-09-28 05:49:34.012 49 INFO STARTUP <main> PcesUtilities: Span compaction completed for data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr501_orgn0.pces, new upper bound is 378
node4 5m 55.240s 2025-09-28 05:49:34.014 50 INFO STARTUP <main> StandardScratchpad: Scratchpad platform.iss contents:
LAST_ISS_ROUND null
node4 5m 55.243s 2025-09-28 05:49:34.017 51 INFO STARTUP <main> PlatformBuilder: Default platform pool parallelism: 8
node4 5m 55.310s 2025-09-28 05:49:34.084 52 INFO STARTUP <main> SwirldsPlatform: Starting with roster history:
RosterHistory[ currentRosterRound: 0 ][ no previous roster set ] Current Roster: { "rosterEntries": [{ "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAK05TS8KZeb1MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTEwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTEwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDBoP9dI3K1PRLRK7h90D9eNCfgzuHTyJi70yDEs90XJXlE6jmgf1NE2av83VAhQHLxu8Ehc/55M9Ayx9IQc0zJLSS+IrRM9QwqoG8ZvNdRgNw+je3V/8rAK/mHId+cPnnyDplCyskyi5kWCv6kTULIewFH8/KVZwhe0/hB2+N6ujWixURrxjjGLHA6b2gPoGAb/nxiVOn+L0cWcOzcyiYShxagj0FBWV7AxKx65Ynzfe7eF0gOzBUA+IM10OM5KXJejk53Xz5KpEyGe8htO/bXFlpLdm3UzrYiIhY0oKPYKECAC1s+VAZA6i+MV0nDpqDgxHRRXD8O2arauPhEI6iVT9f05AtzElrs7U95HbpQUuP1sxkaQw+bLdMOQHHMVCgMgw2g0eDdVDAMJD7wjZ+Bs6kDc/EJELb0l1uy2GEnOZMiHkK4K1r4IyZ/ed6QpyIRKfBCNyT5IIpMoVpzRYxVXgjgFdudd8iErKyvSXHThU6nu92c+vSd+FLBFHPpb6ECAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdga5NYtV48uDCd4vIsmpGWpKuUHtDVDlCvzHc2ij8DxAR6OFp+hIRNEBXkzg1KS5qP8Wba5ptmGoV4f89HemP+AL3Azde+HjpYRtffdfTdQwmMbw7xJg2lKkEo11gDo5+zPZnVbfb3FsZ+IXKji0QshQBfg+ddTkFG3TJG1ttq3ZDw94RxFQivVnkj1p+Ogel/DuBNRWQobFVe5VrmJqbuwwN8AdrPae1dMrkZatF91On5+cpVLGfk96fYUhDohDt6KKQ6DdhvFk5rhd0vsHGMQq2gAW2+Or6ZVsKkHKx8CPINpJVKAdpE0tItI+loMO02jf9oRI/8cThWP1vNAeWnr0D6m275EZf/4qem/DdJ0FJIVou3P7tsq7eSdueDnj5RmcbW/vOBtvlXpD3SqsVRn6sltZ0sk24p+6ZMzopevCZEMf/nL3OzGvSadisXb39H9DgwkNLlefju1QLgHWf0TGfeNHluDgVDhU8+/1/KUGtr2SnZ5EVO1l59FWHALj", "gossipEndpoint": [{ "ipAddressV4": "iHGZ3Q==", "port": 30124 }, { "ipAddressV4": "CoAAbw==", "port": 30124 }] }, { "nodeId": "1", "weight": "12500000000", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIIHWg7e2Q/smQwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlMjAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlMjCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKr5WsBepS3+y/0/yfBjzMWje7zianEz7sszrNWV3cGu2KUlR7v2+9wp/EtX1+BdcGlTTojgFs5nEBN4lM76Cp6JjFH461yN8GSkIkpe8GZnb1w4KEjZj5UYMbq+qOUI6QmwmgLeO8RHAsS6lCP1AyGFalb2ZVJ09DcYDxCRXeFj4BqvNbtD5r5DTCtpVT4ax3eb3pzNSGsjQUG9zhyp/WcsAmwmzKdMl72tk6qF8tlAWXyzwiCujWHS0Kln0C5pyEjeFNsG299toC4pgT8juxijgseTeIFRnNHmGSeSmXpAkEELlwLKR8HOnqeiS5UXNqdbxNemx/EpJSc5rTB6kzLX24dIuRsgyIIFWx73goOzmaHUolN4xmenifoMYlSNNM07WrsvmjRC5OLc/uGhdWqhZGBCH6AJB8Cmw84QLXVdHE6LiueP1oMd7g++N4X880wJkuh0ebfV3i7etUIn0jLlM50AkRucG9kwZDJ/M4LY7FT2F85R1/o2FaB/537ARQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQB5lTkqYw0hEW+BJTFsQ8jEHfIDNRJ0kNbVuibfP+u7kzlJy15lCEi+Qw6E3d8hA1QBX3xJMxNBlrtYPrdG26hh/tOwo5Np/OfxQC5jo0Q7n7hu7aLxZRUB/q7AfdDbOun4Za6rJhT3+EsFocyARWp8bYSk3YILBMkP+2VYDRkgQidzKgKtO5yv21Y9sEgziSprc+dQb/tqn5aQZLWavFwCLwnB3t4r4qwLHkkH00Jw51uOvLeM49/t333V5Caa7wmWzMcE+KSWW0QWFRxeJrodSyjPdmDi4D8lKN5WJHSAU5L2yWIODUyWD/cvsAapTv7xXk9ja/Ssb9DpMQnM1xh0hYaESajNeL1QbGuZgPxAwrw981h7kprR2P2iMGRVGA6u4ezxmhW3s7D+yJ3+Yxs/x2J/sw65Z16mRYXRWYWHQmhgaVQjIviiAkVB6CWZo1kHl/eYaVedQzKlrTpbr3JtmwGwhYEOnrkzsC63h8/AG9gRtIAIGWGqTPWbn2pEm8M=", "gossipEndpoint": [{ "ipAddressV4": "IojqDg==", "port": 30125 }, { "ipAddressV4": "CoAAbQ==", "port": 30125 }] }, { "nodeId": "2", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAJg3GRFp5bT9MA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTMwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTMwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQCl5ut2dCleDmgEneRYpAKa9Pe2qnXzgF+BEIuTfizG2OcPQi/ltv+6HxSrJXtuWNaiX/G4iP7iBzWj2ysaAYwfYj0ezTSMLRqM9hXzVgLtW0LJEF6a8vUXPsJt4GEJkUKiYCCO1MP1NLd3y/3SVJrFhwJSPqKYm2pQNg84WfPDWSkzSneOIO4Z0uWDXgs+vzSNyChWOxVieFQhLjcELtyj6narmLox+Jdo/SxUzPuktuFB3ebNgUqWPkjljgZpl00BTmbRIVHgHfDVulo2PBpXd0VplIDgdPr5zMKdTrKCuDKey8Mft72RkPKMe9LZVZ/21+rXVEh+olvvUCySsP2RkWPUJJD90c8wKo01rZsjAOXscJKQcBYlam5XXO4ZBRYzEdxuivbkPwsOoQ83swCR3alPvwfbg11Va+zXE6sRbUM9LqkYo/M3Hwg8tSIXu8oah6csputanz867dzWwyVJEPzmiXZ6ncVDQO31QlB7RndWCqKTjOQpnpblUMsrE9MCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAnUA8+kz7L+eSOm/iVvUNYF10PKO2nZtxWWL7R1vwK/2Up765PwqxKb0eSEM4bjgvZq1GuGXs9X/Y7dos42yntXvgeUY+/2JzCnw4J5tzxytZ+IKX6DR67NjDzDzVZQfptjLQrb8E7yzml0uxsqrhNPWl57Bmfe66Kg2lD11jImeeEhExlRggFukoiUWVwRNU21Q1jMUWrg2ZwfP+6fFTgRt0WR+X5zkyYPbvI6/yv7reYGjPDuZTOFhbwG8LUTQxdttDswPjnQ606kMyninL+aNelSdV/UIII7lpr/dTvgQAnrlBaGXvdy6brh3wWEwia0FZFZcKEs6M+jZ3MrFxvlTfUIdI3jRq12L10cCDi2VhORg4JmvlM+Tk6kJeSku30ZLAVo3S7GbTdvkuesOxz3UwnF7yfOA1KYOPvhv1oLxGV5z05glsn1OBKnXMdzsKFbAYYHj81bgBni2WLuIpv3oXlai2uc4y9m8LvWAQ+h/ivyog34Ai3Pvr5ZZOFgjy", "gossipEndpoint": [{ "ipAddressV4": "iHLsaQ==", "port": 30126 }, { "ipAddressV4": "CoAAaw==", "port": 30126 }] }, { "nodeId": "3", "weight": "12500000000", "gossipCaCertificate": "MIIDpzCCAg+gAwIBAgIJAN7hww13zBZEMA0GCSqGSIb3DQEBDAUAMBIxEDAOBgNVBAMTB3Mtbm9kZTQwIBcNMDAwMTAxMDAwMDAwWhgPMjEwMDAxMDEwMDAwMDBaMBIxEDAOBgNVBAMTB3Mtbm9kZTQwggGiMA0GCSqGSIb3DQEBAQUAA4IBjwAwggGKAoIBgQDK/bVyv0ZUeJZ4cIOImM+wmqtYjCw4jPAC549WQPPV1vG0lzSpgV+nRKqmWBexhLlKN3bsvrfNCUpKSq8meFyCtdppT1dhUOmEZcoNhLZzqxXb2HYYqRPv82tR+tbh+27WFsBOOqYrYTvr72ECD7qDOuw/Xob6KImaw/b/SIAPecMoYy25fkgYkJSETwd8HUpwssYH/JTLBF8eGjjTTMuu14ARQKeH8BXSs+jjV1+3IItXERS8ryUGDjqc5vC8ZW1kDVQbb91IDxRjqZbFyhuasocCqTAcZuiEgE8Wilwp2g1vbAUnHnvKNfiaEAHoEV6vF4lelaWhOnN2U5tnox/ns6PiDqIbOfs0pmXxjAK0vxc6oZM3TwdRtzo6cSb/AYfQdnmQzkra980kHN12r3f7PK2PzGBuVUPT7fLGA4S3vQDYO4rqcgTc/OLobtqLtdBusOFjZscfIfUW4GVWJUI1j+fwvHacxWLmyZwlQ5Q47UtrtjWpFru7CTn5S477lqMCAwEAATANBgkqhkiG9w0BAQwFAAOCAYEAdW6AWDhT0eOJw+0O6MYngmCgkXfFsgBC/B1plaE596hHo58FHxzCNiLFdvfRj37rxujvqsDAkADWUmOzLzLHYMXu302HzDqAMNY6FZJc32y4ZDsIQpaUOAuiNHAHwFXuPRInVpCqztfJMgw4RhOhcCTEsoIJsqoIN1t4M0pEVAv6x3nJwFKZqSNOZrQ7sOW32FjwWS3kHwRsCTtqdk5n2KxU6wr/fggV3QsSPRMYro8sUfwu93mqggtswwWqfeKlsz5WiaR9aqLnb8z1R6HLvA0bcoPWzjgn8RdP+9we4z06iZ5vdBuNpwBjrCKUELWISyAoekLGGxyS8pPqYiSBRNUoaPITSuUjcCBbJ9EFvm72QgCBesbwF71KPabTPbMPhLmf+uAi+zmeu8ZeVvT6DrX9OHSkIvIEQFry9BrqOT3ce6KBHSO1HpXIetj5Wcd3WHXtz9ulBL9ikWC8eh7/+we51ucmLvFzNKznElhT2Dp+czXUVNEUjp3u/66pyRA4", "gossipEndpoint": [{ "ipAddressV4": "IjrriA==", "port": 30127 }, { "ipAddressV4": "CoAAbg==", "port": 30127 }] }, { "nodeId": "4", "gossipCaCertificate": "MIIDpjCCAg6gAwIBAgIId2RXuetTjnAwDQYJKoZIhvcNAQEMBQAwEjEQMA4GA1UEAxMHcy1ub2RlNTAgFw0wMDAxMDEwMDAwMDBaGA8yMTAwMDEwMTAwMDAwMFowEjEQMA4GA1UEAxMHcy1ub2RlNTCCAaIwDQYJKoZIhvcNAQEBBQADggGPADCCAYoCggGBAKw8NsPZVKyXW3smNAUcoN/JOyhyJah6Y2gPOCek6hcOaLjg6DvZxLzinpLKwScrgRIIiuU/9hUsy6EiOtCuX3lT4ByKA5XdN1YQAXL1TS00vUf7BR1b+PW58gaJtepZctHvyklNxFeVT/vq2zWQa1sXeTz0OReJXO+8WWO1AjciYyv963zZ6jvk5yhWGukC5NJ2pJTjFh3+2+PivnLevNNnW6bZkdgSl3MThr78nsWlUQykvx+FNIgLlrq+4fCIFzXMeJRRXqo0MJlsxBfQaSu1arqMGE5BQWBEr51EH9UKNP3MSEntr1h1WMeJ3tZXFJ/z1xzKxsVII8HNtjynv1MoEGY4AQDUWI0CRUBv0uAmBUljGJVyqgHIPO6OGznJdkQsmSOSpqdVeNSWEd85Nh0Niorvpat9g5vUJ9zWKPWmgiww+RQiqq8jA2BiD/3n+I8UIMbMShiJUQBmnveAEWLMeV6TIuIBllGQ0a5oB+PQOZh0g+TvsjnH8/c1h1MSHQIDAQABMA0GCSqGSIb3DQEBDAUAA4IBgQBYPcTDNQjigDZ2iY/nA9BhiyNJDHQBecLoADQ+5mxGrGclUcvd002ovbVhoGlcMkVyG/Am9/Y3t6zpO4pMQy7Jecfx8U8+VtKEPFkpfslCmEHT9EpnCGcS5t1KB2KcqnZ57f9uUZLBtMPem1FO6wcT71TXr4/+hGNuLpLmtKkQWARnVURR1/MHhcJ35BH1/x1VIeNc+PBpTE/blszn69Gwqt/HlpNTnYfqS0vhTCfOO07dxn8n904xb1nZ0l0k7pCQdRTTn+dYxmvQ7MlTRK5iOlOKIiRAbD8Fwyfo93cvDj6V56c7Gz+knyjz0iARMsptumqevmfiJ324HzV/12SukJokFDl9G6MjG99ucg3CQaIxa5kRVN1b5D+QMeowfj0lKTchGm/BbuO4zousIE+aWCGfy41CccTP11JW2quwjsVGufb5fvCfDXop5f9i1+95oHd2JyLmHDnJBWEqBHret4Dx8P8GyQhyZB6jkZpVwJy/ruPyrAL3kcKqUZecaOg=", "gossipEndpoint": [{ "ipAddressV4": "IjfxcQ==", "port": 30128 }, { "ipAddressV4": "CoAAbA==", "port": 30128 }] }] }
node4 5m 55.329s 2025-09-28 05:49:34.103 53 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with state long -7896747143900124556.
node4 5m 55.329s 2025-09-28 05:49:34.103 54 INFO STARTUP <main> ConsistencyTestingToolState: State initialized with 269 rounds handled.
node4 5m 55.329s 2025-09-28 05:49:34.103 55 INFO STARTUP <main> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 55.330s 2025-09-28 05:49:34.104 56 INFO STARTUP <main> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 56.061s 2025-09-28 05:49:34.835 57 INFO STARTUP <main> StateInitializer: The platform is using the following initial state:
Round: 269 Timestamp: 2025-09-28T05:46:00.409185161Z Next consensus number: 9867 Legacy running event hash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82 Legacy running event mnemonic: hour-eight-simple-steak Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1528825575 Root hash: bebc818347d1c8f4ab66bec91829943db97b9211a14d7a5f23e07487d6aaac39c2f42277c1a7d6a8098b1ecb67fb372c (root) ConsistencyTestingToolState / chuckle-soul-when-thunder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 piano-bread-culture-warm 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -7896747143900124556 /3 trick-possible-breeze-canal 4 StringLeaf 269 /4 convince-example-flush-bicycle
node4 5m 56.302s 2025-09-28 05:49:35.076 59 INFO EVENT_STREAM <main> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: 13727a375e48b95820867bd1bb5760eb34c1fa4f698863e78eb1194e1a5bc620e27edb1719260e2e38481e2f9d047f82
node4 5m 56.314s 2025-09-28 05:49:35.088 60 INFO STARTUP <platformForkJoinThread-4> Shadowgraph: Shadowgraph starting from expiration threshold 242
node4 5m 56.320s 2025-09-28 05:49:35.094 62 INFO STARTUP <<start-node-4>> ConsistencyTestingToolMain: init called in Main for node 4.
node4 5m 56.322s 2025-09-28 05:49:35.096 63 INFO STARTUP <<start-node-4>> SwirldsPlatform: Starting platform 4
node4 5m 56.323s 2025-09-28 05:49:35.097 64 INFO STARTUP <<platform: recycle-bin-cleanup>> RecycleBinImpl: Deleted 0 files from the recycle bin.
node4 5m 56.327s 2025-09-28 05:49:35.101 65 INFO STARTUP <<start-node-4>> CycleFinder: No cyclical back pressure detected in wiring model.
node4 5m 56.329s 2025-09-28 05:49:35.103 66 INFO STARTUP <<start-node-4>> DirectSchedulerChecks: No illegal direct scheduler use detected in the wiring model.
node4 5m 56.329s 2025-09-28 05:49:35.103 67 INFO STARTUP <<start-node-4>> InputWireChecks: All input wires have been bound.
node4 5m 56.332s 2025-09-28 05:49:35.106 68 INFO STARTUP <<start-node-4>> SwirldsPlatform: replaying preconsensus event stream starting at 242
node4 5m 56.338s 2025-09-28 05:49:35.112 69 INFO PLATFORM_STATUS <platformForkJoinThread-3> StatusStateMachine: Platform spent 194.0 ms in STARTING_UP. Now in REPLAYING_EVENTS
node4 5m 56.627s 2025-09-28 05:49:35.401 70 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:4 H:c19558feb1c2 BR:267), num remaining: 4
node4 5m 56.628s 2025-09-28 05:49:35.402 71 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:990525ef7c33 BR:267), num remaining: 3
node4 5m 56.629s 2025-09-28 05:49:35.403 72 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:49109e7f9ab1 BR:267), num remaining: 2
node4 5m 56.629s 2025-09-28 05:49:35.403 73 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:85947813f198 BR:267), num remaining: 1
node4 5m 56.630s 2025-09-28 05:49:35.404 74 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:e7efbd00a54d BR:267), num remaining: 0
node4 5m 57.432s 2025-09-28 05:49:36.206 967 INFO STARTUP <<start-node-4>> PcesReplayer: Replayed 4,947 preconsensus events with max birth round 378. These events contained 6,836 transactions. 108 rounds reached consensus spanning 48.8 seconds of consensus time. The latest round to reach consensus is round 377. Replay took 1.1 seconds.
node4 5m 57.434s 2025-09-28 05:49:36.208 968 INFO STARTUP <<app: appMain 4>> ConsistencyTestingToolMain: run called in Main.
node4 5m 57.435s 2025-09-28 05:49:36.209 969 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 1.1 s in REPLAYING_EVENTS. Now in OBSERVING
node3 5m 58.342s 2025-09-28 05:49:37.116 8785 INFO RECONNECT <<platform-core: SyncProtocolWith4 3 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661] remote ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275]
node1 5m 58.343s 2025-09-28 05:49:37.117 8923 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661] remote ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275]
node0 5m 58.347s 2025-09-28 05:49:37.121 8759 INFO RECONNECT <<platform-core: SyncProtocolWith4 0 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661] remote ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275]
node2 5m 58.390s 2025-09-28 05:49:37.164 8839 INFO RECONNECT <<platform-core: SyncProtocolWith4 2 to 4>> RpcPeerHandler: OTHER_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661] remote ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275]
node4 5m 58.413s 2025-09-28 05:49:37.187 1064 INFO RECONNECT <<platform-core: SyncProtocolWith1 4 to 1>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 5m 58.413s 2025-09-28 05:49:37.187 1065 INFO RECONNECT <<platform-core: SyncProtocolWith3 4 to 3>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 5m 58.416s 2025-09-28 05:49:37.190 1066 INFO RECONNECT <<platform-core: SyncProtocolWith0 4 to 0>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 5m 58.417s 2025-09-28 05:49:37.191 1067 INFO PLATFORM_STATUS <platformForkJoinThread-7> StatusStateMachine: Platform spent 980.0 ms in OBSERVING. Now in BEHIND
node4 5m 58.417s 2025-09-28 05:49:37.191 1068 INFO RECONNECT <platformForkJoinThread-4> ReconnectController: Starting ReconnectController
node4 5m 58.418s 2025-09-28 05:49:37.192 1069 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, stopping gossip
node4 5m 58.459s 2025-09-28 05:49:37.233 1070 INFO RECONNECT <<platform-core: SyncProtocolWith2 4 to 2>> RpcPeerHandler: SELF_FALLEN_BEHIND local ev=EventWindow[latestConsensusRound=377,ancientThreshold=350,expiredThreshold=275] remote ev=EventWindow[latestConsensusRound=762,ancientThreshold=735,expiredThreshold=661]
node4 5m 58.570s 2025-09-28 05:49:37.344 1071 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Preparing for reconnect, start clearing queues
node4 5m 58.572s 2025-09-28 05:49:37.346 1072 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectPlatformHelperImpl: Queues have been cleared
node4 5m 58.574s 2025-09-28 05:49:37.348 1073 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: waiting for reconnect connection
node4 5m 58.574s 2025-09-28 05:49:37.348 1074 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectController: acquired reconnect connection
node1 5m 58.662s 2025-09-28 05:49:37.436 8935 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Starting reconnect in the role of the sender {"receiving":false,"nodeId":1,"otherNodeId":4,"round":762} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node1 5m 58.663s 2025-09-28 05:49:37.437 8936 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: The following state will be sent to the learner:
Round: 762 Timestamp: 2025-09-28T05:49:35.997831Z Next consensus number: 22956 Legacy running event hash: bcce506ee4bc2dea29ba8a59f3cd7488772dc340a48f66af1a14123e800136a93860a7cf986955ba521ebc106fd1805a Legacy running event mnemonic: green-assist-hammer-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1591704947 Root hash: 024d5615136e760046eff9b14bebe4974a396ef24d1f806757cb09745aab8b8e6534f561002f50ff2e2e17fbfbfcf9c3 (root) ConsistencyTestingToolState / endless-section-horse-scale 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 script-vendor-steel-citizen 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7379020150092654631 /3 grass-flavor-sweet-sniff 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node1 5m 58.664s 2025-09-28 05:49:37.438 8937 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Sending signatures from nodes 0, 1, 2 (signing weight = 37500000000/50000000000) for state hash 024d5615136e760046eff9b14bebe4974a396ef24d1f806757cb09745aab8b8e6534f561002f50ff2e2e17fbfbfcf9c3
node1 5m 58.664s 2025-09-28 05:49:37.438 8938 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Starting synchronization in the role of the sender.
node1 5m 58.672s 2025-09-28 05:49:37.446 8939 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node1 5m 58.684s 2025-09-28 05:49:37.458 8940 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@771a6e9a start run()
node4 5m 58.729s 2025-09-28 05:49:37.503 1075 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Starting reconnect in role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":377} [com.swirlds.logging.legacy.payload.ReconnectStartPayload]
node4 5m 58.731s 2025-09-28 05:49:37.505 1076 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Receiving signed state signatures
node4 5m 58.735s 2025-09-28 05:49:37.509 1077 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Received signatures from nodes 0, 1, 2
node4 5m 58.737s 2025-09-28 05:49:37.511 1078 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls receiveTree()
node4 5m 58.738s 2025-09-28 05:49:37.512 1079 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronizing tree
node4 5m 58.738s 2025-09-28 05:49:37.512 1080 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 58.744s 2025-09-28 05:49:37.518 1081 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@28206985 start run()
node4 5m 58.756s 2025-09-28 05:49:37.530 1082 INFO STARTUP <<work group learning-synchronizer: async-input-stream #0>> ConsistencyTestingToolState: New State Constructed.
node1 5m 58.838s 2025-09-28 05:49:37.612 8959 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@771a6e9a finish run()
node1 5m 58.839s 2025-09-28 05:49:37.613 8960 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: finished sending tree
node1 5m 58.839s 2025-09-28 05:49:37.613 8961 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: sending tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node1 5m 58.840s 2025-09-28 05:49:37.614 8962 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@23fae9a5 start run()
node4 5m 58.956s 2025-09-28 05:49:37.730 1104 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 58.956s 2025-09-28 05:49:37.730 1105 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 58.957s 2025-09-28 05:49:37.731 1106 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@28206985 finish run()
node4 5m 58.957s 2025-09-28 05:49:37.731 1107 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.demo.consistency.ConsistencyTestingToolState with route []
node4 5m 58.958s 2025-09-28 05:49:37.732 1108 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: receiving tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 58.962s 2025-09-28 05:49:37.736 1109 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@5a8960d6 start run()
node4 5m 59.021s 2025-09-28 05:49:37.795 1110 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): firstLeafPath: 1 -> 1, lastLeafPath: 1 -> 1
node4 5m 59.022s 2025-09-28 05:49:37.796 1111 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> ReconnectNodeRemover: setPathInformation(): done
node4 5m 59.024s 2025-09-28 05:49:37.798 1112 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread finished the learning loop for the current subtree
node4 5m 59.025s 2025-09-28 05:49:37.799 1113 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call nodeRemover.allNodesReceived()
node4 5m 59.025s 2025-09-28 05:49:37.799 1114 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived()
node4 5m 59.026s 2025-09-28 05:49:37.800 1115 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> ReconnectNodeRemover: allNodesReceived(): done
node4 5m 59.026s 2025-09-28 05:49:37.800 1116 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: call root.endLearnerReconnect()
node4 5m 59.026s 2025-09-28 05:49:37.800 1117 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call reconnectIterator.close()
node4 5m 59.026s 2025-09-28 05:49:37.800 1118 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call setHashPrivate()
node1 5m 59.095s 2025-09-28 05:49:37.869 8966 INFO RECONNECT <<work group teaching-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@23fae9a5 finish run()
node1 5m 59.095s 2025-09-28 05:49:37.869 8967 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> TeachingSynchronizer: finished sending tree
node1 5m 59.099s 2025-09-28 05:49:37.873 8970 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Finished synchronization in the role of the sender.
node4 5m 59.171s 2025-09-28 05:49:37.945 1128 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: call postInit()
node4 5m 59.172s 2025-09-28 05:49:37.946 1130 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> VirtualMap: endLearnerReconnect() complete
node4 5m 59.172s 2025-09-28 05:49:37.946 1131 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushVirtualTreeView: close() complete
node4 5m 59.172s 2025-09-28 05:49:37.946 1132 INFO RECONNECT <<work group learning-synchronizer: learner-task #2>> LearnerPushTask: learner thread closed input, output, and view for the current subtree
node4 5m 59.172s 2025-09-28 05:49:37.946 1133 INFO RECONNECT <<work group learning-synchronizer: async-input-stream #0>> AsyncInputStream: com.swirlds.common.merkle.synchronization.streams.AsyncInputStream@5a8960d6 finish run()
node4 5m 59.173s 2025-09-28 05:49:37.947 1134 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: received tree rooted at com.swirlds.virtualmap.VirtualMap with route [2]
node4 5m 59.173s 2025-09-28 05:49:37.947 1135 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: synchronization complete
node4 5m 59.173s 2025-09-28 05:49:37.947 1136 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls initialize()
node4 5m 59.174s 2025-09-28 05:49:37.948 1137 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initializing tree
node4 5m 59.174s 2025-09-28 05:49:37.948 1138 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: initialization complete
node4 5m 59.174s 2025-09-28 05:49:37.948 1139 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls hash()
node4 5m 59.175s 2025-09-28 05:49:37.949 1140 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing tree
node4 5m 59.176s 2025-09-28 05:49:37.950 1141 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: hashing complete
node4 5m 59.176s 2025-09-28 05:49:37.950 1142 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner calls logStatistics()
node4 5m 59.179s 2025-09-28 05:49:37.953 1143 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: Finished synchronization {"timeInSeconds":0.435,"hashTimeInSeconds":0.001,"initializationTimeInSeconds":0.0,"totalNodes":12,"leafNodes":7,"redundantLeafNodes":4,"internalNodes":5,"redundantInternalNodes":2} [com.swirlds.logging.legacy.payload.SynchronizationCompletePayload]
node4 5m 59.180s 2025-09-28 05:49:37.954 1144 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: ReconnectMapMetrics: transfersFromTeacher=12; transfersFromLearner=10; internalHashes=0; internalCleanHashes=0; internalData=0; internalCleanData=0; leafHashes=3; leafCleanHashes=3; leafData=7; leafCleanData=4
node4 5m 59.180s 2025-09-28 05:49:37.954 1145 INFO RECONNECT <<reconnect: reconnect-controller>> LearningSynchronizer: learner is done synchronizing
node4 5m 59.183s 2025-09-28 05:49:37.957 1146 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectLearner: Reconnect data usage report {"dataMegabytes":0.0060558319091796875} [com.swirlds.logging.legacy.payload.ReconnectDataUsagePayload]
node4 5m 59.187s 2025-09-28 05:49:37.961 1147 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Finished reconnect in the role of the receiver. {"receiving":true,"nodeId":4,"otherNodeId":1,"round":762,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 59.187s 2025-09-28 05:49:37.961 1148 INFO RECONNECT <<reconnect: reconnect-controller>> ReconnectSyncHelper: Information for state received during reconnect:
Round: 762 Timestamp: 2025-09-28T05:49:35.997831Z Next consensus number: 22956 Legacy running event hash: bcce506ee4bc2dea29ba8a59f3cd7488772dc340a48f66af1a14123e800136a93860a7cf986955ba521ebc106fd1805a Legacy running event mnemonic: green-assist-hammer-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1591704947 Root hash: 024d5615136e760046eff9b14bebe4974a396ef24d1f806757cb09745aab8b8e6534f561002f50ff2e2e17fbfbfcf9c3 (root) ConsistencyTestingToolState / endless-section-horse-scale 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 script-vendor-steel-citizen 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7379020150092654631 /3 grass-flavor-sweet-sniff 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node4 5m 59.188s 2025-09-28 05:49:37.962 1150 DEBUG RECONNECT <<reconnect: reconnect-controller>> ReconnectStateLoader: `loadReconnectState` : reloading state
node4 5m 59.189s 2025-09-28 05:49:37.963 1151 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with state long 7379020150092654631.
node4 5m 59.189s 2025-09-28 05:49:37.963 1152 INFO STARTUP <<reconnect: reconnect-controller>> ConsistencyTestingToolState: State initialized with 762 rounds handled.
node4 5m 59.189s 2025-09-28 05:49:37.963 1153 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Consistency testing tool log path: data/saved/consistency-test/4/ConsistencyTestLog.csv
node4 5m 59.189s 2025-09-28 05:49:37.963 1154 INFO STARTUP <<reconnect: reconnect-controller>> TransactionHandlingHistory: Log file found. Parsing previous history
node4 5m 59.211s 2025-09-28 05:49:37.985 1159 INFO STATE_TO_DISK <<reconnect: reconnect-controller>> DefaultSavedStateController: Signed state from round 762 created, will eventually be written to disk, for reason: RECONNECT
node4 5m 59.211s 2025-09-28 05:49:37.985 1160 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 793.0 ms in BEHIND. Now in RECONNECT_COMPLETE
node4 5m 59.212s 2025-09-28 05:49:37.986 1162 INFO STARTUP <platformForkJoinThread-1> Shadowgraph: Shadowgraph starting from expiration threshold 735
node4 5m 59.214s 2025-09-28 05:49:37.988 1164 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 762 state to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/762
node4 5m 59.215s 2025-09-28 05:49:37.989 1165 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 762
node4 5m 59.225s 2025-09-28 05:49:37.999 1173 INFO EVENT_STREAM <<reconnect: reconnect-controller>> DefaultConsensusEventStream: EventStreamManager::updateRunningHash: bcce506ee4bc2dea29ba8a59f3cd7488772dc340a48f66af1a14123e800136a93860a7cf986955ba521ebc106fd1805a
node4 5m 59.225s 2025-09-28 05:49:37.999 1176 INFO STARTUP <platformForkJoinThread-6> PcesFileManager: Due to recent operations on this node, the local preconsensus event stream will have a discontinuity. The last file with the old origin round is 2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr378_orgn0.pces. All future files will have an origin round of 762.
node1 5m 59.257s 2025-09-28 05:49:38.031 8971 INFO RECONNECT <<platform-core: SyncProtocolWith4 1 to 4>> ReconnectTeacher: Finished reconnect in the role of the sender. {"receiving":false,"nodeId":1,"otherNodeId":4,"round":762,"success":false} [com.swirlds.logging.legacy.payload.ReconnectFinishPayload]
node4 5m 59.334s 2025-09-28 05:49:38.108 1198 INFO STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Initializing statistics output in CSV format [ csvOutputFolder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats', csvFileName = 'PlatformTesting4.csv' ]
node4 5m 59.337s 2025-09-28 05:49:38.111 1199 DEBUG STARTUP <<platform-core: MetricsThread #0>> LegacyCsvWriter: CsvWriter: Using the existing metrics folder [ folder = '/opt/hgcapp/services-hedera/HapiApp2.0/data/stats' ]
node4 5m 59.358s 2025-09-28 05:49:38.132 1204 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/2 for round 762
node4 5m 59.361s 2025-09-28 05:49:38.135 1205 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 762 Timestamp: 2025-09-28T05:49:35.997831Z Next consensus number: 22956 Legacy running event hash: bcce506ee4bc2dea29ba8a59f3cd7488772dc340a48f66af1a14123e800136a93860a7cf986955ba521ebc106fd1805a Legacy running event mnemonic: green-assist-hammer-run Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: 1591704947 Root hash: 024d5615136e760046eff9b14bebe4974a396ef24d1f806757cb09745aab8b8e6534f561002f50ff2e2e17fbfbfcf9c3 (root) ConsistencyTestingToolState / endless-section-horse-scale 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 script-vendor-steel-citizen 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7379020150092654631 /3 grass-flavor-sweet-sniff 4 StringLeaf 762 /4 virtual-monkey-weird-extend
node4 5m 59.396s 2025-09-28 05:49:38.170 1206 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus file on disk.
File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr378_orgn0.pces
node4 5m 59.399s 2025-09-28 05:49:38.173 1207 WARN STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: No preconsensus event files meeting specified criteria found to copy. Lower bound: 735
node4 5m 59.405s 2025-09-28 05:49:38.179 1208 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 762 to disk. Reason: RECONNECT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/762 {"round":762,"freezeState":false,"reason":"RECONNECT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/762/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 5m 59.409s 2025-09-28 05:49:38.183 1209 INFO PLATFORM_STATUS <platformForkJoinThread-6> StatusStateMachine: Platform spent 196.0 ms in RECONNECT_COMPLETE. Now in CHECKING
node4 6.004m 2025-09-28 05:49:39.001 1210 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:1 H:d13b798c668d BR:760), num remaining: 3
node4 6.004m 2025-09-28 05:49:39.002 1211 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:0 H:fe160d0367eb BR:760), num remaining: 2
node4 6.004m 2025-09-28 05:49:39.002 1212 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:3 H:bf3cf01bfdfb BR:760), num remaining: 1
node4 6.004m 2025-09-28 05:49:39.002 1213 INFO STARTUP <<scheduler ConsensusEngine>> ConsensusImpl: Found init judge (CR:2 H:1db5e3f1b029 BR:760), num remaining: 0
node4 6m 5.319s 2025-09-28 05:49:44.093 1359 INFO PLATFORM_STATUS <platformForkJoinThread-4> StatusStateMachine: Platform spent 5.9 s in CHECKING. Now in ACTIVE
node3 6m 23.050s 2025-09-28 05:50:01.824 9357 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 815 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node0 6m 23.103s 2025-09-28 05:50:01.877 9341 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 815 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 6m 23.109s 2025-09-28 05:50:01.883 1781 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 815 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 23.113s 2025-09-28 05:50:01.887 9431 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 815 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 6m 23.207s 2025-09-28 05:50:01.981 9548 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 815 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 6m 23.262s 2025-09-28 05:50:02.036 9440 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 815 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/815
node2 6m 23.262s 2025-09-28 05:50:02.036 9441 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node2 6m 23.351s 2025-09-28 05:50:02.125 9479 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node4 6m 23.351s 2025-09-28 05:50:02.125 1790 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 815 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/815
node4 6m 23.351s 2025-09-28 05:50:02.125 1791 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 815
node2 6m 23.353s 2025-09-28 05:50:02.127 9480 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 815 Timestamp: 2025-09-28T05:50:00.070109041Z Next consensus number: 24760 Legacy running event hash: 5dea3ef84b81be05823180534b0f180040e0487fe49c87bf01bb2669b7d40e8c9de80041e265282ee27f9b439b2e9e28 Legacy running event mnemonic: run-tenant-ice-disorder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -454857274 Root hash: 849ca2b327e0a858b6ba85cdcbfdc6b828ffb56ce0d877a2d691b702edbdaa5c284177e4472646c88a576b4977a9a788 (root) ConsistencyTestingToolState / hold-predict-winter-vital 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twenty-define-work-setup 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7799009721450397269 /3 silly-chest-best-grain 4 StringLeaf 815 /4 finger-fantasy-direct-aspect
node2 6m 23.361s 2025-09-28 05:50:02.135 9481 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node1 6m 23.362s 2025-09-28 05:50:02.136 9557 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 815 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/815
node1 6m 23.362s 2025-09-28 05:50:02.136 9558 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 815
node2 6m 23.364s 2025-09-28 05:50:02.138 9482 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 788 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces
node2 6m 23.364s 2025-09-28 05:50:02.138 9483 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 6m 23.370s 2025-09-28 05:50:02.144 9498 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 6m 23.371s 2025-09-28 05:50:02.145 9499 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 815 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/815 {"round":815,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/815/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 6m 23.372s 2025-09-28 05:50:02.146 9500 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/134
node3 6m 23.425s 2025-09-28 05:50:02.199 9366 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 815 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/815
node3 6m 23.426s 2025-09-28 05:50:02.200 9367 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node0 6m 23.432s 2025-09-28 05:50:02.206 9350 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 815 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/815
node0 6m 23.433s 2025-09-28 05:50:02.207 9351 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node1 6m 23.450s 2025-09-28 05:50:02.224 9596 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/44 for round 815
node1 6m 23.452s 2025-09-28 05:50:02.226 9597 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 815 Timestamp: 2025-09-28T05:50:00.070109041Z Next consensus number: 24760 Legacy running event hash: 5dea3ef84b81be05823180534b0f180040e0487fe49c87bf01bb2669b7d40e8c9de80041e265282ee27f9b439b2e9e28 Legacy running event mnemonic: run-tenant-ice-disorder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -454857274 Root hash: 849ca2b327e0a858b6ba85cdcbfdc6b828ffb56ce0d877a2d691b702edbdaa5c284177e4472646c88a576b4977a9a788 (root) ConsistencyTestingToolState / hold-predict-winter-vital 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twenty-define-work-setup 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7799009721450397269 /3 silly-chest-best-grain 4 StringLeaf 815 /4 finger-fantasy-direct-aspect
node1 6m 23.459s 2025-09-28 05:50:02.233 9598 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 23.459s 2025-09-28 05:50:02.233 9599 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 788 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 6m 23.459s 2025-09-28 05:50:02.233 9600 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 6m 23.465s 2025-09-28 05:50:02.239 9601 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 6m 23.465s 2025-09-28 05:50:02.239 9602 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 815 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/815 {"round":815,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/815/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 6m 23.467s 2025-09-28 05:50:02.241 9603 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/134
node4 6m 23.475s 2025-09-28 05:50:02.249 1832 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/8 for round 815
node4 6m 23.477s 2025-09-28 05:50:02.251 1833 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 815 Timestamp: 2025-09-28T05:50:00.070109041Z Next consensus number: 24760 Legacy running event hash: 5dea3ef84b81be05823180534b0f180040e0487fe49c87bf01bb2669b7d40e8c9de80041e265282ee27f9b439b2e9e28 Legacy running event mnemonic: run-tenant-ice-disorder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -454857274 Root hash: 849ca2b327e0a858b6ba85cdcbfdc6b828ffb56ce0d877a2d691b702edbdaa5c284177e4472646c88a576b4977a9a788 (root) ConsistencyTestingToolState / hold-predict-winter-vital 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twenty-define-work-setup 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7799009721450397269 /3 silly-chest-best-grain 4 StringLeaf 815 /4 finger-fantasy-direct-aspect
node4 6m 23.485s 2025-09-28 05:50:02.259 1834 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+49+38.434599713Z_seq1_minr735_maxr1235_orgn762.pces Last file: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr378_orgn0.pces
node4 6m 23.485s 2025-09-28 05:50:02.259 1835 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 788 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+49+38.434599713Z_seq1_minr735_maxr1235_orgn762.pces
node4 6m 23.485s 2025-09-28 05:50:02.259 1836 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 6m 23.489s 2025-09-28 05:50:02.263 1837 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 6m 23.489s 2025-09-28 05:50:02.263 1838 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 815 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/815 {"round":815,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/815/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 6m 23.491s 2025-09-28 05:50:02.265 1839 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/1
node3 6m 23.508s 2025-09-28 05:50:02.282 9405 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node3 6m 23.510s 2025-09-28 05:50:02.284 9406 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 815 Timestamp: 2025-09-28T05:50:00.070109041Z Next consensus number: 24760 Legacy running event hash: 5dea3ef84b81be05823180534b0f180040e0487fe49c87bf01bb2669b7d40e8c9de80041e265282ee27f9b439b2e9e28 Legacy running event mnemonic: run-tenant-ice-disorder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -454857274 Root hash: 849ca2b327e0a858b6ba85cdcbfdc6b828ffb56ce0d877a2d691b702edbdaa5c284177e4472646c88a576b4977a9a788 (root) ConsistencyTestingToolState / hold-predict-winter-vital 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twenty-define-work-setup 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7799009721450397269 /3 silly-chest-best-grain 4 StringLeaf 815 /4 finger-fantasy-direct-aspect
node3 6m 23.517s 2025-09-28 05:50:02.291 9407 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 6m 23.520s 2025-09-28 05:50:02.294 9408 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 788 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces
node3 6m 23.520s 2025-09-28 05:50:02.294 9409 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 23.523s 2025-09-28 05:50:02.297 9397 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/43 for round 815
node0 6m 23.525s 2025-09-28 05:50:02.299 9398 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 815 Timestamp: 2025-09-28T05:50:00.070109041Z Next consensus number: 24760 Legacy running event hash: 5dea3ef84b81be05823180534b0f180040e0487fe49c87bf01bb2669b7d40e8c9de80041e265282ee27f9b439b2e9e28 Legacy running event mnemonic: run-tenant-ice-disorder Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -454857274 Root hash: 849ca2b327e0a858b6ba85cdcbfdc6b828ffb56ce0d877a2d691b702edbdaa5c284177e4472646c88a576b4977a9a788 (root) ConsistencyTestingToolState / hold-predict-winter-vital 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 twenty-define-work-setup 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf 7799009721450397269 /3 silly-chest-best-grain 4 StringLeaf 815 /4 finger-fantasy-direct-aspect
node3 6m 23.526s 2025-09-28 05:50:02.300 9426 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 6m 23.526s 2025-09-28 05:50:02.300 9427 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 815 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/815 {"round":815,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/815/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 6m 23.528s 2025-09-28 05:50:02.302 9428 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/134
node0 6m 23.532s 2025-09-28 05:50:02.306 9399 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 6m 23.535s 2025-09-28 05:50:02.309 9400 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 788 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 6m 23.535s 2025-09-28 05:50:02.309 9401 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node0 6m 23.541s 2025-09-28 05:50:02.315 9402 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 6m 23.542s 2025-09-28 05:50:02.316 9403 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 815 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/815 {"round":815,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/815/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 6m 23.543s 2025-09-28 05:50:02.317 9404 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/134
node0 7m 22.192s 2025-09-28 05:51:00.966 10798 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 945 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node1 7m 22.279s 2025-09-28 05:51:01.053 10993 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 945 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node2 7m 22.292s 2025-09-28 05:51:01.066 10872 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 945 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node4 7m 22.313s 2025-09-28 05:51:01.087 3249 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 945 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 22.358s 2025-09-28 05:51:01.132 10836 INFO STATE_TO_DISK <<scheduler TransactionHandler>> DefaultSavedStateController: Signed state from round 945 created, will eventually be written to disk, for reason: PERIODIC_SNAPSHOT
node3 7m 22.429s 2025-09-28 05:51:01.203 10839 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 945 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/945
node3 7m 22.430s 2025-09-28 05:51:01.204 10840 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node2 7m 22.450s 2025-09-28 05:51:01.224 10875 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 945 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/945
node2 7m 22.451s 2025-09-28 05:51:01.225 10876 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node3 7m 22.518s 2025-09-28 05:51:01.292 10871 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node0 7m 22.520s 2025-09-28 05:51:01.294 10811 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 945 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/945
node3 7m 22.520s 2025-09-28 05:51:01.294 10872 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 945 Timestamp: 2025-09-28T05:51:00.093178Z Next consensus number: 29597 Legacy running event hash: 159d497215cfeb5de603e6c8c0213c2114773a7106db3a87572b12020b337f57377ec3637b86c40e45b2fae1772c2043 Legacy running event mnemonic: recipe-comic-rigid-nerve Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -129527916 Root hash: c067ebc3f0a04fe87ea83f106ed4fef6e90352fcebf8e3575cddd3368fbda94d54165605ca9d9913aae989a58c7ee69e (root) ConsistencyTestingToolState / veteran-marble-ugly-wonder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 venture-weapon-maze-group 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -5031928959372440322 /3 library-party-cruise-cousin 4 StringLeaf 945 /4 buzz-hybrid-sea-market
node0 7m 22.521s 2025-09-28 05:51:01.295 10812 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node3 7m 22.528s 2025-09-28 05:51:01.302 10881 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces Last file: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+43+55.172849588Z_seq0_minr1_maxr501_orgn0.pces
node3 7m 22.528s 2025-09-28 05:51:01.302 10882 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 918 File: data/saved/preconsensus-events/3/2025/09/28/2025-09-28T05+47+43.871813828Z_seq1_minr473_maxr5473_orgn0.pces
node3 7m 22.528s 2025-09-28 05:51:01.302 10883 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 22.535s 2025-09-28 05:51:01.309 10907 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node2 7m 22.537s 2025-09-28 05:51:01.311 10908 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 945 Timestamp: 2025-09-28T05:51:00.093178Z Next consensus number: 29597 Legacy running event hash: 159d497215cfeb5de603e6c8c0213c2114773a7106db3a87572b12020b337f57377ec3637b86c40e45b2fae1772c2043 Legacy running event mnemonic: recipe-comic-rigid-nerve Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -129527916 Root hash: c067ebc3f0a04fe87ea83f106ed4fef6e90352fcebf8e3575cddd3368fbda94d54165605ca9d9913aae989a58c7ee69e (root) ConsistencyTestingToolState / veteran-marble-ugly-wonder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 venture-weapon-maze-group 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -5031928959372440322 /3 library-party-cruise-cousin 4 StringLeaf 945 /4 buzz-hybrid-sea-market
node3 7m 22.537s 2025-09-28 05:51:01.311 10884 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node3 7m 22.537s 2025-09-28 05:51:01.311 10885 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 945 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/945 {"round":945,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/945/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node3 7m 22.539s 2025-09-28 05:51:01.313 10886 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/3/123/269
node2 7m 22.544s 2025-09-28 05:51:01.318 10909 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces Last file: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+43+55.090200347Z_seq0_minr1_maxr501_orgn0.pces
node2 7m 22.545s 2025-09-28 05:51:01.319 10910 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 918 File: data/saved/preconsensus-events/2/2025/09/28/2025-09-28T05+47+43.774521942Z_seq1_minr474_maxr5474_orgn0.pces
node2 7m 22.545s 2025-09-28 05:51:01.319 10911 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node2 7m 22.554s 2025-09-28 05:51:01.328 10912 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node2 7m 22.554s 2025-09-28 05:51:01.328 10913 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 945 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/945 {"round":945,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/945/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node2 7m 22.556s 2025-09-28 05:51:01.330 10914 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/2/123/269
node0 7m 22.608s 2025-09-28 05:51:01.382 10843 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/49 for round 945
node0 7m 22.610s 2025-09-28 05:51:01.384 10844 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 945 Timestamp: 2025-09-28T05:51:00.093178Z Next consensus number: 29597 Legacy running event hash: 159d497215cfeb5de603e6c8c0213c2114773a7106db3a87572b12020b337f57377ec3637b86c40e45b2fae1772c2043 Legacy running event mnemonic: recipe-comic-rigid-nerve Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -129527916 Root hash: c067ebc3f0a04fe87ea83f106ed4fef6e90352fcebf8e3575cddd3368fbda94d54165605ca9d9913aae989a58c7ee69e (root) ConsistencyTestingToolState / veteran-marble-ugly-wonder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 venture-weapon-maze-group 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -5031928959372440322 /3 library-party-cruise-cousin 4 StringLeaf 945 /4 buzz-hybrid-sea-market
node0 7m 22.617s 2025-09-28 05:51:01.391 10845 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+43+55.143048284Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 7m 22.618s 2025-09-28 05:51:01.392 10846 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 918 File: data/saved/preconsensus-events/0/2025/09/28/2025-09-28T05+47+43.897361566Z_seq1_minr473_maxr5473_orgn0.pces
node0 7m 22.618s 2025-09-28 05:51:01.392 10847 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 22.619s 2025-09-28 05:51:01.393 11006 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 945 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/945
node1 7m 22.620s 2025-09-28 05:51:01.394 11007 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 945
node0 7m 22.631s 2025-09-28 05:51:01.405 10848 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node0 7m 22.631s 2025-09-28 05:51:01.405 10849 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 945 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/945 {"round":945,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/945/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node0 7m 22.633s 2025-09-28 05:51:01.407 10850 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/0/123/269
node4 7m 22.638s 2025-09-28 05:51:01.412 3262 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Started writing round 945 state to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/945
node4 7m 22.639s 2025-09-28 05:51:01.413 3263 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Creating a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 945
node1 7m 22.709s 2025-09-28 05:51:01.483 11042 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/50 for round 945
node1 7m 22.711s 2025-09-28 05:51:01.485 11043 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 945 Timestamp: 2025-09-28T05:51:00.093178Z Next consensus number: 29597 Legacy running event hash: 159d497215cfeb5de603e6c8c0213c2114773a7106db3a87572b12020b337f57377ec3637b86c40e45b2fae1772c2043 Legacy running event mnemonic: recipe-comic-rigid-nerve Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -129527916 Root hash: c067ebc3f0a04fe87ea83f106ed4fef6e90352fcebf8e3575cddd3368fbda94d54165605ca9d9913aae989a58c7ee69e (root) ConsistencyTestingToolState / veteran-marble-ugly-wonder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 venture-weapon-maze-group 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -5031928959372440322 /3 library-party-cruise-cousin 4 StringLeaf 945 /4 buzz-hybrid-sea-market
node1 7m 22.718s 2025-09-28 05:51:01.492 11044 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+43+55.008338827Z_seq0_minr1_maxr501_orgn0.pces Last file: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 22.718s 2025-09-28 05:51:01.492 11045 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 918 File: data/saved/preconsensus-events/1/2025/09/28/2025-09-28T05+47+43.818134833Z_seq1_minr474_maxr5474_orgn0.pces
node1 7m 22.718s 2025-09-28 05:51:01.492 11046 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node1 7m 22.727s 2025-09-28 05:51:01.501 11047 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node1 7m 22.728s 2025-09-28 05:51:01.502 11048 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 945 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/945 {"round":945,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/945/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node1 7m 22.729s 2025-09-28 05:51:01.503 11049 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/1/123/269
node4 7m 22.761s 2025-09-28 05:51:01.535 3301 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> MerkleTreeSnapshotWriter: Successfully created a snapshot on demand in /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/swirlds-tmp/14 for round 945
node4 7m 22.763s 2025-09-28 05:51:01.537 3302 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Information for state written to disk:
Round: 945 Timestamp: 2025-09-28T05:51:00.093178Z Next consensus number: 29597 Legacy running event hash: 159d497215cfeb5de603e6c8c0213c2114773a7106db3a87572b12020b337f57377ec3637b86c40e45b2fae1772c2043 Legacy running event mnemonic: recipe-comic-rigid-nerve Rounds non-ancient: 26 Creation version: SemanticVersion[major=1, minor=0, patch=0, pre=, build=] Minimum judge hash code: -129527916 Root hash: c067ebc3f0a04fe87ea83f106ed4fef6e90352fcebf8e3575cddd3368fbda94d54165605ca9d9913aae989a58c7ee69e (root) ConsistencyTestingToolState / veteran-marble-ugly-wonder 0 SingletonNode PlatformStateService.PLATFORM_STATE /0 venture-weapon-maze-group 1 SingletonNode RosterService.ROSTER_STATE /1 finger-warfare-tenant-lend 2 VirtualMap RosterService.ROSTERS /2 decline-exhibit-task-absent 3 StringLeaf -5031928959372440322 /3 library-party-cruise-cousin 4 StringLeaf 945 /4 buzz-hybrid-sea-market
node4 7m 22.770s 2025-09-28 05:51:01.544 3303 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 2 preconsensus files on disk.
First file: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+49+38.434599713Z_seq1_minr735_maxr1235_orgn762.pces Last file: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+43+54.901490133Z_seq0_minr1_maxr378_orgn0.pces
node4 7m 22.771s 2025-09-28 05:51:01.545 3304 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Found 1 preconsensus event file meeting specified criteria to copy.
Lower bound: 918 File: data/saved/preconsensus-events/4/2025/09/28/2025-09-28T05+49+38.434599713Z_seq1_minr735_maxr1235_orgn762.pces
node4 7m 22.771s 2025-09-28 05:51:01.545 3305 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Copying 1 preconsensus event file(s)
node4 7m 22.776s 2025-09-28 05:51:01.550 3306 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> BestEffortPcesFileCopy: Finished copying 1 preconsensus event file(s)
node4 7m 22.776s 2025-09-28 05:51:01.550 3307 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> SignedStateFileWriter: Finished writing state for round 945 to disk. Reason: PERIODIC_SNAPSHOT, directory: /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/945 {"round":945,"freezeState":false,"reason":"PERIODIC_SNAPSHOT","directory":"file:///opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/945/"} [com.swirlds.logging.legacy.payload.StateSavedToDiskPayload]
node4 7m 22.778s 2025-09-28 05:51:01.552 3308 INFO STATE_TO_DISK <<scheduler StateSnapshotManager>> FileUtils: deleting directory /opt/hgcapp/services-hedera/HapiApp2.0/data/saved/com.swirlds.demo.consistency.ConsistencyTestingToolMain/4/123/10